FAQ and Troubleshooting
Work in Progress 🏗️
This page is a work in progress. If you don't find an answer to your questions here or in the guide, feel free to reach out or to open up an issue on github.
How can I troubleshoot connection problems between Thought Stream and my LLM-Provider?
You can troubleshoot connection problems by following these steps:
- Check your API key and ensure it is correctly entered in the Thought Stream settings.
- Check your API URL and ensure it is correctly entered in the Thought Stream settings.
- Ensure that you have a stable internet connection.
- Check the API status of your LLM-Provider to see if there are any ongoing issues.
- If you are using a self-hosted LLM, ensure that it is running and accessible from your network.
- Turn on the developer mode in the plugin settings and check the notices in the plugin interface for any error messages or warnings that might indicate the source of the problem.
- If the problem persists, check the developer console in Obsidian for any error messages related to Thought Stream. You can open the developer console by pressing
Ctrl + Shift + I
(orCmd + Option + I
on Mac) and navigating to the "Console" tab. - If you still can't resolve the issue, consider reaching out for help by opening an issue on GitHub
How do I test the latest extension version from GitHub?
Is Thought Stream also available for Obsidian Mobile?
Yes, Thought Stream is available for Obsidian Mobile.
How do I build my own version of Thought Stream?
How does Thought Stream read note data?
Thought Stream reads note data from your Obsidian vault using the Obsidian plugin API. It does not access or read any data outside your vault.
Thought Stream usually reads only the currently active note execpt if you explicitly upload an audio file.
How do I include / exclude certain (private) notes from being processed?
You can include or exclude certain notes from being processed by Thought Stream by using inlcude
and exlude
settings of the plugin.
How does Thought Stream handle my note data?
Thought Streams sends note data to the configured LLM-Provider for processing. The LLM-Provider might store the data according to its own privacy policy. The Thought Stream plugin has no control over how the LLM-Provider handles the data.
You can also configure Thought Stream to use a self-hosted / local LLM compatible with OpenAi API. This way you can ensure that no sensitive data is sent to a third-party service.
The plugin itself does not store or log any data outside your vault. All the data storage and processing aside from transcription, question generation and content generation happens locally in your Obsidian vault.
Which LLM-Providers are supported?
Currently, Thought Stream supports all OpenAI-API-compatible providers, including:
- OpenAI
- OpenRouter
- Anthropic
- Azure OpenAI
How do I report a bug or request a feature?
You can report bugs or request features by opening an issue on the GitHub repository or by contacting me directly through the contact form.
How do I get an API key for my LLM-Provider?
You can get an API key for your LLM-Provider by following these steps:
- Visit the website of your LLM-Provider (e.g., OpenAI, or Open Router).
- Sign up for an account if you don't have one.
- Navigate to the API section of the website.
- Follow the instructions to create a new API key.
How do I configure Thought Stream to use my (local) LLM-Provider?
You can configure Thought Stream to use your LLM-Provider by following these steps:
- Open the Obsidian settings.
- Navigate to the "Thought Stream" plugin settings.
- Enter your custom API url and API key.
- Select the model you want to use.
- Select the language you want to use for the transcription.
How do I use Thought Stream to transcribe existing audio files?
You can use Thought Stream to transcribe audio files by following these steps:
- Open the command palette with
Ctrl/Cmd + P
. - Search for "Transcribe Audio File" and select it.
- A file dialog will appear. Choose the audio file you want to transcribe.
- The plugin will transcribe the selected file. The output will be saved according to your settings, either in a new note or at the cursor position in the current note.
How do I use Thought Stream to generate content?
You can use Thought Stream to generate content by following these steps:
- using the command palette:
- Open the command palette with
Ctrl/Cmd + P
. - Search for "Generate Content" and select it.
- Open the command palette with
- using the Thought Stream interface:
- Open the Thought Stream interface by e.g., clicking on the ribbon icon.
- Click the "Generate Content" Button.
- A dialog will appear. Choose a preset or enter a configuration for the content you want to generate.
- Click the "Generate" button. The plugin will generate the content based on your configuration and create a new note.
How do I use Thought Stream to generate questions?
You can use Thought Stream to generate questions by following these steps:
- Open the Thought Stream interface by e.g., clicking on the ribbon icon.
- Click the "Generate Questions" Button.
Tip 💡
If you want, you can configure the plugin to automatically generate questions for each note. You can do this by enabling the "Auto read active file" setting in the Thought Stream settings.
Caution 💸
Be aware that this will automatically send requests to your LLM-Provider for each note you open, which might incur costs depending on your provider's pricing model.