talon-ai-tools can be configured by changing settings in any .talon file. You can copy any of the following settings, uncomment them, and change the values to customize which model you use or its runtime behavior.
Adding custom prompts
You do not need to fork the repository to add your own custom prompts. Copy the file below, place it anywhere inside your talon user/ directory and follow the pattern of the key value mapping.
Advanced Customization
Configuring Model Name
The word model is the default prefix before all LLM commands to prevent collisions with other Talon commands. However, you can change or override it. To do so just create another talon list with the same name and a higher specificity. Here is an example that you can copy and past into your own configuration files
Providing Custom User Context
In case you want to provide additional context to the LLM, there is a hook that you can override in your own python code and anything that is returned will be sent with every request. Here is an example: