Cache
LLM requests are NOT cached by default. However, you can turn on LLM request caching from script
metadata or the CLI arguments.
or
The cache is stored in the .genaiscript/cache/chat.jsonl
file. You can delete this file to clear the cache.
This file is excluded from git by default.
Directory.genaiscript
Directorycache
- chat.jsonl
Custom cache file
Use the cacheName
option to specify a custom cache file name.
The name will be used to create a file in the .genaiscript/cache
directory.
Or using the --cache-name
flag in the CLI.
Directory.genaiscript
Directorycache
- summary.jsonl
Programmatic cache
You can instantiate a custom cache object to manage the cache programmatically.