Closed
Description
gptscript version v0.8.2+1b5e068f
Steps to reproduce the problem:
- Execute the following script that is chat enabled using a local provider
cat examples/bobchat.gpt
tools: bob
chat: true
Ask Bob how he is doing and let me know exactly what he said.
---
name: bob
description: I'm Bob, a friendly guy.
args: question: The question to ask Bob.
chat: true
When asked how I am doing, respond with "Thanks for asking "${question}", I'm doing great fellow friendly AI tool!"
- User is prompted to enter OpenAI API key.
gptscript --disable-cache --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' examples/bobchat.gpt
12:32:27 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
Please provide your OpenAI API key:
>
On entering an invalid(or empty) openAI API key , the script fails to execute.
gptscript --disable-cache --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' examples/bobchat.gpt
12:31:32 WARNING: Changing the default model can have unknown behavior for existing tools. Use the model field per tool instead.
2024/06/18 12:31:42 error, status code: 401, message: Incorrect API key provided: 234. You can find your API key at https://platform.openai.com/account/api-keys.
Expected Behavior:
User should not be prompted to enter openAI API keys when executing chat scripts with local providers.
Note - When TUI is disabled - gptscript --disable-cache --default-model 'Llama-3-8b-function-calling-alpha-v1.gguf from http://localhost:1234/v1' --disable-tui examples/bobchat.gpt
, there is no prompting for openai API key and chat completion requests get directed to local models and succeeds.