-
Notifications
You must be signed in to change notification settings - Fork 105
Description
BASE::
MacBook Pro
13-inch, M1, 2020
Chip Apple M1 Memory 16 GB macOS Ventura 13.6.6
LM Studio
Version 0.2.21 (0.2.21)
Visual Studio Code Version: 188.1 Commit:
STEPS::
I changed .env to include local llm from lm studio
# OPENAI_ENDPOINT=https://api.openai.com/v1/chat/completions
OPENAI_ENDPOINT=http://localhost:8081/v1
OPENAI_API_KEY=lm-studio
In VS Studio I updated the location of the GPT -Pilot folder
BACKGROUND::
I can use the chat
ISSUE::
I cannot use the code creator
"There was a problem with request to openai API:
string indices must be integers, not 'str'"
"Do you want to try make the same request again? If yes, just press ENTER. Otherwise, type "no"."
NO::
Traceback (most recent call last):
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 219, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 502, in stream_gpt_completion
raise ValueError(f'Error in LLM response: {json_line["error"]["message"]}')
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
TypeError: string indices must be integers, not 'str'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 151, in create_gpt_chat_completion
response = stream_gpt_completion(gpt_data, req_type, project, model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 301, in wrapper
raise ApiError(f"Error making LLM API request: {err_str}") from e
helpers.exceptions.ApiError: Error making LLM API request: string indices must be integers, not 'str'
The request to OPENAI API failed with error: Error making LLM API request: string indices must be integers, not 'str'. Please try again later.
Error connecting to the API. Please check your API key/endpoint and try again.
Traceback (most recent call last):
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 219, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 502, in stream_gpt_completion
raise ValueError(f'Error in LLM response: {json_line["error"]["message"]}')
~~~~~~~~~~~~~~~~~~^^^^^^^^^^^
TypeError: string indices must be integers, not 'str'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 81, in test_api_access
response = create_gpt_chat_completion(messages, 'project_description', project)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 163, in create_gpt_chat_completion
raise e
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 151, in create_gpt_chat_completion
response = stream_gpt_completion(gpt_data, req_type, project, model_name)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/Users/business/Documents/1 Projects/GPTProjects/PythagoraVSCodeExtension/gpt-pilot/pilot/utils/llm_connection.py", line 301, in wrapper
raise ApiError(f"Error making LLM API request: {err_str}") from e
helpers.exceptions.ApiError: Error making LLM API request: string indices must be integers, not 'str'
POSSIBLE SOLUTION::
continuedev/continue#801
"Adding the "apiBase": "http://localhost:1234/v1/" line to the config fixes the issue, however it wasn't necessary a few days ago. Seems like a regression."
where is this config file?
Otherwise, I am in your hands.
Thank you