import json
from llamaapi import LlamaAPI
# Initialize the SDK
llama = LlamaAPI("")
# Build the API request
api_request_json = {
"model": "llama3.1-70b",
"messages": [
{"role": "user", "content": "What is the weather like in Boston?"},
],
"functions": [
{
"name": "get_current_weather",
"description": "Get the current weather in a given location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"days": {
"type": "number",
"description": "for how many days ahead you wants the forecast",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
},
"required": ["location", "days"],
}
],
"stream": False,
"function_call": "get_current_weather",
}
# Execute the Request
response = llama.run(api_request_json)
print(json.dumps(response.json(), indent=2))
< /code>
Es gibt den folgenden Fehler zurück: < /p>
Traceback (most recent call last):
File "C:\Users\some_user\OneDrive - Ryder\Projects\Tutorials\Pycharm\AI_tests\llama_api.py", line 39, in
response = llama.run(api_request_json)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\some_user\myenv\Lib\site-packages\llamaapi\llamaapi.py", line 67, in run
return self.run_sync(api_request_json)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "C:\Users\some_user\myenv\Lib\site-packages\llamaapi\llamaapi.py", line 53, in run_sync
raise Exception(f"POST {response.status_code} {response.json()['detail']}")
~~~~~~~~~~~~~~~^^^^^^^^^^
TypeError: list indices must be integers or slices, not str
Ich leite unter dem Code aus: Lama Quick Start < /p> [code]import json from llamaapi import LlamaAPI
# Initialize the SDK llama = LlamaAPI("")
# Build the API request api_request_json = { "model": "llama3.1-70b", "messages": [ {"role": "user", "content": "What is the weather like in Boston?"}, ], "functions": [ { "name": "get_current_weather", "description": "Get the current weather in a given location", "parameters": { "type": "object", "properties": { "location": { "type": "string", "description": "The city and state, e.g. San Francisco, CA", }, "days": { "type": "number", "description": "for how many days ahead you wants the forecast", }, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]}, }, }, "required": ["location", "days"], } ], "stream": False, "function_call": "get_current_weather", }
# Execute the Request response = llama.run(api_request_json) print(json.dumps(response.json(), indent=2)) < /code> Es gibt den folgenden Fehler zurück: < /p> Traceback (most recent call last): File "C:\Users\some_user\OneDrive - Ryder\Projects\Tutorials\Pycharm\AI_tests\llama_api.py", line 39, in response = llama.run(api_request_json) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\some_user\myenv\Lib\site-packages\llamaapi\llamaapi.py", line 67, in run return self.run_sync(api_request_json) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "C:\Users\some_user\myenv\Lib\site-packages\llamaapi\llamaapi.py", line 53, in run_sync raise Exception(f"POST {response.status_code} {response.json()['detail']}") ~~~~~~~~~~~~~~~^^^^^^^^^^ TypeError: list indices must be integers or slices, not str [/code] Was ist der Fix hier?
Verzeichnis erstellen llava_shared.dir \ release .
strukturierte Ausgabe ist aktiviert. Die Formatierung der Compiler -Diagnostik spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie...
Verzeichnis erstellen llava_shared.dir \ release .
strukturierte Ausgabe ist aktiviert. Die Formatierung der Compiler -Diagnostik spiegelt die Fehlerhierarchie wider. Weitere Informationen finden Sie...
Ich versuche eine Verbindung zu einem Remote -Computer herzustellen, der Ollama von einem anderen Computer im lokalen Netzwerk ausführt. Beide Computer werden Windows ausgeführt. Auf dem Remote...
Ich führe eine GET -Anforderung mit einem Abfrageparameter mit AxiOS in React/TypeScript durch. Backend ist Fastapi/Python und enthält eine Schlaffunktion (5 Sekunden), um eine Verzögerung in der...