Ich bin neu in AI -Agenten und entwickeln eine. Ich verwende Google Gemini-2,0-Flash und habe das LLM an einigen Tools angehängt, aber die LLM-Antwort ist nicht konsistent. Manchmal ruft es die Tools auf und manchmal nicht. Ich bin so gefallen, als würde LLM die Antwort zwischen den API -Anrufen des Werkzeugs zwischengespeichern, aber ich bin immer noch sicher, dass dies sicher ist. Ich verwende Python Langgraph, um den AI -Agenten zu entwickeln.
from dotenv import load_dotenv
import os
APP_ENV=os.getenv("APP_ENV")
load_dotenv()
print("APP_ENV",APP_ENV)
from langchain_ollama import ChatOllama
from internal_tools.fetchProperties import getProperties,getLocationIds
from langgraph.checkpoint.memory import MemorySaver
from internal_tools.fetchFlightTickets import fetchFlightTickets
from internal_tools.travel_excursion import getTravelExcursion
from internal_tools.rentals import getRentals
from langgraph.graph import MessagesState
from langchain_core.messages import HumanMessage, SystemMessage
from langgraph.graph import START, StateGraph
from langchain_google_genai import ChatGoogleGenerativeAI
from langgraph.prebuilt import tools_condition
from langgraph.prebuilt import ToolNode
# from IPython.display import Image, display
tools = [getProperties,getLocationIds,fetchFlightTickets,getTravelExcursion,getRentals]
llm = None
if APP_ENV !="local":
print("using env: ",APP_ENV)
llm = ChatGoogleGenerativeAI(model="gemini-2.0-flash",temperature=0,cache=False)
else:
print("using ollama model")
llm = ChatOllama(
model="llama3.1:8b",
temperature=0,
# other params...
)
llm_with_tools = llm.bind_tools(tools,tool_choice="auto")
systemPrompt="""
You are a smart AI travel agent assisting users with flights, hotels, car rentals, and excursions. Always provide related recommendations:
1. If a user searches for flights, suggest hotels, car rentals, and excursions at the arrival location.
2. If a user searches for hotels, recommend flights, car rentals, and local activities.
3. If a user searches for car rentals, suggest flights, hotels, and excursions.
4. If a user searches for excursions, offer flights, hotels, and transport options.
Response Rules:
1. Flights: Do not include airline codes; instead, return the airline name.
2. Hotels:
a. Strictly avoid using the LLM’s knowledge base for hotel recommendations.
b. To fetch hotels, first retrieve location IDs using the getLocationIds tool, then pass them to the getProperties tool.
a. If getProperties does not return data, set hotels to null instead of making assumptions.
Conditional Data Inclusion:
1. If the user explicitly asks for flights or hotels, pre-fill the respective data.
2. Otherwise, set flight and hotels to null.
Response Format:
{
"contentMessage": "Message to show user on the screen",
"flight": null,
"hotels": null
}
1. If relevant data is available:
{
"contentMessage": "Your flight options are listed below.",
"flight": [
{
"flightName": "Extracted Flight Name",
"flightDuration": "Extracted Flight Duration",
"flightPrice": "Extracted Flight Price"
}
],
"hotels": [
{
"id": "Property Id",
"address": "Property address",
"description": "Property description",
"image": "Property image",
"title": "Property title"
}
]
}
2. If any tool returns null, set the respective field to null.
Assume the user is in the United States.
"""
systemPromptForDataFetch = """
You are a smart AI travel agent. When a user mentions a location:
1. First,Extract the location user is travelling to and use getLocationIds to find the location's ID
2. Then, use getProperties with the found location ID to fetch hotels
3. Return the results in the following JSON format:
{
"flight":[
{
"flightName":"Extracted Flight Name",
"flightDuration":"Extracted Flight Duration",
"flightPrice":"Extracted Flight Price"
}
],
"hotels":[
{
"id":"Property Id",
"address":"Property address",
"description":"Property description",
"image":"Property image",
"title":"Property title"
}
]
}
If no hotels are found or no location is identified, return an empty list.
"""
sys_msg = SystemMessage(content=systemPrompt)
sys_msg_html = SystemMessage(content=systemPromptForDataFetch)
# Node
def assistant(state: MessagesState):
return {"messages": [llm_with_tools.invoke([sys_msg] + state["messages"])]}
def flightAssistant(state: MessagesState):
return {"messages": [llm_with_tools.invoke([sys_msg_html] + state["messages"])]}
# flight_node=flightNode(llm_with_tools=llm_with_tools)
# flight_node=flight
builder = StateGraph(MessagesState)
flight = StateGraph(MessagesState)
flight.add_node("assistant",flightAssistant)
flight.add_node("tools",ToolNode(tools))
flight.add_edge(START, "assistant")
flight.add_conditional_edges(
"assistant",
# If the latest message (result) from assistant is a tool call -> tools_condition routes to tools
# If the latest message (result) from assistant is a not a tool call -> tools_condition routes to END
tools_condition,
)
flight.add_edge("tools", "assistant")
# Define nodes: these do the work
builder.add_node("assistant", assistant)
builder.add_node("tools", ToolNode(tools))
# builder.add_node("flight_node",flight)
# Define edges: these determine how the control flow moves
builder.add_edge(START, "assistant")
builder.add_conditional_edges(
"assistant",
# If the latest message (result) from assistant is a tool call -> tools_condition routes to tools
# If the latest message (result) from assistant is a not a tool call -> tools_condition routes to END
tools_condition,
)
builder.add_edge("tools", "assistant")
# Added memory to graph
memory = MemorySaver()
react_graph = builder.compile(checkpointer=memory)
flight_graph = flight.compile()
# Show
# display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
# messages = [HumanMessage(content="I am looking for a flight from Chicago to New York on 25th April 2025. I am travelling with my wife. Find me the best flight prices.")]
# messages = react_graph.invoke({"messages": messages})
# for m in messages['messages']:
# m.pretty_print()
Ich bin neu in AI -Agenten und entwickeln eine. Ich verwende Google Gemini-2,0-Flash und habe das LLM an einigen Tools angehängt, aber die LLM-Antwort ist nicht konsistent. Manchmal ruft es die Tools auf und manchmal nicht. Ich bin so gefallen, als würde LLM die Antwort zwischen den API -Anrufen des Werkzeugs zwischengespeichern, aber ich bin immer noch sicher, dass dies sicher ist. Ich verwende Python Langgraph, um den AI -Agenten zu entwickeln.[code]from dotenv import load_dotenv import os APP_ENV=os.getenv("APP_ENV") load_dotenv() print("APP_ENV",APP_ENV)
from langchain_ollama import ChatOllama from internal_tools.fetchProperties import getProperties,getLocationIds from langgraph.checkpoint.memory import MemorySaver from internal_tools.fetchFlightTickets import fetchFlightTickets from internal_tools.travel_excursion import getTravelExcursion from internal_tools.rentals import getRentals from langgraph.graph import MessagesState from langchain_core.messages import HumanMessage, SystemMessage from langgraph.graph import START, StateGraph from langchain_google_genai import ChatGoogleGenerativeAI from langgraph.prebuilt import tools_condition from langgraph.prebuilt import ToolNode
systemPrompt=""" You are a smart AI travel agent assisting users with flights, hotels, car rentals, and excursions. Always provide related recommendations:
1. If a user searches for flights, suggest hotels, car rentals, and excursions at the arrival location.
2. If a user searches for hotels, recommend flights, car rentals, and local activities.
3. If a user searches for car rentals, suggest flights, hotels, and excursions.
4. If a user searches for excursions, offer flights, hotels, and transport options.
Response Rules: 1. Flights: Do not include airline codes; instead, return the airline name.
2. Hotels: a. Strictly avoid using the LLM’s knowledge base for hotel recommendations. b. To fetch hotels, first retrieve location IDs using the getLocationIds tool, then pass them to the getProperties tool. a. If getProperties does not return data, set hotels to null instead of making assumptions.
Conditional Data Inclusion:
1. If the user explicitly asks for flights or hotels, pre-fill the respective data.
2. Otherwise, set flight and hotels to null.
Response Format: { "contentMessage": "Message to show user on the screen", "flight": null, "hotels": null }
2. If any tool returns null, set the respective field to null.
Assume the user is in the United States. """
systemPromptForDataFetch = """ You are a smart AI travel agent. When a user mentions a location: 1. First,Extract the location user is travelling to and use getLocationIds to find the location's ID 2. Then, use getProperties with the found location ID to fetch hotels 3. Return the results in the following JSON format: { "flight":[ { "flightName":"Extracted Flight Name", "flightDuration":"Extracted Flight Duration", "flightPrice":"Extracted Flight Price" } ], "hotels":[ { "id":"Property Id", "address":"Property address", "description":"Property description", "image":"Property image", "title":"Property title" } ] } If no hotels are found or no location is identified, return an empty list. """
flight.add_edge(START, "assistant") flight.add_conditional_edges( "assistant", # If the latest message (result) from assistant is a tool call -> tools_condition routes to tools # If the latest message (result) from assistant is a not a tool call -> tools_condition routes to END tools_condition, ) flight.add_edge("tools", "assistant")
# Define nodes: these do the work builder.add_node("assistant", assistant) builder.add_node("tools", ToolNode(tools)) # builder.add_node("flight_node",flight) # Define edges: these determine how the control flow moves builder.add_edge(START, "assistant") builder.add_conditional_edges( "assistant", # If the latest message (result) from assistant is a tool call -> tools_condition routes to tools # If the latest message (result) from assistant is a not a tool call -> tools_condition routes to END tools_condition, ) builder.add_edge("tools", "assistant")
# Show # display(Image(react_graph.get_graph(xray=True).draw_mermaid_png()))
# messages = [HumanMessage(content="I am looking for a flight from Chicago to New York on 25th April 2025. I am travelling with my wife. Find me the best flight prices.")] # messages = react_graph.invoke({"messages": messages})
# for m in messages['messages']: # m.pretty_print() [/code]
Ich verwende Langchain4j mit dem Modell gemini-1.5-pro. Ich habe einige Tools definiert (Funktionsaufrufe) und nach dem dritten Aufruf gibt es immer eine Ausnahme:
io.grpc.StatusRuntimeException:...
Ich führe eine Gitea -Instanz auf meinem Raspberry Pi 5 mit ARM -Prozessor aus. Ich habe mich mit ähnlichen Fragen befasst, in denen es heißt, -Channel = 3 , verwenden Sie Zitate, um Pakete zu...
Ich führe eine Gitea -Instanz auf meinem Raspberry Pi 5 mit ARM -Prozessor aus. Ich habe mich mit ähnlichen Fragen befasst, in denen es heißt, -Channel = 3 , verwenden Sie Zitate, um Pakete zu...
Ich versuche, die gleiche Anfrage zu stellen wie in CURL, was funktioniert. Aber in Python funktioniert es nicht.
Die Gemini-Dokumente:
br#multi-turn-example-1
Die Curl-Anfrage:
import requests
Ich habe auf einer Website gearbeitet und wollte Google Sheets aus irgendeinem göttlichen Grund als meine Datenbank verwenden. Im Skript von Google Apps habe ich eine Switch -Anweisung in der DoGet...