Ich habe einen Fehler aufgetreten, während ich mit der Langchain_Core Langchain-Openai Bibliothek gearbeitet habe, und ich hoffe, dass mir jemand bei der Lösung dieses Problems unterstützen kann.
Fehlermeldung: < /h3>
Ich habe einen Fehler aufgetreten, während ich mit der Langchain_Core Langchain-Openai Bibliothek gearbeitet habe, und ich hoffe, dass mir jemand bei der Lösung dieses Problems unterstützen kann. Fehlermeldung: < /h3> [code]AttributeError: 'str' object has no attribute 'model_dump'[/code] Beispielcode: [code]import pandas as pd from data_api import * from langchain_openai import ChatOpenAI # from langchain.chat_models import ChatOpenAI from dotenv import load_dotenv, find_dotenv from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParser from perplexity.perplexity import search_chat_completion from langchain.prompts import PromptTemplate from langchain_core.runnables import RunnableLambda from operator import itemgetter import json import os from typing import List, Dict, Any from tqdm import tqdm
return chain.invoke({"company": symbol}) < /code> Traceback: < /h3> AttributeError("'str' object has no attribute 'model_dump'")Traceback (most recent call last):
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3022, in invoke input = context.run(step.invoke, input, config, **kwargs)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3727, in invoke output = {key: future.result() for key, future in zip(steps, futures)}
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3727, in output = {key: future.result() for key, future in zip(steps, futures)}
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/_base.py", line 439, in result return self.__get_result()
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result raise self._exception
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3711, in _invoke_step return context.run(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3022, in invoke input = context.run(step.invoke, input, config, **kwargs)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3727, in invoke output = {key: future.result() for key, future in zip(steps, futures)}
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3727, in output = {key: future.result() for key, future in zip(steps, futures)}
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/_base.py", line 446, in result return self.__get_result()
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/_base.py", line 391, in __get_result raise self._exception
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/concurrent/futures/thread.py", line 58, in run result = self.fn(*self.args, **self.kwargs)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3711, in _invoke_step return context.run(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/runnables/base.py", line 3024, in invoke input = context.run(step.invoke, input, config)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 286, in invoke self.generate_prompt(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 786, in generate_prompt return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 643, in generate raise e
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate self._generate_with_cache(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache result = self._generate(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_openai/chat_models/base.py", line 718, in _generate return self._create_chat_result(response, generation_info)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_openai/chat_models/base.py", line 745, in _create_chat_result response if isinstance(response, dict) else response.model_dump()
AttributeError: 'str' object has no attribute 'model_dump' < /code> Umgebungsinformationen: < /p>
versuchte Fix < /h3> Ich habe die Parser -Funktion aktualisiert, um STR -Objekte zu verarbeiten, aber das Problem bleibt bestehen: < /p> def wrap_chain_output(chain_output): """Wrap children chain output.""" if isinstance(chain_output, str): return chain_output elif isinstance(chain_output, AIMessage): return chain_output.content else: raise TypeError("Unsupported type for chain_output") < /code>
Beispiel für die Kettenkette < /li> < /ul> chain216 = ( PromptTemplate.from_template( """ **Search for the industry corresponding to the company **{company}** and then conduct research.** Engage in divergent thinking, including but not limited to the following keywords. """ ) | self.pplx )
Langsmith -Fehler -Trace -Info: LG -Bugs.jpg < /p> < /li> Fehlerinformationen: < /p> < /li> < /ul> AttributeError("'str' object has no attribute 'model_dump'")Traceback (most recent call last):
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 633, in generate self._generate_with_cache(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_core/language_models/chat_models.py", line 851, in _generate_with_cache result = self._generate(
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_openai/chat_models/base.py", line 718, in _generate return self._create_chat_result(response, generation_info)
File "/home/azureuser/miniconda3/envs/llm39/lib/python3.9/site-packages/langchain_openai/chat_models/base.py", line 745, in _create_chat_result response if isinstance(response, dict) else response.model_dump()
AttributeError: 'str' object has no attribute 'model_dump' [/code]
Ich habe die folgende case-Anweisung in meinem Code:
status = case(
(
orders.c.item_delivered.is_(True),
OrderStatus.DELIVERED.value,
),
(
orders.c.order_processing_status ==...