LangChain-Agent自定义Tools类 ——输入参数篇(二)

给自定义函数传入输入参数,分别有single-input 参数函数案例和multi-input 参数函数案例: 

from langchain.agents import Tool
from langchain.tools import BaseTool
from math import pi
from typing import Union
from math import pi
from typing import Union
from langchain.agents import initialize_agent
from langchain.agents import AgentType
import os
from langchain.chat_models import ChatOpenAI
from langchain.chains.conversation.memory import ConversationBufferWindowMemory
from typing import Optional


class Calculate_Life(BaseTool):#这里的RUL value 我瞎编的
      name = "Calculate Life Tool"
      description = "use this tool when you need to calculate the happiness of life using the RUL value"

      def _run(self, RUL: Union[int, float]):
        return "The happiness of life is {RUL*2}"

      def _arun(self, RUL: int):
        raise NotImplementedError("This tool does not support async")
class Calculate_volumn(BaseTool):#这里的RUL value 我瞎编的
      name = "Calculate Object Volumn"
      description = (
      "use this tool when you need to calculate the the volumn of object,inputs are multi parameters,need multi input variables,like the length,width,and height."
      "To use the tool, you must provide at least three of the following parameters "
      "['length', 'width', 'height']."
      )

      def _run(
          self,
          length: Optional[Union[int, float]] = None,
          width: Optional[Union[int, float]] = None,
          height: Optional[Union[int, float]] = None
      ):
          # check for the values we have been given
          if length and width and height:
              return length*width*height
          else:
              return "Could not calculate the size of universe. Need two or more of `length`, `width`,`height`."
      def _arun(self, query: str):
          raise NotImplementedError("This tool does not support async")
os.environ["OPENAI_API_KEY"] = "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"
llm = ChatOpenAI(
    openai_api_key=os.environ["OPENAI_API_KEY"] ,
    temperature=0,
    model_name='gpt-3.5-turbo'
)
# initialize conversational memory
conversational_memory = ConversationBufferWindowMemory(
        memory_key='chat_history',
        k=5,
        return_messages=True
)

tools = [Calculate_volumn()]
 
# sys_msg = """Assistant is a large language model trained by OpenAI.

# Assistant is designed to be able to assist with a wide range of tasks, from answering simple questions to providing in-depth explanations and discussions on a wide range of topics. As a language model, Assistant is able to generate human-like text based on the input it receives, allowing it to engage in natural-sounding conversations and provide responses that are coherent and relevant to the topic at hand.

# Assistant is constantly learning and improving, and its capabilities are constantly evolving. It is able to process and understand large amounts of text, and can use this knowledge to provide accurate and informative responses to a wide range of questions. Additionally, Assistant is able to generate its own text based on the input it receives, allowing it to engage in discussions and provide explanations and descriptions on a wide range of topics.

# Unfortunately, Assistant is terrible at maths. When provided with math questions, no matter how simple, assistant always refers to it's trusty tools and absolutely does NOT try to answer math questions by itself

# Overall, Assistant is a powerful system that can help with a wide range of tasks and provide valuable insights and information on a wide range of topics. Whether you need help with a specific question or just want to have a conversation about a particular topic, Assistant is here to assist.
# """
# sys_msg = "Do not try to figure math question by yourself, you might be blindly confident about your mathematic capacity,please always refers to it's trusty tools and absolutely does NOT try to answer math questions by itself"



# agent = initialize_agent(
#     agent='chat-conversational-react-description',
#     # agent='zero-shot-react-description',
#     tools=tools,
#     llm=llm,
#     verbose=True,
#     max_iterations=3,
#     early_stopping_method='generate',
#     memory=conversational_memory
# )


new_prompt = agent.agent.create_prompt(
    system_message=sys_msg,
    tools=tools
)

agent.agent.llm_chain.prompt = new_prompt
# update the agent tools
agent.tools = tools

# agent("How much is the happiness of life when RUL value is equal to 2")#单个参数提问
agent("Given [length=10000,width=2,height=3],what's the size of this box?")#多个参数提问

运行结果 

LangChain-Agent自定义Tools类 ——输入参数篇(二)_第1张图片

后期关注结构化输出,方便作为借口提供给其他下游应用 

相关友情链接:

为 LLM 代理构建自定义工具 |松果 (pinecone.io)

你可能感兴趣的:(LangChain,langchain,多参数输入,自定义函数)