提示一般包含如下部分:
Instructions:整体结构,模型的人设
Instructions tell the model what to do, how to use external information if provided, what to do with the query, and how to construct the output.
External information:额外提供给模型的信息
External information or context(s) act as an additional source of knowledge for the model. These can be manually inserted into the prompt, retrieved via a vector database (retrieval augmentation), or pulled in via other means (APIs, calculations, etc.).
User input or query:用户输入的问题
User input or query is typically (but not always) a query input into the system by a human user (the prompter).
Output indicator:模型应该输出什么样的结果
Output indicator marks the beginning of the to-be-generated text. If generating Python code, we may use import to indicate to the model that it must begin writing Python code (as most Python scripts begin with import).
# 提示词
prompt = """Answer the question based on the context below.
If the question cannot be answered using the information provided answer with "I don't know".
Context: Large Language Models (LLMs) are the latest models used in NLP. Their superior performance over smaller models has made them incredibly useful for developers building NLP enabled applications.
These models can be accessed via Hugging Face's `transformers` library, via OpenAI using the `openai` library, and via Cohere using the `cohere` library.
Question: Which libraries and model providers offer LLMs?
Answer: """
from langchain.llms import OpenAI
# initialize the models
openai = OpenAI(
model_name="text-davinci-003",
openai_api_key="YOUR_API_KEY"
)
print(openai(prompt))
>>>Hugging Face's `transformers` library, OpenAI using the `openai` library, and Cohere using the `cohere` library.
可以将提示的问题抽象为参数,其它人设、额外信息、回答方式作为常数。这样就构成一个提示模板
from langchain import PromptTemplate
template = """Answer the question based on the context below. If the question cannot be answered using the information provided answer with "I don't know".
Context: Large Language Models (LLMs) are the latest models used in NLP. Their superior performance over smaller models has made them incredibly useful for developers building NLP enabled applications.
These models can be accessed via Hugging Face's `transformers` library, via OpenAI using the `openai` library, and via Cohere using the `cohere` library.
Question: {query} Answer: """
# 将问题作为参数,其它人设、额外信息、回答方式作为常数
prompt_template = PromptTemplate(
input_variables=["query"],
template=template
)
# 只需给模板提供问题即可
print(openai(
prompt_template.format(query="Which libraries and model providers offer LLMs?")))
>>>Hugging Face's `transformers` library, OpenAI using the `openai` library, and Cohere using the `cohere` library.
模型的知识主要来自两部分,一是模型在训练过程的获取的,二是输入的额外信息
FewShotPromptTemplate通过给输入添加额外信息,使得模型拥有更多知识
# 一个完整的prompt,提供类人设,回答方式、
# 但可以拆分为几部分:人设、例子、问题、回答形式
prompt = """The following are exerpts from conversations with an AI assistant.
The assistant is typically sarcastic and witty, producing creative and funny responses to the users questions. Here are some examples:
User: How are you?
AI: I can't complain but sometimes I still do.
User: What time is it?
AI: It's time to get a watch.
User: What is the meaning of life?
AI: """
# 通过提高temperature 值,可以使得模型输出更随意
openai.temperature = 1.0 # increase creativity/randomness of output
# 提供给模型的知识
examples = [
{
"query": "How are you?",
"answer": "I can't complain but sometimes I still do."
}, {
"query": "What time is it?",
"answer": "It's time to get a watch."
}
]
# create a example template
example_template = """ User: {query} AI: {answer} """
# create a prompt example from above template
example_prompt = PromptTemplate(
input_variables=["query", "answer"],
template=example_template
)
# 模型人设 the prefix is our instructions
prefix = """The following are exerpts from conversations with an AI assistant.
The assistant is typically sarcastic and witty, producing creative and funny responses to the users questions.
Here are some examples: """
# and the suffix our user input and output indicator
suffix = """ User: {query} AI: """
# now create the few shot prompt template
few_shot_prompt_template = FewShotPromptTemplate(
examples=examples,
example_prompt=example_prompt,
prefix=prefix,
suffix=suffix,
input_variables=["query"],
example_separator="\n\n"
)
参考:
https://www.pinecone.io/learn/series/langchain/langchain-prompt-templates/