当前位置: 首页 > news >正文

详细介绍:LangChain Few-Shot Prompt Templates(one)

https://python.langchain.com.cn/docs/modules/model_io/prompts/prompt_templates/few_shot_examples

I. First Clarify the Core Objective of the Original Text (Use Case)

The original text defines the task at the very beginning: configure few-shot examples for “question-answering with search”. In simple terms, this means enabling the LLM to first determine “whether an intermediate follow-up question is needed” when faced with a problem, then derive the final answer step by step (e.g., when asked “Who lived longer?”, first follow up with “How old was each person when they died?” before comparing).
All code and steps are designed to achieve this task.

II. Part 1: Use the Example Set Directly (First Major Section of the Original Text)

The first step in the original text explains “using few-shot examples in prompts without LangChain templates—by manually combining examples with questions”. It is divided into 3 specific steps, with each step corresponding to the original code.

Step 1: Create a Few-Shot Example Set (Core Code from the Original Text)

The original text states: “Each example is a dictionary containing input variables”. Here, the input variables are question (the question) and answer (the answer with follow-up steps). The code is copied exactly from the original text:

# Original code: Few-shot example list (4 question-answering examples)
from langchain.prompts.few_shot import FewShotPromptTemplate
from langchain.prompts.prompt import PromptTemplate
examples = [
{
"question": "Who lived longer, Muhammad Ali or Alan Turing?",
"answer":
"""
Are follow up questions needed here: Yes.
Follow up: How old was Muhammad Ali when he died?
Intermediate answer: Muhammad Ali was 74 years old when he died.
Follow up: How old was Alan Turing when he died?
Intermediate answer: Alan Turing was 41 years old when he died.
So the final answer is: Muhammad Ali
"""
},
{
"question": "When was the founder of craigslist born?",
"answer":
"""
Are follow up questions needed here: Yes.
Follow up: Who was the founder of craigslist?
Intermediate answer: Craigslist was founded by Craig Newmark.
Follow up: When was Craig Newmark born?
Intermediate answer: Craig Newmark was born on December 6, 1952.
So the final answer is: December 6, 1952
"""
},
{
"question": "Who was the maternal grandfather of George Washington?",
"answer":
"""
Are follow up questions needed here: Yes.
Follow up: Who was the mother of George Washington?
Intermediate answer: The mother of George Washington was Mary Ball Washington.
Follow up: Who was the father of Mary Ball Washington?
Intermediate answer: The father of Mary Ball Washington was Joseph Ball.
So the final answer is: Joseph Ball
"""
},
{
"question": "Are both the directors of Jaws and Casino Royale from the same country?",
"answer":
"""
Are follow up questions needed here: Yes.
Follow up: Who is the director of Jaws?
Intermediate Answer: The director of Jaws is Steven Spielberg.
Follow up: Where is Steven Spielberg from?
Intermediate Answer: The United States.
Follow up: Who is the director of Casino Royale?
Intermediate Answer: The director of Casino Royale is Martin Campbell.
Follow up: Where is Martin Campbell from?
Intermediate Answer: New Zealand.
So the final answer is: No
"""
}
]
  • Key note from the original text: The question and answer in each dictionary are “fixed input variable names”. The subsequent template will use these names to retrieve data, so they cannot be changed arbitrarily.

Step 2: Create a “Formatter for Individual Examples” (Called example_prompt in the Original Text)

The original text explains that this step is to “define how a single example should be displayed”—converting each dictionary (containing question + answer) in examples into text in a fixed format. The code is copied exactly from the original text:

# Original code: Formatter template for individual examples
example_prompt = PromptTemplate(
input_variables=["question", "answer"],  # Must match the keys in the example dictionaries
template="Question: {question}\n{answer}"  # Display format for a single example: question first, then answer
)
# Original test: Print the formatted result of the first example
print(example_prompt.format(**examples[0]))
  • Expected output from the original text (what you will see):
Question: Who lived longer, Muhammad Ali or Alan Turing?Are follow up questions needed here: Yes.Follow up: How old was Muhammad Ali when he died?Intermediate answer: Muhammad Ali was 74 years old when he died.Follow up: How old was Alan Turing when he died?Intermediate answer: Alan Turing was 41 years old when he died.So the final answer is: Muhammad Ali
  • Explanation of a difficult point: **examples[0] is “dictionary unpacking”—it automatically passes question and answer from examples[0] to the input_variables of example_prompt. There is no need to manually write question=examples[0]["question"], answer=examples[0]["answer"].

Step 3: Create a Few-Shot Prompt Template (FewShotPromptTemplate)

The original text states that this step “combines the example set, formatter, and final question”. The code is copied exactly from the original text:

# Original code: Create a complete few-shot prompt template
prompt = FewShotPromptTemplate(
examples=examples,  # All examples prepared in Step 1
example_prompt=example_prompt,  # Formatter template for individual examples defined in Step 2
suffix="Question: {input}",  # "Final question" after the examples ({input} is the question to be passed by the user)
input_variables=["input"]  # Tell the template: the parameter the user needs to pass is "input" (i.e., the final question)
)
# Original test: Pass the final question and generate the complete prompt
print(prompt.format(input="Who was the father of Mary Ball Washington?"))
  • Expected output from the original text (what you will see):
    The template will automatically format all examples according to example_prompt, then append the question you passed:
Question: Who lived longer, Muhammad Ali or Alan Turing?Are follow up questions needed here: Yes.Follow up: How old was Muhammad Ali when he died?Intermediate answer: Muhammad Ali was 74 years old when he died.Follow up: How old was Alan Turing when he died?Intermediate answer: Alan Turing was 41 years old when he died.So the final answer is: Muhammad AliQuestion: When was the founder of craigslist born?Are follow up questions needed here: Yes.Follow up: Who was the founder of craigslist?Intermediate answer: Craigslist was founded by Craig Newmark.Follow up: When was Craig Newmark born?Intermediate answer: Craig Newmark was born on December 6, 1952.So the final answer is: December 6, 1952Question: Who was the maternal grandfather of George Washington?Are follow up questions needed here: Yes.Follow up: Who was the mother of George Washington?Intermediate answer: The mother of George Washington was Mary Ball Washington.Follow up: Who was the father of Mary Ball Washington?Intermediate answer: The father of Mary Ball Washington was Joseph Ball.So the final answer is: Joseph BallQuestion: Are both the directors of Jaws and Casino Royale from the same country?Are follow up questions needed here: Yes.Follow up: Who is the director of Jaws?Intermediate Answer: The director of Jaws is Steven Spielberg.Follow up: Where is Steven Spielberg from?Intermediate Answer: The United States.Follow up: Who is the director of Casino Royale?Intermediate Answer: The director of Casino Royale is Martin Campbell.Follow up: Where is Martin Campbell from?Intermediate Answer: New Zealand.So the final answer is: NoQuestion: Who was the father of Mary Ball Washington?
  • Key note from the original text: suffix refers to the “suffix of the examples”—i.e., the question to be asked after all examples are displayed. input_variables=["input"] means you must pass the input parameter (your question) when calling format.

III. Part 2: Use an Example Selector (Second Major Section of the Original Text)

The original text describes this as an “advanced usage”: instead of including all examples in the prompt, it “selects the most similar examples to the user’s question” (reducing prompt length and helping the LLM focus). It is divided into 3 steps, with each step corresponding exactly to the original code.

Step 1: Create an Example Selector (SemanticSimilarityExampleSelector)

The original text uses “semantic similarity” to select examples (e.g., if the user asks “Who was Mary Ball Washington’s father?”, it selects the example about “George Washington’s maternal grandfather” since they are most relevant). The code is copied exactly from the original text:

# Original code: Import required tools (vector store, embedding model, example selector)
from langchain.prompts.example_selector import SemanticSimilarityExampleSelector
from langchain.vectorstores import Chroma
from langchain.embeddings import OpenAIEmbeddings
# Original code: Create the example selector
example_selector = SemanticSimilarityExampleSelector.from_examples(
examples,  # Still the example set from Step 1 (all selectable examples)
OpenAIEmbeddings(),  # Use OpenAI’s embedding model (converts text into "numbers for similarity calculation")
Chroma,  # Use Chroma vector store (stores embedded numbers for easy similarity search)
k=1  # Select only "1 most similar example"
)
  • Explanation of difficult points:
    • “Embedding model (OpenAIEmbeddings)”: Converts text into a sequence of numbers (e.g., “apple” becomes [0.1, 0.2, …]). The more similar the numbers, the closer the semantic meaning of the text.
    • “Chroma”: A tool specifically designed to store these numbers, enabling quick identification of the “example numbers” most similar to the “numbers of the user’s question”.
    • “k=1”: Selects only the 1 most similar example (as used in the original text). You can also change k=2 to select 2 examples.

Step 2: Test the Example Selector (Select the Most Similar Example)

The original text uses the question “Who was the father of Mary Ball Washington?” to test which example is selected. The code is copied exactly from the original text:

# Original code: Define the test question
question = "Who was the father of Mary Ball Washington?"
# Original code: Select the most similar example based on the question
selected_examples = example_selector.select_examples({"question": question})
# Original code: Print the selected example
print(f"Examples most similar to the input: {question}")
for example in selected_examples:
print("\n")
for k, v in example.items():
print(f"{k}: {v}")
  • Expected output from the original text (what you will see):
    The example about “George Washington’s maternal grandfather” will be selected, as both questions relate to “Mary Ball Washington’s relatives”:
Running Chroma using direct local API.
Using DuckDB in-memory for database. Data will be transient.
Examples most similar to the input: Who was the father of Mary Ball Washington?
question: Who was the maternal grandfather of George Washington?
answer:
Are follow up questions needed here: Yes.
Follow up: Who was the mother of George Washington?
Intermediate answer: The mother of George Washington was Mary Ball Washington.
Follow up: Who was the father of Mary Ball Washington?
Intermediate answer: The father of Mary Ball Washington was Joseph Ball.
So the final answer is: Joseph Ball

Step 3: Create a Few-Shot Template with the Example Selector

The original text states: “Replace the previous examples parameter with example_selector”—all other settings remain unchanged. The code is copied exactly from the original text:

# Original code: Create a few-shot template (using the example selector instead of all examples)
prompt = FewShotPromptTemplate(
example_selector=example_selector,  # Example selector created in Step 1
example_prompt=example_prompt,  # Still the formatter template for individual examples from Step 2
suffix="Question: {input}",  # Still the format for the final question
input_variables=["input"]  # Still the user passes the "input" parameter
)
# Original test: Pass the question and generate the complete prompt
print(prompt.format(input="Who was the father of Mary Ball Washington?"))
  • Expected output from the original text (what you will see):
    This time, only “the 1 most similar example” is displayed (instead of all 4), with the question appended at the end:
Question: Who was the maternal grandfather of George Washington?Are follow up questions needed here: Yes.Follow up: Who was the mother of George Washington?Intermediate answer: The mother of George Washington was Mary Ball Washington.Follow up: Who was the father of Mary Ball Washington?Intermediate answer: The father of Mary Ball Washington was Joseph Ball.So the final answer is: Joseph BallQuestion: Who was the father of Mary Ball Washington?
  • Key note from the original text: The advantage of this approach is that “the prompt is shorter”—the LLM does not need to process irrelevant examples, leading to more accurate and faster responses.

IV. Core Summary of the Original Text (Extracted Exclusively from the Original Text)

  1. Two ways to use few-shot prompt templates:
    • Direct use of examples: Include all examples in the prompt (suitable for scenarios with few examples).
    • Use of example_selector: Select the most relevant examples based on semantic similarity (suitable for scenarios with many examples).
  2. 3 Essential Core Components:
    • Example set (examples): A list of dictionaries, where each dictionary contains keys corresponding to input_variables.
    • Individual example formatter template (example_prompt): Defines the display format of a single example.
    • Few-shot template (FewShotPromptTemplate): Combines the example set, formatter template, and final question.
  3. Core Tools for the Example Selector:
    • SemanticSimilarityExampleSelector: Selects examples based on semantics.
    • OpenAIEmbeddings: Converts text into numbers for similarity calculation.
    • Chroma: Stores numbers and performs similarity searches.

All content is derived from the original text without adding any extra cases or code. If you have questions about any line of code, we can break it down further!

http://www.jsqmd.com/news/298610/

相关文章:

  • 在 SQL Server 2025 CU1 中创建 CAG (Contained Availability Group)
  • GPT-5.2-Pro与Sora2全面爆发:普通开发者如何低成本构建AGI级应用?(附多模态整合架构图)
  • 文字游戏:进化之路2.0二开完美版本源码 带后台
  • 基于Kubo公式的石墨烯电导率与表面阻抗计算(MATLAB实现)
  • C#核心
  • C#进阶
  • 2026年市场上有名的打包带厂家排行,市场上有实力的打包带直销厂家广营宏利满足多元需求
  • BES(恒玄)蓝牙平台EQ 调试和设定
  • 将分散的Pytest测试脚本统一接入测试平台:FastAPI改造方案详解
  • 基于模糊控制的MATLAB避障算法实现
  • 为什么网络上搜索不到“桑桥网络”这家公司了?
  • 说说上海MNS2.0配电柜批量定制,如何选择厂家?
  • 2026年阜阳地区,为你分享专业的新能源汽修培训职业学校推荐
  • 自助KTV加盟哪家服务靠谱,长春鱼乐圈资料汇总
  • 2026年全国实力强的博士留学机构排名推荐,这些企业值得关注
  • 2025年成都火锅人气排行:3公里内口碑爆表的十大必吃店,牛肉火锅/麻辣烫/美食/市井火锅nbsp;成都火锅约会地点哪家好吃
  • 聊聊2026年别墅外墙砖靠谱厂家,广东和陶家居实力上榜
  • 读书笔记三:从需求到交付,坚守软件质量的核心底线
  • 金属带材环保电镀费用知多少,哪家收费合理?
  • 详细介绍:STM32外设学习--DMA直接存储器读取--学习笔记。
  • 基于STM32的智能宠物监控设计与实现
  • 基于STM32的智能家居安防系统
  • 基于STM32的智能导盲杖设计与实现
  • 基于STM32的智能楼梯灯系统
  • Vue 3 中的具名插槽仍然完全支持,Vue 2 的旧语法 Vue 3 中已废弃
  • 2026年全国靠谱的股权激励公司排名,创锟股权激励咨询实力入选值得关注
  • 鱼乐圈自助ktv音效好不好,分享值得选择的店铺排名
  • 盘点2026易切削钢专业厂家,宁波、杭州优质厂商Top10
  • 2026年重点关注金属制造企业,金轮精密的服务水平靠谱吗?
  • 详细介绍:Nature Communications|3D 打印仿生 SA-II 神经,让假肢感知拉伸