当前位置: 首页 > news >正文

设计师网站十大网站企业资质查询系统官网

设计师网站十大网站,企业资质查询系统官网,网站源码推荐,资金盘做网站系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)#xff0c;用于创建复杂的逻辑链。通过将不同的可运行对象组合起来#xff0c;LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能…系列文章索引 LangChain教程 - 系列文章 LangChain提供了一种灵活且强大的表达式语言 (LangChain Expression Language, LCEL)用于创建复杂的逻辑链。通过将不同的可运行对象组合起来LCEL可以实现顺序链、嵌套链、并行链、路由以及动态构建等高级功能从而满足各种场景下的需求。本文将详细介绍这些功能及其实现方式。 顺序链 LCEL的核心功能是将可运行对象按顺序组合起来其中前一个对象的输出会自动传递给下一个对象作为输入。我们可以使用管道操作符 (|) 或显式的 .pipe() 方法来构建顺序链。 以下是一个简单的例子 from langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParsermodel OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)输出 Heres a bear joke for you:Why did the bear dissolve in water? Because it was a polar bear!在上述例子中提示模板将输入格式化为聊天模型的输入格式聊天模型生成笑话最后通过输出解析器将结果转换为字符串。 嵌套链 嵌套链允许我们将多个链组合起来以创建更复杂的逻辑。例如可以将一个生成笑话的链与另一个链组合该链负责分析笑话的有趣程度。 analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)输出 Haha, thats a clever play on words! Using polar to imply the bear dissolved or became polar/polarized when put in water. Not the most hilarious joke ever, but it has a cute, groan-worthy pun that makes it mildly amusing.并行链 RunnableParallel 使得可以并行运行多个链并将每个链的结果组合成一个字典。这种方式适用于需要同时处理多个任务的场景。 from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)输出 {joke: Why dont bears like fast food? Because they cant catch it!,poem: In the quiet of the forest, the bear roams free\nMajestic and wild, a sight to see. }路由 路由允许根据输入动态选择要执行的子链。LCEL提供了两种实现路由的方式 使用自定义函数 通过 RunnableLambda 实现动态路由 from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfrom langchain_core.runnables import RunnableLambdafull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)使用 RunnableBranch RunnableBranch 通过条件匹配选择分支 from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)动态构建 动态构建链可以根据输入在运行时生成链的部分。通过 RunnableLambda 的返回值机制可以返回一个新的 Runnable。 from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)输出 According to the context provided, Egypts population in 2024 is estimated to be about 111 million.完整代码实例 from operator import itemgetterfrom langchain_ollama import OllamaLLM from langchain_core.prompts import ChatPromptTemplate from langchain_core.output_parsers import StrOutputParserprint(\n-----------------------------------\n)# Simple demo model OllamaLLM(modelqwen2.5:0.5b) prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Compose demo analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke}) composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears}) print(result)print(\n-----------------------------------\n)# Parallel demo from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear}) print(result)print(\n-----------------------------------\n)# Route demo from langchain_core.prompts import PromptTemplate from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question {question} /questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser() )langchain_chain PromptTemplate.from_template(You are an expert in langchain. \ Always answer questions starting with As Harrison Chase told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \ Always answer questions starting with As Dario Amodei told me. \ Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b) general_chain PromptTemplate.from_template(Respond to the following question:Question: {question} Answer: ) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?}) print(result)print(\n-----------------------------------\n)# Branch demo from langchain_core.runnables import RunnableBranchbranch RunnableBranch((lambda x: anthropic in x[topic].lower(), anthropic_chain),(lambda x: langchain in x[topic].lower(), langchain_chain),general_chain, )full_chain {topic: chain, question: lambda x: x[question]} | branch result full_chain.invoke({question: how do I use Anthropic?}) print(result)print(\n-----------------------------------\n)# Dynamic demo from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text). contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),] ) contextualize_question contextualize_prompt | llm | StrOutputParser()chain def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}. ) qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})] )full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser() )result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),], }) print(result)print(\n-----------------------------------\n)J-LangChain实现上面实例 J-LangChain - 智能链构建 总结 LangChain的LCEL通过提供顺序链、嵌套链、并行链、路由和动态构建等功能为开发者构建复杂的语言任务提供了强大的工具。无论是简单的逻辑流还是复杂的动态决策LCEL都能高效地满足需求。通过合理使用这些功能开发者可以快速搭建高效、灵活的智能链为各种场景的应用提供支持。
http://www.hkea.cn/news/14420813/

相关文章:

  • 不用服务器做视频网站吗江苏九天建设有限公司网站
  • 大气网站欣赏可以做投票的网站
  • 有什么网站交互做的很好 知乎wordpress添加简码
  • 当当网电子商务网站建设自助建网站代理
  • 银川网站建设公司哪家好wordpress无觅
  • 洞口网站开发公司推荐怎么盗用网站
  • 如何防止网站被攻击网站建设色
  • 徐州苏视网站建设wordpress 网站积分打赏
  • 重庆网站seo建设哪家好短网址生成功能
  • 做网站价格多少全球热点app下载
  • 购买服务器后如何做网站单页网站cms
  • 南宁机关两学一做网站彩票网站开发定制
  • 动态做网站网页点击量统计
  • cpa网站建设自己能做网站吗
  • 怎么做盗号网站手机好看的旅游网站模版
  • 用js做网站阅读量域名备案需要多久
  • 51做网站建设企业官网公司logo效果图
  • 营销网站报备怎样设置网站
  • 常州网站定制wordpress新建页面显示数据库
  • 建设公司网站需要准备哪些材料黑龙江新闻
  • 用什么软件可以做网站宁城县建设局网站
  • 常州个性化网站建设杭州市网站建设公司
  • 用模板做企业网站做毕业设计的网站设计
  • 网站策划的基本过程垂直行业门户网站
  • 如何做网站赚一个服务器可以做多个网站吗
  • 做一个网站平台的流程是什么绍兴网站建设优化
  • 重庆网站建设的好处陕西省建设网官网陕西省建筑市场监督与诚信信息一体化平台
  • 山西路桥建设集团有限公司网站wordpress汉化免费企业主题
  • 网上做石材去哪个网站wordpress 批量导入用户名
  • 青岛网站建设平台seo常规优化