开发手机网站教程,可以上传自己做的视频的网站,南昌网上服务,网站主机一般选哪种的介绍
j-langchain是一个Java版的LangChain开发框架#xff0c;旨在简化和加速各类大模型应用在Java平台的落地开发。它提供了一组实用的工具和类#xff0c;使得开发人员能够更轻松地构建类似于LangChain的Java应用程序。
依赖
Maven
dependencygroupIdi…介绍
j-langchain是一个Java版的LangChain开发框架旨在简化和加速各类大模型应用在Java平台的落地开发。它提供了一组实用的工具和类使得开发人员能够更轻松地构建类似于LangChain的Java应用程序。
依赖
Maven
dependencygroupIdio.github.flower-trees/groupIdartifactIdj-langchain/artifactIdversion1.0.1-preview/version
/dependencyGradle
implementation io.github.flower-trees:j-langchain:1.0.1-previewNotes:
系统基于salt-function-flow流程编排框架开发具体语法可 参考。
智能链构建
顺序调用
LangChain实现
from langchain_ollama import OllamaLLM
from langchain_core.prompts import ChatPromptTemplate
from langchain_core.output_parsers import StrOutputParsermodel OllamaLLM(modelqwen2.5:0.5b)
prompt ChatPromptTemplate.from_template(tell me a joke about {topic})chain prompt | model | StrOutputParser()result chain.invoke({topic: bears})
print(result)J-LangChain实现
Component
public class ChainBuildDemo {AutowiredChainActor chainActor;public void SimpleDemo() {BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});ChatOpenAI chatOpenAI ChatOpenAI.builder().model(gpt-4).build();FlowInstance chain chainActor.builder().next(prompt).next(oll).next(new StrOutputParser()).build();ChatGeneration result chainActor.invoke(chain, Map.of(topic, bears));System.out.println(result);}
}分支路由
J-LangChain实现
public void SwitchDemo() {BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});ChatOllama chatOllama ChatOllama.builder().model(llama3:8b).build();ChatOpenAI chatOpenAI ChatOpenAI.builder().model(gpt-4).build();FlowInstance chain chainActor.builder().next(prompt).next(Info.c(vendor ollama, chatOllama),Info.c(vendor chatgpt, chatOpenAI),Info.c(input - sorry, I dont know how to do that)).next(new StrOutputParser()).build();Generation result chainActor.invoke(chain, Map.of(topic, bears, vendor, ollama));System.out.println(result);
}组合嵌套
LangChain实现
analysis_prompt ChatPromptTemplate.from_template(is this a funny joke? {joke})
composed_chain {joke: chain} | analysis_prompt | model | StrOutputParser()result composed_chain.invoke({topic: bears})
print(result)J-LangChain实现
public void ComposeDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();StrOutputParser parser new StrOutputParser();BaseRunnableStringPromptValue, ? prompt PromptTemplate.fromTemplate(tell me a joke about ${topic});FlowInstance chain chainActor.builder().next(prompt).next(llm).next(parser).build();BaseRunnableStringPromptValue, ? analysisPrompt PromptTemplate.fromTemplate(is this a funny joke? ${joke});FlowInstance analysisChain chainActor.builder().next(chain).next(input - Map.of(joke, ((Generation)input).getText())).next(analysisPrompt).next(llm).next(parser).build();ChatGeneration result chainActor.invoke(analysisChain, Map.of(topic, bears));System.out.println(result);}并行执行
LangChain实现
from langchain_core.runnables import RunnableParalleljoke_chain ChatPromptTemplate.from_template(tell me a joke about {topic}) | model
poem_chain ChatPromptTemplate.from_template(write a 2-line poem about {topic}) | modelparallel_chain RunnableParallel(jokejoke_chain, poempoem_chain)result parallel_chain.invoke({topic: bear})
print(result)J-LangChain实现
public void ParallelDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();BaseRunnableStringPromptValue, ? joke PromptTemplate.fromTemplate(tell me a joke about ${topic});BaseRunnableStringPromptValue, ? poem PromptTemplate.fromTemplate(write a 2-line poem about ${topic});FlowInstance jokeChain chainActor.builder().next(joke).next(llm).build();FlowInstance poemChain chainActor.builder().next(poem).next(llm).build();FlowInstance chain chainActor.builder().concurrent((IResultMapString, String) (iContextBus, isTimeout) - {AIMessage jokeResult iContextBus.getResult(jokeChain.getFlowId());AIMessage poemResult iContextBus.getResult(poemChain.getFlowId());return Map.of(joke, jokeResult.getContent(), poem, poemResult.getContent());}, jokeChain, poemChain).build();MapString, String result chainActor.invoke(chain, Map.of(topic, bears));System.out.println(JsonUtil.toJson(result));}动态路由
LangChain实现 通过 RunnableLambda 实现动态路由
from langchain_core.prompts import PromptTemplate
from langchain_core.runnables import RunnableLambdachain (PromptTemplate.from_template(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question
{question}
/questionClassification:)| OllamaLLM(modelqwen2.5:0.5b)| StrOutputParser()
)langchain_chain PromptTemplate.from_template(You are an expert in langchain. \
Always answer questions starting with As Harrison Chase told me. \
Respond to the following question:Question: {question}
Answer:
) | OllamaLLM(modelqwen2.5:0.5b)
anthropic_chain PromptTemplate.from_template(You are an expert in anthropic. \
Always answer questions starting with As Dario Amodei told me. \
Respond to the following question:Question: {question}
Answer:
) | OllamaLLM(modelqwen2.5:0.5b)
general_chain PromptTemplate.from_template(Respond to the following question:Question: {question}
Answer:
) | OllamaLLM(modelqwen2.5:0.5b)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?})
print(result)def route(info):if anthropic in info[topic].lower():return anthropic_chainelif langchain in info[topic].lower():return langchain_chainelse:return general_chainfrom langchain_core.runnables import RunnableLambdafull_chain {topic: chain, question: lambda x: x[question]} | RunnableLambda(route)result full_chain.invoke({question: how do I use LangChain?})
print(result)J-LangChain实现
public void RouteDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();BaseRunnableStringPromptValue, Object prompt PromptTemplate.fromTemplate(Given the user question below, classify it as either being about LangChain, Anthropic, or Other.Do not respond with more than one word.question${question}/questionClassification:);FlowInstance chain chainActor.builder().next(prompt).next(llm).next(new StrOutputParser()).build();FlowInstance langchainChain chainActor.builder().next(PromptTemplate.fromTemplate(You are an expert in langchain. \Always answer questions starting with As Harrison Chase told me. \Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance anthropicChain chainActor.builder().next(PromptTemplate.fromTemplate(You are an expert in anthropic. \Always answer questions starting with As Dario Amodei told me. \Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance generalChain chainActor.builder().next(PromptTemplate.fromTemplate(Respond to the following question:Question: ${question}Answer:)).next(ChatOllama.builder().model(llama3:8b).build()).build();FlowInstance fullChain chainActor.builder().next(chain).next(input - Map.of(topic, input, question, ((Map?, ?)ContextBus.get().getFlowParam()).get(question))).next(Info.c(topic anthropic, anthropicChain),Info.c(topic langchain, langchainChain),Info.c(generalChain)).build();AIMessage result chainActor.invoke(fullChain, Map.of(question, how do I use Anthropic?));System.out.println(result.getContent());}动态构建
LangChain实现
from langchain_core.runnables import chain, RunnablePassthroughllm OllamaLLM(modelqwen2.5:0.5b)contextualize_instructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text).
contextualize_prompt ChatPromptTemplate.from_messages([(system, contextualize_instructions),(placeholder, {chat_history}),(human, {question}),]
)
contextualize_question contextualize_prompt | llm | StrOutputParser()chain
def contextualize_if_needed(input_: dict):if input_.get(chat_history):return contextualize_questionelse:return RunnablePassthrough() | itemgetter(question)chain
def fake_retriever(input_: dict):return egypts population in 2024 is about 111 millionqa_instructions (Answer the user question given the following context:\n\n{context}.
)
qa_prompt ChatPromptTemplate.from_messages([(system, qa_instructions), (human, {question})]
)full_chain (RunnablePassthrough.assign(questioncontextualize_if_needed).assign(contextfake_retriever)| qa_prompt| llm| StrOutputParser()
)result full_chain.invoke({question: what about egypt,chat_history: [(human, whats the population of indonesia),(ai, about 276 million),],
})
print(result)J-LangChain实现
public void DynamicDemo() {ChatOllama llm ChatOllama.builder().model(llama3:8b).build();String contextualizeInstructions Convert the latest user question into a standalone question given the chat history. Dont answer the question, return the question and nothing else (no descriptive text).;BaseRunnableChatPromptValue, Object contextualizePrompt ChatPromptTemplate.fromMessages(List.of(Pair.of(system, contextualizeInstructions),Pair.of(placeholder, ${chatHistory}),Pair.of(human, ${question})));FlowInstance contextualizeQuestion chainActor.builder().next(contextualizePrompt).next(llm).next(new StrOutputParser()).build();FlowInstance contextualizeIfNeeded chainActor.builder().next(Info.c(chatHistory ! null, contextualizeQuestion),Info.c(input - Map.of(question, ((MapString, String)input).get(question)))).build();String qaInstructions Answer the user question given the following context:\n\n${context}.;BaseRunnableChatPromptValue, Object qaPrompt ChatPromptTemplate.fromMessages(List.of(Pair.of(system, qaInstructions),Pair.of(human, ${question})));FlowInstance fullChain chainActor.builder().all((iContextBus, isTimeout) - Map.of(question, iContextBus.getResult(contextualizeIfNeeded.getFlowId()).toString(),context, iContextBus.getResult(fakeRetriever)),Info.c(contextualizeIfNeeded),Info.c(input - egypts population in 2024 is about 111 million).cAlias(fakeRetriever)).next(qaPrompt).next(input - {System.out.println(JsonUtil.toJson(input)); return input;}).next(llm).next(new StrOutputParser()).build();ChatGeneration result chainActor.invoke(fullChain,Map.of(question, what about egypt,chatHistory,List.of(Pair.of(human, whats the population of indonesia),Pair.of(ai, about 276 million))));System.out.println(result);}