LLM
Overview
How to use LCEL at RAGchain
Basic Usage
from operator import itemgetter
from langchain.schema import StrOutputParser
from langchain.schema.runnable import RunnableLambda
from langchain.llms.openai import OpenAI
from RAGchain.schema import Passage
from RAGchain.retrieval import BM25Retrieval
prompt = RAGchainPromptTemplate.from_template("""
You have to answer question using given passages.
Question: {question}
Passages: {passages}
Answer:""")
bm25_retrieval = BM25Retrieval(save_path="your/path/bm25.pkl")
runnable = {
"question": itemgetter("question"),
"passages": itemgetter("passages") | RunnableLambda(lambda x: Passage.make_prompts(bm25_retrieval.retrieve(x))),
} | prompt | OpenAI() | StrOutputParser()
answer = runnable.invoke({"question": "your question"})Streaming answers
2. Chat History
Custom Prompt
Custom LLM
Last updated