RAG issue | loading the model in RetrievalQASourceChain
#2
by
iammano
- opened
Hi there,
I'm getting this error when I try to pass the model to the llm parameter in langchain's RetrievalQASourceChain parameter, Below is my code.
def __init__(self, **kwargs):
"""
Initializes the QA bot with a Qdrant client, embeddings, and a language model.
Parameters:
- **kwargs: Additional keyword arguments for configuration.
"""
self.client = QdrantClient(url=HOST_NAME,api_key=API_KEY)
self.embedding = get_embedding()
self.device = "cuda:0"
if torch.cuda.is_available():
print("Inside GPU")
self.llm = AutoModelForCausalLM.from_pretrained(
HF_MODEL,
low_cpu_mem_usage=True,
device_map=self.device
)
else:
self.llm = CTransformers(model=MODEL_PATH,
model_type=MODEL_TYPE,
config=MODEL_PARAMETERS)
def answer(query):
qa = RetrievalQAWithSourcesChain.from_chain_type(
llm=self.llm,
retriever=retriever,
chain_type_kwargs={
"prompt": my_prompt
},
device_map=self.device,
reduce_k_below_max_tokens=True,
return_source_documents=True)
These are methods of a class that I have shared partial code here. Thanks for your understanding.
Error Message:
Traceback (most recent call last):
File "/home/manoranjan.n/doc_test/backend/main.py", line 68, in ask_model
result = qabot.answer(data['question'])
File "/home/manoranjan.n/doc_test/backend/qa_bot.py", line 227, in answer
qa = RetrievalQAWithSourcesChain.from_chain_type(
File "/home/manoranjan.n/.local/lib/python3.8/site-packages/langchain/chains/qa_with_sources/base.py", line 85, in from_chain_type
combine_documents_chain = load_qa_with_sources_chain(
File "/home/manoranjan.n/.local/lib/python3.8/site-packages/langchain/chains/qa_with_sources/loading.py", line 183, in load_qa_with_sources_chain
return _func(llm, verbose=verbose, **kwargs)
File "/home/manoranjan.n/.local/lib/python3.8/site-packages/langchain/chains/qa_with_sources/loading.py", line 62, in _load_stuff_chain
llm_chain = LLMChain(llm=llm, prompt=prompt, verbose=verbose)
File "/home/manoranjan.n/.local/lib/python3.8/site-packages/langchain_core/load/serializable.py", line 107, in __init__
super().__init__(**kwargs)
File "pydantic/main.py", line 341, in pydantic.main.BaseModel.__init__
pydantic.error_wrappers.ValidationError: 2 validation errors for LLMChain
llm
instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable)
llm
instance of Runnable expected (type=type_error.arbitrary_type; expected_arbitrary_type=Runnable)
Somekind of validation error, please help me on this.