Describe the bug
OpenAILIKE additional params are not supported like is_chat_model param...
To Reproduce
Configure the following llm module and use is_chat_model param like the following:
- node_type: generator
modules:
- batch: 2
module_type: llama_index_llm
llm: openailike
model: qwen-plus
is_chat_model: True
api_base: https://dashscope.aliyuncs.com/compatible-mode/v1
api_key: x
It will have error like 404 error with the endpoint '/v1/completions'. The error is because is_chat_model param is not passed into the model. If the model itself does not support /v1/completions endpoint, it will have error as it doesn't call /v1/chat/completions'.
Additional context
The root cause may be due to llm_instance params is using llm_class.init instead of the llm_class...
autorag/nodes/generator/llama_index_llm.py line 52 __init__ method
self.llm_instance: BaseLLM = llm_class(**pop_params(llm_class.__init__, kwargs))
When using openAILike, it is converting the OpenAI params with ignoring OpenAILike params....
Possible fix
if llm_class.class_name() == "OpenAILike":
llm_openaiLike_params = pop_params(llm_class, kwargs)
original_llm_params = pop_params(llm_class.__init__, kwargs)
llm_params = {**llm_openaiLike_params,**original_llm_params}
self.llm_instance: BaseLLM = llm_class(**llm_params)
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too