Describe the bug
With the provided QA creation script in the README, I tried to replace OpenAI model with Gemini. After modifying which model to use and my file paths, I ran the code and got this error: TypeError: ChatSession.send_message_async() got an unexpected keyword argument 'temperature'
.
To Reproduce
Steps to reproduce the behavior:
Gemini
from LlamaIndex llm
Expected behavior
Normal execution.
Full Error log
This happens in \llama_index\llms\gemini\base.py:233 in achat
Code that bug is happened
import pandas as pd
from llama_index.llms.gemini import Gemini
from autorag.data.qa.filter.dontknow import dontknow_filter_rule_based
from autorag.data.qa.generation_gt.llama_index_gen_gt import (
make_basic_gen_gt,
make_concise_gen_gt,
)
from autorag.data.qa.schema import Raw, Corpus
from autorag.data.qa.query.llama_gen_query import factoid_query_gen
from autorag.data.qa.sample import random_single_hop
from dotenv import load_dotenv
load_dotenv()
llm = Gemini(model="models/gemini-1.5-flash")
raw_df = pd.read_parquet("data/raw.parquet")
raw_instance = Raw(raw_df)
corpus_df = pd.read_parquet("data/corpus-semantic.parquet")
corpus_instance = Corpus(corpus_df, raw_instance)
initial_qa = (
corpus_instance.sample(random_single_hop, n=3)
.map(
lambda df: df.reset_index(drop=True),
)
.make_retrieval_gt_contents()
.batch_apply(
factoid_query_gen, # query generation
llm=llm,
)
.batch_apply(
make_basic_gen_gt, # answer generation (basic)
llm=llm,
)
.batch_apply(
make_concise_gen_gt, # answer generation (concise)
llm=llm,
)
.filter(
dontknow_filter_rule_based, # filter don't know
lang="vi",
)
)
initial_qa.to_parquet('data/qa.parquet', 'data/corpus.parquet')
Desktop (please complete the following information):
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too