Your current code uses OpenAI’s API key to access the LLM service by default. I’d like to switch it to use a local LLM, which I have deployed through LLaMA-Factory and is accessible via a local API, for example, at http://localhost:7788/v1/. Could you guide me on how to make this adjustment? Thank you!
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too