Currently you have to use Ollama or other tools for running a server for local models, would be nice if ChatWise itself supports downloading and running models.
I think firstly I will add support for Apple MLX framework, since it's faster and this app currently focuses on macOS.
Pay now to fund the work behind this issue.
Get updates on progress being made.
Maintainer is rewarded once the issue is completed.
You're funding impactful open source efforts
You want to contribute to this effort
You want to get funding like this too