Ollama Integration

Ollama Integration

1 minute read.

By Team Aquin


Ollama Integration

"Hugging Face and Ollama Integration Tutorial" Watch on YouTube →

"Hugging Face and Ollama Integration Tutorial" Watch on X →

Connect Your Ollama Models to Aquin

Ollama Models, Aquin's Interface

If you're running Ollama locally, you can now connect it directly to Aquin and access all your models inside of Aquin too!

  • Use existing Ollama setup - no re-downloading models
  • Access through Aquin's UI - floating assistant, cursor follow, and so much more
  • Keep everything local - privacy first workflow
  • Switch models instantly - all your Ollama models available

Steps:

  1. Install Ollama
  2. Run Ollama
  3. Open Aquin settings
  4. Connect to Ollama
  5. Select any Ollama model
  6. Start using

Done.

Get Aquin's feature set with Ollama models.

Ollama: Local model management, privacy, control

Aquin: Beautiful UI, advanced features, workflow integration

Together: Perfect local AI setup.

Ollama Integration | Aquin