• 0 Posts
  • 3 Comments
Joined 2 years ago
cake
Cake day: June 23rd, 2023

help-circle
  • I’ve set up OpenWebUI with the docker containers, which includes Ollama in API mode, and optionally Playwright if you want to add webscraping to your RAG queries. This gives you a ChatJippity format webpage that you can manage your models for Ollama, and add OpenAI usage as well if you want. You can manage all the users as well.

    On top, then you have API support to your own Ollama instance, and you can also configure GPU usage for your local AI if available.

    Honestly, it’s the easiest way to get local AI.