What service can I use to ask questions about my database of blog posts? “Tell me everything you know about Grideon, my fictional character” etc

  • Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    2
    ·
    3 days ago

    That’s a great start! A lot of it depends on OpenAI, is there any guide you know of that lets me run completely locally? I use TabbyAPI for most of my inference, and happy to run anything else for training

    • Danitos@reddthat.com
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      3 days ago

      It would work the same way, you would just need to connect with your local model. For example, change the code to find the embeddings with your local model, and store that in Milvus. After that, do the inference calling your local model.

      I’ve not used inference with local API, can’t help with that, but for embeddings, I used this model and it worked quite fast, plus was a top2 model in Hugging Face. Leaderboard. Model.

      I didn’t do any training, just simple embed+interference.