Skip to content

ollama and articles

Is it possible to add/use a local Ollama as a new provider for creating articles, @Sven ?
Would need to add my ollama IP:port and most stuff should be the same

Tagged:

Comments

  • SvenSven www.GSA-Online.de
    Sorry, but what is "Ollama" ?
  • rastarrrastarr Thailand
    Sven said:
    Sorry, but what is "Ollama" ?
    https://ollama.com/ similar to what other LLMs can do however you can do this all from your local homelab without any limitations. So similar to what is being done by Groq, ChatGpt etc. I use Ollama for a number of automations and would be great to see it implemented inside GSA SER
  • SvenSven www.GSA-Online.de
    just edit the openai.dat file and add your model with the url.
  • rastarrrastarr Thailand
    Sven said:
    just edit the openai.dat file and add your model with the url.
    Hmmm, well I found the file however I think there's more to it. From what I see, the base URL for existing models in this file is already set to openai somewhere else. So testing and adding like this:-

    [Models]
    gemma2:2b=Gemma2-2Bmodel ollama|http://<my IP for local ollama>:11434/
    gpt-3.5-turbo=Most capable GPT-3.5 model and optimized for chat at 1/10th the cost of text-davinci-003. Will be updated with our latest model iteration.|/v1/chat/completions

    ... isn't going to work, from what I see.

    Am I missing something obvious here @Sven ?
  • SvenSven www.GSA-Online.de
    That should be it. However, you might need to add the path as well.
    I don't know if the API syntax is the same as in OpenAI though.
  • rastarrrastarr Thailand
    for those wanting to use Ollama, the syntax is -
    llama3.2:latest=Llama3.2 ollama|http://<your ollama ip>:11434/v1/chat/completions
    works like a charm :)
    and thanks @sven for the help
  • SvenSven www.GSA-Online.de
    your welcome ;)
Sign In or Register to comment.