Skip to content

Local AI support

I am running a local AI model (LLaMA) that is fully API-compatible with OpenAI. How can I configure a custom API URL?

Comments

  • SvenSven www.GSA-Online.de
    have a look in openai.dat file where you can add a model and it's corresponding URL.
  • adtradtr Vietnam
    I don't see where to edit the custom URL. Do you mean I should edit it here?
    gpt-3.5-turbo-16k=offers 4 times the context length of gpt-3.5-turbo at twice the price|/v1/chat/completions

    gpt-3.5-turbo-16k=offers 4 times the context length of gpt-3.5-turbo at twice the price|http://127.0.0.1:8888/v1/chat/completions



  • royalmiceroyalmice WEBSITE: ---> https://asiavirtualsolutions.com | SKYPE:---> asiavirtualsolutions
    edited January 14
    adtr said:
    I don't see where to edit the custom URL. Do you mean I should edit it here?

    In the openai.dat file, the [Models] section contains entries where you can define models and their corresponding URLs. Each line represents a model.
    1. Locate the [Models] section in the file.
    2. Add a new line for your model in the same format.
    my-custom-model=Description of your model|Custom API URL

    For example:
    llama-model=Local LLaMA model|http://localhost:5000/v1/completions

    Thanked by 1adtr
  • SvenSven www.GSA-Online.de
    exactly as @royalmice wrote, thanks!
    Thanked by 1royalmice
Sign In or Register to comment.