Skip to navigationSkip to main contentSkip to footerScaleway Docs

Deploy LibreChat with Scaleway Generative APIs as a ChatGPT alternative

docker
ai
generative
apis

LibreChat is a free, open-source AI chat platform that unifies interactions with various AI models through a single, customizable interface. It supports multiple AI providers, including Scaleway's Generative APIs, allowing users to switch seamlessly between different models.

Before you start

To complete the actions presented below, you must have:

  1. Clone the LibreChat repository from GitHub:

    git clone https://github.com/danny-avila/LibreChat.git
  2. Copy the .env.example file into a file called .env:

    cp .env.example .env
  3. Add a docker-compose.override.yml file at the root of the project:

    touch docker-compose.override.yml
  4. Open the docker-compose.override.yml file in a text editor and add the following content. Then save the file and exit the editor:

    services:
      api:
        volumes:
        - type: bind
          source: ./librechat.yaml
          target: /app/librechat.yaml
  5. Add a librechat.yaml file at the root of the project:

    touch librechat.yaml
  6. Open the librechat.yaml file in a text editor and copy the configuration below into it. Then save and exit the editor.

    version: 1.2.1
    cache: true
    # Definition of a custom endpoint for Scaleway
    endpoints:
      custom:
        - name: 'Scaleway'
          apiKey: '<YOUR_SCW_SECRET_API_KEY>'
          baseURL: "https://api.scaleway.ai/v1"
          models:
            default:
              - "llama-3.3-70b-instruct"
            fetch: true
          titleConvo: true
          titleModel: 'Llama-3.3-70b-instruct'
          modelDisplayLabel: 'Scaleway'
    Important

    Make sure that you replace <YOUR_SCW_SECRET_API_KEY> with your own secret API key.

  7. Launch the application with Docker:

    docker compose up -d
  8. Open a web browser and access LibreChat at http://localhost:3080/:

    # Example of opening a browser (may vary by OS)
    open http://localhost:3080/
Questions?

Visit our Help Center and find the answers to your most frequent questions.

Visit Help Center
No Results