NavigationContentFooter
Jump toSuggest an edit
Was this page helpful?

Integrating Scaleway Generative APIs with popular AI tools

Reviewed on 18 February 2025Published on 18 February 2025

Scaleway’s Generative APIs are designed to provide easy access to the latest AI models and techniques. Our APIs are built on top of a robust infrastructure that ensures scalability, reliability, and security. With our APIs, you can integrate AI capabilities into your applications, such as text generation, image classification, and more.

Comparison of AI tools and librariesLink to this anchor

The following table compares AI tools and libraries supported by Scaleway’s Generative APIs:

Tool/LibraryDescriptionUse casesIntegration effort
OpenAI clientPopular AI library for natural language processingText generation, language translation, text summarizationLow
LangChainLibrary for building AI applications leveraging RAGInference, embeddings, document indexing and retrievalMedium
LlamaIndexLibrary for building advanced AI RAG applicationsKnowledge graph building, document retrieval, data indexingMedium
Continue DevIDE extension for AI-powered coding assistanceCode completion, code reviewLow
Zed AIIDE including AI-powered coding assistanceCode completion, code reviewLow
Chatbox AIDesktop client for generative APIs, available on Windows, Mac, LinuxAI copilot for documents, images, or codeLow
Open Web UIUser interface for chatbot applicationsCreating web chatbot interfaces, RAG agentsLow
cURL/PythonDirect HTTP API calls for custom integrationsCustom applications, data processingHigh
Note

The integration effort is subjective and may vary depending on the specific use case and requirements.

OpenAI client librariesLink to this anchor

Scaleway Generative APIs follow OpenAI’s API structure, making integration straightforward. To get started, you’ll need to install the OpenAI library and set up your API key.

ConfigurationLink to this anchor

To use the OpenAI client library with Scaleway’s Generative APIs, first install the required dependencies:

pip install openai

Then set the API key and base URL in your OpenAI-compatible client:

from openai import OpenAI
client = OpenAI(
base_url="https://api.scaleway.ai/v1",
api_key="<API secret key>"
)
Tip

Make sure to replace <API secret key> with your actual API key.

Using OpenAI client for text generationLink to this anchor

To use OpenAI client for text generation, you can create a client.chat.completions object and call the create method:

response = client.chat.completions.create(
model="llama-3.1-8b-instruct",
messages=[{"role": "user", "content": "Tell me a joke about AI"}]
)
print(response.choices[0].message.content)

LangChain (RAG & LLM applications)Link to this anchor

LangChain is a popular library for building AI applications. Scaleway’s Generative APIs support LangChain for both inference and embeddings.

PythonLink to this anchor

Tip

Refer to our dedicated documentation for implementing Retrieval-Augmented Generation (RAG) with LangChain and Scaleway Generative APIs

LlamaIndex (advanced RAG applications)Link to this anchor

LlamaIndex is an open-source framework for building Large Language Models (LLMs) based applications, especially optimizing RAG (Retrieval Augmented Generation) pipelines.

  1. Install the required dependencies to use the LlamaIndex framework with Scaleway’s Generative APIs:

    pip install llama-index-llms-openai-like
  2. Create a file main.py and add the following code to it to configure the OpenAILike client and your secret key:

    from llama_index.llms.openai_like import OpenAILike
    from llama_index.core.llms import ChatMessage
    llm = OpenAILike(
    model="llama-3.1-8b-instruct",
    api_key="<API secret key>",
    api_base="https://api.scaleway.ai/v1",
    max_tokens=512,
    temperature=0.7,
    top_p=1,
    presence_penalty=0,
    )
    Tip

    Make sure to replace <API secret key> with your actual API key.

  3. You can then interact with the LLM by sending messages to the model with the following code:

    response = llm.chat([ChatMessage("Could you tell me about Scaleway please ?")])
    print(response)
  4. Finally, run main.py:

    python main.py

    The LLM response should display an answer:

    Generally, Scaleway is a reliable and secure cloud provider that offers a range of services for businesses and developers.

Javascript (Typescript)Link to this anchor

To perform chat conversations with Langchain, first install langchain and @langchain/openai packages using your node package manager.

  1. Use the following command to install Langchain using npm (yarn and pnpm are also available):

    npm install langchain @langchain/openai
  2. Edit your package.json file to ensure it has the "type": "module" property:

    {
    "type": "module",
    "dependencies": {
    "@langchain/openai": "^0.4.4",
    "langchain": "^0.3.19"
    }
    }
  3. Create a main.js file and add the following content to it:

    import { ChatOpenAI } from "@langchain/openai";
    const chat = new ChatOpenAI({
    apiKey: "<API secret key>",
    model: "llama-3.1-8b-instruct",
    configuration: {
    baseURL: "https://api.scaleway.ai/v1",
    }
    });
    const response = await chat.invoke("Tell me a joke");
    console.log(response.content);
    Tip

    Make sure to replace <API secret key> with your actual API secret key.

  4. Run main.js:

    node main.js

    The model answer should display:

    Why couldn't the bicycle stand up by itself? Because it was two-tired.

Note that other Langchain objects from OpenAI client library are also compatible, such as OpenAI and OpenAIEmbeddings.

Continue Dev (AI coding assistance)Link to this anchor

Continue Dev is a library that provides AI-powered coding assistance. Scaleway’s Generative APIs support Continue Dev for code completion and more.

Tip

Refer to our dedicated documentation for

  • Integrating Continue Dev with Visual Studio Code
  • Integrating Continue Dev with IntelliJ IDEA

Zed AI (coding assistance)Link to this anchor

Zed is an IDE (Integrated Development Environment) including AI coding assistance support. Scaleway’s Generative APIs supports Zed AI code completion and more.

Before you startLink to this anchor

To complete the actions presented below, you must have:

  • A Scaleway account logged into the console
  • Owner status or IAM permissions allowing you to perform actions in the intended Organization
  • A valid API key for API authentication
  • Installed Zed on your local machine

Configure custom endpoints and modelsLink to this anchor

  1. Edit Zed settings located in settings.json, and add the following content to it:
{
"language_models": {
"openai": {
"api_url": "https://api.scaleway.ai/v1",
"available_models": [
{
"name": "qwen2.5-coder-32b-instruct",
"display_name": "Qwen 2.5 Coder 32B",
"max_tokens": 128000
}
],
"version": "1"
}
},
"assistant": {
"default_model": {
"provider": "openai",
"model": "qwen2.5-coder-32b-instruct"
},
"version": "2"
}
}

This configuration will add a qwen2.5-coder-32b-instruct Scaleway hosted model available with the Zed openai provider, and use it as default model.

  1. Open AI Assistant configuration by either using the command palette and typing assistant: show configuration or clicking on the bottom right Assistant Panel button and then Assistant menu in top right and finally Configure.

  2. Scroll down to the OpenAI Configuration, and paste your Scaleway secret key as API Key credentials. Note that this key will be deleted if you restart Zed. To store it permanently, set up your Scaleway secret key as OPENAI_API_KEY environment variable and restart Zed.

Your setup is complete. If you open a new chat and select Qwen 2.5 Coder 32B model, you can send text and retrieve model answers. Additionally, you can also use Inline Assist feature when editing your code.

Chatbox AILink to this anchor

Chatbox AI is a powerful AI client and smart assistant, compatible with Scaleway’s Generative APIs service. It is available across multiple platforms, including Windows, macOS, Android, iOS, Web, and Linux.

Tip

Refer to our dedicated documentation for installing and configuring Chatbox AI with Generative APIs

Open WebUILink to this anchor

Open WebUI is an open-source, self-hosted user interface designed for interacting with large language models (LLMs) through a browser. It offers an intuitive chat-based experience, similar to ChatGPT, making it simple to work with AI models locally or through API integrations. Fully compatible with Scaleway’s Generative APIs, Open WebUI enables users to deploy and manage an AI chat application with little effort.

Tip

Follow our guide on installing and configuring Open WebUI with Generative APIs to get started.

Custom HTTP integrationsLink to this anchor

You can interact with Scaleway’s Generative APIs directly using any HTTP client.

cURL exampleLink to this anchor

To use cURL with Scaleway’s Generative APIs, you can use the following command:

curl https://api.scaleway.ai/v1/chat/completions \
-H "Authorization: Bearer <API secret key>" \
-H "Content-Type: application/json" \
-d '{
"model": "llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "What is quantum computing?"}]
}'
Tip

Make sure to replace <API secret key> with your actual API key.

Python HTTP exampleLink to this anchor

To perform HTTP requests with Scaleway’s Generative APIs, install the requests dependency:

pip install requests

Then, you can use the following code:

import requests
headers = {
"Authorization": "Bearer <API secret key>",
"Content-Type": "application/json"
}
data = {
"model": "llama-3.1-8b-instruct",
"messages": [{"role": "user", "content": "Explain black holes"}]
}
response = requests.post("https://api.scaleway.ai/v1/chat/completions", json=data, headers=headers)
print(response.json()["choices"][0]["message"]["content"])
Tip

Make sure to replace <API secret key> with your actual API key.

Was this page helpful?
API DocsScaleway consoleDedibox consoleScaleway LearningScaleway.comPricingBlogCareers
© 2023-2025 – Scaleway