Skip to main content
Edgee provides an OpenAI-compatible API, which means you can use the official OpenAI SDKs for TypeScript and Python with Edgee. This allows you to leverage Edgee’s routing, observability, and cost tracking features without changing your existing code.

Why Use OpenAI SDK with Edgee?

  • Up to 50% Cost Reduction: Automatic token compression when enabled via headers or console
  • No Code Changes: Use your existing OpenAI SDK code as-is
  • Multi-Provider Access: Route to OpenAI, Anthropic, Google, and more through one API
  • Automatic Failover: Built-in reliability with fallback providers
  • Cost Tracking: Real-time visibility into token usage in the Edgee dashboard
  • Observability: Request tracing, compression metrics, and logging across all providers

Installation

Install the OpenAI SDK for your preferred language:
npm install openai

Configuration

Configure the OpenAI SDK to use Edgee’s API endpoint:
import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY, // Your Edgee API key
});

async function main() {
  const completion = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [
      { role: "user", content: "What is the capital of France?" }
    ],
  });

  console.log(completion.choices[0].message.content);
}

main();

Token Usage Tracking

Access standard OpenAI token usage metrics in every response:
import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY,
});

const completion = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [
    { role: "user", content: "Summarize this long document..." }
  ],
});

console.log(completion.choices[0].message.content);
console.log(`Prompt tokens: ${completion.usage?.prompt_tokens}`);
console.log(`Completion tokens: ${completion.usage?.completion_tokens}`);
console.log(`Total tokens: ${completion.usage?.total_tokens}`);
When compression is enabled, prompt_tokens reflects the compressed token count. View detailed compression metrics in the Edgee dashboard.

Compression & Tags via Headers

When using the OpenAI SDK with Edgee, you can control token compression and add tags using HTTP headers:

Enabling Compression

import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY,
  defaultHeaders: {
    "x-edgee-enable-compression": "true",
    "x-edgee-compression-rate": "0.8", // Target 80% compression (0.0-1.0)
  },
});

// All requests will use compression with 80% target rate
const completion = await openai.chat.completions.create({
  model: "gpt-4o",
  messages: [
    { role: "user", content: "Analyze this long document..." }
  ],
});

Adding Tags for Analytics

Combine compression with tags to track requests in your dashboard:
import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY,
  defaultHeaders: {
    "x-edgee-enable-compression": "true",
    "x-edgee-compression-rate": "0.8",
    "x-edgee-tags": "production,openai-sdk,user-123",
  },
});
Available Headers:
HeaderTypeDescription
x-edgee-enable-compression"true" or "false"Enable token compression for requests (overrides console settings)
x-edgee-compression-ratestringTarget compression rate (0.0-1.0, default 0.75)
x-edgee-tagsstringComma-separated tags for analytics and filtering
You can also enable compression organization-wide or per API key in the Edgee console. Headers override console settings for specific requests.

Advanced Usage

Function Calling (Tools)

Edgee fully supports OpenAI’s function calling interface:
import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY,
});

async function main() {
  const completion = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [
      { role: "user", content: "What's the weather in Paris?" }
    ],
    tools: [
      {
        type: "function",
        function: {
          name: "get_weather",
          description: "Get the current weather for a location",
          parameters: {
            type: "object",
            properties: {
              location: {
                type: "string",
                description: "City name",
              },
            },
            required: ["location"],
          },
        },
      },
    ],
    tool_choice: "auto",
  });

  if (completion.choices[0].message.tool_calls) {
    console.log("Tool calls:", completion.choices[0].message.tool_calls);
  } else {
    console.log("Response:", completion.choices[0].message.content);
  }
}

main();

Streaming Responses

Edgee supports streaming responses for real-time token delivery:
import OpenAI from "openai";

const openai = new OpenAI({
  baseURL: "https://api.edgee.ai/v1",
  apiKey: process.env.EDGEE_API_KEY,
});

async function main() {
  const stream = await openai.chat.completions.create({
    model: "gpt-4o",
    messages: [
      { role: "user", content: "Tell me a short story" }
    ],
    stream: true,
  });

  for await (const chunk of stream) {
    const content = chunk.choices[0]?.delta?.content || "";
    process.stdout.write(content);
  }
}

main();

Migration from OpenAI

If you’re already using the OpenAI SDK, migrating to Edgee is straightforward:
  1. Change the base URL: Update baseURL from https://api.openai.com/v1 to https://api.edgee.ai/v1
  2. Update API key: Use your Edgee API key instead of your OpenAI key
  3. That’s it! Your existing code will work without any other changes
All OpenAI SDK features are supported, including streaming, function calling, and response formatting. Edgee maintains full compatibility with the OpenAI API specification.

What’s Next?