Back to Stack

Ensemble

Unified interface for multiple LLM providers

A TypeScript library that simplifies and standardizes AI model interactions across multiple providers.
With Ensemble you get streaming responses, tool calling, and cost tracking—all in one unified API.

Streaming responses with AsyncGenerator API

import { request } from "@just-every/ensemble";

// Stream responses from any provider
const stream = ensembleRequest('claude-3-5-sonnet-20241022', [
  { type: 'message', role: 'user', content: 'Hello, world!' }
]);

for await (const event of stream) {
  // Process streaming events
  console.log(event);
}

Multi-provider support

// Supports all major providers with the same API
const providers = [
  'claude-3-5-sonnet-20241022',  // Anthropic
  'gpt-4o',                      // OpenAI
  'gemini-pro',                  // Google
  'deepseek-chat',               // Deepseek
  'grok-beta',                   // xAI
  // ... and OpenRouter models
];

// Switch providers seamlessly
const stream = ensembleRequest(providers[0], messages);

Tool calling & conversation history

// Built-in tool calling support
const tools = [
  {
    name: 'get_weather',
    description: 'Get current weather',
    parameters: { city: 'string' }
  }
];

const stream = ensembleRequest('claude-3-5-sonnet-20241022', messages, { tools });

// Automatically converts streaming events to conversation history
// Supports early stream termination
// Flexible tool processing with custom logging

Key Features

Multi-provider support:Claude, OpenAI, Gemini, Deepseek, Grok, and OpenRouter.
AsyncGenerator API:streaming responses with early termination support.
Tool calling:built-in function calling with flexible processing.
Cost & quota tracking:monitor usage and spending across providers.
TypeScript native:full type safety and pluggable logging system.
npm i @just-every/ensembleSource