PHPackages                             omgbwa-yasse/aibridge - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. omgbwa-yasse/aibridge

ActiveLibrary[API Development](/categories/api)

omgbwa-yasse/aibridge
=====================

Laravel / Ai Bridge - Package Laravel unifié pour interagir avec OpenAI, Ollama, Onn, Gemini, Grok, Claude.

v2.6.0(7mo ago)122MITPHPPHP &gt;=8.1

Since Aug 13Pushed 7mo agoCompare

[ Source](https://github.com/omgbwa-yasse/AiBridge)[ Packagist](https://packagist.org/packages/omgbwa-yasse/aibridge)[ RSS](/packages/omgbwa-yasse-aibridge/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (5)Versions (5)Used By (0)

AiBridge
========

[](#aibridge)

Unified Laravel package for interacting with multiple LLM APIs (OpenAI, Ollama, Gemini, Claude, Grok, etc.) with complete support for:

- 💬 **Conversational chat** with history
- 🌊 **Real-time streaming**
- 🔍 **Embeddings** for semantic search
- 🎨 **Image generation** (DALL-E, Stable Diffusion via Ollama)
- 🔊 **Audio** (Text-to-Speech and Speech-to-Text)
- 📋 **Structured output** (JSON mode with schema validation)
- 🛠️ **Function calling** native and generic
- 🎯 **Extensible system tools**
- 🔧 **Laravel Facade** `AiBridge` for simplified access

> ✅ **Status**: Stable - Consolidated API after fixes (v1.Notes:

- Model IDs and capabilities depend on OpenRouter routing. Choose models accordingly.
- The Referer/Title headers are optional but recommended to surface your app in OpenRouter's ecosystem.

### Mistral AI

[](#mistral-ai)

Mistral AI provides an OpenAI-compatible API at . AiBridge includes a dedicated `MistralProvider` that extends OpenAI compatibility with Mistral-specific endpoints.

Environment example:

```
MISTRAL_API_KEY=your-mistral-key
# Optional override (defaults to https://api.mistral.ai/v1/chat/completions)
# MISTRAL_ENDPOINT=https://api.mistral.ai/v1/chat/completions
```

Usage examples (PHP):

```
use AiBridge\Facades\AiBridge;

// Chat
$res = AiBridge::chat('mistral', [
    ['role' => 'user', 'content' => 'Explain quantum computing in simple terms']
], [ 'model' => 'mistral-small-latest' ]);
echo $res['choices'][0]['message']['content'] ?? '';

// Streaming
foreach (AiBridge::stream('mistral', [
    ['role' => 'user', 'content' => 'Write a haiku about AI']
], [ 'model' => 'mistral-medium-latest' ]) as $chunk) {
    echo $chunk;
}

// Embeddings
$emb = AiBridge::embeddings('mistral', [
    'hello world',
    'bonjour le monde'
], [ 'model' => 'mistral-embed' ]);
$vectors = $emb['embeddings'];
```

Supported models:

- `mistral-small-latest` - Fast and efficient for everyday tasks
- `mistral-medium-latest` - Balanced performance and capability
- `mistral-large-latest` - Most capable model for complex tasks
- `mistral-embed` - Embedding model for semantic search

Get your API key at:

### Models (list/retrieve) with OpenAI-compatible endpoints## Installation

[](#models-listretrieve-with-openai-compatible-endpoints-installation)

```
composer require omgbwa-yasse/aibridge
```

### Configuration

[](#configuration)

Publish the configuration file:

```
php artisan vendor:publish --provider="AiBridge\AiBridgeServiceProvider" --tag=config
```

### Environment Variables

[](#environment-variables)

Configure your API keys in `.env`:

```
# OpenAI
OPENAI_API_KEY=sk-...

# Other providers
OLLAMA_ENDPOINT=http://localhost:11434
GEMINI_API_KEY=...
CLAUDE_API_KEY=...
GROK_API_KEY=...
ONN_API_KEY=...
# Mistral AI
MISTRAL_API_KEY=...
# Optional override (defaults to https://api.mistral.ai/v1/chat/completions)
# MISTRAL_ENDPOINT=https://api.mistral.ai/v1/chat/completions
# OpenRouter
OPENROUTER_API_KEY=...
# Optional override (defaults to https://openrouter.ai/api/v1)
# OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
# Optional app discovery headers
# OPENROUTER_REFERER=https://your-app.example.com
# OPENROUTER_TITLE=Your App Name

# Ollama Turbo (SaaS)
OLLAMA_TURBO_API_KEY=...
# Optional override (defaults to https://ollama.com)
# OLLAMA_TURBO_ENDPOINT=https://ollama.com

# Custom providers (Azure OpenAI, etc.)
OPENAI_CUSTOM_API_KEY=...
OPENAI_CUSTOM_BASE_URL=https://your-azure-openai.openai.azure.com
OPENAI_CUSTOM_AUTH_HEADER=api-key
OPENAI_CUSTOM_AUTH_PREFIX=

# HTTP Configuration
LLM_HTTP_TIMEOUT=30
LLM_HTTP_RETRY=1
LLM_HTTP_RETRY_SLEEP=200
```

Basic Usage
-----------

[](#basic-usage)

### Access via Laravel Container

[](#access-via-laravel-container)

Get the manager directly from the container:

```
$manager = app('AiBridge'); // AiBridge\AiBridgeManager instance
$resp = $manager->chat('openai', [
    ['role' => 'user', 'content' => 'Hello']
]);
```

Register a custom provider at runtime (advanced):

```
$manager->registerProvider('myprov', new MyProvider());
```

Or via dependency injection:

```
use AiBridge\AiBridgeManager;

class MyService
{
    public function __construct(private AiBridgeManager $ai) {}

    public function run(): array {
        return $this->ai->chat('openai', [
            ['role' => 'user', 'content' => 'Hello']
        ]);
    }
}
```

### Basic Chat with Facade

[](#basic-chat-with-facade)

```
use AiBridge\Facades\AiBridge;

$res = AiBridge::chat('openai', [
    ['role' => 'user', 'content' => 'Hello, who are you?']
]);
$text = $res['choices'][0]['message']['content'] ?? '';
```

### Laravel Alias (Optional)

[](#laravel-alias-optional)

The `AiBridge` facade is available via auto-discovery. For a custom alias, add to `config/app.php`:

```
'aliases' => [
    // ...
    'AI' => AiBridge\Facades\AiBridge::class,
],
```

### Normalized Response

[](#normalized-response)

```
use AiBridge\Support\ChatNormalizer;

$raw = AiBridge::chat('openai', [
    ['role' => 'user', 'content' => 'Hello']
]);
$normalized = ChatNormalizer::normalize($raw);
echo $normalized['text'];
```

Advanced Features
-----------------

[](#advanced-features)

### Fluent text builder (v2.1+)

[](#fluent-text-builder-v21)

Prefer short, explicit methods instead of large option arrays when generating text:

```
use AiBridge\Facades\AiBridge;

$out = AiBridge::text()
    ->using('claude', 'claude-3-5-sonnet-20240620', [ 'api_key' => getenv('CLAUDE_API_KEY') ])
    ->withSystemPrompt('You are concise.')
    ->withPrompt('Explain gravity in one sentence.')
    ->withMaxTokens(64)
    ->usingTemperature(0.2)
    ->asText();

echo $out['text'];
```

- `using(provider, model, config)` sets the provider, model, and optional per-call config (`api_key`, `endpoint`, `base_url`, ...).
- `withPrompt` appends a user message; `withSystemPrompt` prepends a system message.
- `withMaxTokens`, `usingTemperature`, `usingTopP` control generation.
- `asText()` returns a normalized array with `text`, `raw`, `usage`, and `finish_reason`.
- `asRaw()` returns the raw provider payload; `asStream()` yields string chunks.

This complements the classic API and can reduce errors versus large option arrays.

### Streaming Output (builder)

[](#streaming-output-builder)

Show model responses as they generate:

```
use AiBridge\Facades\AiBridge;

$stream = AiBridge::text()
    ->using('openai', 'gpt-4o', ['api_key' => getenv('OPENAI_API_KEY')])
    ->withPrompt('Tell me a short story about a brave knight.')
    ->asStream();

foreach ($stream as $chunk) {
    // $chunk is AiBridge\Support\StreamChunk
    echo $chunk->text;
    if (function_exists('ob_flush')) { @ob_flush(); }
    if (function_exists('flush')) { @flush(); }
}
```

Laravel controller (Server-Sent Events):

```
use Illuminate\Http\Response;
use AiBridge\Facades\AiBridge;

return response()->stream(function() {
    $stream = AiBridge::text()
        ->using('openai', 'gpt-4o', ['api_key' => env('OPENAI_API_KEY')])
        ->withPrompt('Explain quantum computing step by step.')
        ->asStream();
    foreach ($stream as $chunk) {
        echo $chunk->text;
        @ob_flush(); @flush();
    }
}, 200, [
    'Cache-Control' => 'no-cache',
    'Content-Type' => 'text/event-stream',
    'X-Accel-Buffering' => 'no',
]);
```

Laravel 12 Event Streams:

```
Route::get('/chat', function () {
    return response()->eventStream(function () {
        $stream = AiBridge::text()
            ->using('openai', 'gpt-4o', ['api_key' => env('OPENAI_API_KEY')])
            ->withPrompt('Explain quantum computing step by step.')
            ->asStream();
        foreach ($stream as $resp) { yield $resp->text; }
    });
});
```

Note: Packages that intercept Laravel HTTP client streams (e.g., Telescope) can consume the stream. Disable or exclude AiBridge requests for streaming endpoints.

### Real-time Streaming

[](#real-time-streaming)

```
foreach (AiBridge::stream('openai', [
    ['role' => 'user', 'content' => 'Explain gravity in 3 points']
]) as $chunk) {
    echo $chunk; // flush to SSE client
}
```

Event-based streaming from the manager (delta/end events):

```
foreach (app('AiBridge')->streamEvents('openai', [
    ['role' => 'user', 'content' => 'Stream me a short answer']
]) as $evt) {
    if ($evt['type'] === 'delta') echo $evt['data'];
    if ($evt['type'] === 'end') break;
}
```

### Embeddings for Semantic Search

[](#embeddings-for-semantic-search)

```
$result = AiBridge::embeddings('openai', [
    'First text to vectorize',
    'Second text to analyze'
]);
$vectors = $result['embeddings'];
```

Normalize embeddings across providers:

```
use AiBridge\Support\EmbeddingsNormalizer;

$raw = AiBridge::embeddings('openai', ['hello world']);
$norm = EmbeddingsNormalizer::normalize($raw);
$vectors = $norm['vectors'];
```

### Image Generation

[](#image-generation)

```
$result = AiBridge::image('openai', 'An astronaut cat in space', [
    'size' => '1024x1024',
    'model' => 'dall-e-3',
    'quality' => 'hd'
]);
$imageUrl = $result['images'][0]['url'] ?? null;
```

Normalize images from any provider:

```
use AiBridge\Support\ImageNormalizer;

$raw = AiBridge::image('openai_custom', 'A watercolor elephant');
$images = ImageNormalizer::normalize($raw);
foreach ($images as $img) {
    if ($img['type'] === 'url') { echo $img['url']; }
    if ($img['type'] === 'b64') { file_put_contents('out.png', base64_decode($img['data'])); }
}
```

Facade convenience for normalizers:

```
// Images
$imgs = AiBridge::normalizeImages($rawImage);
// Audio TTS
$tts = AiBridge::normalizeTTSAudio($rawTTS);
// Audio STT
$stt = AiBridge::normalizeSTTAudio($rawSTT);
// Embeddings
$emb = AiBridge::normalizeEmbeddings($rawEmb);
```

### Audio Text-to-Speech

[](#audio-text-to-speech)

```
$result = AiBridge::tts('openai', 'Hello world', [
    'voice' => 'alloy',
    'model' => 'tts-1-hd'
]);
file_put_contents('output.mp3', base64_decode($result['audio']));
```

Normalize audio responses:

```
use AiBridge\Support\AudioNormalizer;

$raw = AiBridge::tts('openai', 'Hello world');
$audio = AudioNormalizer::normalizeTTS($raw);
file_put_contents('tts.mp3', base64_decode($audio['b64']));
```

### Audio Speech-to-Text

[](#audio-speech-to-text)

```
$result = AiBridge::stt('openai', storage_path('app/audio.wav'), [
    'model' => 'whisper-1'
]);
$transcription = $result['text'];
```

Structured Output (JSON Mode)
-----------------------------

[](#structured-output-json-mode)

### With Schema Validation

[](#with-schema-validation)

```
$res = AiBridge::chat('openai', [
    ['role' => 'user', 'content' => 'Give me person info in JSON format']
], [
    'response_format' => 'json',
    'json_schema' => [
        'name' => 'person_schema',
        'schema' => [
            'type' => 'object',
            'properties' => [
                'name' => ['type' => 'string'],
                'age' => ['type' => 'number'],
                'city' => ['type' => 'string']
            ],
            'required' => ['name', 'age']
        ]
    ]
]);

// Check validation
if ($res['schema_validation']['valid'] ?? false) {
    $person = json_decode($res['choices'][0]['message']['content'], true);
    echo "Name: " . $person['name'];
} else {
    $errors = $res['schema_validation']['errors'] ?? [];
    echo "Validation errors: " . implode(', ', $errors);
}
```

### Simple JSON Mode (Ollama)

[](#simple-json-mode-ollama)

```
$res = AiBridge::chat('ollama', [
    ['role' => 'user', 'content' => 'List 3 African countries in JSON']
], [
    'response_format' => 'json',
    'model' => 'llama3.1'
]);
```

Function Calling
----------------

[](#function-calling)

### OpenAI Native Function Calling

[](#openai-native-function-calling)

```
$tools = [
    [
        'name' => 'getWeather',
        'description' => 'Get weather for a city',
        'parameters' => [
            'type' => 'object',
            'properties' => [
                'city' => ['type' => 'string', 'description' => 'City name']
            ],
            'required' => ['city']
        ]
    ]
];

$resp = AiBridge::chat('openai', [
    ['role' => 'user', 'content' => 'What\'s the weather in Paris?']
], [
    'tools' => $tools,
    'tool_choice' => 'auto'
]);

if (!empty($resp['tool_calls'])) {
    foreach ($resp['tool_calls'] as $call) {
        $functionName = $call['name'];
        $arguments = $call['arguments'];
        // Execute function...
    }
}
```

### Generic Tools System

[](#generic-tools-system)

Create a custom tool:

```
use AiBridge\Contracts\ToolContract;

class WeatherTool implements ToolContract
{
    public function name(): string {
        return 'get_weather';
    }

    public function description(): string {
        return 'Get current weather for a city';
    }

    public function schema(): array {
        return [
            'type' => 'object',
            'properties' => [
                'city' => ['type' => 'string']
            ],
            'required' => ['city']
        ];
    }

    public function execute(array $arguments): string {
        $city = $arguments['city'] ?? 'Paris';
        // Weather API call...
        return json_encode(['city' => $city, 'temp' => '22°C']);
    }
}
```

Register and use the tool:

```
$manager = app('AiBridge');
$manager->registerTool(new WeatherTool());

$result = $manager->chatWithTools('ollama', [
    ['role' => 'user', 'content' => 'What\'s the weather in Lyon?']
], [
    'model' => 'llama3.1',
    'max_tool_iterations' => 3
]);

echo $result['final']['message']['content'];
// Tool call history in $result['tool_calls']
```

Supported Providers
-------------------

[](#supported-providers)

ProviderChatStreamEmbeddingsImagesAudio (TTS)Audio (STT)Tools**OpenAI**✅✅✅✅ (DALL-E)✅✅✅ Native**Ollama**✅✅✅✅ (SD)❌❌✅ Generic**Ollama Turbo**✅✅✅✅ (SD)❌❌✅ Generic**Mistral**✅✅✅❌❌❌✅ Native**Gemini**✅✅✅❌❌❌✅ Generic**Claude**✅✅❌❌❌❌✅ Generic**Grok**✅✅❌❌❌❌✅ Generic**OpenRouter**✅✅✅✅✅✅✅ Native (OpenAI-compatible)**ONN**✅✅ (simulated)❌❌❌❌❌**Custom OpenAI**✅✅✅✅✅✅✅ NativeAdvanced Configuration
----------------------

[](#advanced-configuration)

### Timeouts and Retry

[](#timeouts-and-retry)

```
# HTTP request timeout (seconds)
LLM_HTTP_TIMEOUT=30

# Number of retry attempts on failure
LLM_HTTP_RETRY=2

# Delay between retries (ms)
LLM_HTTP_RETRY_SLEEP=200
```

### File Security

[](#file-security)

```
# Maximum file size (bytes)
LLM_MAX_FILE_BYTES=2097152

# Allowed MIME types for files
# (configured in config/aibridge.php)
```

### Custom Provider (Azure OpenAI)

[](#custom-provider-azure-openai)

```
OPENAI_CUSTOM_API_KEY=your-azure-key
OPENAI_CUSTOM_BASE_URL=https://your-resource.openai.azure.com
OPENAI_CUSTOM_AUTH_HEADER=api-key
OPENAI_CUSTOM_AUTH_PREFIX=
```

### Ollama via OpenAI-compatible API

[](#ollama-via-openai-compatible-api)

Ollama exposes an experimental, OpenAI-compatible API at . You can use AiBridge's "Custom OpenAI" provider to call Ollama with OpenAI-shaped requests (chat/completions, streaming, embeddings, vision as content parts).

Environment example:

```
# Ollama OpenAI compatibility
OPENAI_CUSTOM_API_KEY=ollama              # required by client but ignored by Ollama
OPENAI_CUSTOM_BASE_URL=http://localhost:11434/v1
# The default paths already match Ollama's OpenAI-compat endpoints:
#   /v1/chat/completions, /v1/embeddings, /v1/images/generations, etc.
# Keep defaults unless you run a proxy.
```

Usage example (PHP):

```
use AiBridge\AiBridgeManager;

$ai = new AiBridgeManager([
    'openai_custom' => [
        'api_key' => 'ollama',
        'base_url' => 'http://localhost:11434/v1',
        'paths' => [
            'chat' => '/v1/chat/completions',
            'embeddings' => '/v1/embeddings',
        ],
    ],
    'options' => [ 'default_timeout' => 30 ],
]);

// Chat
$resp = $ai->chat('openai_custom', [
    ['role' => 'user', 'content' => 'Say this is a test'],
], [ 'model' => 'llama3.2' ]);
echo $resp['choices'][0]['message']['content'] ?? '';

// Streaming
foreach ($ai->stream('openai_custom', [
    ['role' => 'user', 'content' => 'Explain gravity in one paragraph.'],
], [ 'model' => 'llama3.2' ]) as $chunk) {
    echo $chunk;
}

// Embeddings
$emb = $ai->embeddings('openai_custom', [
    'why is the sky blue?',
    'why is the grass green?',
], [ 'model' => 'all-minilm' ]);
$vectors = $emb['embeddings'];
```

Notes:

- Ollama supports base64 image content parts in chat messages (OpenAI-style). Provide an array of content parts with a data URL if needed.
- Not all OpenAI fields are supported (e.g., tool\_choice, logprobs). See Ollama docs for the current matrix.

#### Vision (image content parts)

[](#vision-image-content-parts)

```
$imageB64 = base64_encode(file_get_contents('example.png'));
$messages = [
    [
        'role' => 'user',
        'content' => [
            [ 'type' => 'text', 'text' => "What's in this image?" ],
            [ 'type' => 'image_url', 'image_url' => 'data:image/png;base64,' . $imageB64 ],
        ],
    ],
];
$resp = $ai->chat('openai_custom', $messages, [ 'model' => 'llava' ]);
echo $resp['choices'][0]['message']['content'] ?? '';
```

#### Troubleshooting Ollama (OpenAI-compat)

[](#troubleshooting-ollama-openai-compat)

- Ensure Ollama is started with the OpenAI-compatible API: it should expose
- Use an arbitrary api key (e.g., "ollama"): some clients require a token header even if the server ignores it.
- If you see 404 on /v1/models, set paths in config to match your proxy or version.

### OpenRouter (OpenAI-compatible)

[](#openrouter-openai-compatible)

OpenRouter exposes an OpenAI-compatible API at  and is pre-wired in AiBridge via a CustomOpenAIProvider.

Environment example:

```
OPENROUTER_API_KEY=your-key
# Optional
# OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
# OPENROUTER_REFERER=https://your-app.example.com
# OPENROUTER_TITLE=Your App Name
```

Usage examples (PHP):

```
use AiBridge\Facades\AiBridge;

// Chat
$res = AiBridge::chat('openrouter', [
    ['role' => 'user', 'content' => 'Give me a one-liner joke']
], [ 'model' => 'openai/gpt-4o-mini' ]);
echo $res['choices'][0]['message']['content'] ?? '';

// Streaming
foreach (AiBridge::stream('openrouter', [
    ['role' => 'user', 'content' => 'Stream a haiku about the sea']
], [ 'model' => 'meta-llama/llama-3.1-8b-instruct' ]) as $chunk) {
    echo $chunk;
}

// Embeddings
$emb = AiBridge::embeddings('openrouter', [
    'hello world',
    'bonjour le monde'
], [ 'model' => 'text-embedding-3-small' ]);
$vectors = $emb['embeddings'];

// Images (if the routed model supports it)
$img = AiBridge::image('openrouter', 'A watercolor fox in the forest', [
    'model' => 'openai/dall-e-3'
]);

// Audio (TTS/STT) if available through OpenRouter for your chosen model
$tts = AiBridge::tts('openrouter', 'Hello from OpenRouter', [ 'model' => 'openai/tts-1', 'voice' => 'alloy' ]);
```

Notes:

- Model IDs and capabilities depend on OpenRouter routing. Choose models accordingly.
- The Referer/Title headers are optional but recommended to surface your app in OpenRouter’s ecosystem.

### Models (list/retrieve) with OpenAI-compatible endpoints

[](#models-listretrieve-with-openai-compatible-endpoints)

```
// List models from an OpenAI-compatible base URL (e.g., Ollama /v1)
$models = $ai->models('openai_custom');
foreach (($models['data'] ?? []) as $m) {
        echo $m['id'] . PHP_EOL;
}

// Retrieve a single model
$model = $ai->model('openai_custom', 'llama3.2');
print_r($model);
```

Also works with built-in providers that speak the OpenAI schema, e.g. `openrouter` and `openai`.

### Streaming events (OpenAI)

[](#streaming-events-openai)

```
use AiBridge\Providers\OpenAIProvider;

$prov = new OpenAIProvider(env('OPENAI_API_KEY'));
foreach ($prov->streamEvents([
    ['role' => 'user', 'content' => 'Stream me a short answer.']
], [ 'model' => 'gpt-4o-mini' ]) as $evt) {
    if ($evt['type'] === 'delta') { echo $evt['data']; }
    if ($evt['type'] === 'end') { echo "\n[done]\n"; }
}
```

ONN Provider
------------

[](#onn-provider)

Basic chat support with optional simulated streaming.

Environment:

```
ONN_API_KEY=your-onn-key
```

Usage:

```
use AiBridge\Facades\AiBridge;

$res = AiBridge::chat('onn', [
    ['role' => 'user', 'content' => 'Say hello']
]);
echo $res['response'] ?? '';

foreach (AiBridge::stream('onn', [
    ['role' => 'user', 'content' => 'Stream a short sentence']
]) as $chunk) {
    echo $chunk;
}
```

Practical Examples
------------------

[](#practical-examples)

### Conversational Assistant with History

[](#conversational-assistant-with-history)

```
class ChatbotService
{
    private array $conversation = [];

    public function __construct(private AiBridgeManager $ai) {}

    public function chat(string $userMessage): string
    {
        $this->conversation[] = ['role' => 'user', 'content' => $userMessage];

        $response = $this->ai->chat('openai', $this->conversation, [
            'model' => 'gpt-4',
            'temperature' => 0.7
        ]);

        $assistantMessage = $response['choices'][0]['message']['content'];
        $this->conversation[] = ['role' => 'assistant', 'content' => $assistantMessage];

        return $assistantMessage;
    }
}
```

### Semantic Search with Embeddings

[](#semantic-search-with-embeddings)

```
class SemanticSearch
{
    public function __construct(private AiBridgeManager $ai) {}

    public function search(string $query, array $documents): array
    {
        // Vectorize query and documents
        $inputs = [$query, ...$documents];
        $result = $this->ai->embeddings('openai', $inputs);

        $queryVector = $result['embeddings'][0];
        $docVectors = array_slice($result['embeddings'], 1);

        // Calculate cosine similarity
        $similarities = [];
        foreach ($docVectors as $i => $docVector) {
            $similarities[$i] = $this->cosineSimilarity($queryVector, $docVector);
        }

        // Sort by relevance
        arsort($similarities);

        return array_map(fn($i) => [
            'document' => $documents[$i],
            'score' => $similarities[$i]
        ], array_keys($similarities));
    }

    private function cosineSimilarity(array $a, array $b): float
    {
        $dotProduct = array_sum(array_map(fn($x, $y) => $x * $y, $a, $b));
        $normA = sqrt(array_sum(array_map(fn($x) => $x * $x, $a)));
        $normB = sqrt(array_sum(array_map(fn($x) => $x * $x, $b)));

        return $dotProduct / ($normA * $normB);
    }
}
```

### Streaming for Real-time Interface

[](#streaming-for-real-time-interface)

```
Route::get('/chat-stream', function (Request $request) {
    $message = $request->input('message');

    return response()->stream(function () use ($message) {
        $manager = app('AiBridge');

        foreach ($manager->stream('openai', [
            ['role' => 'user', 'content' => $message]
        ]) as $chunk) {
            echo "data: " . json_encode(['chunk' => $chunk]) . "\n\n";
            ob_flush();
            flush();
        }

        echo "data: [DONE]\n\n";
    }, 200, [
        'Content-Type' => 'text/plain',
        'Cache-Control' => 'no-cache',
        'X-Accel-Buffering' => 'no'
    ]);
});
```

Testing
-------

[](#testing)

Run the test suite:

```
composer test
```

Or via PHPUnit directly:

```
./vendor/bin/phpunit
```

Development
-----------

[](#development)

### Contributing

[](#contributing)

1. Fork the project
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Commit your changes (`git commit -m 'Add amazing feature'`)
4. Push to the branch (`git push origin feature/amazing-feature`)
5. Open a Pull Request

### Roadmap

[](#roadmap)

- Native Claude Function Calling support
- Automatic embeddings caching
- Additional providers (Cohere, Hugging Face)
- Web administration interface
- Integrated metrics and monitoring
- Advanced multimodal support (vision, audio)

License
-------

[](#license)

This package is open source under the [MIT](LICENSE) license.

Disclaimer
----------

[](#disclaimer)

This package is not officially affiliated with OpenAI, Anthropic, Google, or other mentioned providers. Please respect their respective terms of service.

Support
-------

[](#support)

- 📖 [Complete Documentation](https://github.com/omgbwa-yasse/AiBridge/wiki)
- 🐛 [Report a Bug](https://github.com/omgbwa-yasse/AiBridge/issues)
- 💬 [Discussions](https://github.com/omgbwa-yasse/AiBridge/discussions)
- ⭐ Don't forget to star the project if it helps you!

Per-call overrides (v2.0+)
--------------------------

[](#per-call-overrides-v20)

You can now pass provider credentials and endpoints directly on each call, without editing config:

- OpenAI: `api_key`, optional `chat_endpoint`
- Ollama: `endpoint`
- Ollama Turbo: `api_key`, optional `endpoint`
- Claude/Grok/ONN/Gemini: `api_key`, optional `endpoint`
- Custom OpenAI-compatible: `api_key`, `base_url`, optional `paths`, `auth_header`, `auth_prefix`, `extra_headers`

Examples:

```
$res = app('AiBridge')->chat('ollama', $messages, [
    'endpoint' => 'http://localhost:11434',
    'model' => 'llama3',
]);

$res = app('AiBridge')->chat('openai', $messages, [
    'api_key' => getenv('OPENAI_API_KEY'),
    'chat_endpoint' => 'https://api.openai.com/v1/chat/completions',
]);

$res = app('AiBridge')->chat('openai_custom', $messages, [
    'api_key' => 'ollama', // for Ollama OpenAI-compatible mode
    'base_url' => 'http://localhost:11434/v1',
    'paths' => [ 'chat' => '/chat/completions' ],
]);
```

See `CHANGELOG.md` for details.

###  Health Score

34

—

LowBetter than 77% of packages

Maintenance63

Regular maintenance activity

Popularity10

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity47

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~16 days

Total

4

Last Release

223d ago

Major Versions

v1.0 → v2.0.02025-08-18

### Community

Maintainers

![](https://www.gravatar.com/avatar/1797181ed27004a74753c988f2decb23dd92356a623c42df5d48b3711b4368f2?d=identicon)[omgbwa-yasse](/maintainers/omgbwa-yasse)

---

Top Contributors

[![omgbwa-yasse](https://avatars.githubusercontent.com/u/66338719?v=4)](https://github.com/omgbwa-yasse "omgbwa-yasse (18 commits)")

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/omgbwa-yasse-aibridge/health.svg)

```
[![Health](https://phpackages.com/badges/omgbwa-yasse-aibridge/health.svg)](https://phpackages.com/packages/omgbwa-yasse-aibridge)
```

###  Alternatives

[spatie/laravel-query-builder

Easily build Eloquent queries from API requests

4.4k26.9M220](/packages/spatie-laravel-query-builder)[essa/api-tool-kit

set of tools to build an api with laravel

52680.5k](/packages/essa-api-tool-kit)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

255.2k](/packages/aedart-athenaeum)[esign/laravel-conversions-api

A laravel wrapper package around the Facebook Conversions API

69145.4k](/packages/esign-laravel-conversions-api)[surface/laravel-webfinger

A Laravel package to create an ActivityPub webfinger.

113.8k](/packages/surface-laravel-webfinger)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
