PHPackages                             livenetworks/ln-ai-bridge - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. livenetworks/ln-ai-bridge

ActiveLibrary

livenetworks/ln-ai-bridge
=========================

AI provider abstraction layer for Laravel applications. Unified interface for Claude, OpenAI, and other AI services.

v1.0.0(1mo ago)01↑2900%MITPHPPHP ^8.3

Since Mar 28Pushed 1mo agoCompare

[ Source](https://github.com/livenetworks/ln-ai-bridge)[ Packagist](https://packagist.org/packages/livenetworks/ln-ai-bridge)[ RSS](/packages/livenetworks-ln-ai-bridge/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (6)Versions (2)Used By (0)

LN AI Bridge
============

[](#ln-ai-bridge)

AI provider abstraction layer for Laravel. Unified interface for Claude, OpenAI, and custom providers — direct Guzzle HTTP, no external SDKs.

Installation
------------

[](#installation)

```
composer require livenetworks/ln-ai-bridge
```

```
php artisan vendor:publish --tag=ai-bridge-config
php artisan vendor:publish --tag=ai-bridge-migrations
php artisan migrate
```

Configuration
-------------

[](#configuration)

Add to your `.env`:

```
AI_BRIDGE_PROVIDER=claude
AI_BRIDGE_CLAUDE_API_KEY=your-claude-key
AI_BRIDGE_OPENAI_API_KEY=your-openai-key
```

Quick Start
-----------

[](#quick-start)

```
use LiveNetworks\LnAiBridge\Facades\AiBridge;

$request = AiBridge::prompt()
    ->system('You are a helpful assistant.')
    ->context('customer_name', 'John')
    ->prompt('Write a greeting for the customer.')
    ->build();

$response = AiBridge::send($request);

echo $response->content;
```

Context pairs are injected as XML tags into the prompt:

```
John

Write a greeting for the customer.

```

Conversations
-------------

[](#conversations)

Multi-turn conversations with persistent history, managed via `ConversationManager`:

```
use LiveNetworks\LnAiBridge\Services\ConversationManager;

$manager = app(ConversationManager::class);

$conversation = $manager->startConversation(
    tenantId: 1,
    userId: 42,
    systemPrompt: 'You are a support agent.',
    title: 'Billing inquiry',
);

$response = $manager->sendMessage($conversation, 'I need help with my invoice.');
echo $response->content;

$response = $manager->sendMessage($conversation, 'Can you check order #12345?');
echo $response->content;
```

All messages are persisted to the database. History is automatically loaded and included with each request.

Summarization
-------------

[](#summarization)

When a conversation exceeds the configured message threshold, older messages are automatically summarized via an AI call. The summary replaces the old messages in context, keeping the conversation within token limits.

Configure in `config/ai-bridge.php`:

```
'conversation' => [
    'summarize_threshold' => 20,  // Trigger after this many unsummarized messages
    'keep_recent'         => 6,   // Keep the last N messages unsummarized
    'summary_max_tokens'  => 500, // Max tokens for the summary response
],
```

Usage Tracking
--------------

[](#usage-tracking)

Every AI request is automatically logged to `ai_usage_log` with input/output token counts, provider, and model. Query aggregated usage per tenant or user:

```
use LiveNetworks\LnAiBridge\Services\UsageTracker;

$tracker = app(UsageTracker::class);

$usage = $tracker->getTenantUsage(
    tenantId: 1,
    from: now()->startOfMonth(),
);

// Returns: ['input_tokens' => ..., 'output_tokens' => ..., 'total_tokens' => ..., 'request_count' => ...]
```

Disable tracking in `.env`:

```
AI_BRIDGE_USAGE_TRACKING=false
```

Adding Custom Providers
-----------------------

[](#adding-custom-providers)

**1. Create the provider class** extending `AbstractProvider`:

```
use LiveNetworks\LnAiBridge\Providers\AbstractProvider;
use LiveNetworks\LnAiBridge\DTO\AiRequest;
use LiveNetworks\LnAiBridge\DTO\AiResponse;

class MistralProvider extends AbstractProvider
{
    public function name(): string { return 'mistral'; }
    public function model(): string { return $this->config['model']; }
    protected function endpoint(): string { return '/v1/chat/completions'; }
    protected function buildHeaders(): array { /* ... */ }
    protected function buildPayload(AiRequest $request): array { /* ... */ }
    protected function parseResponse(array $data): AiResponse { /* ... */ }
}
```

**2. Register it** in a service provider:

```
AiBridge::register('mistral', MistralProvider::class);
```

**3. Add config** to `config/ai-bridge.php` providers array:

```
'mistral' => [
    'api_key'  => env('AI_BRIDGE_MISTRAL_API_KEY'),
    'model'    => env('AI_BRIDGE_MISTRAL_MODEL', 'mistral-large-latest'),
    'base_url' => 'https://api.mistral.ai',
],
```

Configuration Reference
-----------------------

[](#configuration-reference)

KeyDefaultDescription`ai-bridge.default``claude`Default provider`ai-bridge.providers.claude.api_key`—Anthropic API key`ai-bridge.providers.claude.model``claude-sonnet-4-20250514`Claude model`ai-bridge.providers.claude.base_url``https://api.anthropic.com`Claude API base URL`ai-bridge.providers.claude.version``2023-06-01`Anthropic API version`ai-bridge.providers.openai.api_key`—OpenAI API key`ai-bridge.providers.openai.model``gpt-4o`OpenAI model`ai-bridge.providers.openai.base_url``https://api.openai.com`OpenAI API base URL`ai-bridge.logging``false`Enable request/response logging`ai-bridge.defaults.temperature``0.4`Default temperature`ai-bridge.defaults.max_tokens``8192`Default max tokens`ai-bridge.defaults.timeout``60`HTTP timeout (seconds)`ai-bridge.conversation.summarize_threshold``20`Messages before auto-summarization`ai-bridge.conversation.keep_recent``6`Recent messages to keep unsummarized`ai-bridge.conversation.summary_max_tokens``500`Max tokens for summary`ai-bridge.usage.tracking_enabled``true`Enable usage trackingLicense
-------

[](#license)

MIT License — Live Networks DOOEL.

###  Health Score

38

—

LowBetter than 85% of packages

Maintenance90

Actively maintained with recent releases

Popularity2

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity48

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Unknown

Total

1

Last Release

46d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/b8c6d187d0e3dcfe2cab547135efef6de91afd8e40d90d0e538106e28877802f?d=identicon)[daliborsojic](/maintainers/daliborsojic)

---

Top Contributors

[![daliborsojic](https://avatars.githubusercontent.com/u/82275140?v=4)](https://github.com/daliborsojic "daliborsojic (6 commits)")

---

Tags

laravelaiopenaiBridgeclaudellm

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/livenetworks-ln-ai-bridge/health.svg)

```
[![Health](https://phpackages.com/badges/livenetworks-ln-ai-bridge/health.svg)](https://phpackages.com/packages/livenetworks-ln-ai-bridge)
```

###  Alternatives

[sbsaga/toon

🧠 TOON for Laravel — a compact, human-readable, and token-efficient data format for AI prompts &amp; LLM contexts. Perfect for ChatGPT, Gemini, Claude, Mistral, and OpenAI integrations (JSON ⇄ TOON).

6115.6k](/packages/sbsaga-toon)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

245.2k](/packages/aedart-athenaeum)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
