PHPackages                             woutersf/ai-connection-bundle - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Utility &amp; Helpers](/categories/utility)
4. /
5. woutersf/ai-connection-bundle

ActiveMautic-plugin[Utility &amp; Helpers](/categories/utility)

woutersf/ai-connection-bundle
=============================

Core AI connection plugin for Mautic - manages LiteLLM integration and shared AI services

v1.0.1(5mo ago)00GPL-3.0-or-laterPHPPHP ^7.4|^8.0

Since Nov 10Pushed 5mo agoCompare

[ Source](https://github.com/woutersf/MauticAiConnectionBundle)[ Packagist](https://packagist.org/packages/woutersf/ai-connection-bundle)[ RSS](/packages/woutersf-ai-connection-bundle/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (1)Versions (3)Used By (0)

Mautic AI Connection Bundle
===========================

[](#mautic-ai-connection-bundle)

[![Mautic AI Connection](Assets/img/mauticai.png)](Assets/img/mauticai.png)

A core AI connection plugin for Mautic that manages LiteLLM integration and provides centralized AI services for all Mautic AI-powered plugins.

Overview
--------

[](#overview)

The Mautic AI Connection Bundle serves as the foundation for AI functionality in Mautic. It provides a centralized LiteLLM service that can be used by other AI-powered plugins such as:

- **Mautic AI Console** - AI-powered console interface with voice input
- **Mautic AI Reports** - AI-powered report generation
- **Mautic AI Eval** - AI evaluation features

Features
--------

[](#features)

- **Centralized AI Configuration** - Single source of truth for LiteLLM endpoint and credentials
- **LiteLLM Integration** - Connect to multiple AI providers (OpenAI, Anthropic Claude, Llama, etc.) through LiteLLM proxy
- **Shared Service Architecture** - Other plugins access AI capabilities through this bundle's service
- **Model Management** - Dynamically fetch available models from your LiteLLM instance
- **Secure Credential Storage** - API keys are encrypted and stored securely

Requirements
------------

[](#requirements)

- Mautic 4.0+ or Mautic 5.0+
- PHP 7.4 or 8.0+
- A running LiteLLM instance (proxy server)

Installation
------------

[](#installation)

### Via Composer

[](#via-composer)

```
composer require mautic/ai-connection-bundle
```

### Manual Installation

[](#manual-installation)

1. Download or clone this repository
2. Place the `MauticAIconnectionBundle` folder in `docroot/plugins/`
3. Clear Mautic cache: ```
    php bin/console cache:clear
    ```
4. Go to Mautic Settings → Plugins
5. Click "Install/Upgrade Plugins"
6. Find "Mautic AI Connection" and publish it

Configuration
-------------

[](#configuration)

Navigate to **Mautic Settings → Plugins → Mautic AI Connection** to configure the plugin.

[![AI Connection Settings](Assets/aiConnection.png)](Assets/aiConnection.png)

### Required Settings

[](#required-settings)

1. **LiteLLM Endpoint**

    - URL of your LiteLLM proxy server or an OPENAI API key.
    - Example: `http://localhost:4000` or `https://your-litellm-server.com` or
    - **Note:** This should point to your LiteLLM proxy, NOT directly to OpenAI or other providers
2. **LiteLLM Secret Key**

    - API key for authenticating with your LiteLLM instance or Openai.
    - This credential is encrypted and stored securely

Usage in Other Plugins
----------------------

[](#usage-in-other-plugins)

Other Mautic plugins can use the LiteLLM service provided by this bundle.

### Accessing the Service

[](#accessing-the-service)

```
// Get the service from the container
$liteLLMService = $this->container->get('mautic.ai_connection.service.litellm');
```

### Available Methods

[](#available-methods)

#### 1. Chat Completion (with tools support)

[](#1-chat-completion-with-tools-support)

```
$messages = [
    ['role' => 'system', 'content' => 'You are a helpful assistant.'],
    ['role' => 'user', 'content' => 'What is Mautic?'],
];

$options = [
    'model' => 'gpt-3.5-turbo',
    'temperature' => 0.7,
    'max_tokens' => 1000,
];

$response = $liteLLMService->getChatCompletion($messages, $options);
```

#### 2. Simple Completion

[](#2-simple-completion)

```
$response = $liteLLMService->getCompletion('Explain marketing automation in 50 words');
```

#### 3. Streaming Completion

[](#3-streaming-completion)

```
$liteLLMService->streamCompletion('Write a blog post about email marketing', function($chunk) {
    echo $chunk;
});
```

#### 4. Speech-to-Text

[](#4-speech-to-text)

```
$audioData = file_get_contents('recording.wav');
$transcription = $liteLLMService->speechToText($audioData, 'en', 'whisper-1');
```

#### 5. Get Available Models

[](#5-get-available-models)

```
$models = $liteLLMService->getAvailableModels();
// Returns: ['GPT-4' => 'gpt-4', 'Claude 3' => 'claude-3-sonnet', ...]
```

### Subscribing to the Service in Controllers

[](#subscribing-to-the-service-in-controllers)

```
use MauticPlugin\MauticAIconnectionBundle\Service\LiteLLMService;

class YourController extends CommonController
{
    public static function getSubscribedServices(): array
    {
        return array_merge(parent::getSubscribedServices(), [
            'mautic.ai_connection.service.litellm' => LiteLLMService::class,
        ]);
    }

    public function yourAction()
    {
        $liteLLMService = $this->container->get('mautic.ai_connection.service.litellm');
        // Use the service...
    }
}
```

Architecture
------------

[](#architecture)

This plugin follows a centralized service architecture:

```
┌─────────────────────────────────────┐
│   Mautic AI Connection Bundle       │
│  ┌───────────────────────────────┐  │
│  │   LiteLLM Service             │  │
│  │  - Chat Completions           │  │
│  │  - Streaming                  │  │
│  │  - Speech-to-Text             │  │
│  │  - Model Discovery            │  │
│  └───────────────────────────────┘  │
└─────────────────────────────────────┘
              ↑ ↑ ↑
              │ │ │
    ┌─────────┘ │ └─────────┐
    │           │           │
┌───┴────┐ ┌───┴────┐ ┌───┴────┐
│AI      │ │AI      │ │AI      │
│Console │ │Reports │ │Eval    │
│Bundle  │ │Bundle  │ │Bundle  │
└────────┘ └────────┘ └────────┘

```

Composer Dependency
-------------------

[](#composer-dependency)

Other AI plugins should declare this bundle as a dependency in their `composer.json`:

```
{
    "require": {
        "mautic/ai-connection-bundle": "^1.0"
    }
}
```

Security
--------

[](#security)

- API keys are encrypted using Mautic's encryption helper
- All requests use HTTPS when connecting to remote LiteLLM instances
- The service validates configuration before making API calls

Troubleshooting
---------------

[](#troubleshooting)

### "LiteLLM endpoint and secret key must be configured"

[](#litellm-endpoint-and-secret-key-must-be-configured)

**Solution:** Configure the LiteLLM endpoint and secret key in the plugin settings.

### "404 Not Found" when making AI requests

[](#404-not-found-when-making-ai-requests)

**Issue:** The endpoint is pointing directly to OpenAI/Anthropic instead of LiteLLM proxy.

**Solution:** Ensure you're using your LiteLLM proxy URL (e.g., `http://localhost:4000`), not `https://api.openai.com`.

### Models not appearing in dropdown

[](#models-not-appearing-in-dropdown)

**Issue:** LiteLLM instance is not reachable or not properly configured.

**Solution:**

1. Verify LiteLLM is running: `curl http://localhost:4000/models`
2. Check endpoint URL in plugin settings
3. Verify secret key is correct

Development
-----------

[](#development)

### Running Tests

[](#running-tests)

```
php bin/phpunit --filter MauticAIconnectionBundle
```

### Code Style

[](#code-style)

Follow Mautic coding standards:

```
php bin/php-cs-fixer fix plugins/MauticAIconnectionBundle
```

Support
-------

[](#support)

- GitHub Issues: [Report an issue](https://github.com/yourusername/mauticorangepoc/issues)
- Mautic Community: [community.mautic.org](https://community.mautic.org)
- Documentation: [LiteLLM Docs](https://docs.litellm.ai/)

License
-------

[](#license)

GPL-3.0-or-later

Credits
-------

[](#credits)

Created by Frederik Wouters

Version
-------

[](#version)

1.0.0

Changelog
---------

[](#changelog)

### 1.0.0 (2024)

[](#100-2024)

- Initial release
- LiteLLM service integration
- Chat completion support
- Streaming support
- Speech-to-text support
- Model discovery
- Secure credential storage

###  Health Score

32

—

LowBetter than 71% of packages

Maintenance76

Regular maintenance activity

Popularity0

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity41

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~16 days

Total

2

Last Release

163d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/42bc3b27606ef65745245b440198e05a62e85dea0ba9f62f2ff2361bfbf1b146?d=identicon)[woutersf](/maintainers/woutersf)

---

Top Contributors

[![woutersf](https://avatars.githubusercontent.com/u/592312?v=4)](https://github.com/woutersf "woutersf (1 commits)")

---

Tags

pluginaiMauticintegrationllmartificial intelligencelitellm

### Embed Badge

![Health badge](/badges/woutersf-ai-connection-bundle/health.svg)

```
[![Health](https://phpackages.com/badges/woutersf-ai-connection-bundle/health.svg)](https://phpackages.com/packages/woutersf-ai-connection-bundle)
```

###  Alternatives

[rubix/ml

A high-level machine learning and deep learning library for the PHP language.

2.2k1.4M28](/packages/rubix-ml)[mautic/grapes-js-builder-bundle

GrapesJS Builder with MJML support for Mautic

5684.1k4](/packages/mautic-grapes-js-builder-bundle)[cognesy/instructor-php

The complete AI toolkit for PHP: unified LLM API, structured outputs, agents, and coding agent control

310107.9k1](/packages/cognesy-instructor-php)[symfony/ai-platform

PHP library for interacting with AI platform provider.

51927.7k134](/packages/symfony-ai-platform)[ardagnsrn/ollama-php

This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.

20755.8k](/packages/ardagnsrn-ollama-php)[symfony/ai-agent

PHP library for building agentic applications.

30536.7k44](/packages/symfony-ai-agent)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
