PHPackages                             laraflowai/laraflowai - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Utility &amp; Helpers](/categories/utility)
4. /
5. laraflowai/laraflowai

ActiveLibrary[Utility &amp; Helpers](/categories/utility)

laraflowai/laraflowai
=====================

A Laravel extension for building multi-agent AI workflows inspired by crewAI

0.1.0-alpha3(8mo ago)10MITPHPPHP ^8.1

Since Sep 5Pushed 8mo agoCompare

[ Source](https://github.com/korshak/laraflowai)[ Packagist](https://packagist.org/packages/laraflowai/laraflowai)[ RSS](/packages/laraflowai-laraflowai/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (2)Dependencies (9)Versions (3)Used By (0)

LaraFlowAI
==========

[](#laraflowai)

[![Latest Version](https://camo.githubusercontent.com/0c27b6d1df5006a9a858905f6c3272de3e2006f5300a6a8f35a30f04f002bea9/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f76657273696f6e2d302e312e302d2d616c706861332d626c75652e737667)](https://packagist.org/packages/laraflowai/laraflowai)[![License](https://camo.githubusercontent.com/8bb50fd2278f18fc326bf71f6e88ca8f884f72f179d3e555e20ed30157190d0d/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f6c6963656e73652d4d49542d677265656e2e737667)](https://opensource.org/licenses/MIT)[![Laravel](https://camo.githubusercontent.com/73692ab0f1fac2901149539199fa738ac249d6cd2387048e8063666cfab3d736/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4c61726176656c2d31302e7825323025374325323031312e7825323025374325323031322e782d7265642e737667)](https://laravel.com)[![PHP](https://camo.githubusercontent.com/fb7c72456e13f7d5ecf8486e29d02a2e6775aaf4d18622a63529976b0ed0740e/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f5048502d382e312532422d707572706c652e737667)](https://php.net)

A powerful Laravel package for building multi-agent AI workflows inspired by crewAI. Create intelligent agents, crews, and flows with support for multiple AI providers, advanced workflow management, and Model Context Protocol (MCP) integration.

✨ Features
----------

[](#-features)

- 🤖 **Multi-Agent System**: Create intelligent agents with specific roles and goals
- 👥 **Crew Management**: Organize agents into collaborative teams with sequential or parallel execution
- 🔄 **Flow Control**: Build sophisticated workflows with conditional logic, steps, and events
- 🧠 **Memory System**: Short-term and long-term memory with intelligent recall and search
- 🔌 **Multi-Provider Support**: OpenAI, Anthropic, Grok, Gemini, DeepSeek, Groq, and Ollama
- 🛠️ **Extensible Tools**: HTTP, Database, Filesystem, MCP, and custom tool implementations
- 🌐 **MCP Integration**: Connect to external Model Context Protocol servers for extended capabilities
- ⚡ **Queue Integration**: Asynchronous execution with Laravel queues
- 📊 **Observability**: Comprehensive logging and performance analytics
- 💬 **Interactive Chat**: Built-in console chat interface for testing and development
- 🎯 **Artisan Commands**: Generate agents, crews, and flows with artisan commands
- 🔄 **Streaming Support**: Real-time streaming responses with Server-Sent Events

🚀 Installation
--------------

[](#-installation)

### Requirements

[](#requirements)

- PHP 8.1 or higher
- Laravel 10.x, 11.x, or 12.x
- Composer

### Quick Install

[](#quick-install)

```
# Install the package
composer require laraflowai/laraflowai

# Publish configuration and migrations
php artisan vendor:publish --provider="LaraFlowAI\LaraFlowAIServiceProvider"

# Run migrations
php artisan migrate
```

### Environment Setup

[](#environment-setup)

Add your API keys to your `.env` file:

```
# AI Provider API Keys
OPENAI_API_KEY=your_openai_api_key
ANTHROPIC_API_KEY=your_anthropic_api_key
GROK_API_KEY=your_grok_api_key
GEMINI_API_KEY=your_gemini_api_key
DEEPSEEK_API_KEY=your_deepseek_api_key
GROQ_API_KEY=your_groq_api_key

# Local AI (Ollama)
OLLAMA_HOST=http://localhost:11434
OLLAMA_MODEL=mistral

# LaraFlowAI Settings
LARAFLOWAI_DEFAULT_PROVIDER=openai

# Optional: Enable MCP integration
LARAFLOWAI_MCP_ENABLED=false

# Optional: Enable streaming
LARAFLOWAI_STREAMING_ENABLED=true

# Optional: Enable queue processing
LARAFLOWAI_QUEUE_ENABLED=false
```

🚀 Quick Start
-------------

[](#-quick-start)

### Basic Agent Usage

[](#basic-agent-usage)

```
use LaraFlowAI\Facades\FlowAI;
use LaraFlowAI\Tools\HttpTool;

// Create an agent with fluent interface
$agent = FlowAI::agent(
    role: 'Content Writer',
    goal: 'Create engaging blog posts about Laravel',
    provider: 'openai' // Optional: defaults to configured provider
)
->addTool(new HttpTool())
->setContext(['style' => 'professional']);

// Create a task
$task = FlowAI::task('Write a blog post about Laravel 12 features');

// Handle the task
$response = $agent->handle($task);

// Access response
echo $response->getContent();
echo "Execution time: " . $response->getExecutionTime() . "s";
```

### Multiple Providers

[](#multiple-providers)

```
// Use different providers for different tasks
$openaiAgent = FlowAI::agent('Writer', 'High-quality content', 'openai');
$grokAgent = FlowAI::agent('Writer', 'Creative and humorous content', 'grok');
$geminiAgent = FlowAI::agent('Writer', 'Google-powered insights', 'gemini');
$groqAgent = FlowAI::agent('Writer', 'Fast and efficient content', 'groq');

// Compare responses
$task = FlowAI::task('Explain Laravel 12 features');
$openaiResponse = $openaiAgent->handle($task);
$grokResponse = $grokAgent->handle($task);
$geminiResponse = $geminiAgent->handle($task);
$groqResponse = $groqAgent->handle($task);
```

### Crew Usage

[](#crew-usage)

```
use LaraFlowAI\Facades\FlowAI;

// Create specialized agents
$writer = FlowAI::agent('Content Writer', 'Write engaging content', 'openai');
$editor = FlowAI::agent('Editor', 'Review and improve content', 'grok');

// Create tasks
$tasks = [
    FlowAI::task('Write a comprehensive blog post about AI in web development'),
    FlowAI::task('Review and improve the blog post from the previous task. The blog post is provided in the context.'),
];

// Create crew with fluent interface
$crew = FlowAI::crew()
    ->addAgent($writer)
    ->addAgent($editor)
    ->addTasks($tasks);

// Execute crew
$result = $crew->execute();

if ($result->isSuccess()) {
    echo "Crew executed successfully!\n";
    echo "Total execution time: " . $result->getExecutionTime() . "s\n";

    foreach ($result->getResults() as $index => $taskResult) {
        echo "Task " . ($index + 1) . ":\n";
        echo $taskResult['response']->getContent() . "\n\n";
    }
}
```

### Memory Usage

[](#memory-usage)

```
use LaraFlowAI\Facades\FlowAI;

// Store information in memory
FlowAI::memory()->store('user_preferences', [
    'theme' => 'dark',
    'language' => 'en',
    'writing_style' => 'technical'
], 'long_term');

// Recall specific information
$preferences = FlowAI::memory()->recall('user_preferences');

// Search memory
$results = FlowAI::memory()->search('Laravel features', [
    'limit' => 10,
    'type' => 'long_term'
]);

// Store conversation context
FlowAI::memory()->store('conversation_context', [
    'topic' => 'Laravel 12',
    'user_questions' => ['What are the new features?', 'How to migrate?'],
    'agent_responses' => ['Feature list...', 'Migration guide...']
]);
```

### Using Tools

[](#using-tools)

```
use LaraFlowAI\Tools\HttpTool;
use LaraFlowAI\Tools\DatabaseTool;
use LaraFlowAI\Tools\FilesystemTool;
use LaraFlowAI\Tools\MCPTool;

// Create a research agent with tools
$agent = FlowAI::agent('Research Assistant', 'Gather information from various sources')
    ->addTool(new HttpTool())
    ->addTool(new DatabaseTool())
    ->addTool(new FilesystemTool())
    ->addTool(new MCPTool());

// Create a task with tool inputs
$task = FlowAI::task('Research the latest Laravel features and create a summary')
    ->setToolInput('http', [
        'url' => 'https://laravel.com/news',
        'method' => 'GET',
        'headers' => ['User-Agent' => 'LaraFlowAI/1.0']
    ])
    ->setToolInput('database', [
        'query' => 'SELECT * FROM articles WHERE category = "laravel" ORDER BY created_at DESC LIMIT 10'
    ]);

$response = $agent->handle($task);

// Access tool results
$toolResults = $response->getToolResults();
foreach ($toolResults as $tool => $result) {
    echo "Tool {$tool}: " . $result['status'] . "\n";
}
```

### Streaming Responses

[](#streaming-responses)

```
use LaraFlowAI\Facades\FlowAI;

// Create an agent with streaming support
$agent = FlowAI::agent('Content Writer', 'Create engaging content');

// Create a task
$task = FlowAI::task('Write a comprehensive guide about Laravel 12');

// Stream the response
$streamingResponse = $agent->stream($task, function ($chunk) {
    echo $chunk; // Output each chunk as it arrives
});

// Or get the complete response after streaming
$completeResponse = $streamingResponse->toResponse();
echo $completeResponse->getContent();
```

### MCP Integration

[](#mcp-integration)

```
use LaraFlowAI\Tools\MCPTool;

// Create an agent with MCP tool
$agent = FlowAI::agent('MCP Assistant', 'Use external MCP servers')
    ->addTool(new MCPTool());

// Create a task that uses MCP
$task = FlowAI::task('Get weather information for New York')
    ->setToolInput('mcp', [
        'server' => 'weather_server',
        'action' => 'get_weather',
        'parameters' => ['location' => 'New York']
    ]);

$response = $agent->handle($task);
```

### Flow Usage

[](#flow-usage)

```
use LaraFlowAI\Facades\FlowAI;
use LaraFlowAI\FlowStep;
use LaraFlowAI\FlowCondition;

// Create a flow with multiple steps
$flow = FlowAI::flow()
    ->addStep(new FlowStep('Research Step', 'crew'))
    ->addStep(new FlowStep('Analysis Step', 'crew'))
    ->addStep(new FlowStep('Report Step', 'crew'))
    ->addCondition(new FlowCondition('if research_complete'));

// Add event handlers
$flow->onEvent('step_completed', function ($data) {
    echo "Step completed: " . $data['step']->getName() . "\n";
});

// Run the flow
$result = $flow->run();

if ($result->isSuccess()) {
    echo "Flow completed successfully!\n";
    echo "Total execution time: " . $result->getExecutionTime() . "s\n";

    foreach ($result->getResults() as $stepResult) {
        echo "Step: " . $stepResult['step']->getName() . "\n";
        echo "Success: " . ($stepResult['success'] ? 'Yes' : 'No') . "\n";
    }
}
```

### Custom Tools

[](#custom-tools)

```
use LaraFlowAI\Contracts\ToolContract;

class WeatherTool implements ToolContract
{
    public function getName(): string
    {
        return 'weather';
    }

    public function getDescription(): string
    {
        return 'Get current weather information for a location';
    }

    public function execute(array $inputs): array
    {
        $location = $inputs['location'] ?? 'New York';
        // Your weather API logic here
        return [
            'location' => $location,
            'temperature' => '72°F',
            'condition' => 'Sunny'
        ];
    }
}

// Register and use custom tool
$agent = FlowAI::agent('Weather Assistant', 'Provide weather information')
    ->addTool(new WeatherTool());

$task = FlowAI::task('What\'s the weather like in San Francisco?')
    ->setToolInput('weather', ['location' => 'San Francisco']);

$response = $agent->handle($task);
```

🔧 Configuration
---------------

[](#-configuration)

### Provider Configuration

[](#provider-configuration)

Each provider can be configured in `config/laraflowai.php`:

```
'providers' => [
    'openai' => [
        'driver' => \LaraFlowAI\Providers\OpenAIProvider::class,
        'api_key' => env('OPENAI_API_KEY'),
        'model' => env('OPENAI_MODEL', 'gpt-4'),
        'timeout' => 60,
    ],
    'grok' => [
        'driver' => \LaraFlowAI\Providers\GrokProvider::class,
        'api_key' => env('GROK_API_KEY'),
        'model' => env('GROK_MODEL', 'grok-4'),
        'timeout' => 120,
    ],
    'gemini' => [
        'driver' => \LaraFlowAI\Providers\GeminiProvider::class,
        'api_key' => env('GEMINI_API_KEY'),
        'model' => env('GEMINI_MODEL', 'gemini-1.5-flash'),
        'timeout' => 60,
    ],
    // ... other providers
],
```

### Available Providers

[](#available-providers)

- **OpenAI**: GPT-4, GPT-3.5-turbo with chat and completion modes
- **Anthropic**: Claude models with chat mode
- **Grok**: Grok-4, Grok-3 with chat mode and humor
- **Gemini**: Google's Gemini models with chat mode
- **DeepSeek**: DeepSeek Chat and Reasoner models
- **Groq**: Fast inference engine for various models
- **Ollama**: Local models like Llama, Mistral, etc.

🎯 Artisan Commands
------------------

[](#-artisan-commands)

LaraFlowAI includes several artisan commands for development and management:

```
# Interactive chat interface
php artisan laraflowai:chat

# Chat with specific agent/crew/flow
php artisan laraflowai:chat --agent=MyAgent
php artisan laraflowai:chat --crew=MyCrew
php artisan laraflowai:chat --flow=MyFlow

# Generate new classes
php artisan laraflowai:make:agent MyAgent --role="Content Writer" --goal="Create engaging content"
php artisan laraflowai:make:crew MyCrew --agents="Writer,Editor" --tasks="Write content,Review content"
php artisan laraflowai:make:flow MyFlow --steps="Step1,Step2" --conditions="Condition1"

# Memory management
php artisan laraflowai:cleanup-memory --days=30
php artisan laraflowai:cleanup-tokens --days=7

# View statistics
php artisan laraflowai:stats
```

🧪 Testing
---------

[](#-testing)

```
# Test a provider
php artisan laraflowai:test-provider openai

# View usage statistics
php artisan laraflowai:stats

# Clean up old data
php artisan laraflowai:cleanup-memory --days=30
```

📊 Performance
-------------

[](#-performance)

LaraFlowAI is optimized for performance:

- **Caching**: Intelligent caching of responses, memory, and MCP server data
- **Queue Integration**: Async processing for long-running tasks
- **Token Optimization**: Efficient token usage and cost tracking
- **Memory Management**: Smart memory cleanup and garbage collection
- **Streaming**: Real-time response streaming for better user experience
- **MCP Caching**: Cached tool and resource discovery for faster MCP operations

🔧 Troubleshooting
-----------------

[](#-troubleshooting)

### Common Issues

[](#common-issues)

1. **Provider not found**: Check your API keys and provider configuration
2. **Memory issues**: Run `php artisan laraflowai:cleanup-memory`
3. **Queue not working**: Ensure queue workers are running
4. **Token limits**: Check your provider's rate limits and quotas
5. **MCP connection issues**: Verify MCP server configuration and connectivity
6. **Streaming not working**: Check streaming configuration and provider support

### Debug Mode

[](#debug-mode)

```
# Enable debug logging
LARAFLOWAI_DEBUG=true

# View detailed logs
tail -f storage/logs/laraflowai.log
```

📚 Documentation
---------------

[](#-documentation)

- **[API Documentation](docs/API.md)** - Complete API reference
- **[Artisan Commands](docs/ARTISAN_COMMANDS.md)** - Available console commands
- **[Streaming Guide](docs/STREAMING.md)** - Real-time streaming implementation
- **[Universal MCP Client](docs/UNIVERSAL_MCP_CLIENT.md)** - MCP integration guide
- **[Laravel Quick Start](docs/LARAVEL_QUICKSTART.md)** - 5-minute setup guide
- **[Laravel Usage Guide](docs/LARAVEL_USAGE.md)** - Comprehensive integration guide
- **[Examples](examples/)** - Real-world usage examples and patterns

🤝 Contributing
--------------

[](#-contributing)

Contributions are welcome! Please feel free to submit a Pull Request.

### Development Setup

[](#development-setup)

```
# Clone the repository
git clone https://github.com/laraflowai/laraflowai.git

# Install dependencies
composer install

# Run tests
composer test

# Run code quality checks
composer cs-fix
composer phpstan
```

📄 License
---------

[](#-license)

This package is open-sourced software licensed under the [MIT license](https://opensource.org/licenses/MIT).

🆘 Support
---------

[](#-support)

- **GitHub Issues**: [Report bugs and request features](https://github.com/laraflowai/laraflowai/issues)
- **Documentation**: [Comprehensive guides and examples](https://github.com/laraflowai/laraflowai/tree/main/docs)

🙏 Acknowledgments
-----------------

[](#-acknowledgments)

- Inspired by [crewAI](https://github.com/joaomdmoura/crewAI)
- Built for the Laravel community
- Powered by multiple AI providers
- Model Context Protocol (MCP) integration for extended capabilities

---

**Made with ❤️ for the Laravel community**

###  Health Score

26

—

LowBetter than 43% of packages

Maintenance61

Regular maintenance activity

Popularity2

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity31

Early-stage or recently created project

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~1 days

Total

2

Last Release

245d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/98801715d6f4aff7ec91c28200ef3bde659a452018d6362380b2749d78503c3b?d=identicon)[korshak](/maintainers/korshak)

---

Top Contributors

[![korshak](https://avatars.githubusercontent.com/u/3677225?v=4)](https://github.com/korshak "korshak (1 commits)")

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/laraflowai-laraflowai/health.svg)

```
[![Health](https://phpackages.com/badges/laraflowai-laraflowai/health.svg)](https://phpackages.com/packages/laraflowai-laraflowai)
```

###  Alternatives

[roots/acorn

Framework for Roots WordPress projects built with Laravel components.

9682.1M97](/packages/roots-acorn)[laravel/pulse

Laravel Pulse is a real-time application performance monitoring tool and dashboard for your Laravel application.

1.7k12.1M99](/packages/laravel-pulse)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

255.2k](/packages/aedart-athenaeum)[flarum/core

Delightfully simple forum software.

211.3M1.9k](/packages/flarum-core)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
