PHPackages                             filaforge/filament-ollama-chat - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Utility &amp; Helpers](/categories/utility)
4. /
5. filaforge/filament-ollama-chat

ActiveLibrary[Utility &amp; Helpers](/categories/utility)

filaforge/filament-ollama-chat
==============================

Filament v4 Ollama chat plugin (conversations, settings, models).

v1.0.0(8mo ago)11MITPHPPHP ^8.1

Since Aug 19Pushed 8mo agoCompare

[ Source](https://github.com/filaforge/filament-ollama-chat)[ Packagist](https://packagist.org/packages/filaforge/filament-ollama-chat)[ RSS](/packages/filaforge-filament-ollama-chat/feed)WikiDiscussions master Synced 1mo ago

READMEChangelogDependencies (2)Versions (5)Used By (0)

Filaforge Ollama Chat
=====================

[](#filaforge-ollama-chat)

A powerful Filament plugin that integrates Ollama AI chat capabilities directly into your admin panel for local AI model interactions.

Features
--------

[](#features)

- **Ollama AI Integration**: Chat with local AI models powered by Ollama
- **Local Model Support**: Run AI models locally without external API calls
- **Conversation Management**: Save, organize, and continue chat conversations
- **Model Selection**: Choose from available Ollama models on your system
- **Customizable Settings**: Configure Ollama server, models, and chat parameters
- **Real-time Chat**: Live chat experience with streaming responses
- **Conversation History**: Keep track of all your AI conversations
- **Export Conversations**: Save and share chat transcripts
- **Role-based Access**: Configurable user permissions and access control
- **Context Awareness**: Maintain conversation context across sessions

Installation
------------

[](#installation)

### 1. Install via Composer

[](#1-install-via-composer)

```
composer require filaforge/ollama-chat
```

### 2. Publish &amp; Migrate

[](#2-publish--migrate)

```
# Publish provider groups (config, views, migrations)
php artisan vendor:publish --provider="Filaforge\\OllamaChat\\Providers\\OllamaChatServiceProvider"

# Run migrations
php artisan migrate
```

### 3. Register Plugin

[](#3-register-plugin)

Add the plugin to your Filament panel provider:

```
use Filament\Panel;

public function panel(Panel $panel): Panel
{
    return $panel
        // ... other configuration
        ->plugin(\Filaforge\OllamaChat\Filament\OllamaChatPanelPlugin::make());
}
```

Setup
-----

[](#setup)

### Prerequisites

[](#prerequisites)

Before using this plugin, you need to have Ollama installed and running on your system:

1. **Install Ollama**: Visit [ollama.ai](https://ollama.ai) and follow installation instructions
2. **Start Ollama Service**: Ensure Ollama is running on your system
3. **Download Models**: Pull the AI models you want to use

### Configuration

[](#configuration)

The plugin will automatically:

- Publish configuration files to `config/ollama-chat.php`
- Publish view files to `resources/views/vendor/ollama-chat/`
- Publish migration files to `database/migrations/`
- Register necessary routes and middleware

### Ollama Configuration

[](#ollama-configuration)

Configure your Ollama connection in the published config file:

```
// config/ollama-chat.php
return [
    'base_url' => env('OLLAMA_BASE_URL', 'http://localhost:11434'),
    'default_model' => env('OLLAMA_MODEL', 'llama3'),
    'max_tokens' => env('OLLAMA_MAX_TOKENS', 4096),
    'temperature' => env('OLLAMA_TEMPERATURE', 0.7),
    'stream' => env('OLLAMA_STREAM', true),
    'timeout' => env('OLLAMA_TIMEOUT', 120),
    'system_prompt' => env('OLLAMA_SYSTEM_PROMPT', 'You are a helpful assistant.'),
];
```

### Environment Variables

[](#environment-variables)

Add these to your `.env` file:

```
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3
OLLAMA_MAX_TOKENS=4096
OLLAMA_TEMPERATURE=0.7
OLLAMA_STREAM=true
OLLAMA_TIMEOUT=120
OLLAMA_SYSTEM_PROMPT=You are a helpful assistant.
```

### Installing Ollama Models

[](#installing-ollama-models)

Install the AI models you want to use:

```
# Install Llama 3
ollama pull llama3

# Install Code Llama
ollama pull codellama

# Install Mistral
ollama pull mistral

# Install other models as needed
ollama list
```

Usage
-----

[](#usage)

### Accessing Ollama Chat

[](#accessing-ollama-chat)

1. Navigate to your Filament admin panel
2. Look for the "Ollama Chat" menu item
3. Start chatting with local AI models

### Starting a Conversation

[](#starting-a-conversation)

1. **Select Model**: Choose from available Ollama models on your system
2. **Type Your Message**: Enter your question or prompt
3. **Send Message**: Submit your message to the local AI
4. **View Response**: See the AI's response in real-time
5. **Continue Chat**: Keep the conversation going

### Managing Conversations

[](#managing-conversations)

1. **New Chat**: Start a fresh conversation
2. **Save Chat**: Automatically save important conversations
3. **Load Chat**: Resume previous conversations
4. **Export Chat**: Download conversation transcripts
5. **Delete Chat**: Remove unwanted conversations

### Advanced Features

[](#advanced-features)

- **Model Switching**: Switch between different Ollama models
- **Parameter Tuning**: Adjust temperature, max tokens, and other settings
- **Context Management**: Maintain conversation context across sessions
- **Streaming Responses**: Real-time AI responses for better user experience

Troubleshooting
---------------

[](#troubleshooting)

### Common Issues

[](#common-issues)

- **Connection refused**: Ensure Ollama service is running on your system
- **Model not found**: Verify the model is installed with `ollama list`
- **Slow responses**: Check system resources and model size
- **Memory issues**: Ensure sufficient RAM for the selected model

### Debug Steps

[](#debug-steps)

1. Check the plugin configuration:

```
php artisan config:show ollama-chat
```

2. Verify routes are registered:

```
php artisan route:list | grep ollama-chat
```

3. Test Ollama connectivity:

```
# Check if Ollama is running
curl http://localhost:11434/api/tags

# List available models
ollama list
```

4. Check environment variables:

```
php artisan tinker
echo env('OLLAMA_BASE_URL');
```

5. Clear caches:

```
php artisan optimize:clear
```

6. Check logs for errors:

```
tail -f storage/logs/laravel.log
```

### Ollama Service Issues

[](#ollama-service-issues)

- **Service not starting**: Check Ollama installation and permissions
- **Port conflicts**: Ensure port 11434 is available
- **Model download failures**: Check internet connection and disk space
- **Performance issues**: Monitor system resources and model size

Security Considerations
-----------------------

[](#security-considerations)

### Access Control

[](#access-control)

- **Role-based permissions**: Restrict access to authorized users only
- **Local network security**: Ensure Ollama is not exposed to external networks
- **User isolation**: Ensure users can only access their own conversations
- **Audit logging**: Track all chat activities and model usage

### Best Practices

[](#best-practices)

- Run Ollama on localhost only
- Implement proper user authentication
- Monitor system resources and model usage
- Regularly update Ollama and models
- Use appropriate firewall rules

Performance Optimization
------------------------

[](#performance-optimization)

### System Requirements

[](#system-requirements)

- **RAM**: Minimum 8GB, recommended 16GB+ for larger models
- **CPU**: Multi-core processor for better performance
- **Storage**: SSD recommended for faster model loading
- **GPU**: Optional but recommended for better performance

### Model Selection

[](#model-selection)

- **Small models** (1-3B parameters): Fast, lower quality
- **Medium models** (7-13B parameters): Balanced performance
- **Large models** (30B+ parameters): High quality, slower responses

Uninstall
---------

[](#uninstall)

### 1. Remove Plugin Registration

[](#1-remove-plugin-registration)

Remove the plugin from your panel provider:

```
// remove ->plugin(\Filaforge\OllamaChat\Filament\OllamaChatPanelPlugin::make())
```

### 2. Roll Back Migrations (Optional)

[](#2-roll-back-migrations-optional)

```
php artisan migrate:rollback
# or roll back specific published files if needed
```

### 3. Remove Published Assets (Optional)

[](#3-remove-published-assets-optional)

```
rm -f config/ollama-chat.php
rm -rf resources/views/vendor/ollama-chat
```

### 4. Remove Package and Clear Caches

[](#4-remove-package-and-clear-caches)

```
composer remove filaforge/ollama-chat
php artisan optimize:clear
```

### 5. Clean Up Environment Variables

[](#5-clean-up-environment-variables)

Remove these from your `.env` file:

```
OLLAMA_BASE_URL=http://localhost:11434
OLLAMA_MODEL=llama3
OLLAMA_MAX_TOKENS=4096
OLLAMA_TEMPERATURE=0.7
OLLAMA_STREAM=true
OLLAMA_TIMEOUT=120
OLLAMA_SYSTEM_PROMPT=You are a helpful assistant.
```

### 6. Stop Ollama Service (Optional)

[](#6-stop-ollama-service-optional)

If you no longer need Ollama:

```
# Stop Ollama service
ollama stop

# Remove models (optional)
ollama rm llama3
ollama rm codellama
```

Support
-------

[](#support)

- **Documentation**: [GitHub Repository](https://github.com/filaforge/ollama-chat)
- **Issues**: [GitHub Issues](https://github.com/filaforge/ollama-chat/issues)
- **Discussions**: [GitHub Discussions](https://github.com/filaforge/ollama-chat/discussions)

Contributing
------------

[](#contributing)

We welcome contributions! Please see our [Contributing Guide](CONTRIBUTING.md) for details.

License
-------

[](#license)

This plugin is open-sourced software licensed under the [MIT license](LICENSE).

---

**Made with ❤️ by the Filaforge Team**

###  Health Score

30

—

LowBetter than 64% of packages

Maintenance58

Moderate activity, may be stable

Popularity3

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity47

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~0 days

Total

3

Last Release

266d ago

Major Versions

0.1.2 → v1.0.02025-08-19

### Community

Maintainers

![](https://www.gravatar.com/avatar/e7ccd0e302c517e0c6bc2b4ef05008e5beb1621bf5a64b17b82cdceb19566389?d=identicon)[filaforge](/maintainers/filaforge)

---

Top Contributors

[![blhk0532](https://avatars.githubusercontent.com/u/221689993?v=4)](https://github.com/blhk0532 "blhk0532 (4 commits)")

### Embed Badge

![Health badge](/badges/filaforge-filament-ollama-chat/health.svg)

```
[![Health](https://phpackages.com/badges/filaforge-filament-ollama-chat/health.svg)](https://phpackages.com/packages/filaforge-filament-ollama-chat)
```

###  Alternatives

[illuminate/pipeline

The Illuminate Pipeline package.

9446.6M213](/packages/illuminate-pipeline)[bezhansalleh/filament-google-analytics

Google Analytics integration for FilamentPHP

205144.8k5](/packages/bezhansalleh-filament-google-analytics)[kirschbaum-development/commentions

A package to allow you to create comments, tag users and more

12369.2k](/packages/kirschbaum-development-commentions)[hydrat/filament-table-layout-toggle

Filament plugin adding a toggle button to tables, allowing user to switch between Grid and Table layouts.

6292.3k1](/packages/hydrat-filament-table-layout-toggle)[interaction-design-foundation/laravel-geoip

Support for multiple Geographical Location services.

17221.0k3](/packages/interaction-design-foundation-laravel-geoip)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

255.2k](/packages/aedart-athenaeum)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
