PHPackages                             jonathanbossenger/wp-ollama-model-provider - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. jonathanbossenger/wp-ollama-model-provider

ActiveWordpress-plugin

jonathanbossenger/wp-ollama-model-provider
==========================================

Ollama AI model provider for WordPress AI Client. Enables WordPress to use Ollama models locally or in the cloud.

1.0.0(3mo ago)91[1 issues](https://github.com/jonathanbossenger/wp-ollama-model-provider/issues)GPL-2.0-or-laterPHPPHP &gt;=8.0CI failing

Since Feb 7Pushed 3mo agoCompare

[ Source](https://github.com/jonathanbossenger/wp-ollama-model-provider)[ Packagist](https://packagist.org/packages/jonathanbossenger/wp-ollama-model-provider)[ Docs](https://github.com/jonathanbossenger/wp-ollama-model-provider)[ RSS](/packages/jonathanbossenger-wp-ollama-model-provider/feed)WikiDiscussions trunk Synced 1mo ago

READMEChangelog (1)Dependencies (5)Versions (4)Used By (0)

WP Ollama Model Provider
========================

[](#wp-ollama-model-provider)

[![Latest Stable Version](https://camo.githubusercontent.com/1c4c46f0f9ed9e63716cc5a6a8b1d5d54dd5baa8f95c361f8b7c3201f90e8da3/68747470733a2f2f706f7365722e707567782e6f72672f6a6f6e617468616e626f7373656e6765722f77702d6f6c6c616d612d6d6f64656c2d70726f76696465722f762f737461626c65)](https://packagist.org/packages/jonathanbossenger/wp-ollama-model-provider)[![Total Downloads](https://camo.githubusercontent.com/31e44a81fef913f347a4254b05af90a4681c442c54121f3f3c08e27d9b455a9f/68747470733a2f2f706f7365722e707567782e6f72672f6a6f6e617468616e626f7373656e6765722f77702d6f6c6c616d612d6d6f64656c2d70726f76696465722f646f776e6c6f616473)](https://packagist.org/packages/jonathanbossenger/wp-ollama-model-provider)[![License](https://camo.githubusercontent.com/f014eaef486740172e554657c4684cdb73fc78d1254aecd995d5e2eec3324664/68747470733a2f2f706f7365722e707567782e6f72672f6a6f6e617468616e626f7373656e6765722f77702d6f6c6c616d612d6d6f64656c2d70726f76696465722f6c6963656e7365)](https://packagist.org/packages/jonathanbossenger/wp-ollama-model-provider)

A WordPress plugin that provides local and cloud AI model support (Ollama) for the WordPress AI Client.

Description
-----------

[](#description)

WP Ollama Model Provider enables WordPress to use AI models through Ollama, supporting both local installations and Ollama Cloud. This plugin acts as a provider for the [WordPress AI Client](https://github.com/WordPress/wordpress-ai-client), making Ollama models accessible to any WordPress plugin that uses the AI Client.

### Features

[](#features)

- **Flexible Deployment**: Support for both local Ollama installations and Ollama Cloud
- **Local AI Models**: Run AI models locally with Ollama - no cloud API keys required
- **Cloud Integration**: Use Ollama Cloud for easy access without local installation
- **Automatic Model Detection**: Discovers all available Ollama models on your system or cloud account
- **Simple Configuration**: Easy settings page at Settings &gt; Ollama AI Models
- **Model Selection**: Choose which Ollama model to use from a dropdown
- **Model Caching**: Efficient 5-minute cache for model discovery
- **Public API**: Other plugins can easily check for and use your selected model
- **Privacy-First**: Local mode ensures text generation happens entirely on your machine

### Supported Providers

[](#supported-providers)

- **Ollama** (current)
    - Local installation
    - Ollama Cloud
- Future: LocalAI, LM Studio, and other local providers

Requirements
------------

[](#requirements)

- **PHP**: 8.0 or higher
- **WordPress**: 6.0 or higher
- **Dependencies**:
    - `wordpress/wp-ai-client` ^0.2.1
    - For local mode: [Ollama](https://ollama.com) installed and running locally
    - For cloud mode: Ollama Cloud API key

Installation
------------

[](#installation)

### Via Composer (Recommended)

[](#via-composer-recommended)

```
# Install in your WordPress project
composer require jonathanbossenger/wp-ollama-model-provider
```

The plugin will be installed to `wp-content/plugins/wp-ollama-model-provider/` (or `web/app/plugins/` for Bedrock). Activate it through the WordPress admin.

**For Bedrock/Roots.io users:**The package will automatically install to the correct plugins directory.

**For traditional WordPress installations:**Run composer from your WordPress root with proper installer-paths configured, or run it directly in `wp-content/plugins/`.

### Manual Installation

[](#manual-installation)

1. Download the latest release from [GitHub Releases](https://github.com/jonathanbossenger/wp-ollama-model-provider/releases)
2. Upload to `/wp-content/plugins/wp-ollama-model-provider/`
3. Run `composer install --no-dev` in the plugin directory
4. Activate the plugin through the WordPress admin

### Setup Ollama

[](#setup-ollama)

#### Local Setup

[](#local-setup)

1. Install Ollama from
2. Pull at least one model: ```
    ollama pull llama3.2
    ```
3. Verify Ollama is running: ```
    curl http://localhost:11434/api/tags
    ```

#### Ollama Cloud Setup

[](#ollama-cloud-setup)

1. Sign up for Ollama Cloud at
2. Get your API key from the Ollama Cloud dashboard

Configuration
-------------

[](#configuration)

### For Local Ollama:

[](#for-local-ollama)

1. Navigate to **Settings &gt; Ollama AI Models** in WordPress admin
2. Select **Local** as the deployment mode
3. Select your preferred Ollama model from the dropdown
4. Click **Save Settings**

### For Ollama Cloud:

[](#for-ollama-cloud)

1. Navigate to **Settings &gt; Ollama AI Models** in WordPress admin
2. Select **Ollama Cloud** as the deployment mode
3. Enter your Ollama Cloud API key
4. Select your preferred model from the dropdown
5. Click **Save Settings**

Your selected model is now available to all WordPress plugins that use the AI Client.

Usage
-----

[](#usage)

### For End Users

[](#for-end-users)

Once configured, the plugin runs automatically in the background. Any WordPress plugin that uses the WordPress AI Client can detect and use your selected Ollama model.

### For Plugin Developers

[](#for-plugin-developers)

#### Check for Selected Model

[](#check-for-selected-model)

```
if ( function_exists( 'wp_ollama_model_provider_get_selected_model' ) ) {
    $selected_model = wp_ollama_model_provider_get_selected_model( 'ollama' );

    if ( ! empty( $selected_model ) ) {
        // Use the selected Ollama model
    }
}
```

#### Use the Selected Model

[](#use-the-selected-model)

```
// Check for selected Ollama model from wp-ollama-model-provider
if ( function_exists( 'wp_ollama_model_provider_get_selected_model' ) ) {
    $selected_ollama_model = wp_ollama_model_provider_get_selected_model( 'ollama' );

    if ( ! empty( $selected_ollama_model ) &&
         class_exists( 'WpOllamaModelProvider\Providers\Ollama\OllamaProvider' ) ) {
        $model = \WpOllamaModelProvider\Providers\Ollama\OllamaProvider::model( $selected_ollama_model );
        $registry = \WordPress\AiClient\AiClient::defaultRegistry();
        $registry->bindModelDependencies( $model );

        return \WordPress\AI_Client\AI_Client::prompt( $prompt )
            ->using_model( $model )
            ->generate_text();
    }
}

// Fall back to automatic provider/model selection
return \WordPress\AI_Client\AI_Client::prompt( $prompt )->generate_text();
```

#### Public API Functions

[](#public-api-functions)

**wp\_ollama\_model\_provider\_is\_provider\_registered( $provider\_slug )**

- Check if a provider (e.g., 'ollama') is registered
- Returns: `bool`

**wp\_ollama\_model\_provider\_has\_settings\_page()**

- Check if the settings page is available
- Returns: `bool`

**wp\_ollama\_model\_provider\_get\_selected\_model( $provider\_slug )**

- Get the selected model for a provider (e.g., 'ollama')
- Returns: `string` (model ID) or empty string if none selected

Filters
-------

[](#filters)

### wp\_ai\_client\_ollama\_base\_url

[](#wp_ai_client_ollama_base_url)

Change the Ollama base URL (default: `http://localhost:11434`).

```
add_filter( 'wp_ai_client_ollama_base_url', function() {
    return 'http://192.168.1.100:11434'; // Remote Ollama server
} );
```

### wp\_ai\_client\_default\_request\_timeout

[](#wp_ai_client_default_request_timeout)

Adjust timeout for slower models (default: varies by plugin).

```
add_filter( 'wp_ai_client_default_request_timeout', function() {
    return 120; // 2 minutes
} );
```

Documentation
-------------

[](#documentation)

- **[Quick Start Guide](docs/QUICK-START.md)** - Get up and running in 5 minutes
- **[Ollama Models Reference](docs/OLLAMA-MODELS.md)** - Model recommendations and comparisons
- **[Integration Guide](docs/OLLAMA-INTEGRATION.md)** - Technical integration details

Troubleshooting
---------------

[](#troubleshooting)

### No models appearing in dropdown

[](#no-models-appearing-in-dropdown)

1. Verify Ollama is running:

    ```
    ollama list
    ```
2. Pull a model if needed:

    ```
    ollama pull llama3.2
    ```
3. Use the "Refresh Model List" button on the settings page

### Cannot connect to Ollama

[](#cannot-connect-to-ollama)

1. Check if Ollama is running:

    ```
    curl http://localhost:11434/api/tags
    ```
2. Start Ollama if needed:

    ```
    ollama serve
    ```

### Slow generation times

[](#slow-generation-times)

- Use a smaller model (e.g., `llama3.2:1b` instead of `llama2:13b`)
- Increase the timeout using the `wp_ai_client_default_request_timeout` filter
- See [OLLAMA-MODELS.md](docs/OLLAMA-MODELS.md) for model recommendations

### Enable debug logging

[](#enable-debug-logging)

```
// In wp-config.php
define( 'WP_DEBUG', true );
define( 'WP_DEBUG_LOG', true );
define( 'WP_DEBUG_DISPLAY', false );
```

Check `wp-content/debug.log` for error messages.

Limitations
-----------

[](#limitations)

- **Text Generation Only**: Ollama only supports text generation, not image generation
- **Local Models**: Requires Ollama to be installed and running locally (or on your network)
- **Resource Intensive**: Larger models require significant RAM and CPU/GPU resources

Contributing
------------

[](#contributing)

Contributions are welcome! Please feel free to submit issues or pull requests.

### Development Setup

[](#development-setup)

1. Clone the repository
2. Run `composer install`
3. Ensure Ollama is installed and running
4. Activate the plugin in a WordPress development environment

### Coding Standards

[](#coding-standards)

This plugin follows WordPress Coding Standards. Check your code:

```
vendor/bin/phpcs --standard=WordPress wp-ollama-model-provider.php includes/
```

License
-------

[](#license)

GPL-2.0-or-later

Support
-------

[](#support)

- **Issues**: [GitHub Issues](https://github.com/jonathanbossenger/wp-ollama-model-provider/issues)
- **Documentation**: See `docs/` directory
- **Ollama**:

Credits
-------

[](#credits)

Created by Jonathan Bossenger

Built on top of:

- [WordPress AI Client](https://github.com/WordPress/wordpress-ai-client)
- [Ollama](https://ollama.com)

###  Health Score

35

—

LowBetter than 80% of packages

Maintenance79

Regular maintenance activity

Popularity7

Limited adoption so far

Community8

Small or concentrated contributor base

Maturity41

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 59.1% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Unknown

Total

1

Last Release

94d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/86f5e1be04745c6f77957f7ffa31698e0d0c9e13fcc6f33b84a58987eea004c1?d=identicon)[psykro](/maintainers/psykro)

---

Top Contributors

[![Copilot](https://avatars.githubusercontent.com/in/1143301?v=4)](https://github.com/Copilot "Copilot (13 commits)")[![jonathanbossenger](https://avatars.githubusercontent.com/u/180629?v=4)](https://github.com/jonathanbossenger "jonathanbossenger (9 commits)")

---

Tags

wordpresscloudailocalwordpress pluginmachine learningllmollama

###  Code Quality

TestsPHPUnit

Code StylePHP\_CodeSniffer

### Embed Badge

![Health badge](/badges/jonathanbossenger-wp-ollama-model-provider/health.svg)

```
[![Health](https://phpackages.com/badges/jonathanbossenger-wp-ollama-model-provider/health.svg)](https://phpackages.com/packages/jonathanbossenger-wp-ollama-model-provider)
```

###  Alternatives

[deepseek-php/deepseek-php-client

deepseek PHP client is a robust and community-driven PHP client library for seamless integration with the Deepseek API, offering efficient access to advanced AI and data processing capabilities.

47073.9k5](/packages/deepseek-php-deepseek-php-client)[ardagnsrn/ollama-php

This is a PHP library for Ollama. Ollama is an open-source project that serves as a powerful and user-friendly platform for running LLMs on your local machine. It acts as a bridge between the complexities of LLM technology and the desire for an accessible and customizable AI experience.

20755.8k](/packages/ardagnsrn-ollama-php)[rubix/server

Deploy your Rubix ML models to production with scalable stand-alone inference servers.

632.3k](/packages/rubix-server)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
