PHPackages                             modelflow-ai/ollama - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. modelflow-ai/ollama

ActiveLibrary[API Development](/categories/api)

modelflow-ai/ollama
===================

Client for ollama API.

0.3.0(1y ago)131.3k↓15.5%34MITPHPPHP ^8.2

Since Apr 7Pushed 3mo agoCompare

[ Source](https://github.com/modelflow-ai/ollama)[ Packagist](https://packagist.org/packages/modelflow-ai/ollama)[ RSS](/packages/modelflow-ai-ollama/feed)WikiDiscussions 0.2 Synced 1mo ago

READMEChangelogDependencies (11)Versions (9)Used By (4)

 [![Ollama Logo](https://avatars.githubusercontent.com/u/152068817?s=768&v=4)](https://avatars.githubusercontent.com/u/152068817?s=768&v=4)

Modelflow AI
Ollama

=====================

[](#modelflow-aiollama)

Ollama is a PHP package that provides an easy-to-use client for the ollama API.

> **Note**: This is part of the `modelflow-ai` project create issues in the [main repository](https://github.com/modelflow-ai/.github).

> **Note**: This project is heavily under development and any feedback is greatly appreciated.

Installation
------------

[](#installation)

To install the Ollama package, you need to have PHP 8.2 or higher and Composer installed on your machine. Then, you can add the package to your project by running the following command:

```
composer require modelflow-ai/ollama
```

Examples
--------

[](#examples)

Here are some examples of how you can use the Ollama in your PHP applications. You can find more detailed examples in the [examples directory](examples).

Usage
-----

[](#usage)

```
use ModelflowAi\Ollama\Ollama;

// Create a client instance
$client = Ollama::client();

// Use the client
$chat = $client->chat();
$completion = $client->completion();
$embeddings = $client->embeddings();

// Example usage of chat
$chatResponse = $chat->create([
    'model' => 'llama2',
    'messages' => [['role' => 'user', 'content' => 'Hello, world!']],
]);
echo $chatResponse->message->content;

// Example usage of completion
$completionResponse = $completion->create([
    'model' => 'llama2',
    'prompt' => 'Once upon a time',
]);
echo $completionResponse->response;

// Example usage of embeddings
$embeddingsResponse = $embeddings->create(['prompt' => 'Hello, world!']);
echo $embeddingsResponse->embedding;
```

For more examples, see the [examples](examples) directory.

Testing &amp; Code Quality
--------------------------

[](#testing--code-quality)

To run the tests and all the code quality tools with the following commands:

```
composer fix
composer lint
composer test
```

Open Points
-----------

[](#open-points)

### Model API

[](#model-api)

The Model API is another area that we are actively working on. Once completed, this will provide users with the ability to manage and interact with their AI models directly from the Ollama package.

Contributing
------------

[](#contributing)

Contributions are welcome. Please open an issue or submit a pull request in the main repository at .

License
-------

[](#license)

This project is licensed under the MIT License. For the full copyright and license information, please view the LICENSE file that was distributed with this source code.

###  Health Score

42

—

FairBetter than 89% of packages

Maintenance68

Regular maintenance activity

Popularity28

Limited adoption so far

Community16

Small or concentrated contributor base

Maturity48

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 97.3% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~94 days

Recently: every ~106 days

Total

8

Last Release

99d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/e43a10984e8ee1687abaac86c26311c6a607b9560e8a3cc3193f8245751618bc?d=identicon)[wachterjohannes](/maintainers/wachterjohannes)

---

Top Contributors

[![wachterjohannes](https://avatars.githubusercontent.com/u/1464615?v=4)](https://github.com/wachterjohannes "wachterjohannes (36 commits)")[![alexander-schranz](https://avatars.githubusercontent.com/u/1698337?v=4)](https://github.com/alexander-schranz "alexander-schranz (1 commits)")

---

Tags

apiclientaiollama

###  Code Quality

TestsPHPUnit

Static AnalysisPHPStan, Rector

Type Coverage Yes

### Embed Badge

![Health badge](/badges/modelflow-ai-ollama/health.svg)

```
[![Health](https://phpackages.com/badges/modelflow-ai-ollama/health.svg)](https://phpackages.com/packages/modelflow-ai-ollama)
```

###  Alternatives

[deepseek-php/deepseek-php-client

deepseek PHP client is a robust and community-driven PHP client library for seamless integration with the Deepseek API, offering efficient access to advanced AI and data processing capabilities.

47073.9k5](/packages/deepseek-php-deepseek-php-client)[gemini-api-php/client

API client for Google's Gemini API

216221.4k5](/packages/gemini-api-php-client)[chriskonnertz/deeply

DeepLy is a PHP client for the DeepL.com translation API

230116.2k8](/packages/chriskonnertz-deeply)[gemini-api-php/laravel

Gemini API client for Laravel

8915.7k](/packages/gemini-api-php-laravel)[smnandre/pagespeed-api

PageSpeed Insight PHP Api Client 🚀 Analyse web pages for performances metrics, core web vitals...

1511.5k](/packages/smnandre-pagespeed-api)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
