PHPackages                             lewenbraun/ollama-php-sdk - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. lewenbraun/ollama-php-sdk

ActiveLibrary[API Development](/categories/api)

lewenbraun/ollama-php-sdk
=========================

PHP SDK for interacting with the Ollama API, a service for running and deploying large language models.

1.0.2(1y ago)274↓66.7%MITPHPPHP &gt;=8.1

Since Feb 20Pushed 1y ago1 watchersCompare

[ Source](https://github.com/lewenbraun/ollama-php-sdk)[ Packagist](https://packagist.org/packages/lewenbraun/ollama-php-sdk)[ Docs](https://github.com/lewenbraun/ollama-php-sdk)[ RSS](/packages/lewenbraun-ollama-php-sdk/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (3)Dependencies (2)Versions (4)Used By (0)

Ollama PHP Client Library
-------------------------

[](#ollama-php-client-library)

A lightweight PHP client library to interact with the Ollama API. This library provides a simple, fluent interface to work with text and chat completions, manage models, handle blobs, and generate embeddings.

Installation
------------

[](#installation)

Install the library via Composer:

```
composer require lewenbraun/ollama-php-sdk
```

Usage
-----

[](#usage)

### Initialize the Client

[](#initialize-the-client)

Create a new client instance (default host is `http://localhost:11434`):

```
use Lewenbraun\Ollama\Ollama;

$client = Ollama::client('http://localhost:11434');
```

### Completion

[](#completion)

Generate a text completion using a model. Under the hood, this wraps the `/api/generate` endpoint.

#### Non-Streaming Completion

[](#non-streaming-completion)

Returns a single response object:

```
$completion = $client->completion()->create([
    'model'  => 'llama3.2',
    'prompt' => 'Hello',
    // 'stream' is false by default
]);

echo $completion->response;
```

#### Streaming Completion

[](#streaming-completion)

Enables streaming to receive the response in chunks:

```
$completion = $client->completion()->create([
    'model'  => 'llama3.2',
    'prompt' => 'Hello',
    'stream' => true,
]);

foreach ($completion as $chunk) {
    echo $chunk->response;
}
```

### Chat Completion

[](#chat-completion)

Generate chat responses using a conversational context. This wraps the `/api/chat` endpoint.

#### Non-Streaming Chat Completion

[](#non-streaming-chat-completion)

Returns the final chat response as a single object:

```
$chat = $client->chatCompletion()->create([
    'model'    => 'llama3.2',
    'messages' => [
        ['role' => 'user', 'content' => 'why is the sky blue?']
    ],
    'stream' => false,
]);

echo $chat->message->content;
```

#### Streaming Chat Completion

[](#streaming-chat-completion)

Stream the chat response as it is generated:

```
$chat = $client->chatCompletion()->create([
    'model'    => 'llama3.2',
    'messages' => [
        ['role' => 'user', 'content' => 'why is the sky blue?']
    ],
    'stream' => true,
]);

foreach ($chat as $chunk) {
    echo $chunk->message->content;
}
```

### Model Management

[](#model-management)

Manage models through various endpoints (create, list, show, copy, delete, pull, push).

#### Create a Model

[](#create-a-model)

Create a new model from an existing model (or other sources):

```
$newModel = $client->models()->create([
    'model'  => 'mario',
    'from'   => 'llama3.2',
    'system' => 'You are Mario from Super Mario Bros.',
]);
```

#### List Models

[](#list-models)

Retrieve a list of available models:

```
$modelList = $client->models()->list();
print_r($modelList);
```

#### List Running Models

[](#list-running-models)

List models currently loaded into memory:

```
$runningModels = $client->models()->listRunning();
print_r($runningModels);
```

#### Show Model Information

[](#show-model-information)

Retrieve details about a specific model:

```
$modelInfo = $client->models()->show(['model' => 'llama3.2']);
print_r($modelInfo);
```

#### Copy a Model

[](#copy-a-model)

Create a copy of an existing model:

```
$copied = $client->models()->copy([
    'source'      => 'llama3.2',
    'destination' => 'llama3-backup',
]);
```

#### Delete a Model

[](#delete-a-model)

Delete a model by name:

```
$deleted = $client->models()->delete(['model' => 'llama3:13b']);
```

#### Pull a Model

[](#pull-a-model)

Download a model from the Ollama library:

```
$pull = $client->models()->pull(['model' => 'llama3.2']);
```

#### Push a Model

[](#push-a-model)

Upload a model to your model library:

```
$push = $client->models()->push(['model' => 'your_namespace/your_model:tag']);
```

### Blobs

[](#blobs)

Work with binary large objects (blobs) used for model files.

#### Check if a Blob Exists

[](#check-if-a-blob-exists)

Verify if a blob (by its SHA256 digest) exists on the server:

```
$exists = $client->blobs()->exists(['digest' => 'sha256:your_digest_here']);
```

#### Push a Blob

[](#push-a-blob)

Upload a blob to the Ollama server:

```
$blobPushed = $client->blobs()->push([
    'digest' => 'sha256:your_digest_here',
    // Additional parameters as required
]);
```

### Embeddings

[](#embeddings)

Generate embeddings from a model using the `/api/embed` endpoint.

```
$embedding = $client->embed()->create([
    'model' => 'all-minilm',
    'input' => 'Why is the sky blue?',
]);

print_r($embedding);
```

### Version

[](#version)

Retrieve the version of the Ollama server.

```
$version = $client->version()->show();
echo "Ollama version: " . $version->version;
```

### Advanced Parameters

[](#advanced-parameters)

Each method accepts additional parameters (e.g., suffix, format, options, keep\_alive, etc.) that can be used to fine-tune the request. For example, you can pass advanced options in the request arrays for structured outputs, reproducible responses, and more. For full details, refer to the **[Ollama API documentation](https://github.com/ollama/ollama/blob/main/docs/api.md)**.

Contributing
------------

[](#contributing)

Contributions, issues, and feature requests are welcome. Please check the issues page for more details.

License
-------

[](#license)

This project is licensed under the MIT License.

###  Health Score

30

—

LowBetter than 64% of packages

Maintenance45

Moderate activity, may be stable

Popularity12

Limited adoption so far

Community7

Small or concentrated contributor base

Maturity48

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~10 days

Total

3

Last Release

423d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/1e717ef19f0d8e379825e86fe2e4c4c2a9db047d8beb5efa250fcd1b91f4ca5a?d=identicon)[lewenbraun](/maintainers/lewenbraun)

---

Top Contributors

[![lewenbraun](https://avatars.githubusercontent.com/u/44715690?v=4)](https://github.com/lewenbraun "lewenbraun (9 commits)")

---

Tags

apisdkaimachine learningllmollamaLanguage Model

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/lewenbraun-ollama-php-sdk/health.svg)

```
[![Health](https://phpackages.com/badges/lewenbraun-ollama-php-sdk/health.svg)](https://phpackages.com/packages/lewenbraun-ollama-php-sdk)
```

###  Alternatives

[deepseek-php/deepseek-php-client

deepseek PHP client is a robust and community-driven PHP client library for seamless integration with the Deepseek API, offering efficient access to advanced AI and data processing capabilities.

47073.9k5](/packages/deepseek-php-deepseek-php-client)[theodo-group/llphant

LLPhant is a library to help you build Generative AI applications.

1.5k311.5k5](/packages/theodo-group-llphant)[claude-php/claude-php-sdk-laravel

Laravel integration for the Claude PHP SDK - Anthropic Claude API

5010.8k](/packages/claude-php-claude-php-sdk-laravel)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
