PHPackages                             futuremeng/ollama-laravel - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. futuremeng/ollama-laravel

ActiveLibrary

futuremeng/ollama-laravel
=========================

This is my package ollama-laravel

v1.1.6(1y ago)014MITPHPPHP ^8.2

Since Mar 27Pushed 1y agoCompare

[ Source](https://github.com/futuremeng/ollama-laravel)[ Packagist](https://packagist.org/packages/futuremeng/ollama-laravel)[ Docs](https://github.com/futuremeng/ollama-laravel)[ RSS](/packages/futuremeng-ollama-laravel/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (9)Versions (3)Used By (0)

Ollama-Laravel Package
======================

[](#ollama-laravel-package)

Ollama-Laravel is a Laravel package that provides a seamless integration with the [Ollama API](https://github.com/jmorganca/ollama). It includes functionalities for model management, prompt generation, format setting, and more. This package is perfect for developers looking to leverage the power of the Ollama API in their Laravel applications.

If you use laravel 10.x, please use the following version V1.0.9
----------------------------------------------------------------

[](#if-you-use-laravel-10x-please-use-the-following-version-v109)

```
https://github.com/futuremeng/ollama-laravel/releases/tag/v1.0.9
```

Installation
------------

[](#installation)

```
composer require futuremeng/ollama-laravel
```

Configuration
-------------

[](#configuration)

```
php artisan vendor:publish --tag="ollama-laravel-config"
```

Published config file:

```
return [
    'model' => env('OLLAMA_MODEL', 'llama2'),
    'url' => env('OLLAMA_URL', 'http://127.0.0.1:11434'),
    'default_prompt' => env('OLLAMA_DEFAULT_PROMPT', 'Hello, how can I assist you today?'),
    'connection' => [
        'timeout' => env('OLLAMA_CONNECTION_TIMEOUT', 300),
        'verify_ssl' => env('OLLAMA_VERIFY_SSL', true),
        'auth' => [
            'type' => env('OLLAMA_AUTH_TYPE', null), // 'basic' or 'bearer'
            'username' => env('OLLAMA_AUTH_USERNAME', null),
            'password' => env('OLLAMA_AUTH_PASSWORD', null),
            'token' => env('OLLAMA_AUTH_TOKEN', null),
        ],
        'headers' => [
            // Add any custom headers here
            // 'X-Custom-Header' => 'value',
        ],
    ],
];
```

### Authentication

[](#authentication)

> **Note:** The Ollama server itself does not include built-in authentication. The authentication features in this package are designed to work with reverse proxy setups (like Nginx, Caddy, or Apache) that add authentication in front of your Ollama server.

The package supports both Basic Authentication and Bearer Token Authentication when accessing Ollama through a reverse proxy. This is particularly useful when:

- Deploying Ollama in a production environment
- Securing access to your Ollama instance
- Running Ollama behind a corporate firewall

Common reverse proxy setups:

- Nginx with basic auth or token validation
- Caddy with authentication middleware
- Apache with auth modules

#### Basic Authentication

[](#basic-authentication)

To use Basic Authentication with your reverse proxy, set the following environment variables:

```
OLLAMA_AUTH_TYPE=basic
OLLAMA_AUTH_USERNAME=your_username
OLLAMA_AUTH_PASSWORD=your_password
```

#### Bearer Token Authentication

[](#bearer-token-authentication)

To use Bearer Token Authentication, set the following environment variables:

```
OLLAMA_AUTH_TYPE=bearer
OLLAMA_AUTH_TOKEN=your_bearer_token
```

#### Custom Headers

[](#custom-headers)

You can add custom headers in the configuration file:

```
'headers' => [
    'X-Custom-Header' => 'value',
    'Another-Header' => 'another-value',
],
```

Usage
-----

[](#usage)

### Basic Usage

[](#basic-usage)

```
use futuremeng\Ollama\Facades\Ollama;

$schema = [
            "type"  => "array",
            "items" => [
                "type"       => "object",
                "properties" => [
                    "s"    => ["type" => "string"],
                    "p"    => ["type" => "string"],
                    "o"    => ["type" => "string"],
                    "type" => [
                        "type" => "string",
                        "enum" => ["resource", "literal"],
                    ],
                ],
                "required"   => ["s", "p", "o", "type"],
            ],
        ];

/** @var array $response */
$response = Ollama::agent('You are a language expert...')
    ->prompt('The sky is blue.')
    ->model('llama2')
    ->options(['temperature' => 0.7])
    ->format($schema)
    ->stream(false)
    ->ask();
```

### Vision Support

[](#vision-support)

```
/** @var array $response */
$response = Ollama::model('llava:13b')
    ->prompt('What is in this picture?')
    ->image(public_path('images/example.jpg'))
    ->ask();

// "The image features a close-up of a person's hand, wearing bright pink fingernail polish and blue nail polish. In addition to the colorful nails, the hand has two tattoos – one is a cross and the other is an eye."
```

### Chat Completion

[](#chat-completion)

```
$messages = [
    ['role' => 'user', 'content' => 'My name is Toni Soriano and I live in Spain'],
    ['role' => 'assistant', 'content' => 'Nice to meet you , Toni Soriano'],
    ['role' => 'user', 'content' => 'where I live ?'],
];

$response = Ollama::agent('You know me really well!')
    ->model('llama2')
    ->chat($messages);

// "You mentioned that you live in Spain."

### Chat Completion
```

### Chat Completion with tools

[](#chat-completion-with-tools)

```
$messages = [
    ['role' => 'user', 'content' => 'What is the weather in Toronto?'],
];

$response = Ollama::model('llama3.1')
    ->tools([
        [
            "type"     => "function",
            "function" => [
                "name"        => "get_current_weather",
                "description" => "Get the current weather for a location",
                "parameters"  => [
                    "type"       => "object",
                    "properties" => [
                        "location" => [
                            "type"        => "string",
                            "description" => "The location to get the weather for, e.g. San Francisco, CA",
                        ],
                        "format"   => [
                            "type"        => "string",
                            "description" => "The format to return the weather in, e.g. 'celsius' or 'fahrenheit'",
                            "enum"        => ["celsius", "fahrenheit"],
                        ],
                    ],
                    "required"   => ["location", "format"],
                ],
            ],
        ],
    ])
    ->chat($messages);
```

### Streamable responses

[](#streamable-responses)

```
use futuremeng\Ollama\Facades\Ollama;
use Illuminate\Console\BufferedConsoleOutput;

/** @var \GuzzleHttp\Psr7\Response $response */
$response = Ollama::agent('You are a snarky friend with one-line responses')
    ->prompt("I didn't sleep much last night")
    ->model('llama3')
    ->options(['temperature' => 0.1])
    ->stream(true)
    ->ask();

$output = new BufferedConsoleOutput();
$responses = Ollama::processStream($response->getBody(), function($data) use ($output) {
    $output->write($data['response']);
});

$output->write("\n");
$complete = implode('', array_column($responses, 'response'));
$output->write("$complete");
```

###  Health Score

31

—

LowBetter than 68% of packages

Maintenance46

Moderate activity, may be stable

Popularity6

Limited adoption so far

Community14

Small or concentrated contributor base

Maturity51

Maturing project, gaining track record

 Bus Factor2

2 contributors hold 50%+ of commits

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~6 days

Total

2

Last Release

405d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/a750885f91f3a4f426aedff7fbaeb95c88745914dd919d30718ab5bb94af148a?d=identicon)[futuremeng](/maintainers/futuremeng)

---

Top Contributors

[![cloudstudio](https://avatars.githubusercontent.com/u/3589377?v=4)](https://github.com/cloudstudio "cloudstudio (13 commits)")[![pepijnolivier](https://avatars.githubusercontent.com/u/2015108?v=4)](https://github.com/pepijnolivier "pepijnolivier (7 commits)")[![futuremeng](https://avatars.githubusercontent.com/u/3060463?v=4)](https://github.com/futuremeng "futuremeng (4 commits)")[![rjsamra](https://avatars.githubusercontent.com/u/31566509?v=4)](https://github.com/rjsamra "rjsamra (2 commits)")[![neoteknic](https://avatars.githubusercontent.com/u/1809652?v=4)](https://github.com/neoteknic "neoteknic (1 commits)")[![antonrom-appflame](https://avatars.githubusercontent.com/u/192878789?v=4)](https://github.com/antonrom-appflame "antonrom-appflame (1 commits)")[![steef-paqt](https://avatars.githubusercontent.com/u/59729886?v=4)](https://github.com/steef-paqt "steef-paqt (1 commits)")[![G4Zz0L1](https://avatars.githubusercontent.com/u/13220636?v=4)](https://github.com/G4Zz0L1 "G4Zz0L1 (1 commits)")[![Jamonek](https://avatars.githubusercontent.com/u/209572?v=4)](https://github.com/Jamonek "Jamonek (1 commits)")[![marianoarga](https://avatars.githubusercontent.com/u/1138627?v=4)](https://github.com/marianoarga "marianoarga (1 commits)")

---

Tags

laravelollama-laravelfuturemeng

###  Code Quality

TestsPest

Code StyleLaravel Pint

### Embed Badge

![Health badge](/badges/futuremeng-ollama-laravel/health.svg)

```
[![Health](https://phpackages.com/badges/futuremeng-ollama-laravel/health.svg)](https://phpackages.com/packages/futuremeng-ollama-laravel)
```

###  Alternatives

[spatie/laravel-health

Monitor the health of a Laravel application

85810.0M83](/packages/spatie-laravel-health)[vormkracht10/laravel-mails

Laravel Mails can collect everything you might want to track about the mails that has been sent by your Laravel app.

24149.7k](/packages/vormkracht10-laravel-mails)[sunchayn/nimbus

A Laravel package providing an in-browser API client with automatic schema generation, live validation, and built-in authentication with a touch of Laravel-tailored magic for effortless API testing.

29428.0k](/packages/sunchayn-nimbus)[muhammadhuzaifa/telescope-guzzle-watcher

Telescope Guzzle Watcher provide a custom watcher for intercepting http requests made via guzzlehttp/guzzle php library. The package uses the on\_stats request option for extracting the request/response data. The watcher intercept and log the request into the Laravel Telescope HTTP Client Watcher.

98239.8k1](/packages/muhammadhuzaifa-telescope-guzzle-watcher)[spatie/laravel-mailcoach-sdk

An SDK to easily work with the Mailcoach API in Laravel apps

41290.2k1](/packages/spatie-laravel-mailcoach-sdk)[ralphjsmit/laravel-helpers

A package containing handy helpers for your Laravel-application.

13704.6k2](/packages/ralphjsmit-laravel-helpers)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
