PHPackages                             brimleylabs/brim - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Search &amp; Filtering](/categories/search)
4. /
5. brimleylabs/brim

ActiveLibrary[Search &amp; Filtering](/categories/search)

brimleylabs/brim
================

Semantic search for Laravel using Ollama and pgvector. Local-first, privacy-focused, no API keys required.

v1.0.0(5mo ago)12MITPHPPHP ^8.2

Since Dec 2Pushed 5mo agoCompare

[ Source](https://github.com/brimleylabs/brim)[ Packagist](https://packagist.org/packages/brimleylabs/brim)[ Docs](https://github.com/brimleylabs/brim)[ RSS](/packages/brimleylabs-brim/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (5)Versions (2)Used By (0)

 [![Brimley the Morkie](https://raw.githubusercontent.com/brimleylabs/brim/main/art/brimley.svg)](https://raw.githubusercontent.com/brimleylabs/brim/main/art/brimley.svg)

Brim
====

[](#brim)

 **Bringing Retrieval to Indexed Models**
 Semantic search for Laravel. Local-first, no API keys required.

 [![Latest Version](https://camo.githubusercontent.com/bf1c7f6cabbbd11f2efb9f2d86ea2cafbeb1224be9150ecaa47eb92649b99d76/68747470733a2f2f696d672e736869656c64732e696f2f7061636b61676973742f762f6272696d6c65796c6162732f6272696d2e7376673f7374796c653d666c61742d737175617265)](https://packagist.org/packages/brimleylabs/brim) [![License](https://camo.githubusercontent.com/55c0218c8f8009f06ad4ddae837ddd05301481fcf0dff8e0ed9dadda8780713e/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f6c6963656e73652d4d49542d627269676874677265656e2e7376673f7374796c653d666c61742d737175617265)](https://github.com/brimleylabs/brim/blob/main/LICENSE) [![PHP Version](https://camo.githubusercontent.com/ce42ccc094ad69407f582b91534813154dc2ba8c653f297a3858e21a5db3d2fe/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f7068702d382e322b2d3838393242462e7376673f7374796c653d666c61742d737175617265)](https://php.net) [![Laravel Version](https://camo.githubusercontent.com/3d87f1c711acac2101f225b5068dc0dfdf95ccfe56dc849f04c65d0a6b6307d3/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f6c61726176656c2d31302e782d2d31322e782d4646324432302e7376673f7374796c653d666c61742d737175617265)](https://laravel.com)

 [Quick Start](#-quick-start) • [Installation](#-installation) • [Usage](#-usage) • [Configuration](#-configuration) • [Advanced](#-advanced) • [Testing](#-testing)

---

What is Brim?
-------------

[](#what-is-brim)

Brim adds **AI-powered semantic search** to your Laravel models. Instead of matching exact keywords, Brim understands the *meaning* behind your queries.

```
// Traditional search: only finds exact matches
Article::where('title', 'LIKE', '%red car%')->get();

// Brim semantic search: understands meaning
Article::semanticSearch('red car')->get();
// ✓ Finds: "crimson automobile", "scarlet vehicle", "cherry-colored sedan"
```

**How it works:** Brim converts your text into numerical vectors (embeddings) using AI, then finds similar content using vector mathematics. All processing happens locally on your machine using [Ollama](https://ollama.ai) - no API keys, no usage limits, no data leaving your server.

---

✨ Why Brim?
-----------

[](#-why-brim)

**🔒 Privacy First****🎯 Laravel Native**Your data never leaves your server. Embeddings are generated locally using Ollama. No third-party API calls unless you explicitly choose OpenAI.Eloquent trait integration. Familiar syntax. Model observers for automatic sync. Works with your existing codebase.**💰 Zero API Costs****🔌 Extensible**No per-query charges. No token limits. Generate unlimited embeddings and run unlimited searches.Swap between Ollama (local) and OpenAI (cloud). Custom drivers supported. Multiple embedding namespaces per model.**⚡ Production Ready****📊 Observable**HNSW indexing for sub-millisecond searches. Batch operations. Queue support. Built-in telemetry.Built-in telemetry tracks embedding generation, search performance, and system health.---

🚀 Quick Start
-------------

[](#-quick-start)

Get semantic search running in under 5 minutes:

```
composer require brimleylabs/brim
php artisan vendor:publish --provider="Brim\BrimServiceProvider"
php artisan migrate
```

```
// 1. Add trait to your model
class Article extends Model
{
    use HasEmbeddings;

    public function toEmbeddableText(): string
    {
        return "{$this->title}\n\n{$this->content}";
    }
}

// 2. Generate embeddings
Article::all()->each->generateEmbedding();

// 3. Search semantically
Article::semanticSearch('climate change solutions')->get();
```

That's it. Your app now has AI-powered search.

---

📋 Requirements
--------------

[](#-requirements)

RequirementVersionNotesPHP8.2+RequiredLaravel10.x, 11.x, 12.xAny recent versionPostgreSQL12+With pgvector extensionOllamaLatestFor local embeddings---

📦 Installation
--------------

[](#-installation)

### Step 1: Install the Package

[](#step-1-install-the-package)

```
composer require brimleylabs/brim
```

### Step 2: Publish Configuration &amp; Migrations

[](#step-2-publish-configuration--migrations)

```
php artisan vendor:publish --provider="Brim\BrimServiceProvider"
```

This publishes:

- `config/brim.php` - Configuration file
- `database/migrations/*_create_brim_embeddings_table.php` - Vector storage table
- `database/migrations/*_create_brim_telemetry_table.php` - Telemetry table (optional)

### Step 3: Configure PostgreSQL with pgvector

[](#step-3-configure-postgresql-with-pgvector)

Brim uses pgvector for efficient vector storage and similarity search.

**Option A: If you have superuser access:**

```
CREATE EXTENSION IF NOT EXISTS vector;
```

**Option B: Using Docker (recommended for local development):**

```
# docker-compose.yml
services:
  postgres:
    image: pgvector/pgvector:pg16
    environment:
      POSTGRES_DB: your_database
      POSTGRES_USER: your_user
      POSTGRES_PASSWORD: your_password
    ports:
      - "5432:5432"
```

**Option C: Cloud PostgreSQL:**

- **Supabase**: pgvector enabled by default
- **Neon**: Enable via dashboard extensions
- **AWS RDS**: Enable pgvector extension in parameter group

### Step 4: Run Migrations

[](#step-4-run-migrations)

```
php artisan migrate
```

### Step 5: Install Ollama

[](#step-5-install-ollama)

Ollama runs AI models locally on your machine.

**macOS:**

```
brew install ollama
```

**Linux:**

```
curl -fsSL https://ollama.ai/install.sh | sh
```

**Windows:**Download from [ollama.ai/download](https://ollama.ai/download)

### Step 6: Pull the Embedding Model

[](#step-6-pull-the-embedding-model)

```
ollama pull nomic-embed-text
```

This downloads the `nomic-embed-text` model (~274MB), optimized for generating text embeddings.

### Step 7: Start Ollama

[](#step-7-start-ollama)

```
ollama serve
```

Ollama runs on `http://localhost:11434` by default.

### Step 8: Verify Installation

[](#step-8-verify-installation)

```
php artisan brim:health
```

You should see:

```
✓ Ollama connection: healthy
✓ Model nomic-embed-text: available
✓ PostgreSQL pgvector: enabled
✓ Brim is ready to use!

```

---

🎯 Usage
-------

[](#-usage)

### Adding Semantic Search to a Model

[](#adding-semantic-search-to-a-model)

**Step 1:** Add the `HasEmbeddings` trait and implement `toEmbeddableText()`:

```
