PHPackages                             jtsternberg/localgpt - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [CLI &amp; Console](/categories/cli)
4. /
5. jtsternberg/localgpt

ActiveLibrary[CLI &amp; Console](/categories/cli)

jtsternberg/localgpt
====================

A command-line interface for creating and interacting with local, file-based custom GPTs.

1.4.0(10mo ago)10MITPHP

Since Jun 23Pushed 10mo agoCompare

[ Source](https://github.com/jtsternberg/LocalGPT)[ Packagist](https://packagist.org/packages/jtsternberg/localgpt)[ Docs](https://github.com/jtsternberg/localgpt)[ RSS](/packages/jtsternberg-localgpt/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (6)Versions (6)Used By (0)

LocalGPT
========

[](#localgpt)

[![LocalGPT](https://private-user-images.githubusercontent.com/1098900/457710061-f297580a-0de2-4443-b932-1e8fc85e4432.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NzU0NzAwODcsIm5iZiI6MTc3NTQ2OTc4NywicGF0aCI6Ii8xMDk4OTAwLzQ1NzcxMDA2MS1mMjk3NTgwYS0wZGUyLTQ0NDMtYjkzMi0xZThmYzg1ZTQ0MzIucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI2MDQwNiUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNjA0MDZUMTAwMzA3WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9ZTZiMGNjZjlmMWZiN2MzNDg1YjdjMjc3YzQ5OGRmZGUyMzU3ZjMwN2IzMDIxNjU5ZmEyNGUyMTE5YjgzMjcxNSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.fdWH0Z8xgXj9PC_KtPlHZ7grk-uPTjqLx34wZa4gzYk)](https://private-user-images.githubusercontent.com/1098900/457710061-f297580a-0de2-4443-b932-1e8fc85e4432.png?jwt=eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9.eyJpc3MiOiJnaXRodWIuY29tIiwiYXVkIjoicmF3LmdpdGh1YnVzZXJjb250ZW50LmNvbSIsImtleSI6ImtleTUiLCJleHAiOjE3NzU0NzAwODcsIm5iZiI6MTc3NTQ2OTc4NywicGF0aCI6Ii8xMDk4OTAwLzQ1NzcxMDA2MS1mMjk3NTgwYS0wZGUyLTQ0NDMtYjkzMi0xZThmYzg1ZTQ0MzIucG5nP1gtQW16LUFsZ29yaXRobT1BV1M0LUhNQUMtU0hBMjU2JlgtQW16LUNyZWRlbnRpYWw9QUtJQVZDT0RZTFNBNTNQUUs0WkElMkYyMDI2MDQwNiUyRnVzLWVhc3QtMSUyRnMzJTJGYXdzNF9yZXF1ZXN0JlgtQW16LURhdGU9MjAyNjA0MDZUMTAwMzA3WiZYLUFtei1FeHBpcmVzPTMwMCZYLUFtei1TaWduYXR1cmU9ZTZiMGNjZjlmMWZiN2MzNDg1YjdjMjc3YzQ5OGRmZGUyMzU3ZjMwN2IzMDIxNjU5ZmEyNGUyMTE5YjgzMjcxNSZYLUFtei1TaWduZWRIZWFkZXJzPWhvc3QifQ.fdWH0Z8xgXj9PC_KtPlHZ7grk-uPTjqLx34wZa4gzYk)

A command-line interface for creating and interacting with local, file-based custom GPTs, powered by your favorite AI providers.

This tool is designed to be extensible, allowing you to wrap any AI API.

Providers currently supported:

- **Google Gemini**
- **OpenAI**
- **Anthropic**
- **Ollama**

With more providers coming soon.

**Table of Contents** *generated with [DocToc](https://github.com/thlorenz/doctoc)*

- [Features](#features)
- [Prerequisites](#prerequisites)
- [Installation &amp; Configuration](#installation--configuration)
- [Getting Started](#getting-started)
- [Commands](#commands)
    - [Command: `new`](#command-new)
    - [Command: `chat`](#command-chat)
    - [Command: `models`](#command-models)
    - [Command: `reference`](#command-reference)
- [Examples](#examples)
- [Additional Features](#additional-features)
    - [Reference Files](#reference-files)
    - [Manual Configuration](#manual-configuration)

Features
--------

[](#features)

- **CLI First**: No UI needed. Manage everything from your terminal.
- **Provider Agnostic**: Designed for extension, with support for Google Gemini, OpenAI, and Anthropic.
- **Local Configuration**: Define your custom GPTs in simple JSON files.
- **Interactive Chat**: Chat with your custom GPTs directly from the command line.
- **Non-interactive Chat**: Send a single message or a file to a GPT without entering an interactive session.
- **Reference Files**: Include local files in your GPT's context to provide additional information and data.
- **GPT Builder**: An interactive wizard to help you create your custom GPT configuration files.

Prerequisites
-------------

[](#prerequisites)

- PHP 8.1 or higher
- [Composer](https://getcomposer.org/)

Installation &amp; Configuration
--------------------------------

[](#installation--configuration)

1. **Install the tool via Composer**:

    ```
    composer global require jtsternberg/localgpt
    ```

    *Note: Make sure your global Composer `bin` directory is in your system's `PATH`.*
2. **Set up your API keys**. Copy the example environment file:

    ```
    cp .env.example .env
    ```
3. **Add your API key** to the new `.env` file. For now, we only need Gemini:

    ```
    # .env
    GEMINI_API_KEY="your_gemini_api_key_here"
    OPENAI_API_KEY="your_openai_api_key_here"
    ANTHROPIC_API_KEY="your_anthropic_api_key_here"

    ```

The CLI will automatically load the required API key based on the `provider` specified in your GPT's configuration file.

Getting Started
---------------

[](#getting-started)

This guide will walk you through creating and interacting with your first custom GPT.

**Step 1: Create a New GPT**

Use the `new` command to start the interactive wizard:

```
localgpt new my-first-gpt
```

The wizard will guide you through configuring your GPT's title, description, provider, model, and system prompt.

You will see a new directory created named `my-first-gpt/` with the following files:

- `gpt.json`
- `SYSTEM_PROMPT.md`
- `reference-files/`

If you want to modify your GPT's prompt in the future, simply edit the `SYSTEM_PROMPT.md` file.

**Step 2: Chat with Your GPT**

Once your GPT is created, start a chat session with the `chat` command:

```
localgpt chat my-first-gpt
```

You can now interact with your custom GPT directly from your terminal.

To simply get a one-time response from your GPT, you can use the `chat` command with the `--message` flag:

```
localgpt chat my-first-gpt --message "What is the capital of France?"
```

(or use `--messageFile` to send a file's content as the message)

**Step 3: Chat with Examples**

You can also chat with one of the [examples](./examples):

```
localgpt chat examples/pizza-pro-gemini-2.5-flash
```

Commands
--------

[](#commands)

### Command: `new`

[](#command-new)

Creates a new GPT configuration via an interactive wizard.

```
localgpt new my-first-gpt
```

This will launch a step-by-step wizard that asks you for the following information:

- **Title**: A name for your GPT.
- **Description**: A short description of what it does.
- **Provider**: Select a provider from a list of supported options (e.g., `gemini`, `openai`, or `anthropic`).
- **Model**: Select a model from the chosen provider's available list.
- **System Prompt**: The core instructions for the GPT. This can be typed directly or pasted into the terminal. The prompt is stored in a `SYSTEM_PROMPT.md` file inside the new GPT's directory.

The wizard will create a new directory named after the slug you provide (e.g., `my-first-gpt/`). This directory will contain your `gpt.json` configuration file and any other related files, like the `SYSTEM_PROMPT.md`.

This approach keeps your custom GPTs organized and makes them easy to track in version control alongside your projects.

### Command: `chat`

[](#command-chat)

Starts an interactive chat session with a specified GPT.

```
localgpt chat my-first-gpt
```

**Non-interactive Mode**

You can also send a single message or a file directly to the GPT without starting an interactive session.

**Send a message:**

```
localgpt chat my-first-gpt --message "What is the capital of France?"
```

**Send a file's content:**

```
localgpt chat my-first-gpt --messageFile "path/to/your/message.md"
```

*In non-interactive mode, the response is sent directly to `stdout`.*

### Command: `models`

[](#command-models)

Lists all available models from the supported AI providers.

```
localgpt models
```

*Note: For the `ollama` provider, this command will list the models you have pulled locally.*

### Command: `reference`

[](#command-reference)

Adds, removes, or lists reference files for a specified GPT.

```
localgpt reference my-first-gpt
```

**List reference files:**

```
localgpt reference my-first-gpt --list
```

**Add a reference file:**

```
localgpt reference my-first-gpt ./path/to/your/file.md
```

**Remove a reference file:**

```
localgpt reference my-first-gpt --delete ./reference-files/file.md
```

This command will copy the specified file to the GPT's `reference-files` directory and update the `gpt.json` configuration.

Examples
--------

[](#examples)

This repository includes several [examples](./examples) to help you get started. Checkout the [README](./examples#readme) for more information.

**Example Session:**

```
$ localgpt chat examples/pizza-pro-gemini-2.5-flash
Loading GPT: Pizza Pro - Lover of all things pizza
Provider: gemini
Model: gemini-2.5-flash

You can start chatting now. (type 'exit' to quit)

> What is the best pizza in the world?

🤖 The best pizza in the world is the Margherita pizza.
>

```

Additional Features
-------------------

[](#additional-features)

### Reference Files

[](#reference-files)

You can provide your GPT with local files to use as reference material. This is useful for providing additional context.

To use reference files, add a `reference_files` array to your `gpt.json` file. The array should contain a list of relative paths to the files you want to include.

```
{
    "reference_files": [
        "./reference-files/my-file-1.txt",
        "./reference-files/my-file-2.md"
    ]
}
```

When you start a chat session, the content of these files will be loaded and included in the system prompt that is sent to the LLM. This allows the GPT to use the information in the files to inform its responses.

### Manual Configuration

[](#manual-configuration)

You can create a `[name]/gpt.json` file yourself. Review the [`examples/pizza-pro-gemini-2.5-flash/gpt.json` file](./examples/pizza-pro-gemini-2.5-flash/gpt.json) for an example of the structure.

###  Health Score

27

—

LowBetter than 49% of packages

Maintenance54

Moderate activity, may be stable

Popularity2

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity40

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~1 days

Total

5

Last Release

318d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/204f41ec695703a1363f27b8e65667efbb491e78a15f5dba772bfce88a7a277b?d=identicon)[jtsternberg](/maintainers/jtsternberg)

---

Top Contributors

[![jtsternberg](https://avatars.githubusercontent.com/u/1098900?v=4)](https://github.com/jtsternberg "jtsternberg (61 commits)")

---

Tags

cliconsoleaiGeminigpt

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/jtsternberg-localgpt/health.svg)

```
[![Health](https://phpackages.com/badges/jtsternberg-localgpt/health.svg)](https://phpackages.com/packages/jtsternberg-localgpt)
```

###  Alternatives

[nunomaduro/collision

Cli error handling for console/command-line PHP applications.

4.6k331.8M8.4k](/packages/nunomaduro-collision)[nunomaduro/termwind

It's like Tailwind CSS, but for the console.

2.5k239.8M285](/packages/nunomaduro-termwind)[helhum/typo3-console

A reliable and powerful command line interface for TYPO3 CMS

2939.0M192](/packages/helhum-typo3-console)[laravel-zero/framework

The Laravel Zero Framework.

3371.4M368](/packages/laravel-zero-framework)[buggregator/trap

A simple and powerful tool for debugging PHP applications.

2591.7M40](/packages/buggregator-trap)[laminas/laminas-cli

Command-line interface for Laminas projects

563.7M54](/packages/laminas-laminas-cli)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
