PHPackages                             jonaspoelmans/laravel-gpt - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. jonaspoelmans/laravel-gpt

ActiveLibrary[API Development](/categories/api)

jonaspoelmans/laravel-gpt
=========================

This package integrates with LLM APIs such as OpenAI and Mistral

v1.1.3(1y ago)121MITPHPPHP ^8.1

Since May 24Pushed 1y ago1 watchersCompare

[ Source](https://github.com/jonaspoelmans/laravel-gpt)[ Packagist](https://packagist.org/packages/jonaspoelmans/laravel-gpt)[ RSS](/packages/jonaspoelmans-laravel-gpt/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (4)Dependencies (9)Versions (6)Used By (0)

Connect with LLMs such as OpenAI ChatGPT and Mistral
====================================================

[](#connect-with-llms-such-as-openai-chatgpt-and-mistral)

[![Latest Version on Packagist](https://camo.githubusercontent.com/c8b676627ea2e8f53788d8885bcb6310eae056464a2e110b1fbb40853f404ee1/68747470733a2f2f696d672e736869656c64732e696f2f7061636b61676973742f762f6a6f6e6173706f656c6d616e732f6c61726176656c2d6770742e737667)](https://packagist.org/packages/jonaspoelmans/laravel-gpt)[![GitHub Tests Action Status](https://camo.githubusercontent.com/e6ff76d791e16a4c5602d7ceec6d85fbd8d09f94444759850c5a45d953681c43/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f616374696f6e732f776f726b666c6f772f7374617475732f6a6f6e6173706f656c6d616e732f6c61726176656c2d6770742f7068702e796d6c3f6272616e63683d6d61696e266c6162656c3d5465737473)](https://github.com/jonaspoelmans/laravel-gpt/actions?query=workflow%3ATests+branch%3Amain)[![Total Downloads](https://camo.githubusercontent.com/412f97c41eefbc8bd9daba954f3c005adfd26f4dd1e99277df868d438f7309b2/68747470733a2f2f696d672e736869656c64732e696f2f7061636b61676973742f64742f6a6f6e6173706f656c6d616e732f6c61726176656c2d6770742e7376673f7374796c653d666c61742d737175617265)](https://packagist.org/packages/jonaspoelmans/laravel-gpt)

Documentation, Installation, and Usage Instructions
---------------------------------------------------

[](#documentation-installation-and-usage-instructions)

### Laravel

[](#laravel)

Require this package in your composer.json and update composer. This will download the package and the laravel-gpt libraries also.

```
composer require jonaspoelmans/laravel-gpt

```

You also need to create an OpenAI API key and add it to your.env file:

```
OPENAI_API_KEY=xxxxxxxxxxxxx

```

What It Does
------------

[](#what-it-does)

This package allows you to connect by API to Large language Models such as OpenAI ChatGPT and Mistral.

Once installed you can make a prompt consisting of any textual content you wish to feed OpenAI:

```
// Create your prompt for the LLM
$prompt = "Give me the list of 10 best PHP frameworks."

// Instantiate the Laravel GPT service
$laravelGPT = new LaravelGPTService();

// Retrieve a response from ChatGPT
$response = $laravelGPT->generateOpenAIResponse($prompt);
```

Instead of a single prompt, you can also feed OpenAI with prior message history:

```
// Prior messages are generated by a user, the system or the chat assistant
$priorMessageUser = new OpenAIMessage('user', 'I love Laravel but exclude it from the list.');
$priorMessageAssistant = new OpenAIMessage('assistant', 'Which list?');
$history = [$priorMessageUser, $priorMessageAssistant];

// The prompt can be fed to ChatGPT alongside the chat history
$response = $laravelGPT->generateOpenAIResponse($prompt, $history);
```

The response object is an associative array if the request went through or an empty string if an error occurred.

The parameters for the LLMs can be configured through the laravelgpt.php config class. You can publish the configuration file as follows:

```
php artisan vendor:publish --tag="laravel-gpt-config"

```

For OpenAI you can tweak these parameters:

```
// The base URI for the OpenAI API.
// This is the endpoint where all API requests will be sent.
'openai_base_uri' => 'https://api.openai.com/v1/',

// The default model to be used for generating responses.
// You can change this to any valid model identifier provided by OpenAI,
// such as 'gpt-3.5-turbo' or 'gpt-4-1106-preview'.
'openai_model' => 'gpt-4-1106-preview',

// The maximum number of tokens to generate in the response.
// Tokens can be thought of as pieces of words. The maximum number
// of tokens allowed is determined by the model you are using.
'openai_max_tokens' => 4000,

// The temperature setting for the response generation.
// Temperature controls the randomness of the output.
// A value closer to 0 makes the output more deterministic and repetitive,
// while a value closer to 1 makes it more random.
'openai_temperature' => 0.7,

// Enable or disable logging of errors.
// When set to true, any errors encountered while using the API
// will be logged using Laravel's built-in logging system.
'openai_logging' => true,
```

Contributing
------------

[](#contributing)

Please send me a direct message if you would like to contribute..

### Testing

[](#testing)

```
composer test
```

### Security

[](#security)

If you discover any security-related issues, please email  instead of using the issue tracker.

Postcardware
------------

[](#postcardware)

You're free to use this package, but if it makes it to your production environment I would highly appreciate if you send me a text:-) I reply to all messages!

Credits
-------

[](#credits)

- [Jonas Poelmans](https://github.com/jonaspoelmans)

License
-------

[](#license)

The MIT License (MIT). Please see [License File](LICENSE.md) for more information.

###  Health Score

28

—

LowBetter than 54% of packages

Maintenance35

Infrequent updates — may be unmaintained

Popularity8

Limited adoption so far

Community7

Small or concentrated contributor base

Maturity53

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~28 days

Total

5

Last Release

605d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/0c443a6280dc6e95645c846b854492ff357fcfc699eac51caf9d8c05a0f60f82?d=identicon)[JonasPoelmans](/maintainers/JonasPoelmans)

---

Top Contributors

[![jonaspoelmans](https://avatars.githubusercontent.com/u/17165896?v=4)](https://github.com/jonaspoelmans "jonaspoelmans (24 commits)")

---

Tags

laravelllmopenaiphp8

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/jonaspoelmans-laravel-gpt/health.svg)

```
[![Health](https://phpackages.com/badges/jonaspoelmans-laravel-gpt/health.svg)](https://phpackages.com/packages/jonaspoelmans-laravel-gpt)
```

###  Alternatives

[andreaselia/laravel-api-to-postman

Generate a Postman collection automatically from your Laravel API

1.0k586.2k3](/packages/andreaselia-laravel-api-to-postman)[roots/acorn

Framework for Roots WordPress projects built with Laravel components.

9682.1M97](/packages/roots-acorn)[laravel-zero/framework

The Laravel Zero Framework.

3371.4M369](/packages/laravel-zero-framework)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

245.2k](/packages/aedart-athenaeum)[flarum/core

Delightfully simple forum software.

211.3M1.9k](/packages/flarum-core)[simplestats-io/laravel-client

Client for SimpleStats!

4515.5k](/packages/simplestats-io-laravel-client)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
