PHPackages                             evolvoltd/laravel-openai-assistants - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. evolvoltd/laravel-openai-assistants

ActiveLibrary[API Development](/categories/api)

evolvoltd/laravel-openai-assistants
===================================

Method to send messages to an OpenAI Assistant of your choice.

11.0(2y ago)071MITPHPPHP &gt;=8.2.0

Since Jan 26Pushed 2y ago1 watchersCompare

[ Source](https://github.com/evolvoltd/laravel-openai-assistants)[ Packagist](https://packagist.org/packages/evolvoltd/laravel-openai-assistants)[ RSS](/packages/evolvoltd-laravel-openai-assistants/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (5)Dependencies (2)Versions (7)Used By (0)

laravel-openai-assistants
=========================

[](#laravel-openai-assistants)

Get Started
-----------

[](#get-started)

First, install by using Composer

```
composer require evolvoltd/laravel-openai-assistants

```

Then, make sure to set your .env variables

```
OPENAI_API_KEY =
GPT_ASSISTANT_ID =

```

Note that this package uses queues. Queues need to be configured. The queue being used is 'default' by default. You can set the .env variable below to change this.

```
QUEUE_NAME =

```

You can also set a maximum retry count. The default is 60. You can set the .env variable below to change this

```
MAX_RETRIES =

```

Usage
-----

[](#usage)

The key part of this package is the **ask** function

```
(new GPTService)->ask($message, $parser)

```

This function will add a message to an assistant's thread, run the thread and dispatch a job to check on the run's status and retrieve the response when the run is completed.

The first parameter **'message'** requires you to pass in the message you want to send to the assistant.

The second parameter **'parser'** requires you to create and pass in a class that implements interface **ParseGPTResponse**

ParseGPTResponse Interface
--------------------------

[](#parsegptresponse-interface)

This interface has three functions:

**`public function parseResponse();`**

This function allows you to set up the functionality to parse the response from the assistant according to your requirements. Example below:

```
  public function parseResponse(string $response)
    {
        return strtoupper($response);
    }

```

**`public function executeFunctions());`**

This function allows you to set up your assistant's predefined functions for when it requests data from them. An example is displayed below. Note that the **$name** is entirely based on the functions you assigned to your assistant. The output is hardcoded in this example but could return any data of your choice.

```
  public function executeFunctions(string $name, $arguments, $toolCall)
    {
        if ($name == 'get_weather') {
            return [
                'tool_call_id' => $toolCall->id,
                'output' => "25 degrees celsius"
            ];
        } else return [];
    }

```

**`public function statusUpdate();`**

This function allows you to set up your own way to store the status and data of the run for future reference. An example is displayed below. Note that in this example, GptQuery is a model of a database table. You will need to create your own database table and model to use this function.

```
  public function statusUpdate(string $status, string $run_id, string $thread_id, string $message = null, string $file_path = null, string $file_name = null, string $response = null)
    {
        if ($status === 'new') {
            $query = GptQuery::query()->create([
                'user_id' => auth::id(),
                'status' => $status,
                'run_id' => $run_id,
                'thread_id' => $thread_id,
                'message' => $message,
                'response' => $response,
                'file_path' => $file_path,
                'file_name' => $file_name
            ]);
        } else {
            $query = GptQuery::query()->where('run_id', $run_id)->where('thread_id', $thread_id)->first();
            $query->update([
                'status' => $status,
                'response' => $response
            ]);
        }

        return $query;
    }

```

Multiple Queues/Assistants
--------------------------

[](#multiple-queuesassistants)

You can also pass in optional variables when using the **ask** function to specify the queue and assistant for the specific message sent. This allows you to use multiple queues and/or assistants in your application. An example is below:

```
(new GPTService)->ask($message, $parser, null, $assistantId, $queueName)

```

Here, **$assistantId** allows you to pass in the assistant you want to use in this request, and **$queueName** allows you to pass in the name of the queue you want to use in this request. The **ask** function checks if these variables are provided. It uses the provided variable if available, otherwise it uses the value from the config.

###  Health Score

25

—

LowBetter than 37% of packages

Maintenance20

Infrequent updates — may be unmaintained

Popularity5

Limited adoption so far

Community10

Small or concentrated contributor base

Maturity58

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 75% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~22 days

Total

5

Last Release

748d ago

Major Versions

0.1.3 → 1.0.02024-01-29

1.0.0 → 11.02024-04-24

### Community

Maintainers

![](https://www.gravatar.com/avatar/a739a3ac15d9410bfba7e13dabe3a57914d2ec3622c5b5dac76b7486d923d90e?d=identicon)[evolco](/maintainers/evolco)

---

Top Contributors

[![Dainius-Dzialtuvas](https://avatars.githubusercontent.com/u/123356215?v=4)](https://github.com/Dainius-Dzialtuvas "Dainius-Dzialtuvas (6 commits)")[![evolvoltd](https://avatars.githubusercontent.com/u/28786451?v=4)](https://github.com/evolvoltd "evolvoltd (2 commits)")

### Embed Badge

![Health badge](/badges/evolvoltd-laravel-openai-assistants/health.svg)

```
[![Health](https://phpackages.com/badges/evolvoltd-laravel-openai-assistants/health.svg)](https://phpackages.com/packages/evolvoltd-laravel-openai-assistants)
```

###  Alternatives

[darkaonline/l5-swagger

OpenApi or Swagger integration to Laravel

2.9k34.0M112](/packages/darkaonline-l5-swagger)[echolabsdev/prism

A powerful Laravel package for integrating Large Language Models (LLMs) into your applications.

2.3k388.3k10](/packages/echolabsdev-prism)[sburina/laravel-whmcs-up

WHMCS API client and user provider for Laravel

271.3k](/packages/sburina-laravel-whmcs-up)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
