PHPackages                             vizra-ai/ai-tokens - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [API Development](/categories/api)
4. /
5. vizra-ai/ai-tokens

ActiveLibrary[API Development](/categories/api)

vizra-ai/ai-tokens
==================

Estimate AI API costs before making expensive calls

0.0.1(8mo ago)1850↓40%1MITPHPPHP ^8.1

Since Sep 11Pushed 8mo agoCompare

[ Source](https://github.com/vizra-ai/ai-tokens)[ Packagist](https://packagist.org/packages/vizra-ai/ai-tokens)[ RSS](/packages/vizra-ai-ai-tokens/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (1)Versions (2)Used By (0)

AI Tokens
=========

[](#ai-tokens)

A lightweight PHP package for AI token cost management. Estimate costs before API calls, calculate actual costs from token usage, and track expenses across OpenAI, Claude, Gemini, and other popular AI models.

Installation
------------

[](#installation)

```
composer require vizra-ai/ai-tokens
```

Usage
-----

[](#usage)

### 1. Count Tokens &amp; Estimate Cost from Text

[](#1-count-tokens--estimate-cost-from-text)

Estimate tokens and cost for a text message:

```
use VizraAi\AiTokens\TokenCounter;

$result = TokenCounter::count('Hello, how are you?', 'gpt-4o');
// Returns: ['input_tokens' => 5, 'estimated_cost' => 0.0000125]
```

### 2. Calculate Actual Cost from Token Usage

[](#2-calculate-actual-cost-from-token-usage)

Calculate costs when you know the exact token counts (e.g., after an API call):

```
use VizraAi\AiTokens\TokenCounter;

$result = TokenCounter::calculateActualCost(
    model: 'gpt-4o',
    inputTokens: 500,
    outputTokens: 600
);
// Returns: ['total_cost' => 0.00725, 'input_cost' => 0.00125, 'output_cost' => 0.006]
```

### 3. Estimate Cost Before API Calls

[](#3-estimate-cost-before-api-calls)

Plan costs before making API calls:

```
use VizraAi\AiTokens\TokenCounter;

$result = TokenCounter::estimateCost([
    'model' => 'claude-3.5-sonnet',
    'input_tokens' => 2000,
    'expected_output_tokens' => 1500  // optional
]);
// Returns: ['total_cost' => 0.0285, 'input_cost' => 0.006, 'output_cost' => 0.0225]
```

Supported Models
----------------

[](#supported-models)

The package supports 100+ models across major AI providers:

### OpenAI

[](#openai)

- GPT-4o series (`gpt-4o`, `gpt-4o-mini`, `gpt-4o-realtime`)
- GPT-4 series (`gpt-4`, `gpt-4-turbo`)
- GPT-3.5 (`gpt-3.5-turbo`)
- ChatGPT models (`chatgpt-4o-latest`, `o1`, `o3` series)

### Claude (Anthropic)

[](#claude-anthropic)

- Claude 3.5 Sonnet (`claude-3-5-sonnet-20241022`)
- Claude 3 series (Opus, Sonnet, Haiku)

### Google

[](#google)

- Gemini 2.0 Flash (`gemini-2.0-flash-exp`, `gemini-2.0-flash-thinking-exp`)
- Gemini 1.5 Pro &amp; Flash series

### Others

[](#others)

- DeepSeek (`deepseek-v3`, `deepseek-r1`, `deepseek-r1-lite`)
- Mistral models
- Llama 3.1 &amp; 3.2 series
- Grok models
- ...and many more

### Utility Methods

[](#utility-methods)

```
// Get all supported models
$models = TokenCounter::getSupportedModels();

// Get model pricing information
$info = TokenCounter::getModelInfo('gpt-4o');
// Returns: ['input_price_per_million' => 2.5, 'output_price_per_million' => 10.0, 'chars_per_token' => 4.0, 'max_tokens' => 128000]

// Get last pricing update date
$date = TokenCounter::getLastUpdated();
// Returns: '2025-01-04'
```

### Exception Handling

[](#exception-handling)

```
use VizraAi\AiTokens\TokenCounter;
use VizraAi\AiTokens\Exceptions\TooManyTokensException;
use InvalidArgumentException;

try {
    $result = TokenCounter::count($longText, 'gpt-3.5-turbo');
} catch (TooManyTokensException $e) {
    // Text exceeds model's token limit
    echo $e->getMessage();
} catch (InvalidArgumentException $e) {
    // Unsupported model name
    echo $e->getMessage();
}
```

Configuration
-------------

[](#configuration)

The package automatically fetches the latest pricing data from the Vizra AI API. You can configure this behavior:

```
use VizraAi\AiTokens\Config\Pricing;

// Disable remote pricing (use local data only)
Pricing::configure([
    'use_remote_pricing' => false
]);

// Custom API endpoint
Pricing::configure([
    'api_endpoint' => 'https://your-api.com/pricing',
    'cache_duration' => 7200, // 2 hours
]);

// Clear pricing cache
Pricing::clearCache();
```

### Environment Variables

[](#environment-variables)

You can also configure via environment variables:

```
# Disable remote pricing fetching
AI_TOKENS_USE_REMOTE=false

# Custom API endpoint
AI_TOKENS_API_ENDPOINT=https://your-api.com/pricing

# Cache duration in seconds (default: 3600)
AI_TOKENS_CACHE_DURATION=7200

# API timeout in seconds (default: 5)
AI_TOKENS_API_TIMEOUT=10
```

Pricing Updates
---------------

[](#pricing-updates)

Pricing data is automatically fetched from the [Vizra AI Pricing API](https://vizra.ai/ai-llm-model-pricing) which is updated daily. The package includes local fallback data as a backup.

- **Remote Updates**: Fetched automatically (cached for 1 hour)
- **Local Fallback**: Used if API is unavailable
- **Zero Maintenance**: No need to update the package for pricing changes

Last local pricing update: **2025-01-04**

License
-------

[](#license)

MIT

###  Health Score

33

—

LowBetter than 75% of packages

Maintenance61

Regular maintenance activity

Popularity22

Limited adoption so far

Community7

Small or concentrated contributor base

Maturity34

Early-stage or recently created project

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Unknown

Total

1

Last Release

243d ago

### Community

Maintainers

![](https://avatars.githubusercontent.com/u/1165365?v=4)[Vizra](/maintainers/vizra)[@vizra](https://github.com/vizra)

---

Top Contributors

[![aaronlumsden](https://avatars.githubusercontent.com/u/3815991?v=4)](https://github.com/aaronlumsden "aaronlumsden (4 commits)")

---

Tags

aiai-pricingai-toolsanthropicapi-cost-calculatorclaudecomposercost-estimationgeminigpt-3gpt-4large-language-modelsllmmachine-learningopenaiphpphp-packagetoken-countertoken-estimationtoken-managementphpapilaraveltokensaiopenaiclaudellmanthropicgptcostpricingestimationtoken-countercost-calculatorapi-cost

###  Code Quality

TestsPHPUnit

### Embed Badge

![Health badge](/badges/vizra-ai-ai-tokens/health.svg)

```
[![Health](https://phpackages.com/badges/vizra-ai-ai-tokens/health.svg)](https://phpackages.com/packages/vizra-ai-ai-tokens)
```

###  Alternatives

[claude-php/claude-php-sdk-laravel

Laravel integration for the Claude PHP SDK - Anthropic Claude API

5010.8k](/packages/claude-php-claude-php-sdk-laravel)[sbsaga/toon

🧠 TOON for Laravel — a compact, human-readable, and token-efficient data format for AI prompts &amp; LLM contexts. Perfect for ChatGPT, Gemini, Claude, Mistral, and OpenAI integrations (JSON ⇄ TOON).

6115.6k](/packages/sbsaga-toon)[deepseek-php/deepseek-php-client

deepseek PHP client is a robust and community-driven PHP client library for seamless integration with the Deepseek API, offering efficient access to advanced AI and data processing capabilities.

47073.9k5](/packages/deepseek-php-deepseek-php-client)[vizra/vizra-adk

Vizra Agent Development Kit - A comprehensive Laravel package for building intelligent AI agents.

29026.1k](/packages/vizra-vizra-adk)[helgesverre/toon

Token-Oriented Object Notation - A compact data format for reducing token consumption when sending structured data to LLMs

11841.4k9](/packages/helgesverre-toon)[mozex/anthropic-laravel

Anthropic PHP for Laravel is a supercharged PHP API client that allows you to interact with the Anthropic API

71226.4k1](/packages/mozex-anthropic-laravel)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
