PHPackages                             mahdyfo/rotifer - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Framework](/categories/framework)
4. /
5. mahdyfo/rotifer

ActiveLibrary[Framework](/categories/framework)

mahdyfo/rotifer
===============

A genetic AI framework that evolves its own deep learning architecture (AutoML)

v1.1.0(2y ago)25GPL-3.0-or-laterPHPPHP &gt;=8.0.0

Since Sep 6Pushed 7mo ago1 watchersCompare

[ Source](https://github.com/mahdyfo/rotifer)[ Packagist](https://packagist.org/packages/mahdyfo/rotifer)[ RSS](/packages/mahdyfo-rotifer/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (3)DependenciesVersions (4)Used By (0)

🌱 Rotifer: A Genetic AI That Evolves Itself
===========================================

[](#-rotifer-a-genetic-ai-that-evolves-itself)

🧠 Autonomous Neural Evolution | Faster Convergence Than Keras
-------------------------------------------------------------

[](#-autonomous-neural-evolution--faster-convergence-than-keras)

**"The most powerful AI is one that builds itself."** - Mahdi Forghani Fard, 2023

**Our brain wasn't designed by an engineer. Why should our AIs be?**

[![PHP](https://camo.githubusercontent.com/5ce9fec63d2fc709c4840df682256782449cedaab15cae56caa7575ef16f0c9b/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4275696c74253230576974682d5048502d626c75652e737667)](#)

Rotifer is a cutting-edge **Genetic Machine Learning (AutoML) Framework** that **evolves its neural network architecture, layers, and weights** - all without manual intervention.

Unlike Keras or traditional frameworks where architecture is fixed, **Rotifer builds itself from scratch**, adding/removing neurons and connections via genetic algorithms. This results in having the most optimized and a truly emergent structure for the training dataset, and **faster convergence** - like nature intended.

🔥 Why Use Rotifer?
------------------

[](#-why-use-rotifer)

- 🧬 **Auto-evolving neural networks** (AutoML)
- 🚀 **Faster convergence** than traditional fixed-structure models
- 🧠 **No manual layer or neuron count tuning.** Possible to set, otherwise it evolves to find the best setup
- 🔄 **Mutation, crossover, and self-replication**
- 💡 **Ideal for creative AI and problem solving without predefined architecture**
- 🧩 Examples in the repository: **Autoencoder**, **Memory-Based learning**, and **XOR**
- 💻 Written in **pure PHP** - lightweight and hackable

🧬 How It Works
--------------

[](#-how-it-works)

Rotifer simulates a digital world of agents. Each agent has some genes which define a neural network with weights. The framework handles:

- Neurons
- Connections
- Mutation and crossover
- Activation functions
- Selection and reproduction

Over generations, the population evolves better architectures, learning the task without ever knowing the number of layers or neurons required.

[![Single Layer neural network with intra-connections](https://github.com/mahdyfo/php-genetic-ai-automl/raw/main/neural_layerings.jpg?raw=true)](https://github.com/mahdyfo/php-genetic-ai-automl/blob/main/neural_layerings.jpg?raw=true)

These 2 neural networks are identical. All hidden layers can be combined into a single layer with intra-connections.

Unlike traditional networks with rigid layers, Rotifer condenses complexity into a single, dynamically growing layer. This way we eliminate the need for manual configuration of neuron and layer counts. This single hidden-layer gets very complex and not understandable for humans after several generations of evolution by genetic algorithm. This is not important for us because we don't want to analyze them. We just want to make the network powerful, and indeed it will be very powerful.

📦 Install
---------

[](#-install)

```
git clone https://github.com/mahdyfo/rotifer.git

cd rotifer

composer install
```

🧪 Built-in Examples
-------------------

[](#-built-in-examples)

TaskRun FileDescription🧠 XORxor.phpClassic XOR learning in few generations🧠 Memorymemory.phpRemembers sequence outputs even if inputs repeat🧠 AutoEncoderautoencoder.phpCompress and decompress data through evolved layers🧠 Otheryourfile.phpWrite any other script you want using the examples🚀 Quick Start
-------------

[](#-quick-start)

✅ Solve XOR in seconds:

```
php xor.php
```

Example output:

```
Generation 1 - Best generation fitness: 5.3965296271639 - Best overall fitness: 5.3965296271639
Generation 50 - Best generation fitness: 5.9992278738651 - Best overall fitness: 5.9992278738651
Generation 100 - Best generation fitness: 6.0455893609229 - Best overall fitness: 6.7389574321586
Generation 150 - Best generation fitness: 7.4842880310069 - Best overall fitness: 7.6137585607025
Generation 200 - Best generation fitness: 7.5486734099125 - Best overall fitness: 7.9401862706856

Report:
  Best fitness => 7.940186270685596 (value based on the fitness function)
  Hidden Neurons Count => 7 (The hidden neurons created automatically for the best agent after 200 generations of evolution)
  Connections Count => 52 (The total connections count between neurons. Not all neurons are connected which results in being precise, fast and powerful)

Test:
    Rounded Output: 1 - Raw output: 0.99712500243069
    Rounded Output: 0 - Raw output: 0.00030062252549047
    Rounded Output: 0 - Raw output: 0.0019566823546141
    Rounded Output: 1 - Raw output: 0.99504714984784
    Rounded Output: 1 - Raw output: 0.99970922413458
    Rounded Output: 0 - Raw output: 0.00004324516515
    Rounded Output: 0 - Raw output: 0.0042442188361674
    Rounded Output: 1 - Raw output: 0.95487230539894
(You see how close are the numbers to the intended outputs? Try it with Keras to see the difference!)

```

📚 Code Breakdown
----------------

[](#-code-breakdown)

Create world and agents:

```
$population = 100;
// XOR data with bias value
$data = [
    [[1, 0, 0], [0]],
    [[1, 0, 1], [1]],
    [[1, 1, 0], [1]],
    [[1, 1, 1], [0]],
    //^ Here we set the first value of each input as bias (1). You can skip adding bias by removing them
];
$inputs = 3; // Inputs dimension
$outputs = 1; // Outputs dimension
$hasMemory = false; // Whether it should remember the sequence or not. For xor it shouldn't, for LLM it should.

$layers = []; // Empty, to auto-find the optimal layers and neurons structure by evolution
$layers = [3, 2]; // Or not empty, to use fixed layers count. 3 neurons in 1st hidden layer and 2 neurons in the 2nd hidden layer. This way, the evolution only finds weights.
// If using dynamic-layers, set PROBABILITY_MUTATE_* constants to values more than zero

$world = new World('MyExampleWorld');
$world->createAgents($population, $inputs, $outputs, $layers, $hasMemory);
```

---

Fitness function:

```
$fitnessFunction = function (Agent $agent, $dataRow, $otherAgents, $world) {
    // Here you have access to important variables such as the current $agent, the current inputs and outputs in $dataRow,
    // Other agents and the world instance. So you can even make the agents communicate with each other!
    $predicted = $agent->getOutputValues()[0]; // The predicted value based on the current agent genes
    $actual = $dataRow[1][0]; // The training set actual value
    return 1.0 - abs($predicted - $actual); // fitness function. Opposite of error function, the higher is the better
};
```

---

Run the world:

```
$generations = 30; // How many generations you want the world simulation to continue? 0 to make it endless
$survivalRate = 0.2; // 20% of the best agents in the world can reproduce the next generation. Like our actual world (hypergamy)
$world->step($fitnessFunction, $data, $generations, $survivalRate);
```

---

Test best agent:

```
$agent = $world->getBestAgent();
$agent->reset(); // Reset its memory, like a new-born child. If it has any memory from training phase.
// Iterate through data and test the inputs one by one to see output values
foreach ($data as $row) {
    $agent->step($row[0]);
    echo $agent->getOutputValues()[0];
}
```

---

Load saved worlds:

```
// Load the saved world to continue the training process
$world = World::loadAutoSaved('MyExampleWorld');
```

---

Constants:

```
const PROBABILITY_CROSSOVER = 0.5; // Crossover probability, 0.5 mean half genes from mother and half from father
const PROBABILITY_MUTATE_WEIGHT = 0.4; // Percentage of agents to get mutated
const MUTATE_WEIGHT_COUNT = 1; // number of weight mutations in every agent that gets mutated
const PROBABILITY_MUTATE_ADD_NEURON = 0.04; // Possibility of adding a neuron in a mutation
const PROBABILITY_MUTATE_REMOVE_NEURON = 0.04; // Possibility of removing a neuron in a mutation
const PROBABILITY_MUTATE_ADD_GENE = 0.1; // Possibility of adding a new connection between neurons in a mutation
const PROBABILITY_MUTATE_REMOVE_GENE = 0.1; // Possibility of removing a connection between two neurons in a mutation
const ACTIVATION = [Activation::class, 'sigmoid']; // The activation function. Find more values in Activation class
const SAVE_WORLD_EVERY_GENERATION = 0; // 0 means don't save
const CALCULATE_STEP_TIME = false; // calculate each epoch time of the agent and saves in $agent->stepTime
const ONLY_CALCULATE_FIRST_STEP_TIME = false; // Doesn't calculate all steps, but only the first step
```

---

❤️ Support &amp; Contribution
-----------------------------

[](#️-support--contribution)

We welcome issues, stars, pull requests, collaborations, feel free to contribute ideas or optimize the agent training strategies. Potential ideas:

- Feedback outputs of memory-network to another smaller network that has 3 outputs: type(add/delete neuron or connection), index1, index2. So the network can be changed during the testing by itself
- Parallelization
- GPU
- Visualizer for agent evolution
- Web interface for experiments
- CLI commands for training/testing
- Example for multi-agent cooperative environments

⭐️ Star Us
----------

[](#️-star-us)

If you believe in self-evolving AI, please give this repo a ⭐️ to support future development and spread awareness!

genetic ai - automl - evolving neural network - genetic algorithm neural net - neural architecture search - php ai - machine learning - self-evolving ai - keras alternative - pytorch alternative - autoencoder genetic - xor ai

###  Health Score

29

—

LowBetter than 60% of packages

Maintenance45

Moderate activity, may be stable

Popularity7

Limited adoption so far

Community7

Small or concentrated contributor base

Maturity50

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~2 days

Total

3

Last Release

975d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/2e494ccc82ac676e5987992a16751bfd69fafc0e91086b69333ab7f8e773b28c?d=identicon)[mahdyfo](/maintainers/mahdyfo)

---

Top Contributors

[![mahdyfo](https://avatars.githubusercontent.com/u/15161574?v=4)](https://github.com/mahdyfo "mahdyfo (132 commits)")

---

Tags

aimachine learningDeep learningevolution genetic ai

### Embed Badge

![Health badge](/badges/mahdyfo-rotifer/health.svg)

```
[![Health](https://phpackages.com/badges/mahdyfo-rotifer/health.svg)](https://phpackages.com/packages/mahdyfo-rotifer)
```

###  Alternatives

[laravel/boost

Laravel Boost accelerates AI-assisted development by providing the essential context and structure that AI needs to generate high-quality, Laravel-specific code.

3.4k10.6M274](/packages/laravel-boost)[rubix/ml

A high-level machine learning and deep learning library for the PHP language.

2.2k1.4M28](/packages/rubix-ml)[laravel/ai

The official AI SDK for Laravel.

732506.3k60](/packages/laravel-ai)[davmixcool/php-sentiment-analyzer

PHP Sentiment Analyzer is a lexicon and rule-based sentiment analysis tool that is used to understand sentiments in a sentence using VADER (Valence Aware Dictionary and sentiment Reasoner).

138151.7k1](/packages/davmixcool-php-sentiment-analyzer)[php-mcp/laravel

Laravel SDK for building Model Context Protocol (MCP) servers - Seamlessly integrate MCP tools, resources, and prompts into Laravel applications

47283.1k1](/packages/php-mcp-laravel)[nlpcloud/nlpcloud-client

NLP Cloud serves high performance pre-trained or custom models for NER, sentiment-analysis, classification, summarization, paraphrasing, grammar and spelling correction, keywords and keyphrases extraction, chatbot, product description and ad generation, intent classification, text generation, image generation, code generation, question answering, automatic speech recognition, machine translation, language detection, semantic search, semantic similarity, tokenization, POS tagging, speech synthesis, embeddings, and dependency parsing. It is ready for production, served through a REST API. This is the PHP client for the API. More details here: https://nlpcloud.com. Documentation: https://docs.nlpcloud.com. Github: https://github.com/nlpcloud/nlpcloud-php

2523.9k](/packages/nlpcloud-nlpcloud-client)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
