PHPackages                             bluesparklabs/spark - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. bluesparklabs/spark

AbandonedArchivedProject

bluesparklabs/spark
===================

✨ Toolkit to develop, test and run Drupal websites.

0.2.0(7y ago)2451[13 issues](https://github.com/BluesparkLabs/spark/issues)mitPHP

Since Jul 27Pushed 7y ago13 watchersCompare

[ Source](https://github.com/BluesparkLabs/spark)[ Packagist](https://packagist.org/packages/bluesparklabs/spark)[ RSS](/packages/bluesparklabs-spark/feed)WikiDiscussions master Synced 2mo ago

READMEChangelogDependencies (12)Versions (9)Used By (0)

Spark ✨
=======

[](#spark-)

Toolkit to develop, test and run PHP applications.

Spark provides a turnkey Docker-based **environment** for development and continuous integration. It ships **commands** to create anonymized database exports, execute test suites, initialize a Solr index etc. Spark simply needs to be added as your project's **dependency**, and after some minimal configuration steps you're ready to go.

["Concerning toolkits"](https://blog.kentcdodds.com/concerning-toolkits-4db57296e1c3), an article published by [Kent C. Dodds](https://github.com/kentcdodds) had been a great inspiration for architecting Spark.

Roadmap
-------

[](#roadmap)

- We're in the middle of implementing key database interactions in order to be able to use Spark for creating GDPR-compliant, anonymized database exports. ([See our board here.](https://github.com/BluesparkLabs/spark/projects/1))
- After the features around interacting with the database we'll turn to getting prepared for our first alpha release, which will introduce a more flexible way for defining the required services for project environments.

Getting Started — How to Sparkify your project
----------------------------------------------

[](#getting-started--how-to-sparkify-your-project)

Check out the [Drupal 8 example project](https://github.com/BluesparkLabs/spark/tree/master/examples/drupal8).

Here are the main steps outlined.

**1. Add Spark as a dependency:**

```
    $ composer require bluesparklabs/spark

```

**2. Define a new script in your `composer.json`:**

```
"scripts": {
  "spark": "SPARK_WORKDIR=`pwd` robo --ansi --load-from vendor/bluesparklabs/spark"
}
```

**3. Add autoload information to your `autoload` field in your `composer.json`:**

```
"autoload": {
    "psr-4": {
        "BluesparkLabs\\Spark\\": "./vendor/bluesparklabs/spark/src/"
    }
},
```

**4. Create a file named `.spark.yml` in your project's root.** This will be your project-specific configuration that Spark will use.

To learn about how to write your project-specific configuration, please refer to our [`.spark.example.yml` file](https://github.com/BluesparkLabs/spark/blob/master/.spark.example.yml).

**5. Optional: Create a file named `.spark.local.yml` in your project's root.** This will be your environment-specific configuration that Spark will use. Do not commit this file to your repository. If you want to leverage environment-specific configuration for CI builds or in your hosting environment, the recommended way is to keep these files in your repository named specifically, i.e. `.spark.local.ci.yml`, and then ensure you have automation in place that renames it to `.spark.local.yml` in the appropriate environment.

To learn about how to write your project-specific configuration, please refer to our [`.spark.example.yml` file](https://github.com/BluesparkLabs/spark/blob/master/.spark.local.example.yml).

### Recommended `composer.json` bits

[](#recommended-composerjson-bits)

See the [Drupal 8 example project's `composer.json` file](https://github.com/BluesparkLabs/spark/blob/master/examples/drupal8/composer.json).

**1.** Composer by default installs all packages under a directory called `./vendor`. **Use [`composer/installers`](https://packagist.org/packages/composer/installers) to define installation destinations** for Drupal modules, themes etc. Example configuration in `composer.json`:

```
"extra": {
    "installer-paths": {
        "web/core": ["type:drupal-core"],
        "web/libraries/{$name}": ["type:drupal-library"],
        "web/modules/contrib/{$name}": ["type:drupal-module"],
        "web/profiles/contrib/{$name}": ["type:drupal-profile"],
        "web/themes/contrib/{$name}": ["type:drupal-theme"],
        "drush/contrib/{$name}": ["type:drupal-drush"]
    }
}
```

**2.** In case you're working with a **Drupal site, use [drupal-composer/drupal-scaffold](https://packagist.org/packages/drupal-composer/drupal-scaffold) to install and update files that are outside of the `core`** folder and which are not part of the [drupal/core](https://packagist.org/packages/drupal/core) package. This Composer plugin will take care of the files whenever you install or update `drupal/core`, but to run it manually, you can add a script to your `composer.json`:

```
"scripts": {
    "drupal-scaffold": "DrupalComposer\\DrupalScaffold\\Plugin::scaffold",
},
```

**3.** Spark has a command, `drupal:files`, to ensure the `files` folder exists with the right permissions, and that there's a `settings.php` file and a `settings.spark.php` which currently holds Spark's Docker-specific configuration, i.e. database connection etc. You may want to add this command to your `scripts` field in your `composer.json`, so that Composer executes it when packages are installed or updated:

```
"scripts": {
    "post-install-cmd": "composer run spark drupal:files",
    "post-update-cmd": "composer run spark drupal:files",
}
```

Usage
-----

[](#usage)

This is how you can run a Spark command:

```
$ composer run spark

```

Tip: Set up `spark` as a command-line alias to `composer run spark`.

To list all available commands, just issue `$ composer run spark`. Here is a high-level overview of what you can currently do with Spark:

Command namespaceDescription`drush`Execute Drush commands`containers`Manage a Docker-based environment`db`Drop or import database, check its availability`drupal`(Being deprecated.) Create backup and upload to an Amazon S3 bucket, ensure `files` directory and `settings.php` files, install Drupal`mysql`Import or export database. (Will eventualy deprecate `db` command group.)`solr`Initialize a Solr core with configuration for Drupal, check its availability`test`Execute test suiteCommands
--------

[](#commands)

Notes:

- Commands will be documented here as they become ready for prime time.
- ⚠️ When using command-line arguments, you need to include a double-dash (`--`) before your arguments. E.g. `composer run spark mysql:dump -- --non-sanitized`. (See the [reason for this](https://github.com/BluesparkLabs/spark/issues/10#issuecomment-424646525) and the [proposed solution](https://github.com/BluesparkLabs/spark/issues/36).)

### `mysql:dump`

[](#mysqldump)

Exports database to a file. By default the file is placed into the current folder and data is sanitized based on the sanitization rules in `.spark.yml.` The following command-line arguments are optional.

ArgumentDescription`--non-sanitized`Produces a non-sanitized data export.`--destination`Directory to where the export file will be placed. Can be an absolute or a relative path.

###  Health Score

23

—

LowBetter than 27% of packages

Maintenance0

Infrequent updates — may be unmaintained

Popularity12

Limited adoption so far

Community18

Small or concentrated contributor base

Maturity57

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 66.3% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~29 days

Total

4

Last Release

2756d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/bd6e7e98953f48d39aa2336b5841c3f738e81826310647393f1f80ceebf78cd6?d=identicon)[balintk](/maintainers/balintk)

![](https://www.gravatar.com/avatar/8dccbc2a1354ebdb6ae91429a75b8997f7162ca9643e677a35fa2b1633107c70?d=identicon)[jameswilson](/maintainers/jameswilson)

---

Top Contributors

[![balintbrews](https://avatars.githubusercontent.com/u/297418?v=4)](https://github.com/balintbrews "balintbrews (63 commits)")[![jameswilson](https://avatars.githubusercontent.com/u/243532?v=4)](https://github.com/jameswilson "jameswilson (18 commits)")[![balintk](https://avatars.githubusercontent.com/u/147158703?v=4)](https://github.com/balintk "balintk (7 commits)")[![citlacom](https://avatars.githubusercontent.com/u/802425?v=4)](https://github.com/citlacom "citlacom (4 commits)")[![marthinal](https://avatars.githubusercontent.com/u/2011140?v=4)](https://github.com/marthinal "marthinal (3 commits)")

---

Tags

devops-tools

### Embed Badge

![Health badge](/badges/bluesparklabs-spark/health.svg)

```
[![Health](https://phpackages.com/badges/bluesparklabs-spark/health.svg)](https://phpackages.com/packages/bluesparklabs-spark)
```

###  Alternatives

[pressbooks/pressbooks

Pressbooks is an open source book publishing tool built on a WordPress multisite platform. Pressbooks outputs books in multiple formats, including PDF, EPUB, web, and a variety of XML flavours, using a theming/templating system, driven by CSS.

44643.1k1](/packages/pressbooks-pressbooks)[lullabot/drainpipe

An automated build tool to allow projects to have a set standardized operations scripts.

41716.4k2](/packages/lullabot-drainpipe)[chromatic/usher

A collection of Robo commands for use on Chromatic projects.

13534.3k1](/packages/chromatic-usher)[jasara/php-amzn-selling-partner-api

A fluent interface for Amazon's Selling Partner API in PHP

1344.8k1](/packages/jasara-php-amzn-selling-partner-api)[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

255.2k](/packages/aedart-athenaeum)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
