PHPackages                             alex-moreno/glitcherbot - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. alex-moreno/glitcherbot

ActiveLibrary

alex-moreno/glitcherbot
=======================

Visual regression testing made easy

0.1(5y ago)15109[10 issues](https://github.com/alex-moreno/glitcherbot/issues)PHPPHP &gt;=7.4

Since Dec 29Pushed 4y ago3 watchersCompare

[ Source](https://github.com/alex-moreno/glitcherbot)[ Packagist](https://packagist.org/packages/alex-moreno/glitcherbot)[ RSS](/packages/alex-moreno-glitcherbot/feed)WikiDiscussions main Synced 3d ago

READMEChangelog (5)Dependencies (4)Versions (7)Used By (0)

Visual regression testing made easy
===================================

[](#visual-regression-testing-made-easy)

Automating the boring stuff.

Managing a website can be difficult. You can end up with hundreds of pages, and making sure all of them are functional after an event, say a deployment, peaks of traffic, editorial changes, ads or javascripts behaving badly, ... Creating software and maintaining websites and apps is hard... regression testing should not.

Before you start, Clone / fork this repository and go into the folder. There are two ways of using this tool: With [Vagrant](#Vagrant) or [Docker](#Docker).

Vagrant
-------

[](#vagrant)

Download and install [Vagrant](https://www.vagrantup.com/downloads).

### Requirements

[](#requirements)

- PHP 7.0
- Sqlite
- A (maybe big) list of urls to crawl. Robots.txt and Sitemaps detected automatically

### Installation

[](#installation)

1. Download and unzip the package:

```
curl -L -# -C - -O "https://github.com/alex-moreno/glitcherbot/archive/main.zip"
unzip main.zip
cd glitcherbot-main

```

1. Run: `composer install`
2. Make a copy of your config.php

`cp config.sample.php config.php`

### Usage

[](#usage)

Create a .csv which contains a list of urls to iterate over (see example.csv).

If using Acquia Site Factory, a command is supplied to generate a list of sites from a sites.json file. You'll need to:

1. Download the sites.json in your Acquia Cloud subscription

`scp  [subscription][ENV].[ENV]@[subscription][ENV].ssh.enterprise-g1.acquia-sites.com:/mnt/files/[subscription][ENV]/files-private/sites.json ./sites-dev.json`

1. Vagrant up if you want to use the crawler inside the virtual machine (recommended).
2. Run the crawl against that json

`php bin/visual_regression_bot.php acquia:acsf-crawl-sites sites.json`

You can see all available commands by running:

`php bin/visual_regression_bot.php`

For help with a specific command use:

`php bin/visual_regression_bot.php help `

Whilst debugging, increase verbosity by adding a number of `-v` flags.

`-v` : verbose `-vv` : very verbose `-vvv` : debug

### Configuration

[](#configuration)

There are some settings that you can configure, like the headers that you'll send to the site or the concurrency that you want to use.

Move your config.sample.php into config.php and adapt to your needs. For example:

```
