PHPackages                             juhlinus/depictr - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. juhlinus/depictr

AbandonedArchivedLibrary

juhlinus/depictr
================

v2.0.1(5y ago)754.1k7[4 issues](https://github.com/Juhlinus/depictr/issues)[2 PRs](https://github.com/Juhlinus/depictr/pulls)MITPHPPHP ^7.4

Since Apr 18Pushed 5y ago5 watchersCompare

[ Source](https://github.com/Juhlinus/depictr)[ Packagist](https://packagist.org/packages/juhlinus/depictr)[ GitHub Sponsors](https://github.com/juhlinus)[ RSS](/packages/juhlinus-depictr/feed)WikiDiscussions master Synced today

READMEChangelog (9)Dependencies (2)Versions (14)Used By (0)

📽 Depictr
=========

[](#-depictr)

💰 Is this useful to you?
------------------------

[](#-is-this-useful-to-you)

**Consider [sponsoring me on github](https://github.com/sponsors/juhlinus)! 🙏**

💾 Installation
--------------

[](#-installation)

```
composer require juhlinus/depictr

```

📝 Config
--------

[](#-config)

You can publish the config by running the `artisan vendor:publish` command like so:

```
php artisan vendor:publish --provider="Depictr\ServiceProvider"

```

### 🕷 Crawlers

[](#-crawlers)

The following crawlers are defined out of the box:

```
return [
    'crawlers' => [
        /*
        |--------------------------------------------------------------------------
        | Search engines
        |--------------------------------------------------------------------------
        |
        | These are the list of all the regular search engines that crawl your
        | website on a regular basis and is the crucial if you want good
        | SEO.
        |
        */
        'googlebot',            // Google
        'duckduckbot',          // DuckDuckGo
        'bingbot',              // Bing
        'yahoo',                // Yahoo
        'yandexbot',            // Yandex

        /*
        |--------------------------------------------------------------------------
        | Social networks
        |--------------------------------------------------------------------------
        |
        | Allowing social networks to crawl your website will help the social
        | networks to create "social-cards" which is what people see when
        | they link to your website on the social network websites.
        |
        */
        'facebookexternalhit',  // Facebook
        'twitterbot',           // Twitter
        'whatsapp',             // WhatsApp
        'linkedinbot',          // LinkedIn
        'slackbot',             // Slack

        /*
        |--------------------------------------------------------------------------
        | Other
        |--------------------------------------------------------------------------
        |
        | For posterity's sake you want to make sure that your website can be
        | crawled by Alexa. This will archive your website so that future
        | generations may gaze upon your craftsmanship.
        |
        */
        'ia_archiver',          // Alexa
    ]
]
```

### ⛔ Exclusion

[](#-exclusion)

Depictr comes with the option of excluding an array of urls that shouldn't be processed.

This is useful for plain text files like `sitemap.txt`, where Panther will wrap it in a stripped down HTML file. Use of wildcard is permitted.

Per default the `admin` route and its sub-routes are excluded in the config file.

### 🏞 Environments

[](#-environments)

You can specify which environments Depictr should run on. Default is `testing` and `production`.

###  Health Score

36

—

LowBetter than 81% of packages

Maintenance19

Infrequent updates — may be unmaintained

Popularity33

Limited adoption so far

Community16

Small or concentrated contributor base

Maturity63

Established project with proven stability

 Bus Factor1

Top contributor holds 50% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~16 days

Recently: every ~29 days

Total

9

Last Release

2080d ago

Major Versions

v1.1.1 → v2.0.02020-08-03

### Community

Maintainers

![](https://avatars.githubusercontent.com/u/12988164?v=4)[Linus Juhlin](/maintainers/Juhlinus)[@Juhlinus](https://github.com/Juhlinus)

---

Top Contributors

[![c17r](https://avatars.githubusercontent.com/u/148640?v=4)](https://github.com/c17r "c17r (2 commits)")[![dependabot[bot]](https://avatars.githubusercontent.com/in/29110?v=4)](https://github.com/dependabot[bot] "dependabot[bot] (1 commits)")[![TheDoctor0](https://avatars.githubusercontent.com/u/16612504?v=4)](https://github.com/TheDoctor0 "TheDoctor0 (1 commits)")

---

Tags

inertiajslaravelmiddlewarephpssr

### Embed Badge

![Health badge](/badges/juhlinus-depictr/health.svg)

```
[![Health](https://phpackages.com/badges/juhlinus-depictr/health.svg)](https://phpackages.com/packages/juhlinus-depictr)
```

###  Alternatives

[zrashwani/arachnid

A crawler to find all unique internal pages on a given website

25420.2k](/packages/zrashwani-arachnid)[robertfausk/mink-panther-driver

Symfony Panther driver for Mink framework

123.4M2](/packages/robertfausk-mink-panther-driver)[guikingone/panther-extension

Panther extension for Mink

232.0k](/packages/guikingone-panther-extension)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
