PHPackages                             vielhuber/runpodhelper - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. [Utility &amp; Helpers](/categories/utility)
4. /
5. vielhuber/runpodhelper

ActiveLibrary[Utility &amp; Helpers](/categories/utility)

vielhuber/runpodhelper
======================

Automates self-hosted llm inference

1.2.9(1mo ago)055↓45.5%MITShellPHP &gt;=8.1

Since Mar 12Pushed 1mo agoCompare

[ Source](https://github.com/vielhuber/runpodhelper)[ Packagist](https://packagist.org/packages/vielhuber/runpodhelper)[ RSS](/packages/vielhuber-runpodhelper/feed)WikiDiscussions main Synced 1mo ago

READMEChangelogDependencies (4)Versions (29)Used By (0)

⛈ runpodhelper ⛈
================

[](#-runpodhelper-)

runpodhelper automates the full lifecycle of self-hosted llm inference on runpod gpu cloud. it provisions pods via the runpod graphql api, installs lm studio, downloads gguf models from huggingface, and serves them behind a cloudflare tunnel.

usage
-----

[](#usage)

```
./vendor/bin/runpod.sh create \
    --id 001 \
    --gpu "NVIDIA A40" \
    --hdd 50 \
    --model "unsloth/Qwen3.5-27B-GGUF-UD-Q4_K_XL" \
    --context-length 65536 \
    --lmstudio-api-key "your-static-api-key" \
    --parallel 16 \
    --auto-destroy 3600
```

- `./vendor/bin/runpod.sh status`
- `./vendor/bin/runpod.sh delete --all`
- `./vendor/bin/runpod.sh delete --id 001`
- `./vendor/bin/runpod.sh test quality --runs 5`
- `./vendor/bin/runpod.sh test quantity --runs 30`

```
./vendor/bin/runpod.sh loadbalancer --start \
    --gpu "NVIDIA A40" \
    --hdd 50 \
    --model "unsloth/Qwen3.5-27B-GGUF-UD-Q4_K_XL" \
    --context-length 65536 \
    --lmstudio-api-key "your-static-api-key" \
    --parallel 16 \
    --min-pods 8 \
    --max-pods 16 \
    --auto-destroy 3600

./vendor/bin/runpod.sh loadbalancer --stop
```

installation
------------

[](#installation)

- install library
    - `composer require vielhuber/runpodhelper`
    - `./vendor/bin/runpod.sh init`
- setup cloudflare
    - Create a domain `custom.xyz`
    - Profile &gt; API Tokens &gt; Create Token
        - Permissions:
            - `Zone / DNS / Edit`
            - `Zone / Single Redirect / Edit`
            - `Account / Cloudflare Tunnel / Edit`
        - Account Resources
            - `Include > Your account`
        - Zone Resource
            - Include / Specific zone / `custom.xyz`
    - Set `CLOUDFLARE_DOMAIN`/`CLOUDFLARE_API_KEY` in `.env`
    - Each pod gets a subdomain based on its config ID:
        - `001.custom.xyz`
        - `002.custom.xyz`
        - …
- edit config
    - `vi ./.env`
    - `vi ./models.yaml`

mcp server
----------

[](#mcp-server)

```
{
    "mcpServers": {
        "runpodhelper": {
            "command": "/usr/bin/php",
            "args": ["/path/to/project/runpodhelper/bin/mcp-server.php"]
        }
    }
}
```

recommended models
------------------

[](#recommended-models)

NameHDDModelContext lengthtok/sNotesNVIDIA GeForce RTX 509050 GBQwen3.5-27B-GGUF-UD-Q4\_K\_XL65536~43best current MCP/tool-use baselineNVIDIA A4050 GBQwen3.5-27B-GGUF-UD-Q4\_K\_XL65536~20~2× slower than RTX 5090, identical qualitymanual deployment
-----------------

[](#manual-deployment)

-  &gt; Pods &gt; Deploy
- Pod template &gt; Edit
- Expose HTTP ports (comma separated): `1234`
- Container Disk: `100 GB`
- Copy: SSH over exposed TCP
- `ssh root@xxxxxxxxxx -p xxxxx`

```
curl -fsSL https://lmstudio.ai/install.sh | bash
export PATH="/root/.lmstudio/bin:$PATH"
# this is unreliable
#lms get -y qwen/qwen3-coder-next
mkdir -p ~/.lmstudio/models/unsloth/MiniMax-M2.1-GGUF
cd ~/.lmstudio/models/unsloth/MiniMax-M2.1-GGUF
wget -c https://huggingface.co/unsloth/MiniMax-M2.1-GGUF/resolve/main/MiniMax-M2.1-UD-TQ1_0.gguf
mkdir -p ~/.lmstudio/models/lmstudio-community/Qwen3.5-35B-A3B-GGUF
cd ~/.lmstudio/models/lmstudio-community/Qwen3.5-35B-A3B-GGUF
wget -c https://huggingface.co/lmstudio-community/Qwen3.5-35B-A3B-GGUF/resolve/main/Qwen3.5-35B-A3B-Q4_K_M.gguf
lms server start --port 1234 --bind 0.0.0.0
```

alternative: use runpodctl
--------------------------

[](#alternative-use-runpodctl)

- `ssh-keygen -t ed25519 -C "name@tld.com"`
- `wget https://github.com/Run-Pod/runpodctl/releases/download/v1.14.3/runpodctl-linux-amd64 -O runpodctl`
- `chmod +x runpodctl`
- `mv runpodctl /usr/bin/runpodctl`
- `runpodctl config --apiKey `
- `runpodctl version`

more commands
-------------

[](#more-commands)

- `curl http://localhost:1234/v1/models`
- `lms --help`
- `lms status`
- `lms server stop`
- Copy: HTTP services &gt; URL

```
curl https://xxxxxxxxx-1234.proxy.runpod.net/v1/responses \
  -X POST \
  -H "Content-Type: application/json" \
    -H "Authorization: Bearer your-static-api-key" \
  -d '{
    "model": "xxxxxxxxxxxxx",
    "messages": [
        {"role": "user", "content": [{"type": "input_text", "text": "hi"}]}
    ],
    "temperature": 1.0,
    "stream": true
  }'
```

###  Health Score

43

—

FairBetter than 91% of packages

Maintenance90

Actively maintained with recent releases

Popularity12

Limited adoption so far

Community6

Small or concentrated contributor base

Maturity53

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~0 days

Total

28

Last Release

51d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/d959b840f9893a14dfc78f65588f427d5bf606b885616e95dbe5a11199a8fc21?d=identicon)[vielhuber](/maintainers/vielhuber)

---

Top Contributors

[![vielhuber](https://avatars.githubusercontent.com/u/3183737?v=4)](https://github.com/vielhuber "vielhuber (31 commits)")

### Embed Badge

![Health badge](/badges/vielhuber-runpodhelper/health.svg)

```
[![Health](https://phpackages.com/badges/vielhuber-runpodhelper/health.svg)](https://phpackages.com/packages/vielhuber-runpodhelper)
```

###  Alternatives

[aedart/athenaeum

Athenaeum is a mono repository; a collection of various PHP packages

255.2k](/packages/aedart-athenaeum)[uspdev/replicado

Classes PHP que consome dados do Replicado USP

136.2k7](/packages/uspdev-replicado)

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
