PHPackages                             ascentech/massive-csv-import - PHPackages - PHPackages  [Skip to content](#main-content)[PHPackages](/)[Directory](/)[Categories](/categories)[Trending](/trending)[Leaderboard](/leaderboard)[Changelog](/changelog)[Analyze](/analyze)[Collections](/collections)[Log in](/login)[Sign up](/register)

1. [Directory](/)
2. /
3. ascentech/massive-csv-import

ActiveLibrary

ascentech/massive-csv-import
============================

This lightweight package helps developers to import csv files with millions of records efficiently using Laravel Queues.

v1.1.3(2y ago)114MITPHP

Since Jul 7Pushed 2y ago1 watchersCompare

[ Source](https://github.com/giftonian/massive-csv-import)[ Packagist](https://packagist.org/packages/ascentech/massive-csv-import)[ RSS](/packages/ascentech-massive-csv-import/feed)WikiDiscussions main Synced 1mo ago

READMEChangelog (9)DependenciesVersions (8)Used By (0)

Massive CSV Import
==================

[](#massive-csv-import)

- This lightweight package helps developers to import csv files with millions of records efficiently using Laravel Queues.

Prerequisites
-------------

[](#prerequisites)

- You must be using Laravel Queues and jobs table must exist in your database. If you are not using Queues, setup using this [link](https://laravel.com/docs/10.x/queues).
- Write privileges on `storage` directory of your Laravel project. You can change this location from configuration file of this package as well.
- By default, this package tries to search required Model class from `App\Models\` namespace. If you have placed Models in another directory, set its path in configuration file i.e., `vendor\ascentech\massive-csv-import\config\massive-csv-import.php`.

Installation
------------

[](#installation)

- composer require ascentech/massive-csv-import
- Add `Ascentech\MassiveCsvImport\MassiveCsvImportServiceProvider::class,` into `providers` array of your project's `config\app.php` file.

Usage
-----

[](#usage)

- Prepare a large csv file (without headers) to import.
- Prepare a file upload interface in your project and write following two lines in your Controller code:
- **use Ascentech\\MassiveCsvImport\\MassiveCsvImportFacade;**
- **$result = MassiveCsvImportFacade::import($path, $table\_name, $columns);**
- *`$path`* refers to temp path of uploaded csv file
- *`$table_name`* is the database table name in which you want to import large csv file.
- *`$colums`* is the array of columns for the particular table e.g., *$columns = \['name','description','status'\];*
- This package will create multiple smaller csv files from the large file and save these files into `storage\table_name\` directory. By default the chunk size is 1000, you can edit `csv_chunk_size` variable's value in configuration file i.e., `vendor\ascentech\massive-csv-import\config\massive-csv-import.php`.
- A separate job is created for each smaller csv file for processing in the background.
- You will need to run `php artisan queue:work` command for the jobs processing.
- All processed files will be placed with `.csv-processed` extension in the same `storage\table_name\` directory.
- Remember! If a particular record (from smaller csv file) fails to insert into the database, an error message will be written in laravel.log file, but the remaining job will keep processing without failing. A separate directory `storage\table_name\failed` is automatically created which will have csv files with the failed records only. You can fix these and import later on in a separate csv file.

### License

[](#license)

- MIT

###  Health Score

23

—

LowBetter than 27% of packages

Maintenance20

Infrequent updates — may be unmaintained

Popularity7

Limited adoption so far

Community7

Small or concentrated contributor base

Maturity49

Maturing project, gaining track record

 Bus Factor1

Top contributor holds 100% of commits — single point of failure

How is this calculated?**Maintenance (25%)** — Last commit recency, latest release date, and issue-to-star ratio. Uses a 2-year decay window.

**Popularity (30%)** — Total and monthly downloads, GitHub stars, and forks. Logarithmic scaling prevents top-heavy scores.

**Community (15%)** — Contributors, dependents, forks, watchers, and maintainers. Measures real ecosystem engagement.

**Maturity (30%)** — Project age, version count, PHP version support, and release stability.

###  Release Activity

Cadence

Every ~2 days

Total

7

Last Release

1032d ago

### Community

Maintainers

![](https://www.gravatar.com/avatar/dd65ce67ac130222f446be3870f7fa37bfc34ac338fb4b85fb483e9c3497d34b?d=identicon)[giftonian](/maintainers/giftonian)

---

Top Contributors

[![giftonian](https://avatars.githubusercontent.com/u/2361223?v=4)](https://github.com/giftonian "giftonian (1 commits)")

### Embed Badge

![Health badge](/badges/ascentech-massive-csv-import/health.svg)

```
[![Health](https://phpackages.com/badges/ascentech-massive-csv-import/health.svg)](https://phpackages.com/packages/ascentech-massive-csv-import)
```

PHPackages © 2026

[Directory](/)[Categories](/categories)[Trending](/trending)[Changelog](/changelog)[Analyze](/analyze)
