Skip to content

itstuffsolutions/laravel-12-import-very-large-csv-into-database-with-seeder

Repository files navigation

Laravel 12 – Import Very Large CSV into Database with Seeder

Importing very large CSV files (100k+ or even millions of records) into a database can cause memory issues, timeouts, or poor performance if not handled properly.
This repository demonstrates an efficient and memory-safe way to import very large CSV files in Laravel 12 using Seeders.

📖 Tutorial:
https://itstuffsolutiotions.io/laravel-12-import-very-large-csv-into-database-with-seeder/


🚀 Features

  • Import very large CSV files without memory overflow
  • Uses Laravel 12 Seeders for automation
  • Chunked database inserts for better performance
  • Streaming CSV reading (no full file loading)
  • Suitable for millions of records
  • Clean and reusable implementation

📋 Requirements

  • PHP 8.2+
  • Laravel 12
  • MySQL
  • Composer

📦 Installation

Clone the repository:

git clone https://github.com/itstuffsolutions/laravel-12-import-very-large-csv-into-database-with-seeder.git
cd your-repository

About

Efficiently import very large CSV files into a database using Laravel 12 seeders. This project demonstrates memory-safe CSV streaming, chunked inserts, and optimized performance for handling millions of records without timeouts.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors