Laravel has what we call Database Seeders. I recommend you check the documentation regarding those. Seeders are basically classes that are used to populate your database with test data.
Let's say you have your cities data available as a csv file, it would be possible for you to create a seeder that parses the data, and inserts it into your database. If you don't have a dataset available, you could use the Faker library, which has a lot of helper functions to generate random data.
marco-fiset said:
Laravel has what we call Database Seeders. I recommend you check the documentation regarding those. Seeders are basically classes that are used to populate your database with test data.
Let's say you have your cities data available as a csv file, it would be possible for you to create a seeder that parses the data, and inserts it into your database. If you don't have a dataset available, you could use the Faker library, which has a lot of helper functions to generate random data.
I already have my dataset for that table. I mean actullay it's a big data, so is it logical to create a seed for it?
Or it's prefreable to do it manually??
Maybe it's better to import it via phpadmin.
Seeders are typically intended more for test data, not for actual production data.
My preferred way is to set up an artisan command that executes this import. Have the command parse the source file incrementally (this avoids trying to load the entire source into memory), and insert the data into the database in small chunks (50-100) at a time. This usually gives you the best insertion performance.
DB::table($name)->insert($arrayOfFiftyRowsOfData);
I've made commands that import millions of rows this way, it's a pretty safe strategy. Also look up symfony's progress bar component to make better looking commands.
Don't forget to do other sanity checks, like first clearing the table if it's already been filled.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community