Why don't you use database seeding for that?
shabushabu said:
Why don't you use database seeding for that?
I use Homestead not only for my personal Laravel projects, but for my work also. Sometimes I have some static data, but sometimes I want just to save everything that has been collected in my DB. Why should I write a new seed for that if I can just archive and restore the DB ?
Well, a vagrant machine is for development purposes, so all you normally need is a set of predefined data to work with. Whether it's for a personal or work project doesn't really matter. Seeds just help you keep your database at a known state. Kind of like a clean slate. A db dump just doesn't give you that.
I'm currently working on a some kind of social network for gamers using our own small framework, not Laravel. Also we use mongodb. I have some static data, which I can seed: for example static games database (just a list of games a user can select in his profile). But I'm now writing and testing a user-interface for this and some other features, so I edit my preferences in my profile, add games to my my list ... Yesterday homestead was updated, so I needed to destroy and re-up the box. What is quicker: to write a seed for my current state of the user so I could continue my work, or just backup and restore the db? I don't want to loose my user and recreate it, repeating the same things over again. I think the answer is obvious :)
Well, maybe our workflows are different. For me db dumps just don't make any sense. Seeds should always represent the latest state, so all you need to do is update them occasionally, when your schema changes. Also, destroying a VM is something I do maybe twice a year and only if there's an unrecoverable error with the VM. vagrant reload/resume does the job just fine 99% of the time. I'm not using Homestead, though. This is just something we'll have to agree to disagree, I guess :)
I'm a little late picking up on this thread but my scenario is that I often develop using databases with specific data in them so seeded data isn't ideal. For instance if I'm developing an ecommerce site I'll populate the database with real products and options as it's easier for me and the client to test/review the site if we have real data.
This has caused me a few problems as I did forget once or twice that the MySQL databases aren't persistent when you destroy or have an error in Homestead but luckily I had backups so didn't lose much.
My solution is quite low tech but works for me. I simply created another VirtualBox VM and did a base install of Ubuntu server and MySQL. I keep it as clean as possible and didn't install anything else so it's ultra stable. Then with a new Laravel project I simply have to change the database config to point to this database server VM and then if my development environment in Homestead gets messed up or needs refreshing or destroying it doesn't affect any of my project databases.
Yes my database VM could get corrupt (I still do backups) but as it's only running Ubuntu and MySQL and there's very few OS or environment changes ever needed it's been rock solid so far and I can happily mess up my Homestead VM without worrying about backing up the databases at that point in time or having to recreate and restore the databases on a new Homestead instance.
There's a much simpler solution!
Homestead.yaml
, like so: - map: C:/myprojects/mysqldata
to: /home/vagrant/mysqldata
vagrant up
your VM./etc/mysql/my.cnf
and /etc/apparmor.d/tunables/alias
files to /home/vagrant/mysqldata
. Then:sudo cp -R -p /var/lib/mysql /home/vagrant/mysqldata/mysql
/etc/mysql/my.cnf
and change datadir
to point to /home/vagrant/mysqldata/mysql
/etc/apparmor.d/tunables/alias
and include the line:alias /var/lib/mysql/ -> /home/vagrant/mysqldata/mysql/,
NOTE: The final ',' is important above!
7. Create a script, say,myup
in /home/vagrant/mysqldata
like so:
#!/bin/bash
cp /home/vagrant/mysqldata/my.cnf /etc/mysql/my.cnf
cp /home/vagrant/mysqldata/alias /etc/apparmor.d/tunables/alias
service apparmor reload
service mysql restart
sudo ./myup
vagrant destroy
vagrant up
vagrant up
again, just cd to mysqldata and run sudo ./myup
.And, that should do it.
@imukhurje thanks a lot for sharing your awesome solution!
A few notes... on Ubuntu 14.10 running
service apparmor reload
results in an error.
This command works instead:
start apparmor ACTION=reload
Also when I ran
sudo cp -R -p /var/lib/mysql /home/vagrant/mysqldata/mysql
I got a ton of "warnings" saying
cp: failed to preserve ownership for ‘/home/vagrant/mysqldata/xxx’: Operation not permitted
The files were copied and I guess this is fine because I ran many tests with your solution and so far everything works good.
And lastly.. when you say to edit the two files inside /etc/ that confused me for a moment.. but it's pretty clear you meant to say edit the files you copied in step 1 that are now inside /home/vagrant/mysqldata/.. those are the files you want to copy.. and then using the myup script you will copy those files overwriting the ones located in /etc/
All in all, this is a great solution!
Now if only there was a way to automatically run sudo ./myup upon vagrant up but that is a very small wish!
Thanks for sharing!
i wanted something like this also! i forked homestead and used a vagrant plugin to import/export on halt and destoy.
if you want to take a look its on my github at https://github.com/erikbelusic/farmhouse
there is no docs yet and its not up on packagist either.
If you use Puphpet (puff-pet, a combination of php/puppet) to create your vagrant/virtualbox, there are some scripts that can be run once/everytime you start up vagrant: https://puphpet.com/#custom-files
I'm going to try to implement this solution with vagrant until we get a dedicated dev file/db server.
Try taking backup using "Cloudbacko software" i have experience of using CloudBacko software for my MYSQL database backup. Currently, i am using the same software because it gives me full security and protection to my data. Backup of a large 100GB MySQL Database can be finished overnight. Fast multi-thread MySQL Database backup. Block level incremental hot backup with zero downtime. MySQL Database backup pre-requisites checking guarantees restorability. Multi-destination concurrent backup. Directly restore from backup to original database. Service is very good so i would like to suggest you to have CloudBacko software for your backup plan.
vesper8 said:
@imukhurje thanks a lot for sharing your awesome solution!
A few notes... on Ubuntu 14.10 running
service apparmor reload
results in an error.
This command works instead:
start apparmor ACTION=reload
Also when I ran
sudo cp -R -p /var/lib/mysql /home/vagrant/mysqldata/mysql
I got a ton of "warnings" saying
cp: failed to preserve ownership for ‘/home/vagrant/mysqldata/xxx’: Operation not permitted
The files were copied and I guess this is fine because I ran many tests with your solution and so far everything works good.
And lastly.. when you say to edit the two files inside /etc/ that confused me for a moment.. but it's pretty clear you meant to say edit the files you copied in step 1 that are now inside /home/vagrant/mysqldata/.. those are the files you want to copy.. and then using the myup script you will copy those files overwriting the ones located in /etc/
All in all, this is a great solution!
Now if only there was a way to automatically run sudo ./myup upon vagrant up but that is a very small wish!
Thanks for sharing!
Hi @vesper8, i'm trying to keep db data after each vagrant destroy. I found that mysql doesn't like and complains that the owner of mysqldata is www-data. I also tried to assing
owner: mysql
group: mysql
to my shared folder, but it ends up saying that the user mysql does not exists.
I found a solution using a bash script like this one
bindfs --force-user='mysql' --force-group='mysql' /home/vagrant/mysqldata /home/vagrant/mysqldata
sudo cp /home/vagrant/mysqldata/alias /etc/apparmor.d/tunables/alias
/etc/init.d/apparmor restart
sudo cp /home/vagrant/mysqldata/my.cnf /etc/mysql/my.cnf
service mysql restart
But it works only if i log in into ssh and i launch it with sudo ./myscript.sh and does not work in the provision way:
config.vm.provision "shell", path: "#{dir}/data/scripts/myscript.sh"
i'm under windows and i've also tried with privileged: false
config.vm.provision "shell", path: "#{dir}/data/scripts/myscript.sh", privileged: false
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community