I'm normally used to working on cPanel servers, where vhosts and suphp are set up in such a way that each site hosted on a server are separated into /home/[username]/public_html/
directories, and you just upload files as the user that the home directory belongs to and everything just works.
However, I'm experimenting with VPS kinda stuff, so I've set up a DigitalOcean Droplet which will serve a single site. I've just installed the default Ubuntu LAMP server so right now there's a single vhost (as it were) set to /var/www/html
as its docroot. Interestingly, it doesn't use www-data
but root
as the owner of these files.
So first, is the general recommendation to just go along with this: use the root
user for everything and simply edit the vhost so that the docroot is /var/www/public
rather than /var/www/html
, or should I 'do it properly': create a directory to house a site, add an Apache vhost pointing to [said directory]/public
as its docroot, and somehow hope that permissions magically work so Laravel can handle uploaded files and logs, etc.
If the latter of those choices makes sense (it does feel very icky using the root
user for this kinda stuff, though this server will only host a single website), what's the best way to integrate that with Laravel's remote module and Envoy? SSH in, git pull, then chown/chmod things? Should I maybe set up a specific user and try to get suphp running?
As you can see, this is the kind of stuff I've never really had to deal with before. I'm used to adding vhosts to my local dev box, but never really had to deal with doing this on a production server and the minefield that is getting permissions right (and I don't really like the "chmod directories to 777" thing that a lot of people do).
Any help/recommendations/best practices would be greatly appreciated.
Sign in to participate in this thread!
The Laravel portal for problem solving, knowledge sharing and community building.
The community