you would need everything that is under your project folder and all its subfolders (I believe).
/appas it contains you config and app files.
/vendorsbecause all your packages are there, including all the classes Laravel uses.
/publicas that is where your css, js, and imgs are.
/bootstrapfolder to start the app
I suppose you wouldn't need to actually copy all your
vendor folders if you just brought over your
json.lock and did a
composer install on the production site, but this seems pointless to me. I think just zipping up the whole thing would be better, but I stand to be corrected.
Thanks Mojo... in my case, the vendors directory has A LOT OF FILES!
Is there any way to get around this? Are all of those files needed? Do I need to run composer on each install? What if this is an app that does not have internet access?
Anyways, thanks for the reply.
If there is no internet in your production environment, your application won't work as laravel is web application framework. :-) Any plan to use Git? We use Git to manage source code and to deploy production environment.
You shouldn't commit the vendor directory to git. To deploy your app on production server you can get the vendor directory back with composer update.
there are a number of scenarios here and you need to use the one that works for you.
I know that a lot of people are using shared hosting for production. Many shared hosting providers do not let you have command line access necessary for running
composer on the production deployment. In that case, it might be best to deploy the app with the
/vendor folder as you won't be able to re-get the packages.
btw, you should not do a
composer update on a production machine if you are at all concerned about testing your apps in test before deployment to production. If you have access to the command line in production, the better way to deploy would be:
composer installon the production machine to get all the vendor packages (most importantly) *in the version you tested on before production deployment
I hope this helps
You can use the following package to remove the test and documentation files from the vendor folder -- https://github.com/barryvdh/laravel-vendor-cleanup
It will reduce the number of files in your vendor folder.
Remember to turn off the debug mode once you migrated all the files to production server.
I come from Codeigniter as well and have setup the folders much the same locally and on our server.
My file structure:
|-public_html |----all the files in the public folder |-system |---app |---bootstrap |---vendor |---all the other files
Once you have put them in there you need to change the paths in a couple files:
index.php (in public_html)
require __DIR__.'/../system/bootstrap/autoload.php'; /* |-------------------------------------------------------------------------- | Turn On The Lights |-------------------------------------------------------------------------- | | We need to illuminate PHP development, so let's turn on the lights. | This bootstraps the framework and gets it ready for use, then it | will load up this application so that we can run it and send | the responses back to the browser and delight these users. | */ $app = require_once __DIR__.'/../system/bootstrap/start.php';
'public' => __DIR__.'/../../public_html',
I do this before running composer install. We use codebasehq to manage our source code which is also part of deplyhq. Once we commit changes they are deployed to our servers without any issues.
So far we have had no issues with this setup. No problems updating Laravel or any other package for that matter.
I hope this helps.
I zip up the entire directory using 7zip so that I can exclude .svn, .git and .idea directories and then upload the whole lot to the server. Just incase your interested the 7z command I use is as follows:
7z a -xr!.svn -xr!.git -xr!.idea -mmt -tzip files.zip *.* -r
Once on the server I bring up a remote shell and unzip the file.
@zawmyohtet I can think of several environments where there is an internal network of computers that talk to "web apps" hosted on that network, but that network is not connected to the Internet, eg. factories, chemical plants, etc. (Yeah, I know, a lot of these networks actually are connected to the Internet these days, but they shouldn't be! :-)
I've never felt entirely comfortable relying on composer update on my production environment. Call me paranoid, but I like to make sure I'm running the exact same code in production that I ran in dev and test. Composer is very reliable, but also very complex, and I don't want to take the risk that production ends up broken due to an obscure issue with composer when the alternative solution of copying across the vendor directory is so simple and foolproof.
I have a script called "deploy_vendor" that zips up my vendor folder and leaves the zip file in a staging area on my test and production servers. Whenever I deploy a new version of the app, the deployment script creates the vendor folder based on the current version of that zip file.
Of course, the zip file in the staging area only needs to be updated whenever I run composer update. I automated this problem away by simply adding "deploy_vendor" to the "post-update-cmd" section of composer.json.
Sign in to participate in this thread!
We'd like to thank these amazing companies for supporting us