With Laravel, we get used to command line commands like composer install or artisan migrate. But what if we have only shared-hosting from client, with only FTP access and phpMyAdmin to manage database? Laravel is still usable in this case, but there are some tricks you need to make it work.
Uploading the project for the first time
To make the project work on client’s server, you need to do these things:
1. Upload the viles via FTP
Use Filezilla or similar software
2. Make sure your domain is pointing to /public folder
Depending on where the domain is managed – it will be some kind of Plesk, DirectAdmin or similar software on the client’s server panel.
3. Database: export/import with phpMyAdmin
Export your local database into SQL file, and then just import it to the server, as is.
4. .env file or config/database.php
Edit one of these files to have correct credentials to login to the database.
Ta-daaa – you have your project up and running! But that’s only first part of the story.
Maintaning the project and uploading changes
Ok, now – second phase: how to properly upload changes, without breaking anything?
1. How to upload files
This is pretty simple, right? Just upload the files that have been changed via FTP and refresh the browser.
Not so fast. What if we have a lot of files in different folders and we want to minimize downtime? If you just bulk-upload 100+ files, for some seconds or even minutes the website will become unpredictable. It won’t even be down, it’s worse – its status will change depending on the file(s) uploaded at that exact moment.
Imagine a situation: visitor might see old Blade template with old form, but by the time it’s filled in – new Controller might work on new set of fields.
The thing is that while deploying, we have to make sure that website is totally not accessible in those important seconds. Visitors should see “under construction” or “in progress” or something.
If we had command-line SSH access, it would be easy:
php artisan down git pull php artisan up
And then while “down”, visitors would see this:
But we don’t have SSH access. So how do we “fake” artisan down command? What does it actually do?
It creates an empty file called down (without extension) in storage/framework folder:
So to “fake” it, you just need to create empty file with name down, and that’s it. Whenever you’ve uploaded all the necessary files – just delete the down file.
2. How to manage database migrations
Basically, when you prepare migration, you need to run same SQL query on live server with phpMyAdmin. But how do you know what that query is? All you can see is migration code which works “by magic”.
There’s a helpful option for artisan migrate command. Locally, you should run it with –pretend option. Instead of actually running the command, it would show SQL for it!
And then you copy-paste it into SQL field of server phpMyAdmin, and you’re done.
3. Composer package updates
Ok, you need a new package. Or a new version of existing package. Basically, you’ve run composer install or composer update locally, and now you need to transform it to the server.
In this case, you can just upload the whole /vendor folder, and that’s it. But if your internet connection is slower, or you don’t want to upload whole tens of megabytes and thousands of files (yes, vendor folder is that heavy), you should know what is actually changed there.
All you actually need to upload and overwrite is 3 things:
- Folders of packages that were ACTUALLY changed – looked at their modified time locally;
- Whole folder of /vendor/composer
- File vendor/autoload.php
And that’s it – as I said, no need to transfer full /vendor.
Basically, that’s it – these are main tips to work on shared-hosting. But my ultimate advice is to avoid it in any way possible. It’s a real pain in the neck – you could easily break something or overwrite wrong file. Expensive mistakes.
Talk to client upfront about server requirements, and if their argument is additional costs, then your argument should be the same: just look at how much more time it takes to deploy files to shared-hosting. Your client would be much happier if you spend this time on actually CREATING something, not deploying.
Awesome… Tips
Thanks!
โpretend option, that was news to me ๐
I would suggest gitftp to make it even more easier without the need of a ftp client like filezilla. Thanks.
Another nice post man ๐ I would skip step2 – it is possible to set root folder from Laravel it self (I have done this before, but check google for details on how to). Problem with that – You have to remember this when updating something. There are a lot of hosting platforms and You can never know will You find options You need ๐
Hi, from my experience (I did it locally) after copying the new installed packages’ files, in vendor/composer you only need to update 2 files: autoload_classmap.php and autoload_psr4.php
FTP is waaaaaaay too slow. I just started using https://www.npmjs.com/package/flightplan to push stuff to my Laravel shared host and it works awesome and very easy to set up.
Hi Povilas!
Here is the script I use when updating to a new version:
#!/bin/bash
php artisan down
composer dump-autoload
php artisan clear-compiled
php artisan cache:clear
php artisan config:clear
php artisan view:clear
git pull
composer dumpautoload -o
php artisan config:cache
php artisan route:cache
php artisan optimize
php artisan up
Hi Evert, how do you run this script on shared hosting server ?
Hi Evert,
THANKS A LOT !! U SOLVED MY PROBLEM… XOXO
Or you can try my utility which is “FTP File Difference Utility” (https://github.com/ahsankhatri/FTP-File-Difference-Utility). Actually it compare your local files with remote and tell you the modified files along with the path, i know one should use VCS instead of this, but once you use it you’ll find it much easy and convenient.
you can use beyond compare software to see which files have been changed and not committed to server yet ๐ give it try!
And use sublime text as editor and ftpsync for sublime to make your work easier ๐
[…] Host in a shared server – http://laraveldaily.com/laravel-and-shared-hosting-working-with-ftp-and-phpmyadmin/ […]
Isn’t it better to host Laravel on VPS, instead of shared servers. You can scale VPS, but you cannot scale shared servers. There are also issues regarding performance and security with shared hosting. However, if you don’t have sysadmin skills then hosting Laravel on it could prove to be difficult. You can instead use managed platform, like Cloudways, to host Laravel on it.
After running the migrate command from pretend option on live server,how to upload the data to live server?
Are there other options apart from importing a csv file for ex?