How to Deploy Laravel Projects to Live Server: The Ultimate Guide

There are a lot of courses and articles about coding but much less about putting it in production. With a lot of questions asked, I decided to write this ultimate (hopefully) guide to deploy Laravel projects.

In this article, I will assume two things:

Notice 1: please take this article as personal advice but not 100% accurate process. Every team have their own way to deploy projects, so you may read different advice elsewhere.

Notice 2: this article is written in October 2018, check for any updates to software and Laravel/PHP versions at the time you’re reading it, IT world is constantly changing.

Enough formalities, let’s dive in. In this article, we will cover these steps:

  1. Prepare your dedicated server for Laravel
  2. Initial launch of the project
  3. How to deploy new changes
  4. Zero-downtime deployment
  5. Teamwork, staging server and branches
  6. Brief overview: Automated testing and continuous integration

Step 1. Prepare your dedicated server for Laravel

I’ve recently written a recommendation article for the server needed for Laravel projects. My personal preference is Digital Ocean. They have their own guide for Laravel projects.

To be honest, I don’t want to focus on this step too much, cause server preparation is not part of deployment. But these are the parts that you need to prepare, in short:

Part 1. Create/purchase your dedicated server. In case of Digital Ocean it’s called Droplet, Amazon AWS call it EC2 instance etc.

Part 2. Install LEMP or LAMP stack. LAMP/LEMP stands for Linux (comes with server), web-server (Nginx for “E” and Apache for “A”), MySQL database and PHP. A lot of server providers offer this as a one-click install – see Digital Ocean example below:

Notice: Linux-server is the reason I advise to use Linux environment on local computer, too. This way you get used to configuration. But don’t worry, you don’t need to install Linux or buy specific new computer – it all can be emulated with Laravel Homestead or Laravel Valet.

Part 3. Configure SSH access for yourself. You will probably deploy changes by SSHing to the machine and running commands like git pull, php artisan migrate etc. Forget uploading via FTP, if you still do it. Learn more about users/privileges for SSH access in this Digital Ocean article.

Part 4. PHP and Dependencies for Laravel. At the time of this article, Laravel latest version is 5.7 with these requirements:

  • PHP >= 7.1.3
  • OpenSSL PHP Extension
  • PDO PHP Extension
  • Mbstring PHP Extension
  • Tokenizer PHP Extension
  • XML PHP Extension
  • Ctype PHP Extension
  • JSON PHP Extension

Part 5. Install/configure composer. You do want to run composer install command on the server, right? Follow the instructions from the official website.

Part 6. Install/configure git. Putting the code to the server will work by pulling it down from git repository. Probably this official instruction will help.

Part 7. Configure MySQL. You need to create a database specific for your project, and a user to access it. You may create a separate user, granting only specific privileges. Digital Ocean have article on it, too.

Part 8. Configure Web-server. Prepare a specific folder for your website. Here’s an example Nginx config provided in official Laravel documentation.

server {
    listen 80;
    root /;

    add_header X-Frame-Options "SAMEORIGIN";
    add_header X-XSS-Protection "1; mode=block";
    add_header X-Content-Type-Options "nosniff";

    index index.html index.htm index.php;

    charset utf-8;

    location / {
        try_files $uri $uri/ /index.php?$query_string;

    location = /favicon.ico { access_log off; log_not_found off; }
    location = /robots.txt  { access_log off; log_not_found off; }

    error_page 404 /index.php;

    location ~ \.php$ {
        fastcgi_split_path_info ^(.+\.php)(/.+)$;
        fastcgi_pass unix:/var/run/php/php7.1-fpm.sock;
        fastcgi_index index.php;
        include fastcgi_params;

    location ~ /\.(?!well-known).* {
        deny all;

Part 9. Point the Domain to Server. Your new server probably has some IP address given to you by Digital Ocean (or other provider), so now you need to go to the page of your domain configuration (where you’ve bought the domain) and change its DNS records, specifically A record to point to that new IP address. Here’s article on Quora about it.

Ok, at this stage, if you’ve completed all of the above successfully, you have your server ready. Not that simple, huh? To avoid all that manual work, I totally recommend using Laravel Forge – it prepares the server for you, with all the correct settings, in just a few clicks. If you work with multiple clients, totally worth $19/month (this is the most popular plan price at the time of writing this article). Also, you support Taylor Otwell and Laravel by using Forge.

Step 2. Initial launch of the project

Now, as we have server ready, let’s put the code there.

Part 1. Putting the code to the repository. Choose the system you want to use – GitHub, Bitbucket or Gitlab. Then create a repository there and push your code from your local computer.

Here’s a sample sequence of commands from Github, from the point of initializing Git repository:

git init
git add
git commit -m "first commit"
git remote add origin
git push -u origin master

Let’s take an example of totally fresh Laravel 5.7 project. Now, if you pushed your code to Github, your repository should look like this.

Part 2. Cloning repository to the server. We need to SSH into our server, navigate to the folder prepared for the project, and launch git clone command.

Keep in mind that your server should have access to the repository, if it’s not public. You can do that by specifying your username/password every time you’re asked (see below),

Or you can set up the access following these instructions.

After cloning is successful, we should see the files downloaded to the server.

Important thing about public folder – make sure that your Web-server configuration points to public folder correctly, and your full repository is cloned in one folder above, not into public. Here’s an example from Nginx config:

server {
    root /home/forge/;

Part 3. .env file.

If you’ve done everything correctly, you should see this:

This means that we haven’t run composer install. But to do that, first we need to configure our environment.

Laravel comes with a file called .env.example, with all typical configuration values.









But for your server they will be different, so we need to do two things:

1. Copy that example file as our main .env file with this command:

cp .env.example .env

2. Edit that new .env file, with Linux editors like Vim:

vi .env

You can change a lot of variables, but the main ones are these:



So set up your app URL and database credentials. Everything else can be edited when you actually need it.

Part 4. Writeable folders. Here I’ll just quote the official Laravel documentation:

Directories within the storage and the bootstrap/cache directories should be writable by your web server or Laravel will not run.

Part 5. Composer install. Let’s run this “magic” command. On this step, you may encounter some errors if some packages are not compatible with your PHP version or extensions. So check for any messages.

In case of success, it looks something like this:

Part 6. Generate application key. We need to run this command:

php artisan key:generate

It generates a random key which is automatically added to .env file APP_KEY variable.

Part 7. Migrating DB schema. Do you have any database table to migrate? Then let’s launch this:

php artisan migrate

Please make sure you’ve set up correct database credentials in .env file.

Part 8. Seeding data? If you have set up your database/seeds, this is probably the time to run them:

php artisan db:seed

To speed up the process, you may run two commands above as one:

php artisan migrate --seed

Part 9. Launch! Finally, let’s try it out in the browser… It works!

Congratulations, if you’ve reached this far, you have deployed Laravel project to your server!

But that’s not the end of the article, let’s see how to work with this project and deploy future changes.

Step 3. How to deploy new changes

Ok so we have the first version of the project deployed. Now, how to deploy the changes in our code? Also, how to do it without breaking anything and with least amount of downtime possible? Here’s the plan:

Part 1. Artisan down. There is an Artisan command to get all the website “down”.

php artisan down

If you launch it on the server, the whole website becomes unavailable, and then you can do any deployment actions, and your visitors won’t interrupt you. Otherwise you’re at risk the someone will change some database data in live mode, which may even break your deployment.

After this command, your visitors will see this (in Laravel 5.7):

To get the site up and running again, after all the deployment actions, run this command:

php artisan up

Part 2. Git pull. You need to pull the latest version of the code to the folder. It’s simple:

git pull

You can add more parameters like branch, but we will be talking about branching strategy a little later in the article.

Part 3. Composer install. Next thing is to check if there are any new changes in composer.lock files, so run composer install.

Important notice: don’t run “composer update” on a live server. It will take a lot of time (a few minutes) and will break repository consistency.

This is how you install a new package to the project:

  • On local computer run composer require with whatever package you need
  • Or, add your package to composer.json file and run composer update
  • During committing to the repository, you will have both composer.json and composer.lock changed. That’s ok.
  • If you do composer install on the server, it will look into composer.lock and will do only the necessary changes, without checking all of the versions for all packages.

Part 4. DB migrations. If you have any database changes in database/migrations, run this:

php artisan migrate

Actually, run this anyway, even if you don’t have changes. It won’t error, just will say “Nothing to migrate”.

But important notice: don’t edit the migration files if they are already in repository, for every change add a new migration file.

Part 5 (optional). Restart FPM. This step is not always necessary, but will restart PHP and kill any processes that have been started. Laravel Forge advises this command:

echo "" | sudo -S service php7.1-fpm reload

Change 7.1 version here to your real PHP version.

Part 6 (optional). Restart queues.

If you have any Queue mechanism running, it is advisable to run this:

php artisan queue:restart

Part 7 (optional). Clearing cache.

Again, this step may be unnecessary, but if you’re using Laravel cache, this command will clear it:

php artisan cache:clear

Part 8. Up and running again! And that’s it, let’s get the site working again.

php artisan up

To sum up, here’s an example sequence of commands in one of my projects:

cd /home/forge/
php artisan down
git pull origin master
composer install 
php artisan migrate
php artisan cache:clear
php artisan queue:restart
php artisan up

Step 4. Zero-downtime deployment

You probably think that it takes a lot of time to (properly) deploy the changes. While you’re running various commands, and investigate potential issues, visitors may be waiting for the site to be working again. For minutes. Not cool.

To solve this problem, people came up with a solution called zero-downtime deployment. The principle is pretty easy:

  • You create a new folder on the server
  • You clone the newest version of the code there from repository
  • You run all the commands above on that new folder (while website is still running from the old folder)
  • Now you have two folders, you just need to change Linux so-called symbolic link – point it to a new folder

This way, your customers experience absolutely zero downtime – symlinks happen almost instantly.

There are a few tools that arrange this zero-time deployment for you, with ability to rollback to a different folder, logging the process and report about errors. My favorite is the official tool by Taylor Otwell called Laravel Envoyer.

It’s not as valuable as Laravel Forge for server provisioning, but if you want to have one-click zero-time deployment, try it out for $10/month.

Step 5. Teamwork, Staging Server and Branches

Let’s take one step even further. How do you deploy the changes, while working in a team? And not necessarily sitting in the same office or even working within the same timezone. How do you not break each other’s code?

You need to use git branching, here’s an example screenshot from my own SourceTree for one of the projects:

There are quite a few options to organize this process, but here’s the process I would personally recommend.

  1. You should have (at least) two branches in repository – master and develop
  2. You should have two servers – one is LIVE version and would pull the code from master branch (git pull origin master), and testing server (also often called “staging”) which would pull the code from develop (git pull origin develop)
  3. No developer is allowed to commit directly to master branch. It may be even restricted on repository settings level. Everyone commits to develop branch ONLY. (or feature branches, more on that below)
  4. When you’re ready to release some function to live version, it should be done in develop branch and tested (manually and maybe with automated tests) on staging server. Then you do a Pull Request in repository, which merges develop into master. Pull Request may be reviewed by teammates, and then officially approved/merged. Then live server pulls the code from master.
  5. If someone is working on a bigger feature, they should create so-called “feature branch” from develop branch, and then at some point do a Pull Request from that branch into develop (not master!). Likely there will be code conflicts to merge manually and talk between teammates.

.env File on Staging Server

Speaking about staging server, this is the one that probably your client will test on to confirm the feature is ready. So you need to prepare a separate environment for it, specifically .env file. Here are the important variables:


Don’t leave APP_ENV as “local”, it should be different (but not “production”).
Also, APP_DEBUG=true means it will show errors with all trace, as it should for testing purposes. But don’t forget that on live server it should be strictly APP_DEBUG=false, otherwise you have security issue.
Finally, put your server’s URL as APP_URL. Simple.


Have a separate database for staging server. Don’t ever play around with live data for testing.


Finally, if you have any external services like Stripe, don’t forget to put their sandbox/testing credentials in .env file. Or, for email sending, you can use a separate driver and service like Mailtrap.

Bonus: Automated testing and continuous integration

Kind of a separate “advanced” topic of deployment is you may want to run automated tests on a separate testing server. You can set it up manually or use tools like Travis CI, CircleCI, or Jenkins.

There are many ways how developers set it up and configure, but short version will be this:

  • You have written automated tests – unit, feature, behavior or other type of tests. Check awesome course Test-Driven Laravel for this.
  • You have prepared database seeders to run and create a new fresh database with some dummy data for running your tests.
  • Every time you commit code to develop branch, testing server pulls it and runs the tests, informing you whether code is “safe” to go live (but don’t trust that “green light” blindly).

Here’s example automated check by Travis CI in our Pull Request on Github:

In ideal scenario, which is called Continuous Integration, everything is automatic – developers just commit their code, and system informs the team if some tests fail.

But it can be done in more simple and manual way – just have basic automatic tests and run them manually by typing phpunit on your staging server. Just don’t forget to prepare the database for it – with seeders and fake data (recommended) or some testing semi-real data (be REALLY careful about it).

Final Notes

So here it is, I have described deployment process for Laravel (but most of it is applicable for other frameworks/languages, too). But even in this big article I haven’t touched some topics that may help to make this even smoother. So I’m leaving these for you to explore:

  • Bug tracking and reporting software: we’re using Bugsnag for this – here’s my own video
  • Backup your database before every deployment, it’s really easy with Spatie Laravel Backup package or using a paid service like SnapShooter Backups
  • After successful (or not) deployment you may configure notifications to email or Slack, it depends on what repository system you use or what auto-deployment tools like Envoyer. Here’s how Laravel Forge informs us via Slack.

I guess that’s it. I wish you more success messages like the one above 🙂

Again, please take this article as advice for “typical scenario” but keep in mind there are dozens of other ways to deploy projects, and you may here different tips and processes from other teams. Come up with your own process that is suitable for you!

Like our articles?
Check out our Laravel online courses!


  1. How do you handle building frontend assets?

    Having node installed on the live server just for build seems like the same bad practice as commiting these assets to git.

  2. Great article. Everyone ignores the “future updates” part which I think it is more important. But one question, is this zero downtime real? Lets say envoyer copies my new files to the new folder and runs migrate, and at the same time my old files which are still live requests something that was in the old migration (a db field that got deleted). Will that error? Because the database is the same.
    I read an advice on twitter which was suggesting another way when there are migration changes in here:

    • Great comment, Amin. Yes, you’re right, with DATABASE it’s not exactly zero-downtime if migration operation takes time. And there are various ways to deal with it, that tweet is one of them but sounds really complicated. If downtime is 5-10 seconds, I would probably just run php artisan down for that moment to stop all DB operations. If it’s more sensitive with huge amount of data, then probably Envoyer is not for you, and you would have to create your own script/process for deployment.

  3. I currently use CircleCI and Envoyer, where once tests have successfully been completed, CircleCI informs Envoyer to do a new release during the deploy step and a simple curl call. This works great but one problem I see is wouldn’t Envoyer always checkout the latest commit against the targeted branch? What if for example, the Master branch receives 2 simultaneous commits? CircleCI will run tests on the first commit, then inform Envoyer and proceed with running tests against the second commit. Envoyer on the other hand will run a checkout and see the latest, 2nd commit and proceed to create a release against that, before CircleCI even has a chance to complete running tests on that second commit.

    Wouldn’t that be a problem? If so, how would one get around that? Admittedly at the moment, I only make commits directly to my Master branch so I can see this issue happening more frequently with rapid commits. If I move to using a dev branch before merging all changes to Master, wouldn’t this just improve my chances of this problem not occurring and not actually make it go away?

    • Hi Geo,
      It’s hard for me to comment, because I haven’t used CircleCI, only Travis CI. But general logic should be that you shouldn’t trigger Envoyer immediately after CircleCI, or trigger it with some other condition – when ALL tests pass from all commits.

  4. Hi Povilas, excellent article – thank you. I deployed my site using Forge to a Digital Ocean droplet (LEMP). I’m using PHPjasper for reports and my app online works fine except for reports, it bombs trying to connect the report engine to the database. My app connects fine, so I know the dB connection parameters work. I get an obscure error saying the database name is missing, but I can see it in the parameters being passed. My app works fine locally using Laragon with Nginx & MySQL same as my droplet. I can’t find any reason this shouldn’t work. The only difference I can see is that the working host is Windows and the server is Ubuntu. Any ideas how to figure this one out?

    • These should be done on the server side while deploying.
      That’s part of the reason why Envoyer is so valuable – it makes no-downtime deployment, even if you have many optimization or other commands running while deployment.

      • Hi Povilas,

        Thank you.

        One source of confusion.

        When we were working on the media library the other day (on this site), those media folders were committed.

        How do you keep local testing data from making it to GitHub? Do you use gitignore for folders like that? Are there other commands you use to prepare/remove non-live data from the local copy before the commit?


  5. Okay, This all makes sense but let me entertain you with an example…
    I have a database made with the artisan migrate command…
    I’ve loaded the database with data from laravel or phpmyadmin whatever,
    When I run migrate on the live server I get the schema but not the data.

    How do I transfer the data aswell? If I change the schema in a table that is already live and already has a lot of data how do I update that table without destroying all the data/dropping the table?

    This is something I can’t seem to find an answer to, Is the best way to export via phpmyadmin and re-import?

    • It depends on the state of your database, what changes you need to make etc.
      But, practical “rule of thumb” is always use migration files.
      Want to make changes in tables – use new migration file for it and then run “php artisan migrate”
      Want to add more data – create new migration file and add inserting statements there. Or create a new “seed” file and call it from migration like Artisan::call(‘db:seed –class=YourSeedFile’);

      • In the case where I might have hundreds of thousands of entries lets say in a standard users table. How do I keep that data between a new migration file?

        • Your new migration file doesn’t delete any data, unless you’re doing “migrate:fresh” or specifically deleting something in that file.

  6. Filling .env data for Production in the server from scratch with tons of tokens, while most of the are the same with Local, is not easy! What is your solution?

  7. Just want to say thanks! Did my first deployment the “right way” using the steps above today. Ran into a few snags with PHP and MySQL versions, and a few other things, but I stuck with the steps above throughout and it was invaluable. The app is up and running – thanks!

    • Unfortunately I don’t have any experience with them, personally use Envoyer from the beginning. But I agree there are free tools, and they are probably quite good if they become popular. I just don’t have any experience with them.

  8. Finally someone explains this process from the ground up with all the details. Most of other tutorials just skip important steps and after deployment totally skip the part of adding new features, adding master and develop branches etc… Thanks!

  9. Hi Dear!
    You have done a superb job.
    God bless you. I have no words. I was just stuck in the middle and may would have left Laravel if I wouldn’t have found this article.
    Thanks a lot.

  10. Hi Povilas,
    How to deploy Laravel 8 with Livewire in subfolder?
    We share a server with others apps and frameworks.
    With the tutorials I found on internet works partially, but I can’t make Livewire work.
    I’ve already run the discover from artisan and not work.
    Thanks a lot.


Please enter your comment!
Please enter your name here