Skip to main content

Black Friday 2025! Only until December 1st: coupon FRIDAY25 for 40% off Yearly/Lifetime membership!

Read more here

DB Backups: Automate and TEST Them

Premium
6 min read

Now that we have some features and the project is coming along - we should think about the worst...

What if we lose the data?

Like with the code in GitHub - we should also create database backups!

Seriously, backups are one of those things we don't think about until it's too late. So, let's get ahead of the curve and add backups to our project.


Adding Backups

For our backups, we will use the good old Spatie Backups package. It's tested, reliable, and easy to use.

First, let's install the package:

composer require spatie/laravel-backup

Then publish the configuration file:

php artisan vendor:publish --provider="Spatie\Backup\BackupServiceProvider"

And that's it for the installation. Now, we need to configure the package.


Configuring Backups

We need to edit the config/backup.php file to configure our backups. In this case, we want to add s3 as an additional disk to store our backups:

config/backup.php

// ...
 
/*
* The disk names on which the backups will be stored.
*/
'disks' => [
'local',
's3',
],
 
// ...

Note: Don't forget to configure your S3 credentials in the .env file.

Other settings are personal preference, but we could recommend...

The Full Lesson is Only for Premium Members

Want to access all of our courses? (31 h 16 min)

You also get:

55 courses
Premium tutorials
Access to repositories
Private Discord
Get Premium for $129/year or $29/month

Already a member? Login here

Comments & Discussion

S
Sisifo ✓ Link copied!

Hi, If I use spatie media library with Amazon Web Service S3; Would be the images backed-up as well?. It would be better to use another provider to make the backups, wouldn't it? so the backup provider have to be different to any other provider. ok?

M
Modestas ✓ Link copied!

By default S3 images will not be backed up. You need to configure that separately. And I'm not sure if this is supported in this package, since it would be huge file moving between two services. For that, there should be better proccessses than using PHP to run them :)

S
Sisifo ✓ Link copied!

Thanks Modestas,

I understand that the backup of the media have to be other process. Could you point us in any direction to look up taking into account your experience? Another S3-Bucked with "internal S3 script" of backup between buckets? another third party backup service? Is an incremental process (type "Apple Time Machine") possible (exists?)?

M
Modestas ✓ Link copied!

I'm not that experienced in this, so sorry, can't 100% recommend anything.

AU
amn uteri ✓ Link copied!

I did what's in the tuto above & I wanted to add slack.com webhook so I can get a notification in a channel so I did : composer require laravel/slack-notification-channel because I saw this command in the config/backup.php commented doc.

( I don't think I can do this step : https://laraveldaily.com/uploads/2024/12/forge-scheduler.png ) since I was testing this on localhost, should I install forceCLI or something like this ? or cloud this be done using cronJobs ? )

  • Testing this on plesk server would be easier I think since it got the schedules section (https://i.imgur.com/MZj1I7I.png) , but what about localhost ? is there any tutorial for that?
admin@Med-Macbook-M3-Max elearny_app_web % php artisan backup:run

   Error 

  Non-static method Illuminate\Console\Scheduling\Schedule::command() cannot be called statically

  at routes/console.php:26
     22▕ 
     23▕ 
     24▕ 
     25▕ 
    26▕ Schedule::command('backup:clean')->daily()->at('01:00');
     27▕ Schedule::command('backup:run')->daily()->at('01:30');
     28▕ 

  1   app/Console/Kernel.php:30
      +2 vendor frames 

  4   artisan:35
      Illuminate\Foundation\Console\Kernel::handle(Object(Symfony\Component\Console\Input\ArgvInput), Object(Symfony\Component\Console\Output\ConsoleOutput))

admin@Med-Macbook-M3-Max elearny_app_web % 

I tried to add the this part of the code in the app/Console/Kernel.php :

protected function schedule(Schedule $schedule): void
{
    $schedule->command('backup:clean')->daily()->at('01:00');
    $schedule->command('backup:run')->daily()->at('01:30');
}

Also I kept getting these errors while running php artisan backup:run :

Error Output
============
mysqldump: Got error: 2005: "Unknown server host 'localhost:3306' (8)" when trying to connect
.... ..... ......
Output\ConsoleOutput))
#24 {main}
Backup failed because: The dump process failed with a none successful exitcode.
Exitcode
========
2: Misuse of shell builtins

Output
======
<no output>

Error Output
============
mysqldump: Got error: 2005: "Unknown server host 'localhost:3306' (8)" when trying to connect
.

Also one more question, since this works with S3, would it also work for the Cloudflare R2 cloud storage ?

M
Modestas ✓ Link copied!

You can't really test the scheduler locally (or at least, I haven't found an easy way to do it). You can however run the command manually and that will be exactly what you should expect.

Now to answer the two other questions:

The error you are having - there is an issue with database configuration. Other than that - I'm not sure what is going on. Check that the credentials are correctly set up.

As for the R2 usage - it should work, but you might require a S3 compatible driver for it. In theory there shouldn't be many differences and framework supports it (https://laravel.com/docs/12.x/filesystem#amazon-s3-compatible-filesystems)

AU
amn uteri ✓ Link copied!

Thanks alot !