Y
Y
Yuri Kalnin2019-04-14 13:02:45
Laravel
Yuri Kalnin, 2019-04-14 13:02:45

What is the best way to organize the development and updating of a site on laravel?

When developing the application, I used seeds and each time I reloaded the migrations, I put it in my team and used it:

public function handle()
    {
        $this->call('optimize:clear');
        $this->call('migrate:refresh', ['--seed' => true]);
        $this->call('dusk');
    }

but now that the application is almost completed, I wondered how to organize a convenient development process: local server - test server - production.
I would like to abandon the seeds, and work with the actual site data, only on my local server.
I installed the spatie/laravel-backup package that makes backups on the production server every day, now I'm going to write a script that will download the latest database backup and deploy it to my local development environment.
There is also an upload folder (with data uploaded by users), I also want to automate it in approximately the same way.
But I really doubt the correctness of this approach, for sure there are customized work schemes, tell me how to organize everything.

Answer the question

In order to leave comments, you need to log in

4 answer(s)
E
Ernest Faizullin, 2019-04-14
@erniesto77

on the account of the test server, we do this, clone the project to a new server and create a separate branch for it (dev) in the version control system. We do all the improvements in branches that branch off from the production branch and merge into the dev branch. When approved, the branch is merged back into the production branch
on the local enviroment "local" on the dev server "development", on the production "production" (consider env in the development code) . If the database is large, then you can connect to the database on dev from local, but this is not welcome if there are more than 1 developers, it is better for your local to create some kind of free storage in 1TB like https://stackstorage.com or something something like that
cron takes snapshots from the database on all environments every day at night, only the last 5 dumps are stored in the file system. There is a separate Storage for dumps, a separate Storage for pictures, in short, all Storages are created according to the meaning of the stored data
PS: I think there is no general template for all occasions, but there is a good practice

R
Roman Frank, 2019-04-14
@Akellacom

On a large project, we make a backup for the last day, the dump is 200-300 mb. It rolls on a test or LAN.
If you need a couple of months for testing, then we make a backup for a couple of months.
As written above, and really there is no point in backing up the entire database

J
jazzus, 2019-04-14
@jazzus

I use Git BitBucket + Laravel Forge with VPS
Latest changes + migrations for the database are uploaded to Bitbucket, which is synchronized with Forge automatically. By pressing 1 button, all changes are uploaded to the server and tables are created. Deployment script, env can be edited in the service. Also in the forge there are cron and a simple launch of backup and other commands through it. There were never any problems.

D
Dmitry Kazarmin, 2019-04-15
@fenix163

Make a backup in the battle for the crown. Pull backups to LAN. On a LAN, either by krone before a working day, or simply by manually starting the script, raise the last backup. The upload folder is also pulled with a sync

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question