Answer the question
In order to leave comments, you need to log in
What is the best way to organize synchronization of some tables of two projects (production and test) in Laravel?
Interested in Laravel approaches.
Task: There are two bases. The operator enters data into the working base through the web interface. It is necessary that when saving (adding, changing, deleting ...) each record, this info automatically gets into the neighboring database located on the same server - into the test version. So that the actual information is maintained in the test.
What is the best way to implement this?
My option is the following:
1) Create an artisan command that makes a backup of the table and deploys this table in another database.
2) create a Job that would execute this command
3) run this job every time one of the actions occurs (save, delete, change ...)
It seems to me that this method is more convenient than each time connecting to a different database from laravel and inserting data there. Moreover, we are not talking about one table, but about several interconnected ones.
How else can you?
Answer the question
In order to leave comments, you need to log in
This question is out of laravel scope. Moreover, how will you proceed when the structure of the database in production and in the test is different? It's probably best to backup the working to test environment during the CI/CD process next to git pull in the test environment and before doing any migrations.
For this, I stupidly wrote a bash script with the following content (I use laradock), which makes a backup on the prod, drops the local database and restores the dump copied from the prod on the local machine.
It's a crutch, but it works like clockwork. Normal devops would obviously have done differently.
ssh [email protected] 'cd /var/www/<project>/laradock; docker-compose run postgres pg_dump -Fc --dbname=postgresql://<user>:<password>@postgres:5432/<database> > /var/www/dump.sql'
scp [email protected]:/var/www/dump.sql .
docker cp dump.sql <container_name>:/tmp/
docker-compose exec postgres dropdb default -U default
docker-compose exec postgres psql -U <username> -W -qAt -c 'CREATE DATABASE "default"'
docker-compose exec postgres psql -U <username> -W -qAt -c 'grant all privileges on database "default" to "default"';
docker-compose exec postgres pg_restore -U default --no-owner --role=default -d default -c /tmp/dump.sql
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question