A
A
Alexander2012-12-28 10:49:23
bash
Alexander, 2012-12-28 10:49:23

Bacula, executing scripts after backup?

Good afternoon.
We use Bacula for backup, after backup on the client, Bacula runs a script that additionally presses the necessary data with tar.
In the director settings in the job for the client, it looks like this:

Run Script {
      Runs When = After
      Runs On Client = yes
      Runs On Success = yes
      Runs On Failure = no
      Fail Job On Error = no
      Command = "sh /home/test2.sh"
    }

Everything is fine, the script starts after the backup, but bacula does not consider the task completed until the script has completed, and this can take a long time, because of which unnecessary queues can be created in the director ...
I tried to get around this with a standard Bash ampersand (&): Nothing comes out, I even tried to run another script with an ampersand from the script, all the same, bacula considers the task not completed until the script completes ... Oh , of course, there are already thoughts, to create some file in tmp with a script that runs bakula, and a cron script check for the presence of this file and run tar, but I don’t want to make such crutches. There must be a way out! Tell me, who came across Bacula, how to solve this problem?
Command = "sh /home/test2.sh &"

Answer the question

In order to leave comments, you need to log in

2 answer(s)
M
MealstroM, 2012-12-28
@MealstroM

mycommand.sh | at now » tried this design? you can add at now + 1 hour. Try like this.

M
merlin-vrn, 2012-12-29
@merlin-vrn

Bacula waits for all child processes to terminate. In other words, you will not leave the task “in the background” if your processes are launched from bacula - even if you fork in the script and kill the parent, the bacula will pick up the orphan and wait for it to complete. It is logical, because the serialization of tasks in the bacula is done to speed up each of them: so that the systems do not “spray” during execution, performing each of the tasks and everything in general faster.
You can partially get around this by specifying Maximum Concurrent Jobs greater than 1 in the Client, Storage, Director resources (where necessary). Then the system will be able to start other tasks, despite the fact that this task is still running.
Perhaps another solution is to deploy the backup you just made from bakula and pack it into tar already. If memory is right now, restore jobs work differently with these limits on the number of simultaneous tasks.
In general, I have a strong feeling that you are using it incorrectly, or rather, your backup architecture is wrong. What do you need this tar for? If this is a "second backup", why is it running from bacula if it has nothing to do with it? Trying to serialize tasks? Well, you have them serialized correctly, the disk and other resources are busy with one specific task at any given time.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question