Answer the question
In order to leave comments, you need to log in
Backup files (archiving)
Hello, tell me.
There is a file base of about 350 GB, I need their daily backup, I use 7zip.
When the archive was up to 160 GB, there were no problems, now the archive is more than 160 GB, the archiver makes the archive normal for the first time, beats it the second time (becomes unreadable).
Previously, I used WinRar, the problem arose when the archive file became more than 130 GB in size, switched to 7zip, now the same problem with it. I decided to make a multi-volume archive, now he writes after the first archiving that the file was not found.
What do you advise, what to do, can somehow change the script. It is necessary that the archive is supplemented!
script:
set PATHTO=G:
set FILEBACKUPNAME=backup
7z.exe u -w%PATHTO% -v20g %PATHTO%\%FILEBACKUPNAME% @listfiles.txt > %PATHTO%\%FILEBACKUPNAME%.log
Answer the question
In order to leave comments, you need to log in
I use tar+gzip (there should also be a version for Win)
archives large volumes (up to several terabytes) and there were no problems.
Do you really need to back up 350GB every day? Or all the same all the files + changes per day?
But in fact - you are somehow preparing 7zip in a wrong way, he is perfectly able to make multi-volume books.
You can try Cobian Backup, we use it in many places, there are no complaints about the work, it can just do 7zip, and cut and make incremental / differential backups.
Stupid question, but is the latest version of 7z, and, most importantly, is it not a 32-bit system - maybe you have come across some non-obvious overflow bug?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question