Answer the question
In order to leave comments, you need to log in
Curl error - SSL certificate problem: self signed certificate in certificate chain? in UMICMS on localhost
In UMICMS on localhost, the error localhost/emarket/purchasing_one_step "Curl error - SSL certificate problem: self signed certificate in certificate chain" is displayed, what could be the problem?
Answer the question
In order to leave comments, you need to log in
What for?
Everything is quite optimal here.
Well, except to replace `` with $()
And do not calculate year and month every time, but do it once.
#!/bin/sh
# Пример пути до сайта /var/www/site.ru/web/
SITEDIR="/var/www" # Папка до названия доменаов
WEB="web" # Папка после названия доменов (если есть)
LIST=${1:-$(ls -L $SITEDIR | grep -E ".*\.\w{1,5}")} # Название сайта, тут выбор или аргумент или, регулярка можно убрать только -L ;)
DBCONN="bitrix/php_interface/dbconn.php" # Откуда читать настройки mysql
FTPUSER="*******" # Логин FTP сервера
FTPPASS="****************" # Пароль FTP
FTPHOST="**************************" # сервер FTP
FTPDIR="/server8/site/" # папка на FTP
TMPDIR="/var/backup/site" # Где будут храниться временные файлы
DATE=$(date +"%Y-%m-%d")
YEAR=$(date +%Y)
MONTH=$(date +%m)
test ! -d $TMPDIR/$YEAR/$MONTH && mkdir -p $TMPDIR/$YEAR/$MONTH # создаем структ уру каталогов если их нет
for ELEMENT in $LIST
do
if [ ! -f /$SITEDIR/$ELEMENT/$WEB/$DBCONN ];
then
echo " Файл $SITEDIR/$ELEMENT/$WEB/$DBCONN не найден, сайт будет без БД!"
else
DBLOGIN=$(grep "^\$DBLogin =" $SITEDIR/$ELEMENT/$WEB/$DBCONN | cut -f2 -d'"')
DBPASS=$(grep "^\$DBPassword =" $SITEDIR/$ELEMENT/$WEB/$DBCONN | cut -f2 -d'"')
DBNAME=$(grep "^\$DBName =" $SITEDIR/$ELEMENT/$WEB/$DBCONN | cut -f2 -d'"')
mysqldump -u$DBLOGIN -p$DBPASS $DBNAME > $SITEDIR/$ELEMENT/$WEB/$DBNAME-$DATE.sql && echo "Дамп БД $DBNAME будет сохранен в корне сайта" || echo "Ошибка дампа базы данных " $DBNAME
fi
echo "Архивируем сайт $ELEMENT"
tar -cvpzf $TMPDIR/$YEAR/$MONTH/$ELEMENT-$DATE.tar.gz --directory $SITEDIR/$ELEMENT/$WEB --ignore-failed-read --exclude='./bitrix/tmp' --exclude='./bitrix/updates' --exclude='./bitrix/backup/*\.gz*' --exclude='./bitrix/backup/*\.tar*' --exclude='./bitrix/cache' --exclude='./bitrix/managed_cache' --exclude='./bitrix/stack_cache' --exclude='./upload/resize_cache' --exclude='./stats' . >> /dev/nool 2> /var/log/backup_error.log
rm -f $SITEDIR/$ELEMENT/$WEB/$DBNAME-$DATE.sql # поскльку ложим на реальный сайт, ибо нормально добавить в архив не получитяс.
echo "Сохраняем $ELEMENT на FTP"
wput --basename=$TMPDIR --limit-rate=60000K --timestamping --remove-source-files --tries=2 $TMPDIR/$YEAR/$MONTH/$ELEMENT-$DATE.tar.gz ftp://$FTPUSER:$FTPPASS@$FTPHOST$FTPDIR
rm -f $TMPDIR/$YEAR/$MONTH/$ELEMENT-$DATE.tar.gz
done
rm -r $TMPDIR
I don’t like the implementation, I already deleted a million shit,
at least the config, for example, read variables directly, because the syntax of the variables is the same with bash, probably through read, but I didn’t do it, never, you see 3 times in a row I climb into one and then the same file, and even cut it later, and assign it, in an ideal way, just read them from the file, again, you have to archive the database through the ass because you have to lay it down and then delete it, and tmpdir is not needed, in theory, do it right away in the stream, ;( ssd
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question