V
V
vakavakvaka2015-11-24 21:38:34
linux
vakavakvaka, 2015-11-24 21:38:34

How to protect websites from hacking?

There is a server that hosts a decent number of different sites. Unfortunately, it is not possible to keep scripts up to date and close holes in them.
Tell me some system that would protect at least some of the attacks.
You need something like a fast-working layer between the user and the server in order to cut off the most common malicious requests (for example, if a string containing " <?php" comes with the request)
I would also like to keep logs that would indicate the appearance of new php scripts in folders with sites.
Tell me something please.

Answer the question

In order to leave comments, you need to log in

7 answer(s)
X
xmoonlight, 2015-11-24
@vakavakvaka

Since various rules are constantly updated and modified, the current version of all rules and recommendations for protecting the web server is available at this link.

V
Vlad Zhivotnev, 2015-11-24
@inkvizitor68sl

First of all, you need to prohibit writing to the web server anywhere except tmp_dir (its separate one) and session_dir. Everything else is half measures.
> I would also like to keep logs, which would indicate the appearance of new php scripts in folders with sites.
Called git + git status by cron to mail (if different).

A
Alexander Aksentiev, 2015-11-24
@Sanasol

clamav
At this point it seems a little too late to do anything.

V
Vladimir Seregin, 2015-11-30
@Heavis

Read the CIS Apache HTTP Server Benchmark ( download from here , you only need to fill out a form). There are basic recommendations for setting access rights, minimizing functionality, setting up tls, logging, and so on. There are two versions of the document for apache 2.4 and 2.2.
To protect against some attacks - modsecurity + core rule set. It is not as harsh as naxsi, it works on the principle of a black list, not a white one. After finishing the rules to exclude false positives, you can live.

A
Alexander Kubintsev, 2015-11-25
@akubintsev

There is such a good module as nginx-naxsi, but it is very harsh, you need to test it so that it doesn’t cut off too much. That is, it takes a decent amount of time to set up (at least it requires viewing the logs), although it is not complicated in itself.
It really does not solve the problem of accounting for the appearance of new files, their changes, etc., but partially closes the holes in the requests.

R
repeat, 2015-11-25
@repeat

I recently pulled out the database completely using sqlmap. All due to lack of experience, laziness and incorrect requests.
Changed requests to something like

$stmt = $dbLFSP->prepare("SELECT online, username FROM users WHERE online = :hostname");
$stmt->bindValue(':hostname',$hostname);
$stmt->execute();
$users = $stmt->fetchAll();

problems became much less.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question