W
W
weranda2016-04-06 20:57:19
Burglary protection
weranda, 2016-04-06 20:57:19

Can someone access site files other than the hosting provider?

Greetings

Suppose. There is a domain name and several site files with a database hosted by a popular hosting provider in the Russian Federation. The files are located not in the root of the site, but in a subfolder - site.com/path/path/files. On the main page of the site, there is nothing more than a download in the form of a picture in the html file. The robots.txt file prevents the site from being crawled. Can thoughtful people access the files in subfolders, or somehow access the structure of the site, so that they can then try to access the files themselves? If they can, then how | way to securely hide such folders | files to make life as difficult as possible for such ideological people and at least deliberately and consciously feel a spiritual sense of reliable data safety? There is nothing in the files, just a database of orders from the store.

Only a few simple steps come to mind for setting up .htaccess, robots.txt and setting up access from specific IP addresses.

Answer the question

In order to leave comments, you need to log in

4 answer(s)
S
Sanes, 2016-04-06
@Sanes

They can. Do you have access? So others might as well. Password theft, errors in server or application security. Thermorectal cryptanalysis in the end.

N
nirvimel, 2016-04-07
@nirvimel

Only a few simple steps to set up .htaccess, robots.txt come to mind

robots.txt- has nothing to do with access control, it's just some kind of agreement between the Google bot and the owner of the site, allowing not to complicate life for each other.
.htaccess- this is the Apache config , which determines how the web server processes HTTP requests and what, how and to whom (authorization / user sessions) returns in response. The web server can be made to give / not give the contents of the file system as you like. In general, the content of a "site" (such a vague concept) "showed" (available) to an HTTP client does not necessarily repeat the directory structure in the server's file system (the notion that the web server "opens" the contents of the file system on the Web has historically developed due to corresponding to the architecture of Apache, which was originally designed for static sites).
Even without any configuring .htaccess on all serious hostings, by default, only the content is publicly open public_html(only on very school hostings, public access comes from the root by default), everything that is higher in level is inaccessible from the Web side (unless you specifically open it through .htaccess ).
Много таких "идейных" по весне оттаяло. Везде они могут получить доступ (на словах), хацкеры диванные, попугаи-попугайчики.

P
Puma Thailand, 2016-04-07
@opium

Well, I just scanned and downloaded all the files, but why did you get the idea that the robots file forbids me something

X
xmoonlight, 2016-04-07
@xmoonlight

Encrypt each entry with a public key, give file names:
a2c4d612e1aac5.bin
no one will even try to look at what's there ....
And even when you try to look, you still won't see anything interesting, because it can only be decrypted using the private key.
https://ru.wikipedia.org/wiki/%D0%9A%D1%80%D0%B8%D...

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question