M
M
Monty Python2018-03-31 01:50:23
Apache HTTP Server
Monty Python, 2018-03-31 01:50:23

What are some good ways to make a site completely private?

How can you completely exclude the site from being found by people and all possible robots and ways from the outside
1.

User-agent: *
Disallow: /

2.
Change standard ports 80 and 443 to others
How to prepare before registering a domain, where to start, what else can be done?
PS below it was advised to restrict access by IP, but this is not suitable, it is necessary that you can enter from any IP

Answer the question

In order to leave comments, you need to log in

7 answer(s)
F
forspamonly2, 2018-04-01
@Austin1

add basic authorization, and give clients links with registered login and password like http://user:[email protected]/

O
Orkhan Hasanli, 2018-03-31
@azerphoenix

Why don't you restrict access to the site via htaccess to specific IPs only.

Order deny,allow
deny from all
# Список IP через пробел, с которых доступ разрешен
Allow from 194.111.70.48 194.78.47.128

In Cpanel there is such a thing as "protection from leechers". You can set a password for the directory and only after entering the password the user will see the site

X
xmoonlight, 2018-03-31
@xmoonlight

Local HTML page with one button: "Login".
The form sends a POST request with one dynamic field (token) based on the public key previously received from the server (hardcoded).

S
Sergey Sergey, 2018-03-31
@hahenty

You just need to attach a secret name to the address for knowledgeable clients in hosts. Incoming IP clients without the correct host name will be taken to the "it works" page, according to the standard Apache setting. And registering a domain name is optional in this case.
In any case, such "stealth" will require some kind of identification from clients: host name (my example), password, encryption key.
There is also a similar option when knowledgeable clients have a self-signed certificate in trusted ones.

V
Vladislav Lyskov, 2018-03-31
@Vlatqa

hang ftp authorization

D
Dimonchik, 2018-03-31
@dimonchik2013

change of ports + Authorization by client TLS/SSL certificates

D
Denis Verbin, 2018-03-31
@rez0n

Authorization (internal or via htaccess) or ip restriction, in any other case, google bot will sooner or later find out and crawl the site despite robots.txt (robots.txt carries purely advisory parameters that search engine bots do not always fulfill)

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question