P
P
Pavel Konchich2021-01-15 00:53:38
Laravel
Pavel Konchich, 2021-01-15 00:53:38

Hide website from all users except laravel crawlers?

I don't have sample code at the moment because I don't understand how this solution can be implemented even in classic PHP. And I would like to ask you!

In general, the task is this:

Close the website from all users!
Instead of a website, show a stub for all routes (any html template)
Allow access only to search robots
Allow access by static parameters in the URL
Actually, that's all - but this is exactly what I can't do (Tell me where to start, or in general solution to this issue!

Answer the question

In order to leave comments, you need to log in

2 answer(s)
J
jazzus, 2021-01-15
@konchychp

Install jenssegers/agent
package Create IsRobot middleware with code

if (!(new Agent)->isRobot()) {
    abort(404);
  }
return $next($request);

Adding a middleware to the routes file in the RouteServiceProvider
Route::middleware('web', 'isRobot')
...

M
MKE, 2021-01-15
@MKE

One of the best options is to write the rules in .htaccess, something like this:

SetEnvIfNoCase User-Agent yandex AllowGroup
SetEnvIfNoCase User-Agent google AllowGroup
...
Order Deny,Allow
Deny from all
Allow from env=AllowGroup

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question