Answer the question
In order to leave comments, you need to log in
Why does DNS not resolve domains with a huge number of HTTP requests?
I need to check 200 million domains for availability and belonging to a particular CMS.
I'm using PHP 7.1 and doing a lot of process checking.
HARDWARE AND SETTINGS
Answer the question
In order to leave comments, you need to log in
Set unbound, it will send recursive queries itself. How to configure correctly is described in the documentation. In principle, the default config will be enough, you only need to specify external / internal interfaces, and in the ACL, register permission only to resolve from the address 127.0.0.1.
The cause of the problem is Googledns rate limiting you.
The solution is to use another, preferred recursive dns, for example, as athacker advised you.
And I have a solution that can resolve a bunch of DNS using third party DNS servers. Let's say there is a list of 30k such DNS servers, my solution can take a 4 GB file (emails) and resolve MX and A records in about 40 minutes for a normal server without bells and whistles. If you're interested, you can customize it to suit your needs.
You need to first decide on the goal that you want to get, then figure out how it works.
1. For starters, I would define domain data using WHOIS.
https://www.imena.ua/domains/whois?domain=toster.ru
2. Then I would have servers:
nserver: ns1.habradns.net.
nserver: ns2.habradns.net.
nserver: ns3.habradns.net.
I checked the relevance and lifetime of the record for the domain. Based on this data, I would understand how often you need to update information about the domain.
3. After that, you can check the site for availability (knowing its name and IP) and what CMS, services, etc. are installed there.
To check this, you can use both direct access and indirect, through the search engine cache.
PS Well, for anonymity, I would do it through TOR, VPN, or, at worst, use a proxy
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question