Answer the question
In order to leave comments, you need to log in
What free software is capable of processing text files with server access logs?
There are logs saved in text files of at least a gigabyte each.
It is necessary to analyze them or process them in such a way as to identify the most frequent addresses from which requests come. The density of the log is 50-70 lines per second of time. It takes 3 days to analyze.
Answer the question
In order to leave comments, you need to log in
look at www.weblogexpert.com/lite.htm (you can buy a pro version on a rutracker)
and the classic www.awstats.org, but you still need to get confused with it - install, feed logs, etc.
well, on https://technet.microsoft.com/en-us/scriptcenter/d...
the problem of all progs is that they can crash on a long file, and I
recommend using handles for this particular task:
1 ) take www.powergrep.com - the most powerful tool, eats large files, processes everything that is possible (also sold on a rutracker), choose it from each IP log (well, Get and the response code there, for example)
2) take www.textpad.com - the best text editor in the world (I'm not joking, it's the only one that doesn't get hung up on gigabyte txts, its next 1-2 competitors at 20-30 Gb hang up, it copes with 40+ Gb) and in you sort the selected lines by F9, you get a picture
instead of (2) you can drive the logs - for three days this is only 18 million lines - into MySQL and make selections using DbFOrge Studio or MySQL Front.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question