D
D
dbmaster2012-04-09 18:00:41
System administration
dbmaster, 2012-04-09 18:00:41

How to automate remote file search?

Hey!

On the occasion of spring, system administrators complain that the weekly backup of all servers takes more than a day and asked to clean the backups. We have more than 20 sql servers, and therefore it is not interesting to search manually.

The task is as follows:
- find *.bak files of a "large" size that have not been updated for more than a month
- find installations (*.exe, *.zip, etc) of large size
- find detached database files (mdf / ndf / ldf)

As solutions chose WMI

Search
select * from CIM_DataFile where extension='bak' works
with acceptable speed

but searches
select * from CIM_DataFile where name like '%.bak' and FileSize > 1000000
select * from CIM_DataFile where FileSize > 100000000
select * from CIM_DataFile

work "infinitely" the

server is not 100% loaded, only the antivirus is noticeable on loading.

Google search led to the article technet.microsoft.com/en-us/library/ee176621.aspx
in which they write that searching through WMI on 80,000 files is 6 times slower, but this is in Windows 2000.

Our environment is Windows 2003 - 2008 R2 .

How can I speed up the search and what other options are there?

Thanks

Answer the question

In order to leave comments, you need to log in

3 answer(s)
A
AlekseyPolyakov, 2012-05-26
@AlekseyPolyakov

I think that your ideology needs to be changed a little, in my opinion it will be sufficient to focus on the date of creation of the file and its extension. If you have MSSQL, then I do not see any problems, set up cleaning files by day, this is one of the standard scheduled MSSQL operations. Solutions for searching files by date are in Google.

P
Puma Thailand, 2012-04-10
@opium

It seems to me that a simple file enumeration script with instant deletion will suit you.

C
ComodoHacker, 2012-05-26
@ComodoHacker

What other options are there?

  1. log parser. The syntax is about the same, SQL, this is a big plus, it allows you to set tricky conditions and generate reports. As for performance - I'm sure it will be faster. As far as I remember, it does not work through WMI, but directly with WinAPI. I remember finding all the recently changed files on the C: drive (home system) took no more than a couple of minutes.
  2. Forfile command. In 2008 there is a regular, in 2003 you can copy. The performance of the actual search, I think, is not much different from dir /s. Another thing is that a new process is created to perform an action on each selected file, which can slow down.
  3. PowerShell. I won’t say anything about speed, I don’t have experience. But I think that everything is in order (if not through WMI to work).
  4. And finally, a banal dir with result processing in a batch file or in js.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question