Answer the question
In order to leave comments, you need to log in
How to efficiently (time) write a huge amount of data to a file?
Hi all. I am a beginner Java developer and I need some expert advice. There is a task: there is a processing of a huge (very) number of files (search, indexing, etc.). In the process, you need to write to a file (well, this is how I solve the task) 10 ^ 6 - 20 ^ 6 lines (infa about files). Please advise on the fastest way to do this.
I would be very grateful for your help.)
Answer the question
In order to leave comments, you need to log in
do a test run and see which part of the system is bottlenecked.
if plugging in the recording on the media, apply data compression.
if the plug in the processing of the results, then put the percent more powerful.
if the gag is in getting a bunch of information about files, then speed up the file system.
etc. etc.
And you can also apply distributed processing .... and look towards hadoop, for example, or spark, hazelcast, ingine.
Well, as I wrote in a comment to the answer pfg21 - apply lz4 or snappy compression
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question