M
M
medovuk2018-07-19 11:23:16
Java
medovuk, 2018-07-19 11:23:16

How to efficiently (time) write a huge amount of data to a file?

Hi all. I am a beginner Java developer and I need some expert advice. There is a task: there is a processing of a huge (very) number of files (search, indexing, etc.). In the process, you need to write to a file (well, this is how I solve the task) 10 ^ 6 - 20 ^ 6 lines (infa about files). Please advise on the fastest way to do this.
I would be very grateful for your help.)

Answer the question

In order to leave comments, you need to log in

3 answer(s)
P
pfg21, 2018-07-19
@pfg21

do a test run and see which part of the system is bottlenecked.
if plugging in the recording on the media, apply data compression.
if the plug in the processing of the results, then put the percent more powerful.
if the gag is in getting a bunch of information about files, then speed up the file system.
etc. etc.

M
m0nym, 2018-07-19
@m0nym

Buffer. For writing by byte is wildly slow.

A
Alexey Cheremisin, 2018-07-19
@leahch

And you can also apply distributed processing .... and look towards hadoop, for example, or spark, hazelcast, ingine.
Well, as I wrote in a comment to the answer pfg21 - apply lz4 or snappy compression

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question