T
T
Therapyx2016-03-14 18:00:40
Java
Therapyx, 2016-03-14 18:00:40

BufferedReader or Scanner?

Good evening. Googled enough already and nowhere did I find the 1st specific opinion or given specific numbers.
What is better to use for reading and splitting text (for example, CSV) files of a huge size
Banal code example:

String filePath = "C:/test.csv";
    InputStreamReader isr = new InputStreamReader(new FileInputStream(filePath), "UTF-8");
    BufferedReader br = new BufferedReader(isr);
    String line;
    while((line = br.readLine()) != null) {
      String[] splited = line.split("\t");
      for (int i = 0; i < splited.length; i++) {
        System.out.println(splited[i]);
      }
    }

Or would it be better with a scanner? and why?
ps Unfortunately, in the next couple of days I will not have files for tests. Maybe someone already did it here? :)
ANSWER = Measured even with a not very large file.
The result is BufferedReader = 80 miliseconds Scanner
= 400 miliseconds.
In general, for a big data parse, the answer is obvious ....

Answer the question

In order to leave comments, you need to log in

2 answer(s)
A
Alexander Dorofeev, 2016-03-14
@Therapyx

Now, in terms of speed, they are almost the same.
However, Scanner has useDelimiter(Pattern)and methods findInLine(Pattern)that allow you to instantly search for delimiters, so it's
.csvbetter to use it for files.
In general, why reinvent the wheel if there are wonderful Apache Commons CSV or opencsv
libraries ?

A
asd111, 2016-03-15
@asd111

The fastest way to parse large files is C++ boost memory maped file + boost spirit parser
https://habrahabr.ru/post/246257/

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question