eliaskosunen/scnlib

Right way of reading a large file without performance loss

Closed this issue · 1 comments

What is the right way of reading a large file without performance loss?

If I try something like:

  scn::owning_file file{"bench_stdio.txt", "r"};
  auto ran = scn::make_result(file);
  double x;
  while ((ran = scn::scan(fan.range(),"{}",x))) {
    v.push_back(x);
  }

Performance seems to be bad.

Can we do this better?

Memory mapped files generally make this faster:

scn::mapped_file file{"bench_stdio.txt"};
auto result = scn::make_result(file);
double d{};
while ((result = scn::scan(result.range(), "{}", d))) {
    v.push_back(d);
}