R
R
randomguyasking2020-06-01 09:54:15
JavaScript
randomguyasking, 2020-06-01 09:54:15

How to manage memory in javascript?

In the process of testing the binary search algorithm, I became curious about how the algorithm would work on an array of numbers with a size of 4 billion. As expected, when trying to create such an array, nodejs tells us that there is no more space on the heap:

<--- Last few GCs --->

[133294:0x5da3760]     1203 ms: Scavenge 1148.2 (1181.1) -> 1148.2 (1181.1) MB, 42.9 / 0.0 ms  (average mu = 1.000, current mu = 1.000) allocation failure 
[133294:0x5da3760]     2175 ms: Mark-sweep 1722.0 (1755.0) -> 1709.7 (1743.0) MB, 479.6 / 0.0 ms  (+ 37.7 ms in 11 steps since start of marking, biggest step 4.7 ms, walltime since start of marking 2099 ms) (average mu = 1.000, current mu = 1.000) allocat

<--- JS stacktrace --->

FATAL ERROR: invalid array length Allocation failed - JavaScript heap out of memory
 1: 0xa295e0 node::Abort() [node]
 2: 0x9782df node::FatalError(char const*, char const*) [node]
 3: 0xb99c2e v8::Utils::ReportOOMFailure(v8::internal::Isolate*, char const*, bool) [node]
 4: 0xb99fa7 v8::internal::V8::FatalProcessOutOfMemory(v8::internal::Isolate*, char const*, bool) [node]
 5: 0xd3a3b5  [node]
 6: 0xd1fe8d  [node]
 7: 0xe7fc9e  [node]
 8: 0xe837fa  [node]
 9: 0x10243a3 v8::internal::Runtime_GrowArrayElements(int, unsigned long*, v8::internal::Isolate*) [node]
10: 0x13a5a99  [node]
[4]    133294 abort (core dumped)  node --heap-prof binarySearch.js


There are two questions:
1) In Javascript, the Number type takes 64 bits (I did not find any smaller alternatives). Accordingly, an array with 4 billion such elements will take 4,000,000,000 * 64 = 256000000000 bits or 32 gigabytes. Is my reasoning correct?
2) Is there any way to solve this problem? Or will just increasing the memory and heap size help?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
A
ayazer, 2020-06-01
@randomguyasking

this does not apply to javascript, this is a typical problem when there is too much data to work with them "on the forehead".
so (if you really want to solve this problem through binary search for the sake of interest) cut the tree into several small ones. After all, with each step, the search area narrows by 2 times. If all data takes up 60GB (as Sergei Sokolov wrote ), then after the first 5 steps, you will need to search only in 3.25GB. As an option - use 1 tree to narrow the search area to the size that already fits into memory, and then load the second tree and search in it

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question