Answer the question
In order to leave comments, you need to log in
What is the easiest way to radically speed up R code?
I am not a programmer, but a scientist, I use R to develop new algorithms and conduct computational experiments. In terms of the speed of translating ideas into ready-made tools, the preparation of some non-standard scientific graphics, R is simply excellent and never ceases to please. But in some cases, especially with fairly massive calculations, the speed of code execution is depressing. Everything is already parallelized, the fastest designs are used, but this seems to be the limit. Cycles cannot be used, everything has to be turned into future_apply and analogues that break the brain and are extremely inconvenient for debugging, but their speed is no longer enough. Where should you move? As usual, you want to get the maximum result with minimal effort. So far I see options:
Answer the question
In order to leave comments, you need to log in
"the fastest constructions are used" - exactly? I don't know anyone who could say that about their code =)
Your optimization suggestions seem very scary to me. Isn't it easier to switch to python if you are already considering such complex decisions?
Where is your data stored? Hopefully in data.table? very, very fast. and there is room for further optimization. Extremely superficial proposal, but still, it can help.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question