Answer the question
In order to leave comments, you need to log in
I am building a working PC for Data Science. Do you need ECC memory?
I am building a working PC for Data Science. I settled on the amount of memory in 128 GB. Do I need ECC memory in this case?
It seems like the larger the amount of memory (more chips), the higher the likelihood of spontaneous errors. What is the probability that 24 continuous hours of calculations will go down the drain due to one or two or a few spontaneous errors?
Answer the question
In order to leave comments, you need to log in
non-zero but very small, such as 10%
such errors are compensated by the software
It depends on how the NN is logically designed, how many classifiers it has, and whether it has dynamic balancing of the final "weights" between the classifiers.
If you do not have a 100% mathematical model in the NN and there is a balancing of the classifiers, then you will not notice even 50% of the errors at all.
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question