Answer the question
In order to leave comments, you need to log in
How to increase the variance of a series (an array of numbers)?
The essence of the task is as follows, there is an array of numbers (say 70 elements), all of type byte (from 0 to 255), the problem is that the data does not differ much enough, that is, for example, the minimum is 1, the maximum is 20 (but maybe 255 in a particular case), I would like to increase the "contrast" of the numbers so that the maximum remains the same (does not increase) and the rest change, the small ones become even smaller, the average ones decrease, but not too much so that it is even. I tried to figure out what kind of manipulations with this data in order to do this, but I can’t formulate mathematically, here I most likely need to apply trigonometric functions, but I’m not good with it) I didn’t see anything similar in Google, if anyone is familiar or well versed in this, Ideas how it is possible to process an array?
Answer the question
In order to leave comments, you need to log in
Without delving into the mathematical side of the issue, the first solution that comes to mind is to build a map for the elements of the array, the smaller the element, the higher the index for it in this map, the stronger the subsequent change in this element.
Clumsily, quickly, simply
Increased dynamic range. We are looking for min/max in the array. We bring all numbers from the range [min...max] to the range [0..max] (or, if you need the maximum range in [0..0xFF]), well, normalize that is.
And what does trigonometry have to do with it, by the way?
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question