V
V
vovak19192019-11-07 19:47:45
Neural networks
vovak1919, 2019-11-07 19:47:45

How does normalization work?

I am studying an example from F. Schollet's book "Deep Learning in Python".
Link to example
There is the following code:

As we process the data, we will subtract the mean for each time series and divide by the standard deviation. For training, we use the first 200,000 samples, so the mean and standard deviation should be calculated only for this sample.

In[8]
mean = float_data[:200000].mean(axis=0)
float_data -= mean
std = float_data[:200000].std(axis=0)
float_data /= std

It seems that normalization should concern only the first 200K samples, but the debugger shows that the data beyond this range (over 200K) also changes. Is this a mistake or have I misunderstood the meaning of the operation?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
V
Vladimir Olohtonov, 2019-11-07
@vovak1919

The mean and standard deviation are calculated by the first 200k, and the entire dataset is normalized.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question