# Do we have to scale the data to exactly (-1,1) range?

Do we have to scale the data to exactly (-1,1) range? Could it be (0,1) or could we just standardize it? If not - why?

Also, when we scale the test data with MinMaxScaler fitted to train data we may not get (-1,1) range. Could it be a problem?

Answering your second question when using the scaling on test data the range could be exceeded and the reason is that the formula used for MinMaxScaler is like = X_scaled = X_std * (max-min) + min

Where max and min values come from train data that you've fitted on. So if the max and min values differ on the testing data it goes out of range

Hi Adam

You cannot standardise it because it would produce numbers outside the range. This is scaling, which is a very different thing to standardisation. Think of a map. The area represents a few miles but it is shown on an A4 piece of paper. All lengths are divided by the scale used so it all fits in the page but keeps the same proportions. That is what we are doing here - reducing the max value to 1 and the min value to -1 and keeping the same distance ratio between all the numbers in between (as you would with the features on a map). Standardisation has a similar outcome in that it transforms numbers to small numbers, but the theory behind it is different. In standardisation you are assuming a normal distribution and setting the mean to 0, with the numbers produced representing how many standard deviations from the mean the observation is. Aside from the fact that only 68% of the values would be between -1 and 1, the theory is completely different. Standardisation is often called feature scaling as it transforms the feature values to the same scale, but it is not scaling in this sense of the word. All we are doing here is changing the size of the numbers, but keeping the same relative distances between them (just like towns on the map).

Hope this helps

Hi Alastair

I didn't ask that question but your answer made my knowlege a bit better in Scaling. Thank you for your input.