Goto

Collaborating Authors

 standardization vs normalization


Standardization vs Normalization

#artificialintelligence

Oftentimes, the input features in our data can have different units of measurement. As a result, each feature can have its own unique distribution of values. Unfortunately, incorporating features with different distributions can lead to a model showing bias towards features with larger values and variance. Feature scaling addresses this issue by fitting all data to a specific scale, which is why it is often a necessary component in feature engineering. The two most common methods of feature scaling are standardization and normalization.


Standardization Vs Normalization in Machine Learning

#artificialintelligence

In the above figure, there is a mean which value is 0 and std deviation is 1 which satisfies our standardization theory. Here as you can see there is a scatter plot of before scaling and after scaling and see there is no effect after standardization. After standardization, In probability density function graph lookalike the given below. In machine learning, our aim is to improve model and accuracy scores. So for improving scores and a good predictive model uses Standardization.