Feature Scaling

Why require feature scaling?

so whenever we discuss about feature scaling means we are talking about features.

let me consider that I have features like height and weight, based on this I want to predict my Body Mass Index(BMI).

here Hight and Weight are my independent features and BMI is my dependent feature.

Every features have two properties.

1. Magnitude

2. Unit

Magnitude is nothing but values from feature and unit is basically measurement like Kg, Cm, feet etc.

In above Example height in cm and weight in kg. suppose if we don't perform feature scaling and if we apply magnitude with given units than some of the algorithm which works on distance and units so it will varies the value in large distance so that model accuracy goes down when we use different unit scaling.

so we have to scale down this features with normalization and standardization between 0 to 1 with different techniques.

for Example in linear regression coefficients basically found with help of Gradient Descent.

If we perform above scaling technique so randomly initialized values are very near to global minima.

If we do not scale down so the values are differ and very far away. so that our algorithm run faster when we do scaling.

In each every deep learning technique feature scaling is very important.


When should not perform feature scaling?

There are some of algorithm like decision tree and random forest and XG boost, which are ensemble techniques which do not require feature scaling.

Because they does not make any impact on model features.

Comments

Popular posts from this blog

Transformers: Self-attention

Retrieval Augmented Generation(RAG)

Large Language Models(LLMs)