Types of Feature Transformation

Why do we need to do transformation of the features?

Scaling down the feature is very much important because if we don’t scale down the values the value of the Euclidian distance Gradient descent will also increases.

Also, If the values are large in any of the feature our model will give the more priority to the large value. Because large value means more importance for the model.

For every model building we don’t need to do transformation but the model which uses the concepts like Euclidian distance (K means) and Gradient descent (Linear Regression) we need to do transformation.

In Deep learning also, you need to do transformation as internally they use the concept of gradient descent.

Types of Transformation

  1. Standardization
  2. Scaling to Minimum and Maximum
  3. Scaling to Median and Quartiles
  4. Gaussian Transformation
    a. Logarithmic Transformation
    b. Reciprocal Transformation
    c. Square root Transformation
    d. Exponential Transformation
    e. Box Cox Transformation

Standardization

  1. All the variables or features are brought to similar scale.
  2. Standardization also means centering the values to zero.
  3. Standardization is done using the formula of z score
Standardization
Standardization
  1. Standardization is widely used for machine learning algorithms.
  2. Standardization can be very easily implemented using sklearn StandardScaler library.

Min Max Scaling

  1. Widely used in deep learning like CNN (Convolutional Neural Network).
  2. Min Max Scaling scale the values between 0 to 1.
  3. Min max Scaler Formula
  1. Min Max Scaling can be very easily implemented using sklearn MinMaxScaler library.

Robust Scaler

  1. It is used to scale the feature to median and quartiles.

  2. Scaling using median and quantiles consists of subtracting the median to all the observations and then dividing by the interquartile difference.

  3. Interquartile difference is the difference between the 75th and 25th quartile.

  4. IQR = 75th - 25th quartile.

  5. Robust Scaler Formula

  6. Robust Scaler can be very easily implemented using sklearn RobustScaler library.

Gaussian Transformation

  1. Gaussian Transformation is used to covert data into normal distribution if it is not normally distributed.
  2. It is very much important that our data is normally distributed so that our model performs well and gives better accuracy.
  3. To check whether data is normally distributed we can use Q-Q plot.
  4. Plotted different types of gaussian transformations in the code below.

Code for different types of feature transformation GitHub Link

The END