Skip to main content

Gradient Descent with RSME

Optimization Alorithms

Ex : G.D. , S.D. ,  Adam, RMS prop , momentum , adelta

Gradient Descent is an optimization algorithm that find a best fit line and local minima of a differentiable function for given training data set. Simply used to find the coefficients (weights) and intercept (bias) that minimize a cost function as far as possible. 

There are three types of  gradient descent techniques:  

  1. Regular Batch GD (Gradient Descent) -  Studiously descend the curve in one path towards one minima ; every hop calculates the cost function for entire training data. If training data is large, one should not use this.
  2. Random GD (Stochastic GD) -  Calculates the Cost function for only one (randomly selected) training data per hop ; tend to jump all over the place due to randomness but due to it actually jump across minima’s. 
  3. Mini Batch gradient descent - Somewhere midway between the above 2. Does the calculation for a bunch of random data points for each hop.
Here I've code a gradient descent algorithm from scratch using  root mean squared error (rsme) formula.



import numpy as np
import matplotlib.pyplot as plt

# %%

x = np.array([1,2,3,4,5])
y = np.array([5,7,9,11,13])


# %%

def gradient_descent(x,y):
    m = b = 0
    n = len(x)
    learning_rate = 0.06 # this value you have to operate manually
    iterations = 100

    for iter in range(iterations):
        y_pred = (m * x) + b

        plt.plot(x,y, color='b')
        plt.plot(x,y_pred, color='g')
        # plt.show()

        diff_ = y - y_pred

        mse = (1/n) * sum([val**2 for val in (diff_)])
        rmse = mse**0.5

        d_m = -(1/n) * (mse**(-0.5)) * sum( x * (diff_))
        d_b = -(1/n) * (mse**(-0.5)) * sum(diff_)

        m = m - learning_rate * d_m  
        b = b - learning_rate * d_b

        print('iterations: {}, weight: {}, bias: {}, cost: {},'. format(iter+1, round(m,2), round(b,2),'%0.2f'%rmse))

    pass


gradient_descent(x,y)



# %%

Comments

Popular posts from this blog

Classification & Confusion Matrix & Accuracy Paradox

Classification  work on voting the object belongs from which classes has more probability  There are two types of classification : Binary classification : There are two classes we have ex: male-female , cat-dog , yes-not  Multiple classification :   There are classes more than two we have ex: traffic signs , face recognition , flower race  , Digit Recognition Confusion matrix :  Confusion matrix is one type of technique to evaluate the model accuracy for classification problem. In this technique we consider how many of positive and negative data points we predict correctly. The main consideration terms are accuracy, precision and recall The accuracy was an appealing matric, because it was a single number. Here precision and recall(sensitivity) are two numbers. So to get the final score (accuracy) of our model we use F1 score, so that we have a single number. Here is the F1 score's mathematical formula: F1 = 2x precision x recall / (precision ...

Multiple classification from many of directories

  # %%  Import nessacary libraries import  numpy  as  np import  pandas  as  pd import  cv2 import  matplotlib.pyplot  as  plt import  os import  glob # %%   Keras Tensorflow libraries from  keras  import  layers from  keras.models  import  Model from  keras.optimizers  import  RMSprop , Adam , Nadam from  keras.preprocessing.image  import  ImageDataGenerator from  keras.layers  import  Input, BatchNormalization, Dense, Dropout, Conv2D, Flatten, GlobalAveragePooling2D, LeakyReLU from  keras.preprocessing.image  import  ImageDataGenerator, img_to_array, load_img # %%  Path path  =   r 'G:/Machine Learning/Project/Lego Mnifigures Classification/dataset' open_dir  =  os....

Digit Recognition

Here you can import digit dataset from scikit learn library which is in-built, So you don't need to download from other else Note: If you use visual code, I recommend you to turn your color theme to Monokai because it has a few extra and important keyword and attractive colors than other theme.   # %%  Import libraries import  numpy  as  np import  pandas  as  pd import  matplotlib.pyplot  as  plt import  random  # %%   Load dataset from  sklearn.datasets  import  load_digits dataset  =  load_digits() dataset.keys() output: d ict_keys(['data', 'target', 'target_names', 'images', 'DESCR']) You have to check all to direct print them Here DESCR is a description of dataset # %%   divide the dataset into input and target inputs  =  dataset.data target  =  dataset.target # %% ...