Skip to main content

Posts

Showing posts from September, 2020

Scale Vector Classifier

 Gaussian Radial Basis Function Kernel in SVC.       Gaussian Radial Basis (RBF) kernel used to classify non linearly separable data. It is a most powerful function to gain high accuracy. Non linearly separable data is shown as below as first image.     In this type of data the data points are spread irregularly that's why it is hard to separate them in two dimensional plot, so we use RBF kernel. It is a transform function create landmark(y) in imaginary 3D space. See second image. Assume a data point x on the plane, kernel measures the distance from x to landmark(y) and create a circumference σ. if the data point x is closer to the landmark, the distance will be small and hence the point will be in the circumference. That way we can separate the data points using a circle having σ circumference by getting a projection a 2D plane.          Equation of Gaussian Radial Basis Function:           k(x,y...

Ensemble Learning

 What is ensemble learning?     Ensemble learning is process where we combine various or same algorithms apply for multiple times to enhance the accuracy of final merged model. Ensemble learning is the process by which multiple models, such as classifiers are strategically generated and combined to solve a particular computational intelligence problem. Ensemble learning is primarily used to improve the classification, prediction, results by combining several models. 

Difference between probability and likelihood

Probability and Likelihood Probability   is the quantity mostly familiar with which deals predicting new data given a known model 'Posterior Probabilities' : because they are determined after the results of the model are known. In short, what is chances given feature data  Likelihood   is deals with fitting models given some known data are the posterior values for the fixed data points with distributions that can be moved. assume that already assumed, indicate how likely the event under consideration is to occur given each and every a priori probability. In short, which model or features should more likely given chances or probability Priori Probabilities :  exist before we gain any information from the model itself.

Classification & Confusion Matrix & Accuracy Paradox

Classification  work on voting the object belongs from which classes has more probability  There are two types of classification : Binary classification : There are two classes we have ex: male-female , cat-dog , yes-not  Multiple classification :   There are classes more than two we have ex: traffic signs , face recognition , flower race  , Digit Recognition Confusion matrix :  Confusion matrix is one type of technique to evaluate the model accuracy for classification problem. In this technique we consider how many of positive and negative data points we predict correctly. The main consideration terms are accuracy, precision and recall The accuracy was an appealing matric, because it was a single number. Here precision and recall(sensitivity) are two numbers. So to get the final score (accuracy) of our model we use F1 score, so that we have a single number. Here is the F1 score's mathematical formula: F1 = 2x precision x recall / (precision ...

Road & Safety Signs Detection

   Traffic Signs Recognition Download your files from here:  Here I've done project using pickle files that you can easily download from above link Kindly read and follow the below these comments for understanding the codes For better convenient use visual code. It's free and intelligent IDLE # %% Importing libraries import  pickle import  numpy  as  np import  matplotlib.pyplot  as  plt # %%  Extracting train data train_file =  'train.p' train_obj = open(train_file,  'rb' ) train_data = pickle.load(train_obj) print(train_data) # %%  Extracting test data test_file =  'test.p' test_obj = open(test_file,  'rb' ) test_data = pickle.load(test_obj) print(test_data) # %%  Extracting valid data valid_file =  'valid.p' valid_obj = open(valid_file,  'rb' ) valid_da...

Male Female Classification

   # %%  Importing Libraries import  numpy  as  np        import  pandas  as  pd      import  matplotlib.pyplot  as  plt import  os # %%  Importing Keras Libraries import  keras.backend  as  K  from  keras.models  import  Sequential from  keras.utils  import  to_categorical from  keras.layers.normalization  import  BatchNormalization from  keras.layers  import  Dense, Convolution2D, Activation, MaxPooling2D, AveragePooling2D, Dropout, Flatten from  keras.preprocessing  import  image from  keras.layers  import  LeakyReLU # %%   Importing sklearn libraries from  sklearn.metrics  import  confusion_matrix, classification_report from  skl...