Memory management in Python involves a private heap containing all Python objects and data structures. The management of this private heap is ensured internally by the Python memory manager. The Python memory manager has different components which deal with various dynamic storage management aspects, like sharing, segmentation or preallocation. If some contents or data doesn't used by user for a long time it is deleted by garbage collector provided by Python.
Optimization Alorithms Ex : G.D. , S.D. , Adam, RMS prop , momentum , adelta Gradient Descent is an optimization algorithm that find a best fit line and local minima of a differentiable function for given training data set. S imply used to find the coefficients (weights) and intercept (bias) that minimize a cost function as far as possible. There are three types of g radient descent techniques: Regular Batch GD (Gradient Descent) - Studiously descend the curve in one path towards one minima ; every hop calculates the cost function for entire training data. If training data is large, one should not use this. Random GD (Stochastic GD) - Calculates the Cost function for only one (randomly selected) training data per hop ; tend to jump all over the place due to randomness but due to it actually jump across minima’s. Mini Batch gradient descent - Somewhere midway between the above 2. Does the calculation for a bunch of random data poin...
Comments
Post a Comment