Skip to main content

Collection in python

There are basic two datatypes in python 


1. Immutable datatype : numeric , string , tuple

2. Mutable : list , dictionary , set


Immutable : Content cannot be changed or modified using the indexing operator ( assignment  statement )

Mutable : Content can be changed or modified using the indexing operator ( assignment  statement )


Numeric : int, float , complex numbers

Sequence : list, string, tuple, range

Mapping : dictionary

Set : set , frozenset

Boolean : is used for making more conditions for 'if' in True and False form



Collections in python :


1) Dictionary :

It is used to map arbitrary keys to value

Only immutable ( content can be changed ) objects can be used as keys to dictionaries

Ex :  D = { 'a' : 10 , 'b' : [20, 30] }   

           

2)  List :

It can collect multiple values, multiple types of objects and data collection in single variable. 

It is mutable. So operation such like remove, append , insert can be done here

Append : to add an item or whole list to the end of an existing list 

Extend : to add every items separated by commas to the end of an existing list (due to for loop working behind)

Indexing, slicing can be done here

Indexingreferring to an element position of an iterable within an ordered list

Slicing : is way of retrieving values from a list. use to modify or delete the items of mutable

Ex : L = [1 , 'a' , [ 2 , 'b' ] , ( 3, 4 )]


3)  Tuples  : 

It is immutable 

Ex : T = ( 1, 2, 3)   or   T =  1, 2, 3







Comments

Popular posts from this blog

Gradient Descent with RSME

Optimization Alorithms Ex : G.D. , S.D. ,  Adam, RMS prop , momentum , adelta Gradient Descent is an  optimization algorithm that find a best fit line and local minima of a differentiable function for given training data set. S imply used to find the coefficients (weights) and intercept (bias) that minimize a cost function as far as possible.  There are three types of  g radient descent techniques:   Regular Batch GD (Gradient Descent) -  Studiously descend the curve in one path towards one minima ; every hop calculates the cost function for entire training data. If training data is large, one should not use this. Random GD (Stochastic GD) -   Calculates the Cost function for only one (randomly selected) training data per hop ; tend to jump all over the place due to randomness but due to it actually jump across minima’s.  Mini Batch gradient descent - Somewhere midway between the above 2. Does the calculation for a bunch of random data poin...

Why python ? What is Python?

Python is a generally interpreted and  interactive dynamic symmetric   high-level  object oriented programming language. It is widely used in Machine Learning today. Pretty easy to understand, learn, code and explain because it has very crisp and clear syntaxes than other languages.  Guido van Rossum made Python in 1991, named his programming language after the television show Monty Python's Flying Circus. Python has got features or derived features from ABC named programming language. Interactive - The result will be printed on the screen, immediately return, in the next line as we entered. High-level - Humans can easy to interpret; the source code contains easy-to-read syntax that is later converted into a low-level language (0 , 1) Dynamic-symmetric – Don’t need to clarify the data type. It Allows the type casting. Type Casting –  We can transform the one data type in another data type Object Oriented – language is focused on Object...