machine learning .p16 -Activation Function. [Hindi]

machine learning  .p16 -Activation Function. [Hindi]

In today’s video , we’ll be talking about activation function In artificial neural networks we do the sum of products of inputs(x) and their corresponding weights(w) and apply a Activation function to it to get the output of that layer and feed it as an input to the next layer activation function converts our input and or “sum of product” into desired output In technical terms, neuron calculates a weighted sum of its input , adds a “Bias” and then activation function decides whether it should be fired or not now, what is mean by “fired” if you have watched my video about “sigmoid function” then you would have known if we get some value pass through sigmoid function then sigmoid function will convert it 0 or 1 where 0.5 is a threshold value if input is less than 0.5 output will be 0 otherwise 1 here 1 stands for fired and zero stands for not fired Now you must be wondering what is the need of the activation function for that, take a good look at neuron you are finding weighted sum and adding a bias but you were doing similar thing while doing linear regression after finding coefficients then multiply them with our input and at last adding a constant value . right? then why do we need neural networks suppose you have to draw a non-linear function to fit your dataset neural network without activation function would simply be a linear regression model which has limited power and does not perform good most of the times, thats right we can not rely on linear function only there are some complex and non linear techniques which can fit your data more accurately and this why we use neural network and acts like a universal method that can work almost with everything such as regression , classification images, videos, sound etcetra thats why neural net is used in deep learning

1 Reply to “machine learning .p16 -Activation Function. [Hindi]”

Leave a Reply

Your email address will not be published. Required fields are marked *