Non-linear Activation Functions

Activation functions are any functions that defines the output of a neuron. The activation function associated with each neurons in

Gradient descent is one of the most common method of training a neural network. It is an optimization algorithm used

A mathematical approach towards Gradient Descent Algorithm

Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function (commonly called as loss/cost functions

Most commonly used activation functions in Deep Learning

Activation functions are any functions that defines the output of a neuron. The activation function associated with each neurons in

Basics of Artificial neural network

Artificial neural network   A neural network is a network of neurons or, in a contemporary context, an artificial neural

Introduction To Gradient Descent algorithm and its variants

gradient descent algorithm source: Imad Dabbura Gradient descent is a first-order iterative optimization algorithm for finding the minimum of a function

Insert math as
Formula color
Type math using LaTeX
Preview
$${}$$
Nothing to preview
Insert