MACHINE LEARNING ALGORITHMS
Careful, this website is still under construction.
MACHINE LEARNING ALGORITHMS
In recent years, we've seen numerous breakthroughs in artificial intelligence algorithms. Many of these are simply modern types of neural networks such as Recurrent Neural Network, Convolutional Neural Network, Radial Basis Functional Neural Network, etc, each designed for specific types of structured or unstructured data. Understanding the underlying principle of a simple neural network would help you in breaking down these new complex algorithms.
Neurons are the basic units of our nervous system. They use electrical impulses and chemical signals to transmit information between different areas of the brain, and between the brain and the rest of the nervous system.
The human brain has an average of 86 billion neurons. Neurons have three basic parts:
a cell body: this contains the nucleus which controls the cell’s activities and contains the cell’s genetic material.
axon: transmits messages from the cell.
dendrite: receives messages for the cell.
"In machine learning, a neuron basically takes an input value, multiplies by weight (dot product), sums all these multiplications and uses an activation function to get output."
Before proceeding, you should be:
familiar with the logistic regression algorithm
able to code in Python
Neurons:
Layers: A layer is made up of multiple neurons.
Weights:
Vectors:
Activation functions:
Forward and Back Propagation:
Cost function:
Gradient descent:
The following outlines the steps involved in creating an artificial neural network from ground up. First, import the numpy library.
There are several varieties of ANNs. The number of hidden layers, input parameters and output varies across different architectures. When creating a Neural Network, it is important to know the number of inputs there are, the number of outputs needed, and the number of hidden layers that would work best. It is also a good practice to ensure that all hidden layer uses equal number of neurons.
Weights, commonly referred to as 'w', are learnable parameters that represents the strength of the connection between units. If the weight from node 1 to node 2 has greater magnitude, it means that neuron 1 has greater influence over neuron 2.
A bias vector is an additional set of weights in a neural network. It requires no input. Bias represents an extra neuron included with each pre-output layer and stores the value of “1,” for each action. It is commonly referred to as 'b'.
There are many types of activiation functions that can be used in a Neural Network. The most popular of them are sigmoid, relu and softmax. For this ANN we are building, we would use the sigmoid activation funtion type. It outputs values in the range of 0 to 1.
Forward Propagation is the process of moving from the input layer (left) to the output layer (right) in the neural network. The input data is fed in the forward direction through the network. Each hidden layer accepts the input data, processes it and passes to the successive layer until it gets to the output layer.
The cost function evaluates the performance of a neural network. It compares the predicted outputs and actual outputs and calculates the error of the model. A cost function is a single value, not a vector. The higher the cost function, the farther away our prediction is from the actual value. This is why it is important to minimize the cost function.
Backpropagation is the opposite of forward propagation. It is the process of moving from the output layer (right) to the input layer (left). During backpropagation (also referred to as backprop), the total loss is propagated back into the neural network to know how much of the loss every node is responsible for, and subsequently updating the weights in such a way that minimizes the loss by giving the nodes with higher error rates lower weights and vice versa.
Gradient descent is a numerical calculation that aims to minimize the cost function by finding what is called a global minimum. Backpropagation, as stated earlier, updates the weights so as to minimize loss.