Deep Learning — Perceptron Model

Part 3: Perceptron Model

Anurag
5 min readJan 7, 2023

Welcome back to my deep learning series. In this article, we will discuss the Perceptron model in detail. The previous series of this deep learning includes the Application of deep learning and MP neurons. So in case you have not read that article, you can follow the below link.

Perceptron Model

Previously, we studied MP-Neuron and came across some limitations. To overcome some of the challenges, we are now being introduced to a new model known as the Perceptron model.
Limitations of MP-neurons:

  1. Inputs are Boolean → inputs are either 0 or 1.
  2. Weights and thresholds are analytically determined.
  3. The model cannot learn.
  4. Also, what about non-binary tasks?

Now let us try to think of the solutions to overcome these problems. I urge you to take a pause before proceeding further and think about what and how can you overcome these limitations.

Let us discuss the solutions now. The solution to these limitations is Perceptron!

  1. Inputs → In the case of the perceptron model, inputs can be real values instead of just boolean.
  2. Outputs → Outputs can be non-boolean as well.
  3. Weights → Weights in the case of the perceptron model can be positive as well as negative.
  4. Activation functions could be different.
  5. And most importantly, weights and thresholds can be learned!

Let’s just summarize what we said.

Inputs can be real values. Output can be non-boolean. The threshold is removed and bias is added.

Now, look at the diagram mentioned below.

Perceptron Model

Here in this diagram, we see that the threshold (b= -θ) is introduced. The decision line equation in the case of MP-neuron was w1.X1+w1.X2 = θ. In the case of the Perceptron model, the decision line equation is w1.X1+w1.X2 -θ = 0. Here, weights are no more only unity. These can be now real values too.

Let’s now try to implement AND gate using perceptron mode.

Implementation of AND Gate with Perceptron Model

See the truth table for AND gate drawn below on the left side. We have taken weights as 1 and bias as -1.5. Now focus on the decision line diagram drawn on the right side. We see that model can separate two classes.

Equation of blue line drawn to separate two classes: w1.X1+w1.X2 +b= 0 (In this case, X1+X2–1.5=0 with w1=w1=1 and b= -1.5).

The truth table for AND gate (left) and Decision line diagram (right)

When we will change the weights w1 and w2, the slope (blue line in the decision line figure mentioned above) will also change. This decision line will change into the 2-D plane.

Implementation of OR Gate with Perceptron Model

See the truth table for the OR gate mentioned below. We have taken weights as -1, 1.2, and 1.2.

The truth table for OR gate

Equation of (blue line in decision line figure mentioned below) drawn to separate two classes: w1.X1+w1.X2 +b= 0 (In this case, X1+X2–1.5=0 with w1=w1=1 and b= -1.5).

OR gate diagram (left) and Decision line diagram (right)

Implementation of NAND Gate with Perceptron Model

Implementation of the NAND Gate will be similar to the above. See the truth table for the NAND gate mentioned below. Here we have taken weights as 1, -0.6, and -0.6.

The truth table for NAND gate

Considering the weights assigned, if h(x,w) > 0, then the output will be 1, else output will be 0 and h≤0.

Decision line diagram of NAND Gate

Till now we have discussed the linear combination of weighted inputs. See the below diagram to understand the process. On left, we have some inputs which are passed by weights. The summation of the linear combination of weighted inputs is processed with a nonlinear activation function along with some addition of bias to obtain the output.

These weights and inputs can also be denoted in the form of a matrix.

Let’s now understand this topic with some examples for better understanding.

Now try to answer the following questions.

• How many features will be there for the above case?

• How many Classes will be there?

• How does the feature space look like?

• Can a single MP Neuron solve the problem?

  • Can a single Perceptron solve the problem?

Let us now reveal all the answers.

Can a single Perceptron solve the problem? — Yes. Why and how? See below figure and equation to understand.

In case you have not followed me yet, please do. I hope to see you again for another blog on the deep learning series. Stay tuned!

--

--

Anurag
Anurag

No responses yet