Rosenblatt’s Perceptron

  • Perceptron is the simplest form of a neural network used for the classification of patterns said to be linearly separable.
  • Linearly separable are
    patterns that lie on opposite sides of a hyperplane.
  • In 1958, Rosenblatt was first person proposed the perceptron as the first model for learning with a teacher.
  • 79-Structure of neuron.
  • For adapting perceptron we may use error-correction rule known as the perceptron convergence algorithm.
  • Cauchy-Schwarz inequality:
    • Given two vectors and the Cauchy-Schwarz inequality states that:
  • We use Bayes classifier when we have the parameters of the 2 classification problem. Otherwise perceptron is suitable for any 2 linearly separable problems without any parameters.
  • Minsky and Papert proved that the perceptron as defined by Rosenblatt is inherently incapable of making some global generalizations on the basis of locally learned examples.
  • Key terms: perceptron convergence theorem.
  • Proof of convergence algorithm.
  • Questions:
    • In convergence algorithm proof, how equation 1.10 is valid?!

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s