- Perceptron is the simplest form of a neural network used for the classification of patterns said to be linearly separable.
- Linearly separable are
patterns that lie on opposite sides of a hyperplane.
- In 1958, Rosenblatt was first person proposed the perceptron as the first model for learning with a teacher.
- 79-Structure of neuron.
- For adapting perceptron we may use error-correction rule known as the perceptron convergence algorithm.
Given two vectors and the Cauchy-Schwarz inequality states that:
- We use Bayes classifier when we have the parameters of the 2 classification problem. Otherwise perceptron is suitable for any 2 linearly separable problems without any parameters.
- Minsky and Papert proved that the perceptron as defined by Rosenblatt is inherently incapable of making some global generalizations on the basis of locally learned examples.
- Key terms: perceptron convergence theorem.
- Proof of convergence algorithm.
- In convergence algorithm proof, how equation 1.10 is valid?!