The perceptron is a single, solitary processing neuron with supervised learning. It receives pulses from various stimuli, then applies the relative weights of its synapses and then emits an output signal.
A perceptron network is a set of several perceptrons side by side, all receiving the same stimuli. As a perceptron does not interfere with the outcome of another perceptron, they can be understood individually without prejudice to the whole.
Not to be confused with Perceptron Multi Layer, MLP of the English acronym, in which the perceptrons are layered.
The perceptron neuron learns from its mistakes. Yes, literally. And it depends on the size of the error: the bigger the error, the faster the perceptron tries to correct itself.
The output of a perceptron is a real function that receives a real number. Stimuli are transformed into a single real number through a scalar product of the strength of the stimuli with the weight of the synapses. In short, to X
being the stimulus, p
the result of perceptron processing, S
the weights of their synapses and f
the actual function of perceptron:
p = f(y)
y = X . S
I mentioned above that the perceptron has supervised learning, it is not self-sufficient like the Kohonen networks. Supervised learning means here that for each input T_i
training, there is an expected result r_i
. If p_i != r_i
, this means that there was a non-zero error, called e_i
.
About the e_i
obtained for entry T_i
, correct the synapse values S
in such a way that this error will have been corrected or minimized in this learning turn.
The creation of the learning set and how the test elements will be presented can vary greatly depending on who implements them. Usually the set of tests is presented sequentially and successively until a convergence criterion is reached. The convergence criterion may be the total cumulative error of the test suite. Another interesting point in training is that the learning rate is usually reduced between one battery and another.
Possible duplicate of How an artificial neural network works?
– user28595
@diegofm It is not a duplicate, I am being very specific here friend, Perceptron Neural Network and not Neural Networks in General. Research on this type of RN, people do not find, I tried yesterday and found nothing about..
– Rogers Corrêa