An
Artificial Neural Network (ANN) is an information-processing paradigm that is
inspired by the way biological nervous systems, such as the brain, process information.
The key element of this paradigm is the novel structure of the information processing
system. It is composed of a large number of highly interconnected processing elements
(neurons) working in unison to solve specific problems. ANNs, like people, learn
by example. An ANN is configured for a specific application, such as pattern recognition
or data classification, through a learning process. Learning in biological systems
involves adjustments to the synaptic connections that exist between the neurons.
This is true of ANNs as well.
Historical
background
Neural network
simulations appear to be a recent development. However, this field was established
before the advent of computers, and has survived several eras. Many important
advances have been boosted by the use of inexpensive computer emulations. The
first artificial neuron was produced in 1943 by the neurophysiologist Warren McCulloch
and the logician Walter Pitts. Neural
networks, with their remarkable ability to derive meaning from complicated or
imprecise data, can be used to extract patterns and detect trends that are too
complex to be noticed by either humans or other computer techniques. A trained
neural network can be thought of as an "expert" in the category of information
it has been given to analyze. This expert can then be used to provide projections
given new situations of interest and answer "what if" questions.
Other advantages include: -
1.
Adaptive learning: An ability to learn how to do tasks based on the data given
for training or initial experience.
2. Self-Organisation: An ANN can create
its own organization or representation of the information it receives during learning
time.
3. Real Time Operation: ANN computations may be carried out in parallel,
and special hardware devices are being designed and manufactured which take advantage
of this capability.