1,323
24
Essay, 9 pages (2000 words)

The science of artificial neural networks psychology essay

The science of Artificial Neural Networks (ANNs), commonly referred as Neural Networks, stills a new and promising area of research. The concept of creation of neural networks exists for many decades. Nevertheless neural networks have become known and have been developed in international levels only in the recent years. It is noteworthy, scientist showing interest in neural networks, come from different scientific areas such as chemistry, medicine, physics, mathematics, engineering and the list goes on. That shows Neural Networks is a new challenge in science. No other science today combines and needs direct knowledge from such diverse areas. One of the main differences of the Artificial Neural Networks from the biological is that while ANNs learn through training and experience just like the biological ones but they follow different rules from regular computers. A Neural Network is a parallel data processing system consisted by multitude of artificial neurons, organized in structures similar with the ones in human brain. They function as parallel computing devices made by many highly interconnected simple processors. Artificial Neurons are mainly organized in layers. The first of those layers called the ” input layer” and is used to insert the data. ” Input layers” are unable to proceed to any sort of computation as its elements do not comprise input weights or bias (threshold).

The axon: mean of transfer of neural signals from the neuron. Its length can be tens of thousands of times the diameter of its body and it is characterized by high electrical resistance and very large capacitance. Every neuron has only one axon, however it can branch and thus enabling communication with many target cells or other neurons.

The dendrite: short highly branched cell projections (filaments). Most neurons have many dendrites, attached on the soma and increase the surface area. There are approximately 10^3 to 10^4 dendrites per neuron, to receive information from other neurons through synapses they are covered with and transmit electrochemical stimulation to the soma.

The axon terminal: located in the end of the axon and is responsible for transmitting signals on to other neurons. On axon terminals are attached the terminal buttons, that store the information in synaptic vesicles and secreting them in neurotransmitters.

As mentioned above, the connection between neurons happens through the synapses. Neural synapses are a silent exchange of information. The electrical nerve impulses travel along neurons and transmitted by chemical transmitters (neurotransmitters) in the next neuron across a tiny gap, the synapses and are located between the neuron and the neighboring cell (target cell). Therefore dendrites are very close to each other but never in touch. It is estimated that there are approximately 10 billion neurons in the human cortex, and 60 trillion synapses or connections (Shepherd and Koch, 1990).

A number of neurons and their connections form a neural network. The entire system of neural networks in the human body forms the Central Nervous System. This system goes through the whole human body with central points the brain and the spine. During lifetime, synapses are in constant dynamic equilibrium, new are created and old are destroyed. The creation of new synapse happens when the brain acquires more experiences from the surrounding environment, learns, recognizes and understands. On the other side, diseases cause the devastation of neurons and therefore the devastation of synapses.

In comparison to other cells, neurons might not replaced by new ones, if destroyed. That means after the birth of a new individual, its neural system is fully developed within the first few months of its life.

A neuron can be either active or inactive. When it is activated, it produces an electric signal. This signal has intensity of only a few mVolt.

The way those electric signals are produced is pretty similar with the way a capacitor works. Between the external and internal surface of the cell of the neuron there is a dynamic difference.

Although the mass of the human brain is only the 2% of human body mass, consumes more than the 20% of the oxygen that goes in the organism. The energy consumption in the brain is about 20 Watt in comparison to a computer that needs a lot more.

The computational power of brain is measured by three possible approaches:

The number of synapses (Kandel, 1985), the computational power of the retina and multiply it by their brain-to-retina ratio (Moravec, 1998b), and the total useful energy used by the brain per second by the amount of energy used for each basic operation to give the maximum operation per second (Merkle, 1989)

From the three approaches above, is concluded that the estimated computational power of human brain is about 10^14 operations per second (Ng, 2009).

It is interesting to mention how the electric pulses are created to stimulate neurons. On the membrane of the cell it is appeared to be an electric potential difference between its external and internal surface just like a capacitor. Most of the times the negative charges found in the internal surface as they cant penetrate the membrane and leave the cell.

The membrane has many openings that allow ions and atoms to go through each element from its own channel. The endings of the channels are secured by gates which directing the flow of those elements. Proteins that act like pumps force the elements to travel in the opposite direction from their natural and thus neurons consume larger amounts of energy. Eventually the balanced movement of the elements along the surface of the membrane produces an electric current which is the corresponding electrical pulse that stimulates the neuron.

Once the neuron has ‘fired’ it returns to a state of potential equilibrium and in this state it can’t be fired again until it recovers.

Each neuron has a specific threshold or weight. When electric signals reach that point, sum up and if their weight value is same or larger than the one of the threshold the neuron stimulates. If the sum of the signals is smaller than the required value of the threshold, then the neuron stays inactive.

Add images.

Models of artificial neurons

As mentioned earlier, ANN’s are parallel data processing systems, consisting out of large numbers of artificial neurons, inspired by the biological neurons.

A neuron is an information-processing unit that is fundamental to the operation of a neural network (Haykin, 1999, pg-10).

A neuron may have many inputs, an internal structure consisting out of multiple layers but it always has a single output.

Every single neuron accepts variable input signals x0, x1, x2 …xn. This corresponds to the electric pulses of the biological brain.

Every input signal is multiplied by the synaptic weights of the neuron, wi, where i= 1, 2, 3.. n, the input nodes. The weights represent the biological synapses and indicate the strength of the bond (the connection) between them.

The range of value of a weight can be positive or negative depending on if the function of synapse suspend or propagate (transmit) the stimuli from other neurons, unlike the biological synapses that do not take negative values. This is because external bias, b, are applied when the weights added.

Bias or threshold, is the standard value of the internal potential energy of the neuron that the sum of the combined output must be reached in order the activation (or squashing) function to be activated.

An important element of the neuronal body is the adder . At the adder, all the input signals, influenced by weight vectors are summing up together and produce a resultant combined output u. When the sum of weight is big (0 Therefore, the product u is given by the relationship:

The result of combined output u, pass through the activation function, denoted with the letter †( ).

The activation function is a non linear function where the resultant combined output u takes its final value y.

The calculated activation output signal of the neuron is shown as:

and where

Therefore,

Activation functions

There are several activation functions, however three of the most basic types are the following 🙁 they slightly vary from book to book)

The threshold activation function, which gives as an output 1, if the adder produce a value greater than the one of the threshold. This is expressed as:

The Piecewise-Linear function, where the unity is assumed to be the amplification factor inside the linear region of operation (Haykin, 1999, pg: 14)

The Sigmoid function, which is expressed as:

Where ¡ is is the slope parameter of the sigmoid function. This function is one of the most important and most commonly used as it provides non-linearity to the neuron.

Some other activation functions are, the rump function, the bipolar sigmoid function, and the signum function.

The signum function gives a positive or negative output, with values usually ranging from 1 to -1 depending on the value of the summation of the weights on the threshold. This can be applied to the activation functions mentioned above and more specifically to the threshold faction giving:

Add images and graphs

A simple neural network

In this paragraph, neural networks will be introduced, starting from their simplest form. Every neural network consists out of hundreds or thousands of tiny units, the neurons. Each neuron has an input where the electric signals are received. A neuron may have more than one input but no matter how many layers of neurons and synaptic connections are in between (the body), there is always one output value. The neurons of a layer between each input and output are not connected to each other however each layer is interconnected with the layer of the next and the previous level. In its simplest form, a neuron has no layers but is limited only to an input and an output. Every signal that leaves an output and enters an input has a value, the weights. The weights represent the importance of each signal reaching the threshold of an input. Depending on the value of weight (wn), the contribution of the electric signal can be great or small for the function of the system.

Artificial intelligence and neural networks

Historical background

(The study of the brain and the biological neurons has started thousands of years ago.) However, as artificial neural networks started to be developing the past century, the historical background still not as broad as in other sciences.

The first union of mathematical logic and neuropsychology, commenced in 1943 by Warren S. McCulloch and Walter Pitts.

McCulloch was a pioneer neuroanatomist and psychiatrist. Pitts was a young mathematical prodigy, who joined McCulloch in 1942. (Haykin, 1999, pg: 38).

Together they created the first model a neural network that was represented by a great number of interconnected neurons. In their well-known paper, ” A logical calculus of the ideas immanent in nervous activity, (1943)”, came up with theorems that describe the function of neurons and the neural networks. As a result of those theorems, neural networks and artificial intelligence ideas established a new era of research began.

The paper of McCulloch and Pitts, triggered the interest of many scientists like von Neumann, Wiener and Uttley in their effort to extract information of the function of biological neurons and create corresponding artificial ones.

In 1949 another idea appeared by D. Hebb who published the book ” The Organisation of Behavior”. Although his book had greater influence on the psychological rather than the engineering community, he introduced the concept of postulate and learning and the synaptic modification rule, which suggests that the connectivity of the brain changes continually thorough its entire life in the process of learning new tasks.

From 1950 to 1979, a number of remarkable books were written about neural networks developing the ideas of neurons abilities, such as learning and memorising.

Some of these books are the ” Design for a Brain: The origin of Adaptive Behaviors”, (1952) by Ashby, that still exciting to read nowadays, and the ” Learning Machines”, (1965) by Nilsson, one of the best-written expositions about linearly separable patterns in hypersurfaces. (Haykin, 1999, pg: 40).

A novel model, the perceptron, introduced in 1958 by F. Rosenblatt. The perceptron is a very simple model of supervised learning, which has only one input and one output built around a nonlinear neuron (Haykin pg 135). Although this model appeared to have many limitations the idea of training the neurons encouraged many scientists for building larger neural networks.

In 1969, Minsky and Papert in their book ” Perceptron” they make a complete evaluation of the features and uses of the perceptrons. It proved with mathematics that there were fundamental limitations on the computational ability of single-layered perceptrons and therefore those limitations assumed to carry on in the multilayered levels of perceptrons.

A period followed were scientists start losing hope about neural networks and turned to other knowledge based systems.

In 1982, neural networks make an interesting come back when John Hopfield proved in a strict mathematical way that by time a neural network can be adjusted to use the minimum energy to function just like human brain does. In addition, Hopfield proved that a simple neural network can be used as storage devise. Such networks are called the Hopfield networks.

A very important work was published in 1986 by Rumelhart and McClelland. The two-volume book, ” Parallel Distributed Processing: Explorations in the Microstructures of Cognitions”, shows new methods of training neural networks and introduces the idea of parallel data processor. This theory had a great influence in the use of back-propagation learning as and allowed the development of multilayered networks (perceptrons).

The books published by McCulloch- Pitts (1943), Hopfield (1982) and Rumelhart-McClelland (1986), are the most influential in the revolution of neural networks.

Since 1980 to nowadays, Neural Networks have been established as a new independent science branch. Conferences and magazines appeared with complete interest on artificial neural networks while the first commercial companies dedicated to the improvement of them, created, supported by thousands of members worldwide especially in America, Europe and Japan.

Learning processes/ training

Fundamental ideas

The present, looking to future

Ann applications areas

Anns in civil engineering

Can it be applied in?

Benefits/disadvantages

Program

Observations

comments

summary

references

Thank's for Your Vote!
The science of artificial neural networks psychology essay. Page 1
The science of artificial neural networks psychology essay. Page 2
The science of artificial neural networks psychology essay. Page 3
The science of artificial neural networks psychology essay. Page 4
The science of artificial neural networks psychology essay. Page 5
The science of artificial neural networks psychology essay. Page 6
The science of artificial neural networks psychology essay. Page 7
The science of artificial neural networks psychology essay. Page 8
The science of artificial neural networks psychology essay. Page 9

This work, titled "The science of artificial neural networks psychology essay" was written and willingly shared by a fellow student. This sample can be utilized as a research and reference resource to aid in the writing of your own work. Any use of the work that does not include an appropriate citation is banned.

If you are the owner of this work and don’t want it to be published on AssignBuster, request its removal.

Request Removal
Cite this Essay

References

AssignBuster. (2022) 'The science of artificial neural networks psychology essay'. 27 September.

Reference

AssignBuster. (2022, September 27). The science of artificial neural networks psychology essay. Retrieved from https://assignbuster.com/the-science-of-artificial-neural-networks-psychology-essay/

References

AssignBuster. 2022. "The science of artificial neural networks psychology essay." September 27, 2022. https://assignbuster.com/the-science-of-artificial-neural-networks-psychology-essay/.

1. AssignBuster. "The science of artificial neural networks psychology essay." September 27, 2022. https://assignbuster.com/the-science-of-artificial-neural-networks-psychology-essay/.


Bibliography


AssignBuster. "The science of artificial neural networks psychology essay." September 27, 2022. https://assignbuster.com/the-science-of-artificial-neural-networks-psychology-essay/.

Work Cited

"The science of artificial neural networks psychology essay." AssignBuster, 27 Sept. 2022, assignbuster.com/the-science-of-artificial-neural-networks-psychology-essay/.

Get in Touch

Please, let us know if you have any ideas on improving The science of artificial neural networks psychology essay, or our service. We will be happy to hear what you think: [email protected]