Open access peer-reviewed chapter

Introductory Chapter: Artificial Neural Networks

By Adel El-Shahat

Reviewed: January 5th 2018Published: February 28th 2018

DOI: 10.5772/intechopen.73530

Downloaded: 744

1. Introduction

Artificial neural network (ANN) is a computational structure inspired by a biological nervous system. An ANN consists of very simple and highly interconnected processors called neurons. The neurons are connected to each other by weighted links over which signals can pass. The process consists of data collection, analysis and processing, network structure design, number of hidden layers, number of hidden units, initializing, training the network, network simulation, weights/bias adjustments, and testing the network. Artificial neural networks are used in many different fields to process large sets of data, often providing useful analyses that allow for prediction and identification of new data. Artificial neural networks are computational structure programs consisting of interconnected processors called neurons connected by weights. They compute structural data through a process of learning and training. Data normally used by these structures have nonlinear relationships between inputs and outputs. They are used in applications such as speech recognition, imaging, control, estimation, optimization, and host of other things. They are also applied in real-world applications in the areas of finance, medical, business, mining, etc. [1].

2. ANN basic fundamentals and architectures

The process of artificial neural network works similarly to the neurons in the brain. First, before an artificial neural network can be tested or used, it must be trained. The process starts with the neuron, the basic element ANNs (see Figure 1). The neuron inputs are the outputs of several other neurons. Once a neuron has combined the inputs and produces a single output, it is compared to what is known as a training set, group of known inputs and outputs. The user sets a threshold of maximum that is allowed by the system. If the output value from the neuron does not match the known value in the data set, then the system will adjust the weights. This process is repeated until the error is within the threshold range and at that time the system will hold the weights at their current position to be tested against data that was not used in the training set. At this point in time, the system will be able to back decision and process patterns from the nonlinear data.

Figure 1.

Neuron function [1].

An important issue in designing an ANN is knowing how many values are assigned to each layer of an ANN. Assigning too many variables to a layer will lead to overfitting due to the system not being able to process so much information and also will lead to longer computational times during training. Sometimes, proper training may not be possible in a reasonable time given the overload of data. Assigning to few variables in a layer of a complicated set will lead to the system to underfitting the data due to the signals not being fully recognized. Many factors such as the inputs and outputs, noise, complexity of the error function, network architecture, and training algorithm influence what is the best number of hidden units. The neural network is implemented in MATLAB and Simulink toolbox. The performance of the system will be the mean squared error for measuring the average of the squared error. Training of the network will stop when the max number of repetitions is reached, the max time is reached, the performance is minimized to the goal, the performance gradient falls below the minimum value, or mu exceeds the max value [1].

Biological neurons as shown Figure 2 have a cell nucleus to receive input from other neurons through a web of dendrites. Its learning is accomplished via little changes to a current portrayal—its setup contains critical data previously, and learning is directed. The qualities of associations between neurons, or weights, don’t begin as arbitrary, nor does the structure of the associations. Unlike biological neural networks, artificial neural networks (ANNs) are ordinarily prepared starting with no outside help, utilizing a settled topology decided for the current issue. Be that as it may, ANNs can likewise learn in light of a prior portrayal. Sooner rather than later, ANNs will start to play out extra classes of errands at close human and even superhuman levels, maybe ending up scientifically and fundamentally more like organic neural systems [2].

Figure 2.

Neural network (biological and artificial) [2] (Image credit: Wikipedia).

One of the primary issues with neural systems is that, generally, they have a constrained capacity to recognize causal connections expressly. Engineers of neural systems bolster these systems’ huge swathes of information and take into consideration the neural systems to decide autonomously which input factors are generally vital. Another issue with neural systems is the inclination to overfit in light of the fact that the model records for abnormalities and anomalies in the preparation information that may not be available crosswise over real informational indexes. Overfitting of information happens when an information examination model, for example, a neural system, produces great expectations for the preparation information yet more regrettable ones for testing information. Designers can moderate overfitting in neural systems by punishing huge weights and constraining the quantity of neurons in shrouded layers. Decreasing the quantity of neurons in concealed layers lessens overfitting yet in addition restrains the capacity of the neural system to display more intricate, nonlinear connections [3].

Artificial neural network contains various layers as shown in Figure 3. These layers are the following: input layer to receive inputs for network’s learning and recognition, output layer to react to the data about how it’s found out any assignment, and in-between hidden layer to change the contribution to something that yield unit can use somehow.

Figure 3.

Multilayer neural network [4].

Moreover, neural network has different architectures such as perceptron model, radial basis function, multilayer perceptron, recurrent neural network, long short-term memory, Hopfield network, Boltzmann machine neural network, convolutional neural network, modular neural network, and physical neural network [4]. These different types are well depicted in Figure 4.

Figure 4.

Neural network’ different architectures [4].

Also, neural networks could be used in classification, prediction, clustering (competitive networks, adaptive resonance theory networks, and Kohonen self-organizing maps), association, pattern recognition (supervised classification and unsupervised classification), and in addition machine learning [4].

3. In-brief recent applications

The editor himself has used ANN in different applications such as smart distributed generation systems [5], photovoltaic module and horizontal axis wind turbine modeling [6], wind energy estimation functions for future homes [7], small-scale hydropower generator electrical system modelling [8], robot energy modeling [9], small-scale wind power dispatchable energy source modeling [10], optimum ANN empirical model of capacitive deionization desalination unit [11], lead acid battery modeling for PV applications [12], solar panel modeling-based design technique for distributed generation applications [13], wind turbine (horizontal and vertical) design and simulation aspects for renewable energy applications [14], neural network storage unit parameter modeling [15], empirical capacitive deionization ANN nonparametric modeling for desalination purpose [16], PV module optimum operation modeling [17], ANN interior PM synchronous machine performance improvement unit [18], DC-DC converter duty cycle ANN estimation for DG applications [19], stand-alone PV system simulation for DG applications—Part I: PV module modeling and inverters [20], stand-alone PV system simulation for DG applications—Part II: DC-DC converter feeding maximum power to resistive load [21], maximum power point genetic identification function for photovoltaic system [22], PV cell module modeling and ANN simulation for smart grid applications [23], a neuro-modelling for new biological technique of water pollution control [24], high fundamental frequency PM synchronous motor design neural regression function [25], PM synchronous motor control strategies with their neural network regression functions [26], DC micro-grid pricing and market models [27], battery degradation model based on ANN regression function for EV applications [28], sizing residential photovoltaic systems in the state of georgia [29], an artificial neural network model for wind energy estimation [30], site wind energy appraisal function for future egyptian homes [31], horizontal axis wind turbines modeling [32], wind energy simulation and estimation in egypt [33], petroleum archie parameter estimation [34], storage device unit modeling [35], capacitive deionization (CDI) operational condition nonparametric modeling [36], solar photovoltaic module modeling-based design technique [37], high-speed synchronous motor basic sizing neural function for renewable energy applications [38], generating basic sizing design regression neural function for HSPMSM in aircraft [39], neural unit for PM synchronous machine performance improvement used for renewable energy [40], neural unit for PM synchronous machine performance improvement used for renewable energy [41], a neural model for flat-plate collector [42], a neuro-modeling for new biological technique of water pollution control [43], a neural model for new biological technique of water pollution control: experimental project [44], speed sensorless neural controller for induction motor efficiency optimization [45], and neural model of three-phase induction motor [46].

These book chapters reflect advanced ANN applications for next generation optical networks modulation recognition using artificial neural networks, hardware ANN for gait generation of multi-legged robots, high-resolution soil property ANN map production, ANN dynamic factor models for combined forecasts, ANN parameter recognition of engineering constants in civil engineering, ANN electricity consumption and generation forecasting, ANN for advanced process control, ANN breast cancer detection, ANN applications in biofuels, ANN modeling for manufacturing process optimization, spectral interference correction using a large-sized spectrometer and ANN-based deep learning, solar radiation ANN prediction using NARX model, and ANN data assimilation an atmospheric general circulation model.

© 2018 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Adel El-Shahat (February 28th 2018). Introductory Chapter: Artificial Neural Networks, Advanced Applications for Artificial Neural Networks, Adel El-Shahat, IntechOpen, DOI: 10.5772/intechopen.73530. Available from:

chapter statistics

744total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Modulation Format Recognition Using Artificial Neural Networks for the Next Generation Optical Networks

By Latifa Guesmi, Habib Fathallah and Mourad Menif

Related Book

First chapter

Introductory Chapter: Electric Machines for Smart Grids and Electric Vehicles Applications

By Adel El-Shahat

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us