Cite this asSadollah A (2020) How do artificial neural networks lead to developing an optimization method? Trends Comput Sci Inf Technol 5(1): 067-069. DOI: 10.17352/tcsit.000026
This concise paper explains the inspiration of AI particularly artificial neural networks (ANNs) for developing new metaheuristics. Using the unique concept of ANNs and its wide applications in different fields of study, how the ANNs can be utilized for solving real life and complex optimization problems? This paper briefly links the inspiration to a practical model in order to build an optimization strategy.
Simply looking at artificial neural networks (ANNs), we can see there exists many potentials from this mathematical model. Many applications and improvement have been inspired using the ANNs. Question of how ANNs inspire a new optimizer is underlined in its unique structure. Why the most important task of ANNs is prediction or regression. Why not using the concept of ANNs for the sake of optimization? Indeed, there are some versions of ANNs such as Hopfield neural networks in which are used for optimization, however, they mostly act as local search approaches and they are not considered as global optimizers. In this brief review paper, it is going to answer this question.
The main contribution in this brief review paper is to shortly introduce the proposed metaheuristic named as neural network algorithm (NNA) along with its applications up to now. For sure, in near future, more applications and utilization of this optimizer will be seen. It is worth mention that the NNA is different with ANNs in terms of concept and application. Indeed, the idea and inspiration of NNA is obtained using the unique strategy and structure used in the ANNs. Hence, as for other metaheuristics, the NNA is a metaheuristic optimization algorithm utilized for solving optimization problems unlike ANNs in which applies for regression and prediction purposes.
Artificial neural networks (ANNs) are computing structures inspired by biological neural network structure and/or functional aspects. The ANNs are comprised of dense, interlinked, biologically motivated nervous system computing units (i.e., artificial neurons) . The network feature is primarily influenced by interactions between units, as can be seen in nature.
ANNs can be divided into the following two groups according to their communication pattern (architecture), a) neural feed networks: these are networks whose architecture contains no loops. b) recurrent networks: these are networks in which loops arise due to feedback interactions and where a parameter of time falls into the model, the neural networks are complex in this way .
Figure 1 demonstrates two standard feed-forward and recurrent neural architectures for the ANNs. The question is how to connect the ANNs to form an efficient metaheuristic optimizer? First, let’s redefine the meaning of a metaheuristic. More details of ANNs processes are given in Section 4 along with the proposed NNA.
In computer science, a metaheuristic technique is a higher level or heuristic approach that attempts to discover, produce or pick a heuristic that can provide a satisfactory solution to a problem of optimization, in particular with insufficient knowledge or reduced computer power. Metaheuristics sample a sub-set of solutions that are too large to be listed or discussed. Metaheuristics can make relatively few predictions about the solution of the optimization problem and can hence be used for a range of problems.
The design of nature-inspired metaheuristics is a very active field of study. Many recent metaheuristics are motivated by natural processes, in particular by evolutionary computational algorithms. Nature serves as a source of ideas, processes and principles in the design of computer systems that solve complex computer problems. Such metaheuristic methods include simulation of annealing , genetic algorithms  (e. g., classified as evolutionary algorithms), ant colony optimization , the optimization of particle swarm  and water cycle algorithm .
A modern metaheuristic optimization algorithm has been developed with the aid of ANNs principle and the biological nervous system to optimally solve problems, known as the neural network algorithm (NNA) . The proposed NNA incorporates the framework and definition of ANNs to build new solution candidates and utilized the other ANNs operators for its searching strategy. The NNA benefits from the dynamic nature of ANNs and their mechanism in order to generate new solutions in the search domain.
The NNA begins with an initial population called the pattern solution population, as for other metaheuristics. Inspired by the ANNs, at each iteration in the NNA, target data is considered as the best solution obtained (i.e., time-optimal solution) and the goal is to reduce the error between target data and other predicted pattern solutions (i.e., moving other predicted patterns to the target solution). In fact, the NNA is designed to minimize the problem (i.e., reduce the error between the target and the predicted solutions). It should be stressed that this aim solution has been revised with each iteration.
There may be multiple input directions for the artificial neurons or processing units in the ANNs, referring to the dendrites. The unit averages the weighted values of these input paths using a straightforward summation. The effect is an internal degree of operation for the unit . Initial weights in ANNs are random numbers which will be modified considering the measured network error as the iteration number increases. Back to the NNA, initial weights are defined as a weight matrix (initial random numbers between zero and one) which is altered when iteration number is increased. Then, by matrix multiplying the weight matrix to initial population, new solutions can be obtained.
Next, these solutions are added to the current population to form the new positions of new candidate solution. Note that there is a limitation on weight values. Indeed, the restriction imposed is that the summation of weights does not exceed one which is due to escaping from local optima during search . Figure 2 displays the schematic view of the aforementioned rule in practical.
After generating new solutions, search operators such as bias and transfer function operators play main important roles after creating a new update solution with the special ANNs structure. In the complexities of the neural networks model, the bias present plays a crucial role. In the current population of pattern solutions and a modernized weights matrix the bias operator (acting as noise) modifies a certain portion of pattern solutions in the NNA.
In the NNA, the transfer function operator, unlike the ANNs, is moving from existing search positions to the new positions of the population to the target solution so that higher performing solutions are revised and implemented against the target solution. The schematic view of the NNA entire processes can be observed in Figure 3 showing the local and global feedbacks along with used searching operators.
It can be seen that a simple structure of ANNs (i.e., neuron and layers) can lead to design a metaheuristic. The findings of numerical optimization obtained from many optimization problems clearly show the attractiveness and competition characteristics of the proposed NNA for dealing with unconstrained and restricted real life issues with several design variables relative to recent and well-used optimizers [10-14].
It is worth mentioning that developing a new metaheuristic with a new metaphor by itself does not consider a novel contribution in optimization community. The proposed new optimizer should have examined first over well-known and valid mathematical benchmarks, then should be applied for solving real-world, practical, and large-scale optimization problems. Only in this situation, developing new metaheuristic based on new metaphor inspired by nature can be accepted as it increases the accuracy of obtained optimum solution and it may reduce computational time as it maturely and quickly converges to global optimum solution.
Subscribe to our articles alerts and stay tuned.