Open access peer-reviewed chapter

Atomistic Mathematical Theory for Metaheuristic Structures of Global Optimization Algorithms in Evolutionary Machine Learning for Power Systems

By Jonah Lissner

Submitted: July 30th 2019Reviewed: February 8th 2021Published: March 8th 2021

DOI: 10.5772/intechopen.96516

Downloaded: 105

Abstract

Global Optimization in the 4D nonlinear landscape generates kinds and types of particles, waves and extremals of power sets and singletons. In this chapter these are demonstrated for ranges of optimal problem-solving solution algorithms. Here, onts, particles, or atoms, of the ontological blueprint are generated inherently from the fractional optimization algorithms in Metaheuristic structures of computational evolutionary development. These stigmergetics are applicable to incremental machine learning regimes for computational power generation and relay, and information management systems.

Keywords

  • metaheuristic
  • particle swarm optimization
  • global optimization
  • machine learning
  • evolutionary algorithms
  • stigmergetics
  • arrow’s paradox
  • atomistics
  • Fermat’s last theorem
  • network theory
  • Hensellian mathematics

1. Introduction

The evolution of Algorithms from a simple route, to complexified paths requires maps from zones of optimal utilization, to be solved sufficiently, in a given amount of time.

These algorithms are constructed for the purpose of building and advancing a continuity for the next location of optimal utilization, in order to realize the importance to form workable nodes and circuits, that are discrete and exact algorithm criteria in a time-basis. Therefore, a complete network on a nonlinear surface and related machine learning epochs is built.

These criteria are based on Fermat’s Theorem proving global extrema locations either at stationary or bounding points, based ultimately upon the Pythagorean Theorem, where:

Let Nbe the set of natural numbers 1, 2, 3, …, let Zbe the set of integers 0, ±1, ±2, …, and let Qbe the set of rational numbers a/b, where aand bare in Zwith b ≠ 0. In what follows we will call a solution to xn + yn = znwhere one or more of x, y, or zis zero a trivial solution. A solution where all three are non-zero will be called a non-trivialsolution [1].

Advertisement

2. Materials and methods

2.1 Metaheuristic structural rules for the algorithm building

It is a rule of No Unreasonable Effectiveness of Mathematics in any Science [Wigner] [2], and therefore a notion that No Unreasonable Effectiveness of Axiomation in any Science, that 3 Rules of Information Physics [3IP] by Jonah Lissner exist:

  1. Problem of Demarcation.

  2. Rule of Information Dichotomy [Gestalt-Inverse Gestalt], and thereby requiring a kind of.

  3. Context-Restricted Deep Structure [CRDS] for the given topology.

Therefore hypothesized to be commutable terms within this 3IP rulebase, The Three Thermodynamic Rules of Macrodynamics by Jonah Lissner. The 3 General Rules of Macrodynamics [3GRM] which are established to define Global Optimization Algorithm [GOA] challenges:

  1. The Rule of the Continuity of Primaries

  2. The Rule of Perpetuation of Information Inequalities of Primaries

  3. The Rule of Unprovable Ideals, Cardinals or Delimitations of Complex Adaptive Evolutionary Systems [CAES].

3. Algorithm definition

The Metaheuristic Algorithmis defined by the Author [Jonah Lissner] as.

An ontological mechanism to generate or activate decision paths [algorithms] and make decision potential to solve [essentially two-state] paradoxes, a computational physical network and topos, for practical effort or application. Therefore can be constructed a theoretical or hypothetical guideway from objects and particles to advance ontological gradations of relevance and value, through a logical progression.

A relevant algorithm to solve for discrete stigmergetics in nonlinear optimization challenges for graphing algorithms of power systems has been demonstrated in Ant Colony Optimization [ACO]:

Here in general formula where

pxyk=τxyαηxyβzallowedxτxzαηxzβ

by trail update action τxy1ρτxy+kΔτxyk.

for given function sets

fx=λx,forx0;E1
fx=λx2,forx0;E2
fx=sinπx2λ,for0xλ;0,elseE3
fx=πxsinπx2λ,for0xλ;0,else.E4

Evolutionary Game Theory [EGT] challenges in optimization schedules are therein linked to Ant Colony Optimization [ACO], e.g. Eigenvector centrality formula

Where forGEVwithVverticesletA=avt,e.g.av,t=1orav,t=0.Thereforexv=1/lSigtsetMvxt=1/lSigtsetGav,txtibid.E5

It is a basis for optimization schedules that there is an asymmetrical velocity, mass and gravity of said scope of systems. At various times in the computational history, particle optimization on the manifolds evolve at a faster rate [or slower rate] than before. Hence the given incremental and discrete rate of increase, in valleys and peaks accelerates and stabilizes at a higher positive, null or negative value and result in extremal mechanics and nonlinear dynamics. An example can be demonstrated utilizing power faults and extremals on the electrical circuits [3].

These problems of prediction for probability of choice of one object or particle of a set, for pariwise sets and in algorithms, have been demonstrated in Arrow’s Impossibility Theorem and for Algorithmic Information Theory [AIT] whence we can replace voterfor global optimization particleand replace groupwith set:

  1. If every voter prefers alternative X over alternative Y, then the group prefers X over Y.

  2. If every voter’s preference between X and Y remains unchanged, then the group’s preference between X and Y will also remain unchanged (even if voters’ preferences between other pairs like X and Z, Y and Z, or Z and W change).

  3. There is no “dictator”: no single voter possesses the power to always determine the group’s preference [4].

Important is Criteria 3, from whence adaptive and efficient algorithms have space to be constructed as particles on the run-time, for a given Global Optimization Algorithm [GOA].

4. Building the algorithm

In a praexological theory [5] this is proposed because of the inherent general inaccuracy of specific problems, learning rubrics, and Macrodynamic properties of a given performance landscape, and ultimately inefficient of any algorithmic system, given isomorphic [atomistic or non-atomistic] qualities of rulebase, algorithmic structure, weights, and variables [6]. These in turn can be represented as information sets, materiel, work, and symbolic representation and/or power in specific qualiaof Historical Rule of Perpetuation of Information Inequalitiesset to various scales and models.

Clerc has demonstrated a general Metaheuristic algorithm where for f:RnRessentially f(a) ≤ f(b). S includes the number of particles in the swarm having specific position and velocity in the search—space:

for each particle i = 1, …, S do

 Initialize the particle’s position with a uniformly distributed random vector: xi ∼ U(blo, bup)

 Initialize the particle’s best known position to its initial position: pi ← xi

 if f(pi) < f(g) then

  update the swarm’s best known position: g ← pi

 Initialize the particle’s velocity: vi ∼ U(−|bup-blo|, |bup-blo|)

while a termination criterion is not met do:

 for each particle i = 1, …, S do

  for each dimension d = 1, …, n do

   Pick random numbers: rp, rg ∼ U(0,1)

   Update the particle’s velocity: vi,d ← ω vi,d + φp rp (pi,d-xi,d) + φg rg (gd-xi,d)

  Update the particle’s position: xi ← xi + vi

  if f(xi) < f(pi) then

   Update the particle’s best known position: pi ← xi

   if f(pi) < f(g) then

    Update the swarm’s best known position:

gpi.7E6

5. Results and discussion

5.1 Complex adaptive evolutionary system: thermodynamic landscape

For the purpose of evaluating algorithmic fitness on the given landscape the following ansatz can be utilized for Particle Swarm Optimization [PSO] for a decision tree:

Forf:Mn>>Mand whereM=KH_thxiEE7

Where

M = Manifold

′M = Algorithmic Landscape/Manifold, for

KH = potentiated Knowledge History of the Algorithm

t = tradition or procedural process structure [word, string, grammar, memory mapping, rulebase, database] of the algorithm

h = computational multiplicatives and inequalities of the x iE = Adaptive Landscape History of the [Numerical] Object(s) or Particle(s) on the Network.

Lovbjerg and Krink demonstrate for the thermodynamic variables on the Particle Swarm Optimization [PSO]:

vi=χwvi+φ1ipixi+φ2ipgxiE8

where χ is the constriction coefficient.

Here the Three General Rules of Macrodynamics [3GRM] Macrodynamic Automata Rules [MAR] are applied to Information Natural Dynamics [IND] criteria:

Natural Sets, Natural Kinds, Natural Procedures, Natural Strings, Natural Radicals, Natural Binaries, Natural Radices, in Complex Adaptive Evolutionary System [CAES]-Multiagent System [MAS] for 4D model variables.

These variables should each contain criteria:

Time-Complexity, Particle-Value, Particle-Weighting in Fuzzy set theory, Gravity of System, <<> > Nanodynamics of System variables [TC-PV-PW-GS-NDS]

Set approximate to

f:OmegasetRn>>Rwith the global minimafand thesetofallglobal minimizersXinOmega to find the minimum bestsetin the function series ofxE9

for system conditions, system boundaries, number and density of particles in the total Information Natural Dynamics [IND] of the Global Optimization Algorithm [GOA]. These are applied to algorithmic manifold for the candidate solution on the given search spaces. It can be argued that given the extremes of information disequilibrium applied to macrodynamic disequilibrium models, there are inevitably generated extremals of various degrees of power, in the incremental Information Dynamics.

5.2 Complex adaptive evolutionary system: weighting

These differentiable functions can be further defined c.f. Dense heterarchy in Complex Systems Algorithms of a coupled oscillators, where in general formula.

dxdt=Pt+μQtμx+ft,E10

Here in a differential equation we can demonstrate.

un=ftuuun1,n2,E11

These can be demonstrated in Particle Swarm Optimization [PSO], and Macrodynamic models of Meta-optimization of Particle Swarm Optimization [PSO] [7], c.f.

vit+1=w·vit+n1r1pixit+n2r2pbestxitE12

for each set of given epoch or evolutionary landscape scenarioprediction in analytical and expectation weighting parameter formula algorithm optimization [Meissner, et al., ibid].

5.3 Complex adaptive evolutionary system: thermodynamics

Regarding bounding definitions, Chaitin demonstrated in Algorithmic Information Theory [AIT] algorithmic decomposition given Boltzmann-Shannon entropy, where in general formula to set the integral.

HX=iPxiIxi=iPxilogbPxi,

and

limp0+plogp=0.

For Information Natural Dynamics [IND] pairwise comparison, the genesis and stigmergetic evolutionary dynamics of the A-group agent H_Ac.f. for

f:RnRforxiRnandviRnE13

in Particle Swarm Optimization [PSO].

The variables include function space, gradient, vector and weighting and additionally the stigmergy of the given Macrosystem and subsystem autonomics such as in a Pairwise Breakout Model [PBM].

In associated Fuzzy set logic to determine power externals and singelton mechanics in atomistic Natural Dynamic System [NDS] operations, are often utilized border-pairs or extremals for multi-pair Multilayer Perception-Learning Classification Algorithms [MLP-LCM] [7].

Examples of particle Monadicity in Algorithmic Information Theory [AIT] whence the formula

F:LA^N>>LAwhere thentupleR1RNLANE14

indicate the possibilites and types of information physical mechanics for possible variables of the landscape extremals as particles within the min-max parameters [8]. A particle-discrete control function of the node degrees on the evolutionary landscape can therefore be defined where essentially

fti:Pk^n>>KjIntegralotktKpSigdjt/dt.E15

To quantify, Primary extremaof n-arity or n-Adicity[Jonah Lissner] are therefore defined by the Author [Jonah Lissner] for alternatives of pair choices [as monoidal algorithmic circuits in the Complex Adaptive Evolutionary System [CAES]] which can be fractional off the prime polynomial root modulos, from the initial power conditions and therefore generate the discrete information inequalities. These can be demonstrated in Hensellian numbers, and secondly, derivable fractional functions, inherent in any given complex toposof an complex adaptive evolutionary system [9].

Nagata defined thusly: A local ring Rwith maximal ideal mis called Henselianif Hensel’s lemma holds. This means that if Pis a monic polynomial in R[x], then any factorization of its image Pin (R/m)[x] into a product of coprime monic polynomials can be lifted to a factorization in R[x] [10].

5.4 Complex adaptive evolutionary system: networks

The network of circuits then form the basis for Complex Network Systems [CNS] from Simple networks LlogNand adaptive complex or dynamic systems and increasingly complex or quantum probability mechanics.

Scale-free or Barbasi-Albert Models are utilized to advance the mechanics and hypotheses for Complex Network Systems [CNS] e.g. for

Pkk3

and

pi=kijkj,

exist for give node degree correlations and links at network computations for a generic clustering law of

Ck=k1.E16

It can be determined in scale-free network nodes.

LloglogNforPkkγ.

where

sG=uvEdegudegv.,

and dynamically

Pkkγwithγ=1+μa.

where.

pxixj=δxixj1+δxixj.
Advertisement

6. Conclusion

These Metaheuristics for Global Optimization Algorithms [GOA] are for purpose of achievement of the theoretical completion between two and more nodes on the network landscape, and ultimately the given requirements for the applied electrical grid. This theory can be utilized to derive, add, multiply, subtract, or divide units designated as necessary to accurately define the parameters for control of the electrical grid, and for control of network extremals.

Some theoretical requirements for Power System applications and machine learning algorithm libraries for solving heuristic challenge for power requirements and control on manifolds have been demonstrated:

  1. In the definition for innovation in Global Optimization Algorithms [GOA] for Machine Learning in Power Systems the Path-decision or Algorithm is the activated object [ontesis] and Algorithmic network is the kind or type of systemic algorithmic operation of object-getting and technology-building [telesis], due to computational physical plasticity conditions and relevant criteria.

  2. Furthermore the network theory meaning of Path-decision or Algorithm and the computational landscape itself as a network, can be defined discretely in terms of multiple avenues and nodes for algorithms of Boolean systems, [e.g. st-connectivity] whence it has progressed in weight, mass and velocity of the defined Ontology.

  3. Path-decision or Algorithm programmesin Computational Sciences, by modules of Alphanumeric Symbols/Characters as Power Systems of Algorithms, [PSA]-Information Natural Dynamics [IND] or in a macrodynamic method, Systems of Utilizationmay develop, for the purpose of heuristic advancements based on computational physical references, functions and operations on specific topologies and as treated as given Computational sequences.

It is proposed from 3 Rules of Information Physics [3IP] and The 3 General Rules of Macrodynamics [3GRM] for Unprovable Ideals, Cardinals or Delimitations of Optimizationfrom origins in The Rule of Perpetuation of Information Inequalities of Primaries.These criteria are the basis to utilize previous methodologies of reasoning for contemporary and future new evolutionary algorithmic landscapes in the accretive methods.

This Metatheory develops theoretical agreement for the computational physical basis for a General Global Optimization Field Theory [GGOFT], given the algorithmic requirements of minima and maxima of a set of functions for a given computational surface, to determine roots, stationary and turning points, points of inflection, convexity, and concavity for atomistic qualities of evolutionary landscape extremals and their subsequent geometric values and derivations [11].

Therefore in this dialectic, the Ontsor Particles in Complex Adaptive Evolutionary System [CAES] and Dynamic Global Workspace Theory-Intelligent Computational System Organization [DGWT-ICSO], can be understood as network gateways in conjunction with nonlinear surfaces, described by Epistemes,or Semantical value for given Formulae, Algorithms, Landscape. Their purpose is to build and attempt to game-solve more complex and efficient, workable algorithmic structures for the machine learning algorithm challenges to incremental Global Optimization Algorithm [GOA] regimes [12].

© 2021 The Author(s). Licensee IntechOpen. This chapter is distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

How to cite and reference

Link to this chapter Copy to clipboard

Cite this chapter Copy to clipboard

Jonah Lissner (March 8th 2021). Atomistic Mathematical Theory for Metaheuristic Structures of Global Optimization Algorithms in Evolutionary Machine Learning for Power Systems, Computational Optimization Techniques and Applications, Muhammad Sarfraz and Samsul Ariffin Abdul Karim, IntechOpen, DOI: 10.5772/intechopen.96516. Available from:

chapter statistics

105total chapter downloads

More statistics for editors and authors

Login to your personal dashboard for more detailed statistics on your publications.

Access personal reporting

Related Content

This Book

Next chapter

Ultrasonic Detection of Down Syndrome Using Multiscale Quantiser with Convolutional Neural Network

By Michael Dinesh Simon and A.R. Kavitha

Related Book

First chapter

Introductory Chapter: On Fingerprint Recognition

By Muhammad Sarfraz

We are IntechOpen, the world's leading publisher of Open Access books. Built by scientists, for scientists. Our readership spans scientists, professors, researchers, librarians, and students, as well as business professionals. We share our knowledge and peer-reveiwed research papers with libraries, scientific and engineering societies, and also work with corporate R&D departments and government entities.

More About Us