Optimization, Big Data, and AI/ML
A section of Fractal and Fractional (ISSN 2504-3110).
Section Information
“Optimization, Big Data, and AI/ML” is a Section of the journal Fractal and Fractional, which publishes advanced theoretical studies and practical applications on the topics related to optimization, machine learning, and big data involving the use of fractional calculus (FC), the concept of fractal, as well as general fractional-order thinking (FOT).
Fractional-order thinking (FOT) consists of solving current complex problems in the physical, social, and life sciences using the tools of fractional calculus. Very soon after Mandelbrot introduced the fractal paradigm into the scientific lexicon, it was shown that integer-order calculus (IOC) could not describe the dynamics of fractal processes. A new kind of calculus was required to construct the equations of motion for fractal dynamic processes, which turned out to be fractional-order Hamiltonian equations (FOHEs). Rejecting fractional calculus is equivalent to saying that there are no numbers between neighboring integers.
Optimization and machine learning are intimately related. Due to the fractal nature of the error landscape in optimization and machine learning processes, how to achieve a more optimal performance in optimization and machine learning is not fully understood. A recent initial study revealed that fractional calculus might help. For example, a fractional-order gradient can be used for an improved optimization performance, fractional-order stochasticity can be leveraged to achieve better exploration in various random search processes, and so on. Similarly, these methods, empowered by fractional calculus, can be used in machine learning as well. It is quite natural to expect a “better than the best” performance in optimization and machine learning. As for big data, we now know that they are complex signals from a complex system and, thus, demand fractional calculus-based tools for a “better than the best” quantification of the variabilities in the data. Fractional-order data analytics is a collection of metric empowered by fractional calculus ideas such as heavy-tailedness, algebraic tails, fractional-order lower-order statistics, fractional Fourier transform (FrFT), etc.
The primary aim of the “Optimization, Big Data, and AI/ML” Section is the publication and dissemination of relevant new developments in theoretical studies, numerical treatments, and practical applications with reproducible results. Almost all fields of engineering, physics, economy, mathematics, and other disciplines dealing with optimization, machine learning, and big data are within the scope of this Section.
Keywords
- adaptive and learning algorithms;
- algebraic tail index;
- artificial neural networks;
- big data analytics;
- complex time series;
- complexity matching;
- complexity synchronization;
- complexity-informed machine learning;
- deep learning;
- digital twin behavior-matching algorithms;
- emergence;
- exploration and exploitation;
- fractal calculus or Hausdorff calculus;
- fractals and multifractals;
- fractional calculus of a constant order, multi-order, variable order, or distributed order;
- fractional entropy;
- fractional lower-order moments;
- fractional lower-order statistics;
- fractional-order data analytics;
- fractional-order gradient;
- fractional-order modeling;
- fractional-order random processes;
- fractional-order randomness;
- fractional-order stochasticity;
- heavy-tailedness;
- Hurst parameters;
- information metrics;
- long memory;
- long-range correlations, long-range interactions, and long-range dependence;
- Nabla fractional calculus;
- neural networks;
- non-extensive statistics;
- optimal control;
- optimal machine learning;
- optimization methods;
- outliers;
- random walks;
- reinforcement learning;
- self-optimizing control;
- supervised learning;
- variability quantification
Editorial Board
Special Issue
Following special issue within this section is currently open for submissions:
- Fractional-Order Learning Systems: Theory, Algorithms, and Emerging Applications (Deadline: 20 January 2025)