**1. Introduction**

Reliable bounding of univariate functions is one of the primary techniques in global optimization, i.e., finding the solution for the following problem:

$$f(\mathbf{x}) \to \min, \ x \in [a, b]. \tag{1}$$

The problem (1) has many practical applications [1–6]. Besides solving problems of one variable, univariate search serves as an auxiliary method in multivariate global optimization. A promising optimization technique known as space-filling curves reduces an optimization [7,8] or approximation [9] problem of multiple variables to a sequence of univariate problems. Univariate optimization techniques are widely used in separable programming [10], where an objective and constraints are sums of functions of one variable. Many univariate optimization methods are directly extended to the multivariate case [11,12].

Univariate global optimization has been intensively studied last decades. The first results date back to the early 1970s. Seminal works in this area [13–16] relied on the Lipschitzian property of a function:

$$|f(\mathbf{x}) - f(\mathbf{y})| \le L|\mathbf{x} - \mathbf{y}|, \text{ for any } \mathbf{x}, \mathbf{y} \in [a, b]. \tag{2}$$

In [14,15] the "saw-tooth cover" lower and upper bounding functions for Lipschitzian objectives were proposed. The lower (upper) bounding functions were defined as max*i*∈1,...,*<sup>n</sup> <sup>f</sup>*(*xi*) − *<sup>L</sup>*|*<sup>x</sup>* − *xi*| (min*i*∈1,...,*<sup>n</sup> <sup>f</sup>*(*xi*) + *<sup>L</sup>*|*<sup>x</sup>* − *xi*|), where *<sup>L</sup>* is a Lipschitz constant and {*x*1, ... , *xn*} is a set of function evaluation points. Since the functions are piecewise linear, their range can be easily computed. This makes such estimates attractive for bounding an objective from below and/or above. Other approaches exploiting the property (2) were studied in numerous papers [17–19]. In papers [20–22], the Lipschitzian first derivatives

**Citation:** Posypkin, M.; Khamisov, O. Automatic Convexity Deduction for Efficient Function's Range Bounding. *Mathematics* **2021**, *9*, 134. https:// doi.org/10.3390/math9020134

Received: 2 December 2020 Accepted: 7 January 2021 Published: 10 January 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

were used to facilitate the global search. Good surveys on Lipschitzian optimization can be found in [7,23–25].

Interval analysis [26,27] is another powerful technique for global optimization. The goal of interval analysis is to find the tightest enclosing interval for a range of a function. The left end of the enclosing interval provides a lower bound for a function over an interval that can be used to reduce the search space in global optimization methods. Most promising approaches are based on interval arithmetic, and more advanced techniques based on interval Taylor expansions [26,28]. Promising approaches based on combining Lipschitzian optimization and interval analysis ideas were proposed in [29]. Efficient optimization algorithms based on piecewise linear [30,31], piecewise convex [32], slopes techniques [33], and DC-decomposition [34,35] should also be mentioned.

The approaches outlined above apply various methods to obtain bounds on a range of a function. However, they do not analyze the convexity of the objective function. Meanwhile, the convexity plays an essential role in global optimization. If the objective is proved to be convex, then efficient local search techniques can be applied to locate its minimum. For example, the univariate convexification technique developed in [36] even sacrifices the dimensionality of the problem for convexity.

The convexity test [26] helps to reduce the search region by pruning areas where a function is proved to be nonconvex. Usually, the convexity is checked by analyzing the range of the second derivative. If this range lies above (below) zero, the function is convex (concave). This approach works only for functions with continuous second derivatives.

Checking convexity is, in general, an NP-hard problem (see [37]) and references therein. Approaches based on the symbolical proof and the numerical disproof of convexity are described in [38]. In the context of convexity checking, it is necessary to mention the disciplined convex programming [39,40], which also relies on a set of rules for proving the convexity of the problem under consideration. However, authors limit their techniques to proving the convexity of the entire mathematical programming problem for a subsequent use of convex programming methods. As we show below, monotonicity, convexity and concavity properties can also remarkably improve the accuracy of interval bounds when applied to subexpression of the function's algebraic representation.

The main contribution of our paper is the novel techniques for bounding the function's range by accounting monotonicity, convexity or concavity of subexpressions of its algebraic expression. This approach efficiently restricts the objective function's range even if the latter is not convex neither concave. We proved experimentally that the introduced techniques can significantly reduce the bounds on the function's range and remarkably enhance the conventional interval global search procedures. A set of rules for deducing monotonicity, concavity and convexity properties of a univariate function from its algebraic expression is clearly and concisely formulated and proved. These rules complement the traditional ways of establishing the properties of the objective function based on evaluating its derivatives' ranges.

Notation:

R — the set of real numbers;

Z — the set of integers;

N — the set of positive integers (natural numbers);

IR — the set of all intervals in R;

**x** = [*x*, *x*] — intervals are denoted with bold font;

<sup>R</sup>*f*([*a*, *<sup>b</sup>*]) = {*<sup>y</sup>* <sup>∈</sup> <sup>R</sup> : *<sup>y</sup>* <sup>=</sup> *<sup>f</sup>*(*x*) for some *<sup>x</sup>* <sup>∈</sup> [*a*, *<sup>b</sup>*]} — the range of function *<sup>f</sup>* : <sup>R</sup> <sup>→</sup> <sup>R</sup> over interval [*a*, *<sup>b</sup>*];

**<sup>f</sup>**—an interval extension of a function *<sup>f</sup>* : <sup>R</sup> <sup>→</sup> <sup>R</sup>, i.e., a mapping **<sup>f</sup>**: IR <sup>→</sup> IR such that <sup>R</sup>*f*([*a*, *<sup>b</sup>*]) <sup>⊆</sup> **<sup>f</sup>**([*a*, *<sup>b</sup>*]) for any [*a*, *<sup>b</sup>*] <sup>∈</sup> IR, notice, there may be many different interval extensions for a function *f*(*x*);

*<sup>f</sup>*(*x*) —*f*(*x*) is non-decreasing monotonic on <sup>R</sup> or an interval if additionally specified; *<sup>f</sup>*(*x*) —*f*(*x*) is non-increasing monotonic on <sup>R</sup> or an interval if additionally specified.

By elementary functions we mean commonly used mathematical functions, i.e., power exponential, logarithmic and trigonometric functions. We distinguish smooth elementary functions that have derivatives of any order in the domain of the definition and nonsmooth functions, which are nondifferentiable at some points. The list of elementary functions supported by our method is given in Table 1. Notice that other elementary functions can be expressed as algebraic expressions over the functions listed in the table and thus omitted. We consider only univariate functions in what follows. Thus, we do not mention it in each statement. We restrict our study to the case of continuous functions.

**Table 1.** Supported elementary functions.


The paper is organized as follows. Section 2 describes the deduction techniques to evaluate the convexity/concavity of a function automatically. Then, in Section 3 it is shown how this technique is used to bound the range of a function. Section 4 contains the experimental results demonstrating the the proposed approach's efficiency. Section 5 concludes the paper and discusses possible future research directions.
