Next Article in Journal
Spectral Efficiency Analysis for IRS-Assisted MISO Wireless Communication: A Metaverse Scenario Proposal
Next Article in Special Issue
Single-Machine Maintenance Activity Scheduling with Convex Resource Constraints and Learning Effects
Previous Article in Journal
Chaotic Vibration and Perforation Effects on the Sound Absorption of a Nonlinear Curved Panel Absorber
Previous Article in Special Issue
Two-Agent Slack Due-Date Assignment Scheduling with Resource Allocations and Deteriorating Jobs
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Study on Convex Resource Allocation Scheduling with a Time-Dependent Learning Effect

School of Science, Shenyang Aerospace University, Shenyang 110136, China
*
Author to whom correspondence should be addressed.
Mathematics 2023, 11(14), 3179; https://doi.org/10.3390/math11143179
Submission received: 14 June 2023 / Revised: 6 July 2023 / Accepted: 17 July 2023 / Published: 20 July 2023
(This article belongs to the Special Issue Systems Engineering, Control, and Automation)

Abstract

:
In classical schedule problems, the actual processing time of a job is a fixed constant, but in the actual production process, the processing time of a job is affected by a variety of factors, two of which are the learning effect and resource allocation. In this paper, single-machine scheduling problems with resource allocation and a time-dependent learning effect are investigated. The actual processing time of a job depends on the sum of normal processing times of previous jobs and the allocation of non-renewable resources. With the convex resource consumption function, the goal is to determine the optimal schedule and optimal resource allocation. Three problems arising from two criteria (i.e., the total resource consumption cost and the scheduling cost) are studied. For some special cases of the problems, we prove that they can be solved in polynomial time. More generally, we propose some accurate and intelligent algorithms to solve these problems.

1. Introduction

In many real-world industrial processes, job (task) processing times may be variable due to learning effects or resource allocation. Learning effects appear in, for example, the way that workers’ repeated processing of similar jobs improves their skills (see Azzouz et al. [1], Sun et al. [2], Zhao [3], Wang et al. [4], Chen et al. [5], Ren et al. [6], Wang et al. [7]). The processing times of jobs can be controlled by allocating a common limited resource, such as fuel, the financial budget, energy, or manpower (see Guan et al. [8], Wang and Cheng [9], Shabtay and Steiner [10], Zhang et al. [11], Wang et al. [12], Wang et al. [13], Liu and Wang [14]).
In addition, in many real-life situations, the simultaneous occurrence of learning effects and resource allocation can be found; e.g., in the chemical industry. Recently, Lu et al. [15] explored single-machine scheduling with learning effects, group technology, and resource allocation. The objective was to minimize the makespan subject to limited resource availability. To solve the problem, the authors proposed heuristic and branch-and-bound algorithms. Wang et al. [16] studied the resource allocation scheduling problem with learning and deterioration effects with a single machine. For linear resource allocation, they showed that some regular objective function minimizations can be solved in polynomial time. Liu and Jiang [17] considered due date assignment scheduling problems with learning effects and resource allocation. Zhao [18] addressed due window assignment flow shop scheduling problems with learning effects and resource allocation. Wang et al. [19] considered single-machine resource allocation scheduling with truncated learning effects. For the scheduling cost (i.e., the total weighted completion time) and total resource consumption cost, they provided a bicriteria analysis. They proved that some special cases of the problem are solvable in polynomial time. To solve the problem more generally, they proposed a heuristic and a branch-and-bound algorithm. Yan et al. [20] studied single-machine group scheduling with resource allocation and learning effects. For the minimization of the total completion time subject to limited resource availability, they proposed heuristic, tabu search, and branch-and-bound algorithms.
Biskup [21] considered the position-dependent learning effect, i.e., for which the actual processing time of job J ˙ j in position r is p j r A = p ¯ j r α , where p ¯ j is the normal processing time of job J ˙ j , and α 0 is the learning factor. Kuo and Yang [22] studied the time-dependent learning effect; i.e., p j r A = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α , where α 0 is the learning factor and [ h ] denotes a job scheduled in the hth position. Wang et al. [23] studied the following model: p j r A ( u ˜ j ) = p ¯ j r a u ˜ j β , where β > 0 is a given constant, and u ˜ j is the resource allocated to the job J ˙ j . The above articles considered the resource allocation scheduling problem with position-dependent learning effects. However, in general, there are two approaches to modeling the learning effect: one is the position-dependent learning effect, and the other is the time-dependent (sum-of-processing-time) learning effect (Azzouz et al. [1]). Hence, in this paper, the work of resource allocation scheduling is continued by researching the time-dependent learning effect (see Table 1). This paper’s contributions and novelties are as follows:
  • Single-machine scheduling with convex resource allocation and a time-dependent learning effect is modeled and studied;
  • The solution algorithms for three versions of the total resource consumption cost and the scheduling cost are presented;
  • The computational results of the proposed algorithms are analyzed.
The paper is organized as follows. Section 2 formulates the model. Section 3 gives the basic properties of the problems. Section 4 describes the solution algorithms developed to solve the problems. Section 5 focused on the computational experiments with the algorithms. Section 6 presents the conclusions.

2. Problem Statement

In this paper, the notations used are listed in Table 2, and we consider the problem of scheduling n ˇ jobs J ˙ 1 , J ˙ 2 , J ˙ n ˇ on a single machine, with all the jobs available at time 0. If the job schedule is J ˙ 1 , J ˙ 2 , J ˙ 3 , J ˙ n ˇ , then the Gantt chart of the single-machine scheduling is as in Figure 1:
In this paper, the model we consider is given as follows
p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β ,
where α 0 is the learning factor, and β > 0 is a given constant. The goal of this article is to determine the optimal schedule and optimal resource allocation. The first problem of this paper is to minimize
F ( u ˜ [ j ] ) = δ j = 1 n ˇ ϑ j p [ j ] A + η j = 1 n ˇ g [ j ] u ˜ [ j ] .
Using the three-field notation, the first problem (denoted by P 1 ¯ ) can be denoted as
1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β δ j = 1 n ˇ ϑ j p [ j ] A + η j = 1 n ˇ g [ j ] u ˜ [ j ] .
The second problem is to minimize j = 1 n ˇ ϑ j p [ j ] A subject to the resource consumption cost cannot exceed an upper bound, i.e., j = 1 n ˇ g [ j ] u ˜ [ j ] U ˘ , and this problem (denoted by P 2 ¯ ) is
1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β , j = 1 n ˇ g [ j ] u ˜ [ j ] U ˘ j = 1 n ˇ ϑ j p [ j ] A ,
where U ˘ > 0 is an upper bound on j = 1 n ˇ g [ j ] u ˜ [ j ] . The last problem is to consider the complementary problem of P 2 ¯ , i.e., the third problem (denoted by P 3 ¯ ) is
1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β , j = 1 n ˇ ϑ j p [ j ] A V ˘ j = 1 n ˇ g [ j ] u ˜ [ j ] ,
where V ˘ > 0 is an upper bound on j = 1 n ˇ ϑ j p [ j ] A .
Obviously, for the makespan of all jobs C max = j = 1 n ˇ ϑ j p [ j ] A , where ϑ j = 1 ; for the total completion times j = 1 n ˇ C j = j = 1 n ˇ ϑ j p [ j ] A , where ϑ j = n ˇ j + 1 ; for the total absolute differences in completion times T A D C = i = 1 n ˇ j = 1 n ˇ C i C j = j = 1 n ˇ ϑ j p [ j ] A , where ϑ j = ( j 1 ) ( n ˇ j + 1 ) (see Kanet [24]); for the total absolute differences in waiting times T A D W = i = 1 n ˇ j = i n ˇ W i W j = j = 1 n ˇ ϑ j p [ j ] A , where ϑ j = j ( n ˇ j ) and W j = C j p j A is the waiting time of job J j (see Bagchi [25]).

3. Basic Properties

In this section, some lemmas are given and there exist the optimal resource allocation of three problems the above mentioned.

3.1. Problem P 1 ¯

Lemma 1.
For the problem P 1 ¯ , the optimal resource allocation is a function of the job schedule, i.e.,
u ˜ [ j ] * = δ β ϑ j η g [ j ] 1 1 + β p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β .
Proof. 
For any fixed job schedule ψ ¯ = ( J ˙ [ 1 ] , J ˙ [ 2 ] , , J ˙ [ n ˇ ] ) , from Equations (1) and (2), we have
F ( u ˜ [ j ] ) = δ j = 1 n ˇ ϑ j p [ j ] A + η j = 1 n ˇ g [ j ] u ˜ [ j ] = δ j = 1 n ˇ ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α u ˜ j β + η j = 1 n ˇ g [ j ] u ˜ [ j ]
We take the derivation of Equation (7), and let it be equal to 0, we have
F ( u ˜ [ j ] ) u ˜ [ j ] = δ β ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α β u ˜ j β + 1 + η g [ j ] = 0 .
From Equation (8), we have
u ˜ [ j ] * = δ β ϑ j η g [ j ] 1 1 + β p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β .
Equation (6) holds.    □
By substituting Equation (6) into Equation (7), we have
F ( u ˜ [ j ] ) = β β β + 1 + β 1 β + 1 δ 1 β + 1 η β β + 1 j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β

3.2. Problem P 2 ¯

Lemma 2.
For the problem P 2 ¯ , the optimal resource allocation as a function of the job schedule, i.e.,
u ˜ j * = ϑ j p ¯ j ( 1 + h = 1 j 1 p ¯ [ h ] ) α β 1 1 ( β + 1 ) ( β + 1 ) g j 1 1 ( β + 1 ) ( β + 1 ) j = 1 n ˇ ϑ j g j p ¯ j ( 1 + h = 1 j 1 p ¯ [ h ] ) α β 1 1 ( β + 1 ) ( β + 1 ) × U ˘ , j = 1 , 2 , , n ˇ .
Proof. 
For any fixed job schedule ψ ¯ = ( J ˙ [ 1 ] , J ˙ [ 2 ] , , J ˙ [ n ˇ ] ) , the Lagrangian function is
L ˜ u ˜ [ j ] , λ ˜ = j = 1 n ˇ ϑ j p j A + λ ˜ j = 1 n ˇ g j u ˜ j U ˘ = j = 1 n ˇ ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α u ˜ [ j ] β + λ ˜ j = 1 n ˇ g j u ˜ j U ˘ ,
where λ ˜ 0 is the Lagrangian multiplier. Deriving (11) with respect to u ˜ j and λ ˜ , we have
L ˜ u ˜ [ j ] , λ ˜ u ˜ j = λ ˜ g j β ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α β u ˜ j β + 1 = 0 .
and
L ˜ u ˜ [ j ] , λ ˜ λ ˜ = j = 1 n ˇ g [ j ] u ˜ [ j ] U ˘ = 0 .
From Equation (12), we have
u ˜ j = β ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α β 1 1 ( β + 1 ) ( β + 1 ) λ ˜ g j 1 1 ( β + 1 ) ( β + 1 ) .
Substitute Equation (14) to Equation (13), we have
λ ˜ 1 1 ( β + 1 ) ( β + 1 ) = j = 1 n ˜ β ϑ j 1 1 ( β + 1 ) ( β + 1 ) g j p ¯ j ( 1 + h = 1 j 1 p ¯ [ h ] ) α β β β + 1 ( β + 1 ) U ˘ .
From Equations (14) and (15), Equation (10) holds.    □
By substituting Equation (10) into j = 1 n ˇ ϑ j p j A = j = 1 n ˇ ϑ j p ¯ [ j ] ( 1 + h = 1 j 1 p ¯ [ h ] ) α u ˜ [ j ] β , we have
j = 1 n ϑ j p j A = j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β β + 1 U ˘ β

3.3. Problem P 3 ¯

Lemma 3.
For the problem P 3 ¯ , the optimal resource allocation is:
u j * = V ˘ 1 β ϑ j 1 β + 1 p ¯ j ( 1 + h = 1 j 1 p ¯ [ h ] ) α β β + 1 j = 1 n ˇ ϑ j 1 β + 1 g j p ¯ j ( 1 + h = 1 j 1 p ¯ [ h ] ) α β β + 1 1 β g j 1 β + 1 , j = 1 , 2 , , n ˇ .
Proof. 
Similar to Lemma 2.    □
By substituting Equation (17) into j = 1 n ˇ g j u ˜ j , we have:
j = 1 n ˇ g j u ˜ j = V ˘ 1 β j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β 1 + 1 β

4. Algorithms

Since β , δ , η , U ˘ and V ˘ are given parameters, from Equations (9), (16), and (18), it can be showed that solving P 1 ¯ , P 2 ¯ and P 3 ¯ is equal to minimizing:
M = j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β
If ϑ j = 1 , g j = 1 ( j = 1 , 2 , , n ˇ ) , we will show that the SPT rule and LPT rule can not find the optimal schedule for P 1 ¯ , P 2 ¯ and P 3 ¯ .
Example 1.
Assume that n ˇ = 3 , α = 0.5 , β = 1 and the processing times of the jobs are p ¯ 1 = 2 , p ¯ 2 = 3 , p ¯ 3 = 4 .
According to the SPT order, M = 4.0082 .
If the schedule is LPT rule, M = 3.9992 .
Therefore, SPT is not an optimal schedule for the case of ϑ j = 1 , g j = 1 ( j = 1 , 2 , , n ˇ ) .
Example 2.
Assume that n ˇ = 3 , α = 0.2 , β = 3 and the processing times of the jobs are p ¯ 1 = 2 , p ¯ 2 = 4 , p ¯ 3 = 7 .
According to the LPT rule, M = 7.5325 .
If the schedule is SPT order, M = 7.2946 .
Therefore, LPT is not an optimal schedule for the case of ϑ j = 1 , g j = 1 ( j = 1 , 2 , , n ˇ ) .

4.1. Polynomial Time Solvable Cases

4.1.1. Case 1

If p ¯ j = p ¯ ( j = 1 , 2 , , n ˇ ) , we have:
Theorem 1.
If p ¯ j = p ¯ ( j = 1 , 2 , , n ˇ ) , for P 1 ¯ , P 2 ¯ and P 3 ¯ , the optimal schedule can be solved in O ( n ˇ log n ˇ ) time.
Proof. 
If p ¯ j = p ¯ ( j = 1 , 2 , , n ˇ ) , from Equation (19), we have
M = j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β = j = 1 n ˇ ( ϑ j ) 1 1 + β [ p ¯ 1 + ( j 1 ) p ¯ α ] β 1 + β g [ j ] β 1 + β .
Let X j = ( ϑ j ) 1 1 + β [ p ¯ 1 + ( j 1 ) p ¯ α ] β 1 + β and Y [ j ] = g [ j ] β 1 + β . Obviously, Equation (20) can be minimized by HLP rule (see Hardy et al. [26]) in O ( n ˇ log n ˇ ) time, i.e., place the largest Y j on the smallest X j , the second largest Y j on the second smallest X j , and so forth.    □

4.1.2. Case 2

If α = 0 , we have:
Theorem 2.
If α = 0 , for the problems P 1 ¯ , P 2 ¯ and P 3 ¯ , the optimal schedule can be solved in O ( n ˇ log n ˇ ) time.
Proof. 
If α = 0 , from Equation (19),
M = j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β = j = 1 n ˇ ( ϑ j ) 1 1 + β g [ j ] p ¯ [ j ] β 1 + β .
Let X j = ( ϑ j ) 1 1 + β and Y [ j ] = g [ j ] p ¯ [ j ] β 1 + β . Obviously, Equation (21) can be minimized by HLP rule in O ( n ˇ log n ˇ ) time.    □

4.1.3. Case 3

If ϑ j = ϑ ( j = 1 , 2 , , n ˇ ) , and g k p ¯ k g j p ¯ j ( j = 1 , 2 , , n ˇ ) implies p ¯ k p ¯ j (or g k p ¯ k g j p ¯ j ( j = 1 , 2 , , n ˇ ) implies p ¯ k p ¯ j ), we have:
Theorem 3.
If ϑ j = ϑ ( j = 1 , 2 , , n ˇ ) , and g k p ¯ k g j p ¯ j ( j = 1 , 2 , , n ˇ ) implies p ¯ k p ¯ j (or g k p ¯ k g j p ¯ j ( j = 1 , 2 , , n ˇ ) implies p ¯ k p ¯ j ), for P 1 ¯ , P 2 ¯ and P 3 ¯ , the optimal schedule can be solved in O ( n ˇ log n ˇ ) time, i.e., by sequencing the jobs in non-decreasing (non-increasing) order of g j p ¯ j ( p ¯ j ).
Proof. 
If ϑ j = ϑ ( j = 1 , 2 , , n ˇ ) , from Equation (19), we have
M = ϑ 1 1 + β j = 1 n ˇ g [ j ] p ¯ [ j ] 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β .
Let ψ ¯ = ( J ˙ [ 1 ] , J ˙ [ 2 ] , , J ˙ [ r 1 ] , J ˙ k , J ˙ j , J ˙ [ r + 2 ] , , J ˙ [ n ˇ ] ) and ψ ¯ = ( J ˙ [ 1 ] , J ˙ [ 2 ] , , J ˙ [ r 1 ] , J ˙ j , J ˙ k , J ˙ [ r + 2 ] , , J ˙ [ n ˇ ] ) be two job schedules, where g k p ¯ k g j p ¯ j and p ¯ k p ¯ j . To show ψ ¯ dominates ψ ¯ , it suffices to show that the rth and ( r + 1 ) th jobs satisfy the following condition: ( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β + ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ k α β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β g j β 1 + β p ¯ j β 1 + β + ( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β .
Obviously, if p ¯ k p ¯ j , we have 1 + h = 1 r 1 p ¯ [ h ] + p ¯ k α β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β , then
( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β + ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ k α β 1 + β ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β ( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β ( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β + ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β ( g k p ¯ k ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β = ( g k p ¯ k ) β 1 + β ( g j p ¯ j ) β 1 + β 1 + h = 1 r 1 p ¯ [ h ] α β 1 + β 1 + h = 1 r 1 p ¯ [ h ] + p ¯ j α β 1 + β 0 .
   □

4.2. Lower Bound

Let ψ ¯ = ( ψ P ¯ , ψ U ¯ ) be a schedule, where ψ P ¯ ( ψ U ¯ ) ) is the scheduled (unscheduled) part, and there are η jobs in ψ P ¯ . Let p ¯ min = min { p ¯ j | j ψ U ¯ } , from Equation (19), we have:
M ( ψ P ¯ , ψ U ¯ ) = j = 1 η ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β + j = η + 1 n ˇ ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 η p ¯ [ h ] + h = η + 1 j 1 p ¯ [ h ] α β 1 + β j = 1 η ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β + j = η + 1 n ˇ ( ϑ j ) 1 1 + β ( g [ j ] ) β 1 + β ( p ¯ min ) β 1 + β 1 + h = 1 η p ¯ [ h ] + ( j 1 η ) p ¯ min α β 1 + β .
Observe that the terms 1 + h = 1 η p ¯ [ h ] and j = 1 η ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β are known and a lower bound can be obtained by minimizing j = η + 1 n ˇ ( ϑ j ) 1 1 + β ( g [ j ] ) β 1 + β ( p ¯ min ) β 1 + β ( 1 + h = 1 η p ¯ [ h ] + ( j 1 η ) p ¯ min ) α β 1 + β . From Theorem 1, we have the first lower bound:
M ( ψ P ¯ , ψ U ¯ ) L B 1 = j = 1 η ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β + j = η + 1 n ˇ X j Y [ j ] ,
where X j = ( ϑ j ) 1 1 + β ( p ¯ min ) β 1 + β 1 + h = 1 η p ¯ [ h ] + ( j 1 η ) p ¯ min α β 1 + β , Y [ j ] = ( g [ j ] ) β 1 + β , and j = η + 1 n ˇ X j Y [ j ] can be minimized by the HLP rule.
Similarly, let ϑ min = min { ϑ j | j = η + 1 , η + 2 , , n ˇ } , from Theorem 3, we have the second lower bound
M ( ψ P ¯ , ψ U ¯ ) L B 2 = j = 1 η ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β + ( ϑ min ) 1 1 + β j = η + 1 n ˇ ( g ( j ) p ¯ ( j ) ) β 1 + β 1 + h = 1 η p ¯ [ h ] + h = η + 1 j 1 p ¯ < h > α β 1 + β ,
where g ( η + 1 ) p ¯ ( η + 1 ) g ( η + 2 ) p ¯ ( η + 2 ) g ( n ˇ ) p ¯ ( n ˇ ) and p ¯ < η + 1 > p ¯ < η + 2 > p ¯ < n ˇ > (note that g ( j ) p ¯ ( j ) and p ¯ < j > do not necessarily correspond to the same job).
Combining Equations (24) and (25), the lower bound for P 1 ¯ , P 2 ¯ and P 3 ¯ is
M ( ψ P ¯ , ψ U ¯ ) L B = max { M ( ψ P ¯ , ψ U ¯ ) L B 1 , M ( ψ P ¯ , ψ U ¯ ) L B 2 } .

4.3. Upper Bound

From Section 4.1, we can propose the following heuristic algorithm as a upper bound (UB) for P 1 ¯ , P 2 ¯ and P 3 ¯ (see Algorithm 1).
Then use the branch-and-bound algorithm, which is abbreviated as B&B, always around a search tree, we regard the original problem as the root node of the search tree, based on which, the meaning of branch is to divide the big problem into small problems. The big problem can be seen as the parent node of the search tree, so the small problem separated from the big problem is the child node of the parent node. The process of branching is the process of adding children to the tree. The bound is to check the upper and lower bounds of the subproblem in the process of branching, if the subproblem does not produce a better solution than the current optimal solution, then cut this branch. The algorithm ends when none of the subproblems yield a better solution.
Algorithm 1: (UB)
Step 1.
Sequence the jobs by the HLP rule, where X j = ( ϑ j ) 1 1 + β ( j ) α β 1 + β and Y [ j ] = g [ j ] β 1 + β .
Step 2.
Sequence the jobs by the HLP rule, where X j = ( ϑ j ) 1 1 + β and Y [ j ] = g [ j ] β 1 + β .
Step 3.
Sequence the jobs in non-decreasing order of g j p ¯ j .
Step 4.
Sequence the jobs in non-increasing order of p ¯ j .
Step 5.
Choose the schedule with the minimal value of M = j = 1 n ˇ ( ϑ j ) 1 1 + β ( g [ j ] p ¯ [ j ] ) β 1 + β 1 + h = 1 j 1 p ¯ [ h ] α β 1 + β from Steps 1–4.

4.4. Complex Algorithms

From Nawaz et al. [27], the NEH heuristic (i.e., Algorithm 2) can be proposed for the P 1 ¯ , P 2 ¯ and P 3 ¯ , and two NEH variants are designed.
Algorithm 2:(NEH)
Step 1.
  
Step 1.1. An initial solution sorted by the SPT rule of g j p ¯ j (denoted by NEH[SPT]);
Step 1.2. An initial solution sorted by the LPT rule of p ¯ j (denoted by NEH[LPT]).
Step 2.
Pick the two jobs from the first and second positions of the list of Step 1, and find the best schedule for these two jobs by calculating M for the two possible schedules. Do not change the relative positions of these two jobs with respect to each other in the remaining steps of the algorithm. Set h = 3 .
Step 3.
Pick the job in the hth position of the list generated in Step 1 and find the best schedule by placing it at all possible h positions in the partial schedule found in the previous step, without changing the positions relative to each other of the already assigned jobs. The number of enumerations at this step equals h.
Step 4.
If n ˇ = h , STOP, otherwise set h = h + 1 and go to Step 3.
In addition, the tabu search (TS) algorithm can be proposed for P 1 ¯ , P 2 ¯ and P 3 ¯ . The initial schedule used in the TS algorithm is chosen by the SPT rule of g j p ¯ j (denoted by TS[SPT]) and the LPT rule of p ¯ j (denoted by TS[LPT]), and the maximum number of iterations for the TS algorithm is set at 1000 n ˇ . The steps of the TS heuristic are given in Wu et al. [28].
From the upper bound (i.e., Algorithm 1) and lower one (see Equation (26)), the standard branch-and-bound algorithm (denoted by B&B) can be proposed, and the depth first search strategy is used.

5. Computational Experiments

This section tests the accuracy and efficiency of the proposed algorithms UB, NEH, TS and B&B. Detailed programming and testing configurations are as follows.
  • Java version: Oracle JDK-11.01, the max memory allowed was restricted to 64 G.
  • Testing computer: One desktop computer with CPU Inter @ Core i5-10500 3.1 GHz, 8 GB RAM and Window 10 system through VC++ 6.0.
We assume that ϑ j = 1 (i.e., for the makespan C max ), and the following parameters are given:
(1)
α = 0.25 , 0.3 , 0.35 , 0.4 ;
(2)
β = 1 , 2 , 3 , 4 ;
(3)
p ¯ j is uniformly distributed over [1, 100];
(4)
g j is uniformly distributed over [1, 50];
(5)
n ˇ = 13, 14, 15, 16 (for small-scale instances and global optimum can be obtained by the B&B).
(6)
n ˇ = 100, 150, 200, 250, 300 (for larger-scale instances and B&B was disabled).
With 20 instances being generated for each combination n ˇ , α and β . For small-scale instances, from Equation (19), the error of algorithms is calculated by Equation (27)
M ( ψ ¯ ) M ( ψ ¯ * ) M ( ψ ¯ * ) × 100 % ,
where ψ ¯ is a schedule obtained by the UB, NEH[SPT], NEH[LPT], TS[SPT], TS[LPT] and the optimal schedule ψ ¯ * is obtained by the B&B. The running time of the UB, NEH[SPT], NEH[LPT], TS[SPT], TS[LPT], and B&B is defined by “CPU time” and time unit is milliseconds (ms). Table 3 compares the average with maximal running times of the above algorithms, and the maximal running time of B&B was 2,996,279 ms ( n ˇ 16 ). Table 4 compares the error of the above algorithms, and the performance of NEH[SPT] performs very well and the maximal error was 9.2 % for n ˇ 16 . For large-scale instances, the running time (i.e., “CPU time”, ms) is defined, and the error of algorithms is:
M ( U B ) M ( ψ ¯ ) M ( U B ) × 100 % .
From Table 5, Table 6, Table 7 and Table 8, we can obtain that the performance of NEH performs very well than TS and UB.
As can be seen from Table 3, Table 4, Table 5, Table 6, Table 7 and Table 8, the efficiency of the algorithm is mainly related to the number of jobs, and the learning factor does not affect its efficiency. As the number of jobs increases, the execution time of the two algorithms significantly increases, the learning factor increases or decreases, the execution time fluctuates, and the number of nodes increases when α < 0.25 . For small-scale experiments (Table 3 and Table 4), the errors of NEH[SPT] and B&B are the smallest, indicating that the algorithm NEH[SPT] is highly accurate. The running time of B&B increases significantly with the growth of the number of jobs, and also, the number of nodes increases significantly with the growth of the number of jobs. When the number of jobs exceeds 16, the average running time of B&B exceeds 3600 s, which denotes that the B&B is more suitable for small-scale experiments and has higher efficiency. For large-scale experiments (Table 5, Table 6, Table 7 and Table 8), the B&B is not applicable. Considering polynomial-time algorithm (i.e., NEH) and intelligent algorithm (i.e., TS), it is obvious that polynomial-time algorithm is more efficient than intelligent algorithm, and NEH[SPT] and NEH[LPT] have similar errors, both of which are better than TS[SPT] and TS[LPT]. It implies that the NEH is more suitable for large-scale experiments.

6. Conclusions

In this paper, we studied the single-machine scheduling problems with the time-dependent learning effect and convex resource allocation. For three versions of the scheduling cost and resource consumption cost, we provided a bicriteria analysis. We proved that some special cases of the problems can be solved in polynomial time. For the general case of the problems, we proposed the heuristic algorithm, branch-and-bound algorithm, NEH algorithm and TS algorithm. Future research should further address the complexity of the problems P 1 ¯ , P 2 ¯ and P 3 ¯ , explore the flow shop or parallel machines (see Sterna and Czerniachowska [29]) scheduling with the time-dependent learning effect and resource allocation, or consider flow shop scheduling with deteriorating effects (see Sun and Geng [30]).

Author Contributions

Methodology, Y.-C.W. and J.-B.W.; investigation, J.-B.W.; writing—original draft preparation, Y.-C.W.; writing—review and editing, Y.-C.W. and J.-B.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work was also supported by LiaoNing Revitalization Talents Program (grant no. XLYC2002017) and Science Research Foundation of Educational Department of Liaoning Province (LJKMZ20220527).

Data Availability Statement

The data used to support the findings of this paper are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that they have no conflict of interest.

References

  1. Azzouz, A.; Ennigrou, M.; Said, L.B. Scheduling problems under learning effects: Classification and cartography. Int. J. Prod. Res. 2018, 56, 1642–1661. [Google Scholar] [CrossRef] [Green Version]
  2. Sun, X.; Geng, X.N.; Liu, F. Flow shop scheduling with general position weighted learning effects to minimise total weighted completion time. J. Oper. Res. Soc. 2021, 72, 2674–2689. [Google Scholar] [CrossRef]
  3. Zhao, S. Scheduling jobs with general truncated learning effects including proportional setup times. Comput. Appl. Math. 2022, 41, 146. [Google Scholar] [CrossRef]
  4. Wang, J.-B.; Zhang, L.-H.; Lv, Z.-G.; Lv, D.-Y.; Geng, X.-N.; Sun, X. Heuristic and exact algorithms for single-machine scheduling problems with general truncated learning effects. Comput. Appl. Math. 2022, 41, 417. [Google Scholar] [CrossRef]
  5. Chen, K.; Cheng, T.C.E.; Huang, H.; Ji, M.; Yao, D. Single-machine scheduling with autonomous and induced learning to minimize the total weighted number of tardy jobs. Eur. J. Oper. Res. 2023, 309, 24–34. [Google Scholar] [CrossRef]
  6. Ren, N.; Lv, D.Y.; Wang, J.B.; Wang, X.Y. Solution algorithms for single-machine scheduling with learning effects and exponential past-sequence-dependent delivery times. J. Ind. Manag. Optim. 2023, 19, 8429–8450. [Google Scholar] [CrossRef]
  7. Wang, S.-H.; Lv, D.-Y.; Wang, J.-B. Research on position-dependent weights scheduling with delivery times and truncated sum-of-processing-times-based learning effect. J. Ind. Manag. Optim. 2023, 19, 2824–2837. [Google Scholar] [CrossRef]
  8. Guan, X.H.; Zhai, Q.Z.; Lai, F. New lagrangian relaxation based algorithm for resource scheduling with homogeneous subproblems. J. Optim. Theory Appl. 2002, 113, 65–82. [Google Scholar] [CrossRef]
  9. Wang, X.; Cheng, T.C.E. Single machine scheduling with resource dependent release times and processing times. Eur. J. Oper. Res. 2005, 162, 727–739. [Google Scholar] [CrossRef] [Green Version]
  10. Shabtay, D.; Steiner, G. A survey of scheduling with controllable processing times. Discret. Appl. Math. 2007, 155, 1643–1666. [Google Scholar] [CrossRef] [Green Version]
  11. Zhang, L.-H.; Lv, D.-Y.; Wang, J.-B. Two-agent slack due-date assignment scheduling with resource allocations and deteriorating jobs. Mathematics 2023, 11, 2737. [Google Scholar] [CrossRef]
  12. Wang, Y.-C.; Wang, S.-H.; Wang, J.-B. Resource allocation scheduling with position-dependent weights and generalized earliness-tardiness cost. Mathematics 2023, 11, 222. [Google Scholar] [CrossRef]
  13. Wang, J.B.; Lv, D.Y.; Wang, S.Y.; Jiang, C. Resource allocation scheduling with deteriorating jobs and position-dependent workloads. J. Ind. Manag. Optim. 2023, 19, 1658–1669. [Google Scholar] [CrossRef]
  14. Liu, W.; Wang, X. Group technology scheduling with due-date assignment and controllable processing times. Processes 2023, 11, 1271. [Google Scholar] [CrossRef]
  15. Lu, Y.Y.; Wang, J.B.; Ji, P.; He, H. A note on resource allocation scheduling with group technology and learning effects on a single machine. Eng. Optim. 2017, 49, 1621–1632. [Google Scholar] [CrossRef]
  16. Wang, J.B.; Liu, M.; Yin, N.; Ji, P. Scheduling jobs with controllable processing time, truncated job-dependent learning and deterioration effects. J. Ind. Manag. Optim. 2017, 13, 1025–1039. [Google Scholar] [CrossRef] [Green Version]
  17. Liu, W.W.; Jiang, C. Flow shop resource allocation scheduling with due date assignment, learning effect and position-dependent weights. Asia-Pacific J. Oper. Res. 2020, 37, 2050014. [Google Scholar] [CrossRef]
  18. Zhao, S. Resource allocation flowshop scheduling with learning effect and slack due window assignment. J. Ind. Manag. Optim. 2021, 17, 2817–2835. [Google Scholar] [CrossRef]
  19. Wang, J.B.; Lv, D.Y.; Xu, J.; Ji, P.; Li, F. Bicriterion scheduling with truncated learning effects and convex controllable processing times. Int. Trans. Oper. Res. 2021, 28, 1573–1593. [Google Scholar] [CrossRef]
  20. Yan, J.-X.; Ren, N.; Bei, H.-B.; Bao, H.; Wang, J.-B. Study on resource allocation scheduling problem with learning factors and group technology. J. Ind. Manag. Optim. 2023, 19, 3419–3435. [Google Scholar] [CrossRef]
  21. Biskup, D. Single-machine scheduling with learning considerations. Eur. J. Oper. Res. 1999, 115, 173–178. [Google Scholar] [CrossRef]
  22. Kuo, W.H.; Yang, D.L. Minimizing the total completion time in a single-machine scheduling problem with a time-dependent learning effect. Eur. J. Oper. Res. 2006, 174, 1184–1190. [Google Scholar] [CrossRef]
  23. Wang, D.; Wang, M.Z.; Wang, J.B. Single–Machine scheduling with learning effect and resource-dependent processing times. Comput. Ind. Eng. 2010, 59, 458–462. [Google Scholar] [CrossRef]
  24. Kanet, J.J. Minimizing variation of flow time in single machine systems. Manag. Sci. 1981, 27, 1453–1459. [Google Scholar] [CrossRef]
  25. Bagchi, U.B. Simultaneous minimization of mean and variation of flow-time and waiting time in single machine systems. Oper. Res. 1989, 37, 118–125. [Google Scholar] [CrossRef]
  26. Hardy, G.H.; Littlewood, J.E.; Polya, G. Inequalities; Cambridge University Press: Cambridge, UK, 1967. [Google Scholar]
  27. Nawaz, M.; Enscore Jr, E.E.; Ham, I. A heuristic algorithm for the m-machine, n-job flow-shop sequencing problem. Omega 1983, 11, 91–95. [Google Scholar] [CrossRef]
  28. Wu, C.C.; Wu, W.H.; Hsu, P.H.; Yin, Y.; Xu, J. A single-machine scheduling with a truncated linear deterioration and ready times. Inf. Sci. 2014, 256, 109–125. [Google Scholar] [CrossRef]
  29. Sterna, M.; Czerniachowska, K. Polynomial time approximation scheme for two parallel machines scheduling with a common due date to maximize early work. J. Optim. Theory Appl. 2017, 174, 927–944. [Google Scholar] [CrossRef] [Green Version]
  30. Sun, X.; Geng, X.N. Single-machine scheduling with deteriorating effects and machine maintenance. Int. J. Prod. Res. 2019, 57, 3186–3199. [Google Scholar] [CrossRef]
Figure 1. Gantt chart of the problem.
Figure 1. Gantt chart of the problem.
Mathematics 11 03179 g001
Table 1. Models studied.
Table 1. Models studied.
ReferencesScheduling ProblemTime Complexity
Biskup [21] 1 p j r A = p ¯ j r α j = 1 n ˇ δ 1 E j + δ 2 T j + δ 3 C j O ( n 3 )
1 p j r A = p ¯ j r α j = 1 n ˇ C j O ( n log ( n ) )
Kuo and Yang [22] 1 p j r A = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α j = 1 n ˇ C j O ( n log ( n ) )
Wang et al. [23] 1 p j r A ( u ˜ j ) = p ¯ j r a u ˜ j β δ 1 C max + δ 2 T C + δ 3 T A D C δ 4 j = 1 n ˇ G j u j O ( n log ( n ) )
1 p j r A ( u ˜ j ) = p ¯ j r a u ˜ j β δ 1 C max + δ 2 T W + δ 3 T A D W δ 4 j = 1 n ˇ G j u j O ( n log ( n ) )
This article 1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β δ j = 1 n ˇ ϑ j p [ j ] A + η j = 1 n ˇ g [ j ] u ˜ [ j ] Conjecture NP-hard
1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β , j = 1 n ˇ g [ j ] u ˜ [ j ] U ˘ j = 1 n ˇ ϑ j p [ j ] A Conjecture NP-hard
1 p j r A ( u ˜ j ) = p ¯ j ( 1 + h = 1 r 1 p ¯ [ h ] ) α u ˜ j β , j = 1 n ˇ ϑ j p [ j ] A V ˘ j = 1 n ˇ g [ j ] u ˜ [ j ] Conjecture NP-hard
Table 2. Notations.
Table 2. Notations.
NotationMeaning
n ˇ the number of jobs
J ˙ j the jth job
J ˙ [ j ] the job scheduled in the jth position
p ¯ j the normal processing time of job J ˙ j
p ¯ [ h ] the normal processing time of the job scheduled in the hth position
p j r A the actual processing time of job J ˙ j in position r
α the learning factor
u ˜ j ( u ˜ [ j ] ) the resource allocated to job J ˙ j ( J ˙ j )
C j ( C j )the completion time of job J ˙ [ j ] ( J ˙ j )
p j A ( p j A )the actual processing time of job J ˙ [ j ] ( J ˙ j )
g j ( g [ j ] ) the cost when allocating unit resource to job
ϑ j the position-dependent (but job-independent) weight (cost) of the jth job
C max the makespan of all jobs
T A D C the total absolute differences in completion times
T A D W the total absolute differences in waiting times
β ( δ , η ) the given constant
Table 3. CPU time (ms) results for n ˇ = 13 , 14 , 15 , 16 .
Table 3. CPU time (ms) results for n ˇ = 13 , 14 , 15 , 16 .
UBNEH[SPT]NEH[LPT]TS[SPT]TS[LPT]B&B
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMaxMeanMaxMeanMax
122.224165.8175158.21659045.210,7679045.210,249515,192.2560,546
223.424154.4160149.215610,256.411,2769047.210,495562,112.2610,490
13−0.25324.425147.2159165.21959549.410,07610,042.211,218645,292.2660,729
419.423111.2130120.61459546.412,0499463.212,249741,061.2745,706
12022169.2190164.217210,457.612,94010,784.413,002654,292.4666,172
221.424189.4951751789549.610,24910,259.411,592599,290.4710,286
13−0.3316.821112.2120116.212010,894.213,24910,214.412,049600,053.2620,049
417.823150.4160142.215410,973.812,2469221.610,079524,290.2550,170
121.424151.41691421499449.811,04610,01611,076564,265.2590,049
224.625146.2155145.415610,216.211,04610,794.411,579555,056560,804
13−0.35319.622109.6132105.21109216.212,2169246.610,249651,095.4669,046
420.423129.2136124.81309794.410,8439846.411,791754,294.2780,720
118.420139.2148152.81649549.610,54910,452.811,743549,490.6580,260
215.617173.6189168.817010,549.211,04910,45711,219610,084.6650,480
13−0.4316.619195.2199187.21899249.211,2499249.210,425649,160.2690,482
419.824154.2167148.21659249.412,9729843.410,197777,049.8789,442
12630140.4145135.214010,549.211,04910,846.211,0341,683,7921,724,292
23545156.2160140.214310,046.411,04910,846.212,2491,643,2951,667,176
14−0.25329.434165.417015015610,27911,27910,842.411,7151,642,6261,687,590
43031149.4153150.216010,279.611,24610,94710,9861,707,9001,717,519
132.236156.616416417210,972.812,04910,249.211,0791,689,5201,700,526
229.83715618016016410,873.211,04911,48712,7091,679,5301,820,249
14−0.334046184.219017017810,497.411,27810,45912,2791,702,8791,719,722
43537184.8199189.219410,249.811,27810,857.412,7011,725,4941,745,492
14047188.4190179.618111,249.412,94311,843.413,7201,679,4101,890,782
241.449210.6225220.422510,249.811,97210,926.811,7081,789,2921,806,482
14−0.35329.4636215.4230208.821011,943.212,87310,719.613,0791,762,9361,847,227
435.239220.822521222010,279.411,66711,02911,4021,642,7981,695,492
14046265.6290288.229012,09412,27910,183.611,0451,847,2891,860,416
24145247.6250249.425610,049.811,97210,984.811,2791,792,7951,801,282
14−0.434548264.8268266.426911,279.212,88311,25711,6451,832,8451,856,481
44446247.2249240.425310,943.411,22410,293.812,0191,849,8241,854,270
15561285286295.429911,046.212,34911,203.212,7941,801,9491,839,279
26566245.224925627610,079.811,24911,440.412,2451,897,7931,900,219
15−0.2535965244.2260264.227412,079.413,27911,492.412,8441,858,9751,897,249
460.154267239.625024926112,24812,87011,549.812,1791,940,2491,960,504
16365258.4260254.626411,216.412,94312,206.212,6401,890,1971,903,590
256.236464259.627327727811,467.612,94212,60912,7641,922,7911,930,434
15−0.3365.264676260264254.225912,27912,40011,452.812,7971,945,7931,949,180
45775246.2260244.625411,04912,67511,183.212,2491,894,2851,906,840
166.164669270.828727227412,279.212,97211,159.212,2251,927,1761,953,294
259.25466527827928128510,276.811,24911,12812,4591,896,2491,908,046
15−0.35362.963164291.4294290.229712,04912,81010,843.811,4141,889,0841,927,040
464.256468284.2285284.629612,640.212,79412,01112,3541,962,8741,986,480
16972275.2276277.827912,249.413,24911,281.812,5491,942,2921,952,284
270.162473264.426727828112,842.813,01811,843.612,7541,923,7281,936,490
15−0.4370.00237129029728929012,04912,24610,952.812,9421,967,2491,976,279
469.259373287.4288286.629212,279.412,79411,42512,7911,897,9281,952,046
188.794290310320330.233511,279.212,64910,957.411,5242,500,4632,604,892
279.15978233034532533212,279.812,49410,829.812,0592,502,4662,564,592
16−0.25384.014986325360346.434810,972.611,24911,906.212,2992,706,2042,769,279
470.154979310.4325340.234810,252.411,54211,41111,5182,790,4292,800,572
18489337.2339340.834412,076.612,97311,549.212,8492,794,0472,804,490
28385350.4356354.235912,706.412,79712,49312,5792,790,8982,808,480
16−0.3379.157884367.2376378.638411,076.212,97212,482.412,8642,943,2792,977,750
480.456182364.2368380.238612,94012,98412,216.212,4632,788,1242,851,592
184.587406409410.842012,04612,97211,24911,5902,971,5192,986,079
289.401298410.442043343611,076.812,99810,249.812,0592,679,2892,789,279
16−0.35390.001292452459465.246811,07612,87312,487.412,7642,790,4652,798,279
48484444.4460470.447512,279.612,49712,249.612,8462,792,4992,899,259
184.28845046748748911,249.212,27911,940.612,4202,987,2492,990,279
286.154690465469458.446012,079.412,49111,491.412,1902,920,0482,960,249
16−0.4390.154398472478486.249012,099.812,94211,241.212,1912,894,2492,905,287
4878945646947248012,15712,34911,49511,7302,790,7952,871,287
Table 4. Error results (%) for n ˇ = 13 , 14 , 15 , 16 .
Table 4. Error results (%) for n ˇ = 13 , 14 , 15 , 16 .
UBNEH[SPT]NEH[LPT]TS[SPT]TS[LPT]
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMaxMeanMax
115.332319.57343.31717.57637.386210.576913.353119.468913.348320.1686
215.977319.86893.38167.72377.589310.901613.780619.728213.763120.2465
13−0.25315.999319.99953.43877.90297.858710.978013.972419.981413.970920.3555
416.000020.00003.76557.97797.962510.996213.995619.997013.995520.3660
116.340620.53194.39498.46427.399710.467413.232719.254113.219720.8075
216.988020.99494.59178.89667.714610.894013.856019.924313.846820.8666
13−0.3317.133021.18374.85838.94877.970310.997813.974420.287914.374921.5649
417.239821.26184.96248.93898.145311.084814.079420.419414.521921.6803
117.519021.36465.39287.82818.487010.009314.434920.379814.473321.9968
217.682021.71835.71447.67148.502210.168914.436620.616614.657621.9995
13−0.35317.887021.82365.83877.76538.508610.458014.545520.919515.061021.2667
417.903722.18385.93828.02129.128110.885414.779621.199615.223221.9199
118.429722.80036.04628.64168.277211.076715.052921.688616.582622.5281
218.573122.95726.09718.75588.298611.378415.199321.846216.658622.0242
13−0.4318.739222.97066.27698.51798.515011.496315.294222.009017.884722.5812
418.979323.17126.69489.18278.571711.549815.649622.086318.287122.6142
115.146120.18873.09016.00467.133410.034014.862318.175914.244319.4956
215.427620.49363.43116.14317.281510.064015.103518.420114.985219.7539
14−0.25315.555620.53383.48246.61097.416610.099115.174918.612515.059220.3064
415.587120.68283.90486.70357.450010.167015.320818.995315.248720.4650
115.924121.15804.00726.75467.576110.241716.060319.238016.121620.7730
216.290321.30944.06116.97147.786410.418817.288119.566416.901921.0787
14−0.3316.311121.67684.16187.07628.043110.724817.664419.569217.175221.1272
416.312421.83724.25097.28998.180410.968617.676519.857317.596121.4696
116.483421.97764.37557.36458.327210.979117.692020.485518.226521.4806
216.593922.29424.53857.54718.454611.104619.831920.592518.243321.7282
14−0.35317.120822.47094.71307.84378.576211.162920.407220.710718.525922.1671
417.194322.60574.92817.84988.591011.229420.538120.947818.578922.3108
117.443022.73405.33197.88548.758311.251920.306321.997218.632622.5012
217.539323.12035.33637.89398.883711.259820.850322.234919.583222.5014
14−0.4317.974523.39575.37748.42969.088811.332821.349622.567220.815622.9743
418.014223.39865.46848.88709.223811.459521.400323.080421.406523.0811
115.143420.09293.13446.36457.058710.085315.780018.151515.184819.0588
215.152720.31613.27526.47767.591210.164115.823518.209915.329119.2560
15−0.25315.839620.36283.36336.83577.648110.211415.905518.315416.081719.5946
417.399621.75194.27847.70298.132210.411017.060620.432117.298520.4187
115.989021.00303.42257.15007.982710.247916.010619.594016.472120.0627
216.199721.18873.64597.40097.992610.293016.366619.740416.575120.3457
15−0.3316.670822.39553.71517.66468.173510.378116.851120.687116.626620.0934
417.449522.50084.30067.80068.184110.412017.624121.161517.932520.6143
117.883621.61474.69158.18648.077710.540617.136421.749618.735821.2691
218.049322.03704.69388.30488.272910.563718.494419.925519.649021.8650
15−0.35318.088322.23454.88378.42928.289810.834118.647820.683119.725422.7423
418.493722.93815.12358.54318.326910.890418.892422.028519.868023.0382
118.575223.28725.39418.59059.050210.980719.777922.391420.883323.2260
218.720422.58995.44388.64789.112111.071619.966022.480820.993923.3771
15−0.4318.895323.76785.55418.64809.209011.077220.431822.624321.117623.6015
419.027423.94125.61788.71619.265211.130020.777523.062221.324323.3566
115.212221.16303.03366.25447.269910.050516.696118.127216.577719.2650
215.298121.64013.34006.42327.335110.108516.887318.292717.471619.3277
16−0.25315.357221.78203.76836.43887.408910.113416.979418.468618.058019.5945
415.417321.85564.12026.47697.464310.200417.138518.609518.318519.6092
115.483721.89074.20356.87477.501810.236317.208718.748618.856919.8253
215.749321.95464.33247.05917.599610.356217.514319.275019.771420.1595
16−0.3315.866922.50734.40507.27237.955610.575717.659919.524619.960320.2609
416.954722.84464.44537.46338.220910.661717.766220.481219.995720.8716
117.269023.08875.20047.79358.485010.719317.958121.739021.196521.3400
217.355423.30365.28717.91798.485510.829018.259422.315121.721021.3451
16−0.35317.502423.69565.59038.09008.543310.892118.820922.381621.778521.5903
418.245024.33176.33928.17899.013811.118120.780923.486222.767522.0077
117.593023.80605.63578.16308.602210.929718.847922.736522.040821.9139
217.608224.22386.18858.17558.764111.080018.857322.786922.436921.9386
16−0.4318.409924.34596.37678.45389.246111.413821.020124.197523.101722.3438
418.612224.49056.38068.55739.306611.527921.685924.293023.151922.7735
Table 5. CPU time (ms) results for n ˇ = 100 , 150 , 200 .
Table 5. CPU time (ms) results for n ˇ = 100 , 150 , 200 .
UBNEH[SPT]NEH[LPT]TS[SPT]TS[LPT]
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMaxMeanMax
1991.211607682.293997660.2967229,863.232,46529,792.245,898
2994.62163769091347741.2840229,899.232,06029,947.239,473
100−0.25399610067845.284497790.610,10729,926.432,28729,960.643,350
41000.613947932.687697768.6981329,92734,58530,056.643,906
51003.417938025.287477815.2953730,11333,60730,120.245,675
11002.213438107.410,1247827.2833730,132.633,91730,14737,530
21011.612288153.284817835.4959330,160.641,00530,186.645,007
100−0.33101923038161.485757914.4818530,186.632,98530,197.837,586
41019.422268233.292757940.610,08630,194.233,17130,267.839,263
51020.823708248.288607979.8883930,366.834,82130,281.238,374
11022.611178263.490417968.8968630,459.834,74630,385.445,368
2102212488313.485048043.2867730,465.432,56130,433.245,003
100−0.3531023.819818335.690538049.210,05830,500.434,72330,510.235,320
4102822398364.210,1668105.4947230,536.231,78430,598.439,411
51036.221228390.299858142.6897130,606.231,28730,685.247,017
11028.424438413.490618107.8917630,698.231,60130,709.636,560
21037.811268655.610,2948171.810,27130,717.633,22430,751.444,770
100−0.431038.420348665.210,0588215.2831930,751.433,75830,78238,607
410381676870785338233.210,04530,799.631,71930,79439,168
51049.816648742.285798257.6861130,851.232,19330,812.238,702
1105019038744.485858281.6924630,862.233,23530,82146,098
21048.211258757.699118307.6918730,877.834,50030,90932,314
100−0.4531051.618088800.893148324.4959530,888.834,30431,054.435,641
41061.214678855.294798336.4890930,956.635,49731,079.848,059
51062.824958884.210,2018337985031,03431,79931,083.234,815
1998.4322216,772.219,86716,701.220,88179,740.291,41879,58699,585
21002.8294816,816.619,49316,768.819,34679,754.297,15479,592.288,831
150−0.2531038222216,825.419,29216,796.620,98579,767.695,29079,68388,666
41041.2293116,836.820,28016,800.221,04579,849.289,46979,642.286,716
51043.2278916,855.220,73216,907.419,98980,047.299,30779,77295,116
11058.6307616,896.218,42116,997.620,00580,049.298,21879,971.4101,640
21052.2302516,91418,99917,00521,30080,050.697,13380,047.694,711
150−0.331069.8309316,949.218,66217,014.219,23180,128.693,00280,237.293,021
41075.2381716,963.420,13617,024.619,88880,139.291,49380,41086,450
51076.4284416,975.619,06217,069.419,32280,189.292,02180,783.499,343
11080318416,976.819,34217,190.220,50480,215.811,19280,806.697,428
21091.8252816,977.218,98617,193.220,44980,223.289,37080,817101,491
150−0.3531098.2363317,031.620,49317,199.619,27280,420.292,92380,90597,811
41102.4321217,045.419,54117,265.419,89780,458.296,20381,01192,751
51108.8214117,153.620,51117,368.221,17180,654.492,24881,035.290,361
11103.2218717,049.818,25817,381.219,23680,68899,87481,066.691,534
21109.4250617,227.218,58017,382.821,51180,694.895,86481,068.888,109
150−0.431112.6332817,230.218,87217,383.220,78380,741.693,74081,168.889,118
41112.2367417,304.218,25217,384.619,89980,778.294,88481,24395,724
51130.2238617,34618,57817,415.219,39781,005.293,30181,45395,209
11120.4225717,352.418,57417,430.619,13580,958.293,00581,48294,852
21141.6208317,372.419,20817,454.821,08981,118.497,99081,564.496,995
150−0.4531143.2230517,421.619,98417,463.420,71081,172.6101,51181,595101,448
41147301917,424.818,37417,520.820,52581,190.292,69681,708.898,576
51160238117,428.220,76117,58121,12981,34292,37881,744101,987
11167.2219719,771.224,66719,67725,136149,796.6155,623149,826.2163,834
21173.6287019,823.630,92019,701.223,136149,946.2154,082150,320158,353
200−0.2531181.8297019,825.423,43319,763.628,100150,019.6153,142150,375160,345
41205202419,929.426,49519,815.433,241150,322.2156,925150,638.8166,380
51238253919,937.428,07419,899.828,211150,528158,533150,717.4156,524
11174.4136519,974.231,07919,931.822,282150,775.4160,672150,817169,835
21177.2276020,028.824,84420,03126,057151,014.4166,685150,960165,099
200−0.331206.6223220,052.233,32420,100.222,768151,076.6158,792150,975155,237
41212.2350120,055.825,61220,126.225,777151,106.6160,256151,042.2153,552
51220.8301020,174.222,75720,183.622,920151,128154,817151,091.2158,253
11178351520,299.624,44820,284.630,100151,208153,111151,153.4154,157
21239241020,304.428,08520,311.622,143151,331.2156,617151,196.6156,658
200−0.3531242.2319420,314.424,93020,328.432,950151,505.2166,131151,200.2159,997
41248.4318820,321.430,82320,322.621,847151,578.2161,758151,245.4165,494
51257.6299520,414.432,11620,393.829,292151,608.4156,293151,291166,519
11252.4210320,426.629,42920,43231,466151,635.6155,830151,340171,189
21260.6204320,469.822,82220,47326,081151,677.2156,286151,770.2165,775
200−0.431262.2211120,487.232,80720,546.827,519151,721.2155,154151,786.2163,851
41265.2203721,403.227,02320,682.231,808151,766.6156,972152,068161,410
51267.6254620,666.227,37620,686.226,028151,769.6160,077152,078.8168,555
11243279720,668.828,54420,603.627,825152,147.8159,240152,325164,082
21259318420,717.429,15820,702.624,855152,258.8164,452152,402.2170,569
200−0.4531272.8295120,815.232,26720,707.428,963152,265163,314152,488.8171,036
41278.8269920,819.226,96120,803.827,908152,442.2154,165152,503154,144
51289345820,88521,36120,923.221,238152,546.4155,099152,517170,864
Table 6. CPU time (ms) results for n ˇ = 250 , 300 .
Table 6. CPU time (ms) results for n ˇ = 250 , 300 .
UBNEH[SPT]NEH[LPT]TS[SPT]TS[LPT]
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMaxMeanMax
11259.4393424,897.629,53724,867.227,872199,049.2271,727199,652.2307,055
21263.6345824,898.226,10924,911.228,509199,470270,918199,951281,056
250−0.2531275.4230924,900.427,58124,935.626,577199,593.2297,258200,032.2248,877
41288.8224924,912.230,09425,048.627,997201,043.2300,606202,006.6293,970
51295.2345624,926.627,76225,057.429,513201,094.6292,507202,917261,451
11289.8241724,916.427,15925,067.429,982201,069.4301,345203,562252,366
21301413324,929.426,90925,073.430,159201,430.6236,215203,831248,524
250−0.331306307324,970.625,48325,087.628,392201,894.6257,597204,321311,840
41310.2302324,979.229,45425,122.628,225202,125.2301,120204,512.2299,658
51315.2203624,994.226,31325,171.231,342202,779.2251,543204,611263,096
11307.8218324,958.828,27425,200.231,422202,957285,368204,792.2296,832
21321.4263025,001.828,84625,281.230,586203,025.2271,886204,971239,756
250−0.3531324405325,002.626,22325,313.628,702203,157.4251,617205,692253,573
41356.6367625,009.629,65425,330.229,413203,367.6242,496205,872.2306,401
51352339125,028.625,91525,356.229,122204,275.2261,458206,048.8269,346
11355.2295325,040.628,81625,39028,230204,328.8275,741206,328.8287,909
21359.6220125,052.227,25825,39531,061204,386.2285,809206,495240,077
250−0.431360.4182525,056.227,62925,411.630,428204,681.4266,179206,622306,779
41365.6174825,062.428,17325,49427,175205,119.2285,304206,728.8307,374
51376.8351125,071.429,03325,537.228,694206,184.2263,097206,898.8236,634
11367.4313525,082.425,37025,412.229,107206,513.2271,578207,076.6282,392
21382.6182425,098.229,66025,566.630,750207,717.6239,572207,423253,829
250−0.4531387.8318725,106.225,98925,568.830,927208,261.4285,918208,192.2309,023
41393.4379925,10828,06625,583.826,694208,543.4250,599208,431.2266,308
51397.2290725,125.627,31625,687.431,041208,672.6249,840208,609274,208
11373315229,118.235,43529,850.237,302249,278.6277,613250,038323,384
21376225229,136.633,22129,856.236,892249,389.6316,413250,105.4291,352
300−0.2531378334329,179.235,19129,947.636,878249,969.2296,189250,106292,833
41380.2260029,287.431,99129,949.637,135250,234.2315,731250,198.8318,015
51392.6299829,362.432,38929,965.636,966250,597.4282,927250,199312,524
11390.2356528,697.631,95930,03536,284250,713.4293,675250,211281,524
21399.4396429,394.236,74830,13233,879250,739.6307,021250,213291,397
300−0.331400.8205929,417.233,60830,063.232,347250,797.4316,809250,258.8292,576
41406.8264929,424.833,96630,118.233,273250,924.4281,671250,278.8297,647
51403.4459529,436.832,11730,173.433,552251,213.2317,061250,292.2285,731
11405.6421429,461.235,89530,257.433,713251,082.8304,255250,367283,863
21407.6409329,510.234,65030,301.238,007251,301.2287,881250,379297,499
300−0.3531409.2475529,636.437,26530,338.632,165251,379314,073250,429282,229
41411.4371330,054.635,62130,505.232,271251,550.2298,918250,446302,785
51416.4340830,299.637,08130,616.437,183251,688.8281,932250,501276,826
11413.2453230,254.437,24730,555.234,036251,663294,545250,456.6278,968
21423312230,304.237,08230,630.636,829251,715287,171250,531321,642
300−0.431424.6207230,461.234,76330,680.832,779251,741287,903250,572284,786
41425.8372030,674.836,68630,712.237,945252,002.2308,071250,576.6298,461
51431.2477730,737.832,88530,734.435,883252,074.4288,835250,687291,994
11426.8442530,819.834,67030,913.633,138252,254.8287,392250,698305,158
21434.4361530,845.634,91130,974.237,733252,332.2299,953250,738.8307,373
300−0.4531442.4356830,849.435,41531,060.234,681253,160.8308,104250,775319,728
41446.8414030,874.434,00531,114.635,173253,313.4304,044250,876.6324,061
51447312530,883.235,38431,116.237,990253,416.2315,289250,880283,537
Table 7. Error results (%) for n ˇ = 100 , 150 , 200 .
Table 7. Error results (%) for n ˇ = 100 , 150 , 200 .
NEH[SPT]NEH[LPT]TS[SPT]TS[LPT]
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMax
17.859112.55116.964715.816221.172232.795626.211944.5464
27.960112.96237.628417.943121.384532.283027.263342.7257
100−0.2538.073014.58817.722716.230023.655434.801328.489239.5713
48.399511.78918.079615.377623.883636.211030.663344.8234
59.081511.88508.259317.878824.270336.619430.689044.0691
17.741512.32267.385118.138920.461535.456925.549140.8982
27.871413.99047.580116.260122.234932.895525.824142.4867
100−0.338.722514.27737.748015.121624.402636.066626.743845.2058
48.776513.98567.822815.632024.894733.002727.514843.5470
58.960713.30198.693416.155325.639533.050029.855044.3070
17.843113.63576.881916.802019.947532.833525.617741.8062
28.066612.76496.896415.646420.768533.096928.897542.0365
100−0.3538.244814.32247.723116.829625.679333.255929.681142.5918
48.754612.39207.849017.206126.101033.425732.438340.7943
58.842014.12128.745415.505927.885934.160832.945043.7525
17.921012.37307.628815.143522.250834.154426.002842.2416
27.967213.01568.124515.766122.678433.554929.610743.1225
100−0.438.071613.90878.589015.842924.993733.757931.200041.0325
48.285014.44588.607116.209027.446537.650333.263339.9907
58.895412.01748.742016.499729.594636.499934.065242.0670
17.808814.96397.167515.032622.091135.605625.624241.3698
28.350514.43017.321015.647426.383233.354026.418443.9935
100−0.4538.458913.42658.079017.518028.368433.521429.935943.9453
49.065813.24968.768414.837128.463432.704631.637743.2563
59.127813.28758.843617.962129.734537.776934.222739.9585
17.197212.53917.713017.272520.756636.521128.649039.2685
27.886514.48667.772416.432821.595135.618031.459742.5865
150−0.2537.950514.44548.151916.745222.437734.136231.548340.9898
48.135411.80868.324015.559825.954633.243533.158844.9261
58.607313.44168.942316.329526.158436.010435.024445.1819
17.709713.29887.558018.081020.453438.226427.627240.9023
28.019413.10897.618216.635023.107233.269127.683244.0719
150−0.338.322714.61477.773116.545825.048833.798928.835344.6592
48.406211.84678.412815.540125.509034.641830.631842.8188
58.645812.09058.507116.433827.233132.684332.770944.5847
17.718411.18096.850616.903323.075536.384027.234844.9528
28.320314.93446.985317.094625.231834.675729.267242.5206
150−0.3538.347313.58847.626316.109527.878838.195529.649343.6254
48.526912.82877.784916.011928.296334.674431.020542.6907
58.894713.55988.085318.167528.589635.999433.930439.2950
17.102513.12716.957914.866722.179033.171727.069541.8880
27.311713.59646.958717.810326.967534.548129.914443.1194
150−0.437.928213.12337.168717.908027.228033.212731.605242.3483
48.084113.93378.145217.501228.587736.832832.329741.4305
58.296913.02558.301615.078529.122837.518034.244644.9121
16.878515.18087.049015.645220.753034.362727.824444.2488
27.786111.63587.071515.900523.952336.392730.177144.3694
150−0.4537.971711.11957.964817.096725.707834.019333.523441.4319
48.184111.13748.722515.209928.099035.453333.673842.7920
58.653310.92658.819017.240929.911637.283434.683944.5140
16.845612.48617.016815.106420.954335.858827.014044.8897
27.148312.68657.944817.006520.955534.268927.372043.2560
200−0.2538.057412.30888.114716.452222.586734.050128.822940.4102
48.510214.12798.280717.441724.445334.980129.509943.1659
58.689413.50768.501917.219427.503734.798532.698039.5797
17.055714.16667.209217.874821.782534.416326.587641.6427
27.428414.90257.798017.830322.741935.621229.038843.2466
200−0.337.938915.08497.964815.896323.846036.738429.403944.8911
48.838111.51398.125317.162824.346734.808830.739544.1343
58.991311.27088.885615.422728.273334.839233.928842.1224
17.822713.82037.358314.841720.739432.992826.713143.8002
27.999011.06477.637617.320222.438832.383827.513641.7063
200−0.3538.268213.03888.045416.472526.352333.995329.721845.1257
48.669713.06148.100116.402728.504034.161030.004945.2255
58.892914.57458.321017.878229.128836.199634.613544.4622
17.081612.85337.025216.854020.457738.038525.760041.5327
27.491312.43537.141916.881125.795337.909926.741741.9386
200−0.437.724113.70677.397617.721025.796135.012227.740440.6562
47.805014.02617.403817.533528.380633.693931.302443.9708
58.729113.01437.773516.738929.500436.867934.848844.5774
17.096012.22606.843015.371022.806736.840227.764744.7677
27.485511.32176.997415.569023.014636.726930.649242.5769
200−0.4537.763613.31647.367617.815025.183636.745332.705942.8270
47.916911.83477.695914.835227.077432.877934.844340.0533
58.378510.83898.550416.437329.750636.368634.951144.6814
Table 8. Error results (%) for n ˇ = 250 , 300 .
Table 8. Error results (%) for n ˇ = 250 , 300 .
NEH[SPT]NEH[LPT]TS[SPT]TS[LPT]
n ˇ α β MeanMaxMeanMaxMeanMaxMeanMax
16.838414.08877.138415.318921.340835.044826.029141.9118
27.146811.74617.217618.135121.566233.522230.606840.4034
250−0.2537.248812.65918.143717.211224.688732.833031.311244.6810
47.317613.78168.803716.474025.656237.229832.797343.8362
57.421112.27878.887416.372026.680433.296933.759544.5752
16.892614.00366.851714.942721.199633.227526.456240.8920
26.984912.44107.776117.104521.951936.274127.528643.2854
250−0.337.215313.76158.320914.883023.078337.659227.639543.2302
47.788613.85598.500914.983826.735535.368028.661539.8926
58.920812.65878.935816.547628.261536.496832.707741.6463
17.312010.72517.333715.071620.171433.167026.740040.8325
28.042012.14897.645617.577520.384738.017431.511043.5532
250−0.3538.244712.57647.714617.575425.464435.515532.691740.8824
48.377511.87188.706117.245128.042336.357533.178744.6598
58.522611.53698.781615.256229.092632.457333.425944.2306
17.530214.39426.908117.026823.549137.142628.861841.5397
28.127812.60217.562016.537023.736236.775228.956042.2047
250−0.438.165014.69637.801918.115324.436832.964430.777943.4184
48.821312.42498.698916.989925.262335.419534.101744.2787
58.823614.15358.810917.515627.885134.211534.722642.8934
17.227212.45056.942116.311923.186235.549325.538942.6783
27.245614.33377.538816.237626.462834.654429.464641.1453
250−0.4537.570714.08937.717717.602427.090134.752730.724941.9490
47.998512.36188.033815.025528.719333.331634.453443.5354
58.400611.62378.042915.198229.772933.784335.017444.5871
17.219614.25096.838216.910220.610032.360125.776943.5790
27.622714.97777.248016.123922.525137.836829.059539.2503
300−0.2538.426312.13397.526716.909722.986936.199629.511843.2949
48.623813.70608.201316.015524.054737.891032.552941.8386
58.879312.64208.363715.777125.819633.227133.032341.8343
17.463414.44807.104717.825321.564637.821126.779039.8570
27.794514.15238.093817.078522.453737.054427.695044.1573
300−0.337.931311.40068.245618.028422.677535.736927.863241.1380
48.021814.57838.468617.428823.598034.904032.682140.6534
58.499015.16338.500917.135726.581533.797834.096341.2481
16.856912.98867.445817.734421.709736.795426.027941.4514
27.396514.68037.693317.898224.911633.622328.290042.5046
300−0.3537.778113.32527.998218.181125.114432.624828.893742.5993
48.304011.34348.815515.636925.532136.888731.221841.5754
58.352311.54988.902417.863227.620336.305834.434241.5897
16.889912.49707.819217.212320.582236.572627.098542.3123
26.960014.06028.107818.183723.469836.129127.143243.1886
300−0.438.302214.41188.255416.993825.860034.776730.353944.9970
48.664114.24898.335416.869928.082334.605233.579543.5882
58.957512.09258.407017.698329.113937.184735.123541.6017
16.961513.07847.243216.222121.737534.160527.738544.2633
27.089211.04707.460416.917721.831537.175029.737639.9637
300−0.4537.652211.14658.140917.954427.570337.020532.090339.5083
48.105911.25908.946417.114827.798437.403734.517039.6549
58.423313.73988.985017.811129.980635.301834.766440.1459
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Y.-C.; Wang, J.-B. Study on Convex Resource Allocation Scheduling with a Time-Dependent Learning Effect. Mathematics 2023, 11, 3179. https://doi.org/10.3390/math11143179

AMA Style

Wang Y-C, Wang J-B. Study on Convex Resource Allocation Scheduling with a Time-Dependent Learning Effect. Mathematics. 2023; 11(14):3179. https://doi.org/10.3390/math11143179

Chicago/Turabian Style

Wang, Yi-Chun, and Ji-Bo Wang. 2023. "Study on Convex Resource Allocation Scheduling with a Time-Dependent Learning Effect" Mathematics 11, no. 14: 3179. https://doi.org/10.3390/math11143179

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop