**1. Introduction**

Scheduling problems are widely used in manufacturing, logistics, and other practical applications. For a real-word example of our scheduling problems, consider a processing enterprise that has no inventory capacity. As the processing time increases, the processing technology improves. The processing time of the product becomes shorter. The pick-up time of each product is determined by the customer. If the product is produced before the pick-up time or after the pick-up time, an additional delivery fee will be incurred. The delivery price of each early (tardy) job is a fixed charge.

The following three forms of pick-up time are often considered:


The scheduling problem is a very classic discrete combinatorial optimization problem. The methods to solve the scheduling problem mainly include two types: the exact algorithm and approximate algorithm. Exact algorithms mainly include mathematical programming methods, dynamic programming, and branch and bound algorithms. Approximate algorithms mainly include heuristic algorithms and intelligent algorithms. For large-scale non-polynomial time-solvable scheduling problems, intelligent algorithms and machine learning algorithms can be used to solve them. In this paper, a single-machine scheduling problem is considered that contains due dates, the delivery time and learning effect. The actual processing time of a job is a learning function of the previous processing time. The objective function is to minimize the number of early jobs, the number of tardy jobs and due date. Under the common due date, slack due date and different due date, three polynomial time algorithms are proposed to obtain the optimal sequence.

#### **2. Literature Review**

In traditional scheduling problems, it is considered that the processing time of jobs is constant. However, in reality, the processing time is often reduced with the increase in

**Citation:** Qian, J.; Zhan, Y. The Due Date Assignment Scheduling Problem with Delivery Times and Truncated Sum-of-Processing-Times-Based Learning Effect. *Mathematics* **2021**, *9*, 3085. https://doi.org/10.3390/ math9233085

Academic Editor: Javier Alcaraz

Received: 28 October 2021 Accepted: 26 November 2021 Published: 30 November 2021

**Publisher's Note:** MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

**Copyright:** c 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/).

workers' skills and abilities. That means the processing time is no longer a constant. In 2011, Cheng et al. developed the branch-and-bound algorithm and simulated an annealing algorithm in order to study the single-machine scheduling problem with a learning effect and truncation processing time [1]. In 2013, Li et al. analyzed the polynomial time algorithm of the single-machine scheduling problem with a truncation processing time [2]. In 2013, Cheng et al. used a genetic algorithm and branch-and-bound algorithm to solve the twomachine flow-shop scheduling problem with a truncated learning function [3]. In 2016, Wu and Wang studied a single-machine scheduling problem with a learning effect and delivery times [4]. In 2017, Wang et al. solved the single-machine scheduling problem with resource allocation and deterioration effects by using the polynomial time algorithm [5]. In 2018, Wu et al. studied a two-stage scheduling problem with a position-based learning effect [6]. In 2018, Yin studied a single-scheduling problem with resource allocation and a learning effect [7]. In 2020, Zhang studied the scheduling problem with the sum-of-processingtimes-based learning effect [8]. In 2020, Qian et al. designed a heuristic algorithm to study the single-scheduling problem with release times and a learning factor [9]. In 2020, Zou et al. studied a multi-machine scheduling problem with the sum-of-processing-times-based learning effect [10]. In 2021, Wu et al. studied a flow-shop scheduling problem with a truncated learning function [11].

In the field of scheduling, the delivery time has attracted extensive attention. The extra time required for a completed job to be delivered to the customer is called the *p ast-sequence-dependent (psd)* delivery time. In 2011, Yang et al. studied a single-machine scheduling problem with delivery times and a learning effect [12]. In 2012, Yang et al. studied a single-machine scheduling problem with delivery times and position-dependent processing times [13]. In 2013, Liu studied a scheduling problem with delivery times and deteriorating jobs [14]. In 2014, Zhao et al. studied a single-machine scheduling problem with delivery times and general position-dependent processing times [15]. In 2021, Qian et al. studied a single-machine scheduling problem with delivery times and deteriorating jobs [16].

In actual production scheduling, the jobs often have due dates. If a job is completed ahead of the due date, it will have an earliness cost; if a job is completed behind the due date, it will have a tardiness cost. In 2013, Yin et al. studied a single-machine scheduling problem with a due date, delivery times and learning effect [17]. In 2014, Lu et al. studied a singlemachine scheduling problem with a due date, learning effect and resource allocation [18]. In 2015, Li et al. studied a single-machine scheduling problem with a slack due window, learning effect and resource allocation [19]. In 2016, Sun et al. studied a single-machine scheduling problem with a due date and convex resource allocation [20]. In 2019, Geng et al. studied a flow-shop scheduling problem with a common due date, resource allocation and learning effect [21]. In 2020, Liu et al. studied a single-machine scheduling problem with a due date, learning effect and resource allocation [22]. In 2021, Tian studied a single-machine scheduling problem with resource allocation and generalized earliness– tardiness penalties [23]. In 2021, Wang studied a single-machine scheduling problem with proportional setup times and earliness–tardiness penalties [24]. In 1996, Lann et al. studied a single-machine scheduling problem whose goal was to minimize the number of early and tardy jobs [25]. In 2017, Yuan studied a single-machine scheduling problem to minimize the number of tardy jobs [26]. In 2021, Hermelin studied a single-machine scheduling problem to minimize the weighted number of tardy jobs [27].

The problem is described in the Section 3. The research methods are discussed in the Section 4. Discussion of results is in the Section 5. The conclusion is given in the Section 6.

#### **3. Notation and Problem Statement**

Some notations used in this paper are introduced in Table 1.


**Table 1.** Symbol definition.

Suppose there were *n* independent jobs *J* = {*J*1, · · · *Jn*} continuously processed on a single machine. The machine can handle one job at a time. The actual processing time of *J<sup>j</sup>* at the *k*th position was:

$$p\_{j[k]}^A = p\_j \max \{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \beta \}. \tag{1}$$

The delivery time *q*[*j*] of *J* [*j*] was:

$$q\_{[j]} = r w\_{[j]} = r \sum\_{i=1}^{j-1} p\_{[i]}^A \tag{2}$$

where *w*[*j*] = *j*−1 ∑ *i*=1 *p A* [*i*] . The completion time of *J* [*j*] :

$$\mathcal{C}\_{[j]} = w\_{[j]} + p\_{[j]}^A + q\_{[j]}.\tag{3}$$

The common due date, slack due date and different due date were considered in this paper. For the CON model, the due date of each job was the same. For the SLK model, the due date was the sum of the processing time and certain parameter *q*. For the DIF model, each job had its own due date. The due date was a decision variable. If *J<sup>j</sup>* was an early job, *U<sup>j</sup>* = 1, *V<sup>j</sup>* = 0. If *J<sup>j</sup>* was a tardy job, *U<sup>j</sup>* = 0, *V<sup>j</sup>* = 1. By the three-field notation [28], the models could be defined as:

$$\mathbf{1}[\boldsymbol{p}\_{j[k]}^{A} = p\_j \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \boldsymbol{\beta} \} , q\_{psd} \text{CON} | \sum\_{j=1}^{n} (a\mathbf{U}\_j + \delta \mathbf{V}\_j + \eta d) , \tag{4}$$

$$\mathbf{1}\left|p\_{j[k]}^{A}\right. = p\_{\rangle} \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^{a}, \boldsymbol{\beta}\} \,, q\_{\text{psd}} \text{ SLK} \left|\sum\_{j=1}^{n} (\boldsymbol{a}\boldsymbol{U}\_{\boldsymbol{j}} + \boldsymbol{\delta}\boldsymbol{V}\_{\boldsymbol{j}} + \boldsymbol{\eta}\boldsymbol{q})\right.\tag{5}$$

$$\mathbf{1}|p\_{j[k]}^A = p\_j \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \boldsymbol{\beta} \}, \boldsymbol{q}\_{psd}, \boldsymbol{DIF} |\sum\_{j=1}^n (a\mathcal{U}\_j + \delta V\_j + \eta \boldsymbol{d}\_j), \tag{6}$$

where *qpsd* represents the past-sequence-dependent delivery times. The following Figure 1 shows the just-in-time common due date scheduling model.

**Figure 1.** The just-in-time CON scheduling model.

#### **4. Research Method**

**Lemma 1.** *For the* 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}|*C*max *problem, an optimal schedule could be obtained by the SPT rule* [4]*.*

**Lemma 2.** *For the* 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*|*C*max *problem, an optimal schedule could be obtained by the SPT rule* [4]*.*

$$\text{4.1. The Problem 1} \lfloor p\_{j[k]}^A = p\_j \max \{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \beta \}, \\ q\_{psd}, \text{CON} \lfloor \sum\_{j=1}^n (a \mathcal{U}\_{\bar{j}} + \delta V\_{\bar{j}} + \eta d) \}$$

**Lemma 3.** *For any job sequence, the due date d of the optimal scheduling was the completion time of some job.*

**Proof.** Suppose that the due date *d* of the optimal scheduling was not equal to the completion time of some job, i.e., *C*[*h*] < *d* < *C*[*h*+1] , 0 ≤ *h* < *n*, *C*[0] = 0. The objective function was:

$$Z = h\mathfrak{a} + (n - h)\delta + n\eta d.\tag{7}$$

When *d* was equal to *C*[*h*] , the objective function was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta \mathcal{C}\_{[h]'} \tag{8}$$

$$Z - Z\_1 = \mathfrak{a} + n\eta (d - \mathbb{C}\_{[h]}) > 0. \tag{9}$$

Therefore, *d* was the completion time of some job.

**Lemma 4.** *When α* ≥ *δ, the due date d was equal to 0.*

**Proof.** When the due date *d* was equal to *C*[*h*] , the objective function was:

$$Z = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathbb{C}\_{[h]}.\tag{10}$$

(1) When *<sup>d</sup>* was equal to *<sup>C</sup>*[*h*−1] , the objective function was:

$$Z\_1 = (h - 2)\alpha + (n - h + 1)\delta + n\eta \mathcal{C}\_{[h-1]}.\tag{11}$$

(2) When *d* was equal to *C*[*h*+1] , the objective function was:

$$Z\_2 = h\mathfrak{a} + (n - h - 1)\delta + n\eta \mathcal{C}\_{[h+1]}.$$

when *α* ≥ *δ*,

$$Z - Z\_1 = \mathfrak{a} - \delta + n\eta (\mathcal{C}\_{[h]} - \mathcal{C}\_{[h-1]}) = \mathfrak{a} - \delta + n\eta (p\_{[h]} + r p\_{[h-1]}) > 0,\tag{12}$$

$$Z - Z\_2 = -\mathfrak{a} + \delta + n\eta (\mathcal{C}\_{[h]} - \mathcal{C}\_{[h+1]}) = -\mathfrak{a} + \delta - n\eta (p\_{[h+1]} + r p\_{[h]}) < 0,\tag{13}$$

*Z*<sup>2</sup> > *Z* > *Z*1. Therefore, the due date *d* was equal to the start time of the first job.

For the convenience of proof, we defined two sets: *G*<sup>1</sup> = {*J<sup>j</sup>* |1 ≤ *j* ≤ *h*}, *G*<sup>2</sup> = {*Jj* |*h* + 1 ≤ *j* ≤ *n*}, *d* = *C*[*h*] .

**Lemma 5.** *In the optimal scheduling, the jobs of set G*<sup>1</sup> *were arranged in an ascending order of normal processing time.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *J<sup>v</sup>* in the *G*1, and *J<sup>u</sup>* was in front of *J<sup>v</sup>* which was at the (*k* + 1)th position, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}. Suppose that the starting time of *J*<sup>1</sup> was 0, *d* = *C*[*h*] , 1 ≤ *k* < *h* ≤ *n*. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathbb{C}\_{[h]}(\mathcal{S}\_1). \tag{14}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathbb{C}\_{[h]}(\mathbb{S}\_2). \tag{15}$$

$$Z\_1 - Z\_2 = n\eta \left( \mathbb{C}\_{[h]}(\mathbb{S}\_1) - \mathbb{C}\_{[h]}(\mathbb{S}\_2) \right). \tag{16}$$

From *p<sup>u</sup>* ≤ *p<sup>v</sup>* and Lemma 2, *Z*<sup>1</sup> ≤ *Z*2, i.e., the jobs of set *G*<sup>1</sup> were arranged in an ascending order of normal processing time.

**Lemma 6.** *In the optimal scheduling, the jobs of set G*<sup>2</sup> *were arranged in any order of normal processing time.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *J<sup>v</sup>* in the *G*2, and *J<sup>u</sup>* was in front of *J<sup>v</sup>* which was at the (*k* + 1)th position, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}, *h* < *k* < *n*. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathcal{C}\_{[h]}(\mathcal{S}\_1). \tag{17}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathbb{C}\_{[h]}(\mathbb{S}\_2). \tag{18}$$

$$\mathbf{Z}\_1 = \mathbf{Z}\_2.\tag{19}$$

Therefore, the jobs of set *G*<sup>2</sup> were arranged in any order of normal processing time.

**Lemma 7.** *In the optimal scheduling, the processing time of any job in the G*<sup>1</sup> *was less than the processing time of any job in the G*2*.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *Jv*, *J<sup>u</sup>* was at the *h*th position in the *G*1, and *J<sup>v</sup>* was at the (*h* + 1)th position in the *G*2, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta \mathcal{C}\_{[h]}(S\_1). \tag{20}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h - 1)\mathfrak{a} + (n - h)\delta + n\eta \mathcal{C}\_{[h]}(\mathcal{S}\_2). \tag{21}$$

$$Z\_1 - Z\_2 = n\eta (p\_\mu - p\_\upsilon) \max \{ (1 + \sum\_{k=1}^{h-1} p\_{[k]})^a, \beta \}. \tag{22}$$

If *p<sup>u</sup>* ≤ *pv*, *Z*<sup>1</sup> ≤ *Z*2, i.e., the processing time of any job in the *G*<sup>1</sup> was less than the processing time of any job in the *G*2.

The Algorithm 1 was summarized as follows:

$$\mathbf{Algorithm 1} \,\mathbf{1} \,|p\_{j[k]}^{A} = p\_{j} \max \{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^{a}, \beta \} \,\rho\_{\text{psd}} \,\text{CON} \left| \sum\_{j=1}^{n} (a\mathcal{U}\_{j} + \delta V\_{j} + \eta d) \right| $$

**Require:** *α*, *β*, *δ*, *η*, *a*, *r*, *p<sup>j</sup>* , *n*

**Ensure:** The optimal sequence, *d*


**Theorem 1.** *For the problem* 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *CON*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηd*)*, the complexity of the algorithm was O*(*nlogn*)*.*

**Proof.** The first step required *O*(*nlogn*) time. The second step required *O*(*n*) time. The third step was completed in a constant time. Therefore, the complexity of the algorithm was *O*(*nlogn*).

$$\text{4.2. The Problem 1}|p\_{j[k]}^A = p\_j \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \beta \}, q\_{psd}, \text{SLK}|\sum\_{j=1}^n (a\mathcal{U}\_j + \delta V\_j + \eta q) \}$$

**Lemma 8.** *For the optimal scheduling, q was equal to* (1 + *r*) *times the sum of actual processing time for some jobs.*

**Proof.** Suppose that *q* was not equal to (1 + *r*) times the sum of the actual processing time for some jobs, i.e., (1 + *r*) *h*−1 ∑ *j*=1 *p A* [*j*] < *q* < (1 + *r*) *h* ∑ *j*=1 *p A* [*j*] , 1 ≤ *h* ≤ *n*, *p*[0] = 0. The objective function was:

$$Z = h\mathfrak{a} + (n - h)\delta + n\eta q. \tag{23}$$

when *q* = (1 + *r*) *h*−1 ∑ *j*=1 *p A* [*j*] , the objective function was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A \tag{24}$$

$$Z - Z\_1 = \alpha + n\eta[q - (1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A] > 0. \tag{25}$$

Therefore, *q* was equal to (1 + *r*) times the sum of the actual processing time for some jobs.

**Lemma 9.** *When α* ≥ *δ, q was equal to 0.*

**Proof.** When *q* was equal to (1 + *r*) *h*−1 ∑ *j*=1 *p A* [*j*] for the optimal scheduling, the objective function was:

$$Z = (h - 1)a + (n - h)\delta + n\eta(1 + r)\sum\_{j=1}^{h-1} p\_{[j]}^A. \tag{26}$$

$$\text{(1) When } q = (1+r)\sum\_{j=1}^{h-2} p\_{[j]}^A \text{ the objective function was}$$

$$Z\_1 = (h-2)a + (n-h+1)\delta + n\eta(1+r)\sum\_{j=1}^{h-2} p\_{[j]}^A. \tag{27}$$

$$\text{(2) When } q = (1+r) \sum\_{j=1}^{h} p\_{[j]}^A \text{, the objective function was } \frac{1}{2}$$

$$Z\_2 = h\mathfrak{a} + (n - h - 1)\delta + n\eta(1 + r) \sum\_{j=1}^{h} p\_{[j]}^A. \tag{28}$$

$$Z - Z\_1 = \mathfrak{a} - \delta + n\eta (1+r) p^A\_{[h-1]'} \tag{29}$$

$$Z - Z\_2 = -\mathfrak{a} + \delta - n\eta (1+r)p\_{[\mathfrak{h}]}^A. \tag{30}$$

when *α* ≥ *δ*, *Z*<sup>2</sup> > *Z* > *Z*1. Therefore, *q* was equal to 0.

For the convenience of proof, we defined two sets: *G*<sup>3</sup> = {*J<sup>j</sup>* |1 ≤ *j* ≤ *h* − 1}, *G*<sup>4</sup> = {*Jj* |*h* ≤ *j* ≤ *n*}, *q* = (1 + *r*) *h*−1 ∑ *j*=1 *p A* [*j*] .

**Lemma 10.** *In the optimal scheduling, the jobs of set G*<sup>3</sup> *were arranged in an ascending order of normal processing time.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *J<sup>v</sup>* in the *G*3, and *J<sup>u</sup>* was in front of *J<sup>v</sup>* which was at the (*k* + 1)th position, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}. Suppose that the starting time of *J*<sup>1</sup> was 0, *q* = (1 + *r*) *h*−1 ∑ *j*=1 *p A* [*j*] , 1 ≤ *k* ≤ *h* − 2. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_1). \tag{31}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_2). \tag{32}$$

$$Z\_1 - Z\_2 = n\eta (1+r) \left(\sum\_{j=1}^{h-1} p\_{[\bar{j}]}^A(S\_1) - \sum\_{j=1}^{h-1} p\_{[\bar{j}]}^A(S\_2)\right). \tag{33}$$

From *p<sup>u</sup>* ≤ *p<sup>v</sup>* and Lemma 1, *Z*<sup>1</sup> ≤ *Z*2, i.e., the jobs of set *G*<sup>3</sup> were arranged in an ascending order of normal processing time.

**Lemma 11.** *In the optimal scheduling, the jobs of set G*<sup>4</sup> *were arranged in an ascending order of normal processing time.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *J<sup>v</sup>* in the *G*4, and *J<sup>u</sup>* was in front of *J<sup>v</sup>* which was at the (*k* + 1)th position, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}, *h* ≤ *k* < *n*. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_1). \tag{34}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_2). \tag{35}$$

$$\mathbf{Z}\_1 = \mathbf{Z}\_2.\tag{36}$$

Therefore, the jobs of set *G*<sup>4</sup> were arranged in any order of normal processing time.

**Lemma 12.** *In the optimal scheduling, the processing time of any job in the G*<sup>3</sup> *was less than the processing time of any job in the G*4*.*

**Proof.** There were two adjacent jobs *J<sup>u</sup>* and *Jv*, and *J<sup>u</sup>* was at the (*h* − 1)th position in the *G*3, and *J<sup>v</sup>* was at the *h*th position in the *G*4, *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}. The objective function of *S*<sup>1</sup> was:

$$Z\_1 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_1). \tag{37}$$

when *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. The objective function of *S*<sup>2</sup> was:

$$Z\_2 = (h-1)a + (n-h)\delta + n\eta(1+r)\sum\_{j=1}^{h-1} p\_{[j]}^A(S\_2). \tag{38}$$

$$Z\_1 - Z\_2 = n\eta (1+r)(p\_\mu - p\_\overline{v}) \max\{ (1 + \sum\_{k=1}^{h-2} p\_{[k]})^a, \beta \}. \tag{39}$$

If *p<sup>u</sup>* ≤ *pv*, *Z*<sup>1</sup> ≤ *Z*2, i.e., the processing time of any job in the *G*<sup>3</sup> was less than the processing time of any job in the *G*4.

The Algorithm 2 was summarized as follows:

$$\mathbf{Algorithm 2 1} | p\_{j[k]}^A = p\_j \max \{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \beta \}, \\ q\_{psd} \text{ SLK} | \sum\_{j=1}^n (a \mathbf{U}\_j + \delta V\_j + \eta q) |$$

**Require:** *α*, *β*, *δ*, *η*, *a*, *r*, *p<sup>j</sup>* , *n*

**Ensure:** The optimal sequence, *q*


**Theorem 2.** *For the problem* 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *SLK*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηq*)*, the complexity of the algorithm was O*(*nlogn*)*.*

**Proof.** The first step required *O*(*nlogn*) time. The second step required *O*(*n*) time. The third step was completed in a constant time. Therefore, the complexity of the algorithm was *O*(*nlogn*).

$$\text{4.3. The Problem 1}|p\_{j[k]}^A = p\_j \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[i]})^a, \beta \}, q\_{psd}, DIF \big|\sum\_{j=1}^n (a\mathcal{U}\_j + \delta V\_j + \eta d\_j) \big|$$

**Lemma 13.** *In the optimal scheduling, if ηC<sup>j</sup>* ≥ *δ, the due date d<sup>j</sup> of J<sup>j</sup> was equal to 0; otherwise, d<sup>j</sup> was equal to the completion time of J<sup>j</sup> .*

**Proof.** The objective function was:

$$Z = \sum\_{j=1}^{n} Z\_j = \sum\_{j=1}^{n} (\alpha \mathcal{U}\_j + \delta V\_j + \eta d\_j),\tag{40}$$

$$Z\_{\rangle} = \mathfrak{a}\mathcal{U}\_{\rangle} + \delta V\_{\rangle} + \eta d\_{\rangle}. \tag{41}$$

(1) When *C<sup>j</sup>* > *d<sup>j</sup>* ,

$$Z\_{\rangle} = \delta + \eta d\_{\rangle}.\tag{42}$$

(2) When *C<sup>j</sup>* = *d<sup>j</sup>* , *Z<sup>j</sup>* = *ηC<sup>j</sup>* . (43)

(3) When *C<sup>j</sup>* < *d<sup>j</sup>* ,

$$Z\_{\dot{\jmath}} = \mathfrak{a} + \eta d\_{\dot{\jmath}} > \eta \mathbb{C}\_{\dot{\jmath}}.\tag{44}$$

$$Z\_j = \min\{\delta, \eta \mathcal{C}\_j\}.\tag{45}$$

when *ηC<sup>j</sup>* ≥ *δ*, *d<sup>j</sup>* was equal to 0; otherwise, *d<sup>j</sup>* was equal to *C<sup>j</sup>* .

**Lemma 14.** *In the optimal scheduling, the jobs were sequenced in an increasing order of normal processing time.*

**Proof.** We considered the job sequence *S*<sup>1</sup> = {*J*1, · · · , *Ju*, *Jv*, · · · , *Jn*}. There were two adjacent jobs *J<sup>u</sup>* and *Jv*. *J<sup>u</sup>* was at the *k*th position in the *S*<sup>1</sup> and *J<sup>v</sup>* was at the (*k* + 1)th position in the *S*1, 1 ≤ *k* < *n*. *Z*<sup>1</sup> was the objective function of *S*1. When *J<sup>u</sup>* and *J<sup>v</sup>* were swapped, the sequence of jobs was *S*<sup>2</sup> = {*J*1, · · · , *Jv*, *Ju*, · · · , *Jn*}. *Z*<sup>2</sup> was the objective function of *S*2.

$$Z\_1 - Z\_2 = \min\{\delta, \eta \mathbb{C}\_{[k]}(\mathbb{S}\_1)\} + \min\{\delta, \eta \mathbb{C}\_{[k+1]}(\mathbb{S}\_1)\} - \min\{\delta, \eta \mathbb{C}\_{[k]}(\mathbb{S}\_2)\} - \min\{\delta, \eta \mathbb{C}\_{[k+1]}(\mathbb{S}\_2)\}.\tag{46}$$

$$\mathcal{C}\_{[k]}(\mathcal{S}\_1) = (1+r) \sum\_{j=1}^{k-1} p\_{[j]}^A + p\_\mu \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[j]})^a, \beta \},\tag{47}$$

$$\mathcal{C}\_{[k]}(\mathcal{S}\_2) = (1+r) \sum\_{j=1}^{k-1} p\_{[j]}^A + p\_v \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[j]})^a, \mathcal{\beta} \}, \tag{48}$$

$$\mathbb{C}\_{[k+1]}(\mathbf{S}\_1) = (1+r)\sum\_{j=1}^{k-1} p\_{[j]}^A + (1+r)p\_u \max\{ (1+\sum\_{i=1}^{k-1} p\_{[j]})^a, \beta \} + p\_v \max\{ (1+\sum\_{i=1}^{k-1} p\_{[j]} + p\_u)^a, \beta \}, \tag{49}$$

$$\mathcal{L}\_{[k+1]}(\mathcal{S}\_2) = (1+r) \sum\_{j=1}^{k-1} p\_{[j]}^A + (1+r)p\_v \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[j]})^a, \mathcal{S} \} + p\_u \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[j]} + p\_v)^a, \mathcal{S} \}. \tag{50}$$

From *p<sup>u</sup>* ≤ *p<sup>v</sup>* and Lemma 1, *C*[*k*] (*S*1) ≤ *C*[*k*] (*S*2), *C*[*k*+1] (*S*1) ≤ *C*[*k*+1] (*S*2), *Z*<sup>1</sup> ≤ *Z*2. Therefore, the jobs were sequenced in an increasing order of normal processing time in the optimal scheduling.

The Algorithm 3 was summarized as follows:

$$\mathbf{Algorithm 3.1}[p\_{\vec{j}[\vec{k}]}^A = p\_{\vec{j}} \max\{ (1 + \sum\_{i=1}^{k-1} p\_{[\vec{i}]})^a, \mathfrak{f} \}, q\_{\text{psd}} \\ DF \big| \sum\_{j=1}^n (a \mathcal{U}\_{\vec{j}} + \delta V\_{\vec{j}} + \eta d\_{\vec{j}}) $$

**Require:** *α*, *β*, *δ*, *η*, *a*, *r*, *p<sup>j</sup>* , *n*

**Ensure:** The optimal sequence, *d<sup>j</sup>*


**Theorem 3.** *For the problem* 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *DIF*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηdj*)*, the complexity of the algorithm was O*(*nlogn*)*.*

**Proof.** The first step required *O*(*nlogn*) time. The second step required *O*(*n*) time. Therefore, the complexity of the algorithm was *O*(*nlogn*).

#### **5. Discussion of Results**

*5.1. Numerical Discussion*

In this section, we used an example to show the calculation process for three different due dates.

**Example 1.** *There were five jobs processed sequentially on the same machine. The processing time of each job is shown in the Tables 2–20 below:*

*α* = 1*, δ* = 2*, η* = 0.2*, a* = −1*, β* = 0.5*, r* = 0.1*.*

**Table 2.** Normal processing time.


**Solution 1:** 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *CON*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηd*)

First step: *p*<sup>5</sup> < *p*<sup>4</sup> < *p*<sup>2</sup> < *p*<sup>1</sup> < *p*3. The processing sequence of jobs: *J*<sup>5</sup> → *J*<sup>4</sup> → *J*<sup>2</sup> → *J*<sup>1</sup> → *J*3.

Second step:


Third step: The optimal due date was 1.

**Table 3.** Actual processing time.


**Table 4.** Waiting time.


**Table 5.** Delivery time.


**Table 6.** Completion time.


**Solution 2:** 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *SLK*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηq*)

First step: *p*<sup>5</sup> < *p*<sup>4</sup> < *p*<sup>2</sup> < *p*<sup>1</sup> < *p*3. The processing sequence of jobs: *J*<sup>5</sup> → *J*<sup>4</sup> → *J*<sup>2</sup> → *J*<sup>1</sup> → *J*3.

Second step:


Third step: The optimal *q* was 0.

**Table 7.** Actual processing time.


**Table 13.** Due date.


**Solution 3:** 1|*p A <sup>j</sup>*[*k*] = *p<sup>j</sup>* max{(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *β*}, *qpsd*, *DIF*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηdj*) First step: *p*<sup>5</sup> < *p*<sup>4</sup> < *p*<sup>2</sup> < *p*<sup>1</sup> < *p*3. The processing sequence of jobs: *J*<sup>5</sup> → *J*<sup>4</sup> → *J*<sup>2</sup> →

Value 7.05 7.05 7.55 8.05 8.55

*J*<sup>1</sup> → *J*3. Second step: *Z* = 2.12.

**Table 16.** Due date.


#### *5.2. Extension*

In the learning effect scheduling model, the learning index *a* was less than 0. If *a* > 0, it became the forgetting effect scheduling model.

$$1|p\_{j[k]}^A = p\_j(1 + \sum\_{i=1}^{k-1} p\_{[i]})^a,\\ q\_{\text{psd}} \text{CON}(SLK, DIF) |\sum\_{j=1}^n (a\mathcal{U}\_j + \delta V\_j + \eta d),\tag{51}$$

where *a* > 0. The same method could prove the following conclusions. When 0 < *a* ≤ 1, the optimal sequence was obtained by the longest processing time order. When *a* > 1, the optimal sequence was obtained by the shortest processing time order. Take Example 1 above as an example to show the algorithmic process of the forgetting effect scheduling model (CON).

**Solution 4:** 1|*p A <sup>j</sup>*[*k*] = *pj*(1 + *k*−1 ∑ *i*=1 *p*[*i*] ) *a* , *qpsd*, *CON*| *n* ∑ *j*=1 (*αU<sup>j</sup>* + *δV<sup>j</sup>* + *ηd*), where *a* = 0.5.

First step: *p*<sup>3</sup> > *p*<sup>1</sup> > *p*<sup>2</sup> > *p*<sup>4</sup> > *p*5. The processing sequence of jobs: *J*<sup>3</sup> → *J*<sup>1</sup> → *J*<sup>2</sup> → *J*<sup>4</sup> → *J*5.

#### Second step:


Third step: The optimal due date was 0.

**Table 17.** Actual processing time.


#### **6. Conclusions**

Under the common due date, slack due date and different due date, a single-machine scheduling problem with delivery times and the truncated sum-of-processing-times-based learning effect was studied in this paper. The goal was to minimize the total costs that comprised the number of early jobs, the number of tardy jobs and the due date. Under different due dates, three polynomial time algorithms were proposed to obtain the optimal sequence and due dates, whose complexity was *O*(*nlogn*). The optimal sequence was arranged in an ascending order of processing time. We gave three examples to show the calculation process of the algorithms. In the future, the multi-machine environment could be considered to expand the research, i.e., a flow-shop scheduling problem with a delivery time, truncated sum-of-processing-times-based learning effect and due dates could be considered whether there were polynomial time algorithms. The truncated sum-ofprocessing-times-based forgetting effect was also studied in the single-machine scheduling environment.

**Author Contributions:** The work presented here was performed in collaboration among all authors. J.Q. designed, analyzed and wrote the paper. Y.Z. analyzed and reviewed the paper. All authors have read and agreed to the published version of the manuscript.

**Funding:** This study was supported by the Natural Science Foundation of Liaoning Province Project (grant no. 2021-MS-102) and the Fundamental Research Funds for the Central Universities (grant no. N2105021 and N2105020).

**Data Availability Statement:** Not applicable.

**Acknowledgments:** We thank the anonymous reviewers for their comments and insights that significantly improved our paper.

**Conflicts of Interest:** The authors declare no conflict of interest.

#### **References**

