Open access peer-reviewed chapter

# Polynomial Algorithm for Constructing Pareto-Optimal Schedules for Problem 1∣rj∣Lmax,Cmax

Written By

Alexander A. Lazarev and Nikolay Pravdivets

Submitted: July 1st, 2020 Reviewed: August 21st, 2020 Published: November 2nd, 2020

DOI: 10.5772/intechopen.93677

From the Edited Volume

## Multicriteria Optimization

Edited by Nodari Vakhania and Frank Werner

Chapter metrics overview

View Full Metrics

## Abstract

In this chapter, we consider the single machine scheduling problem with given release dates, processing times, and due dates with two objective functions. The first one is to minimize the maximum lateness, that is, maximum difference between each job due date and its actual completion time. The second one is to minimize the maximum completion time, that is, to complete all the jobs as soon as possible. The problem is NP-hard in the strong sense. We provide a polynomial time algorithm for constructing a Pareto-optimal set of schedules on criteria of maximum lateness and maximum completion time, that is, problem 1 ∣ r j ∣ L max , C max , for the subcase of the problem: d 1 ≤ d 2 ≤ … ≤ d n ; d 1 − r 1 − p 1 ≥ d 2 − r 2 − p 2 ≥ … ≥ d n − r n − p n .

### Keywords

• single machine scheduling
• two-criteria scheduling
• Pareto-set
• Pareto-optimality
• minimization of maximum lateness
• minimization of maximum completion time
• polynomial time algorithm

## 1. Introduction

We consider a classical scheduling problem on a single machine. A release time of each job is predefined and represents the minimum possible start time of the job. When constructing schedules, we consider two objective functions. The first one is to minimize the maximum lateness, that is, maximum difference between each job due date and its actual completion time. The second one is to minimize the maximum completion time, that is, to complete all the jobs as soon as possible. The problem is NP-hard in the strong sense [1]. We provide a polynomial time algorithm for constructing a Pareto-optimal set of schedules on criteria of maximum lateness and maximum completion time, that is, problem 1rjLmax,Cmax, for the subcase of the problem when due dates are: d1d2dn;d1r1p1d2r2p2dnrnpn. Example of a problem case that meets these constraints will be the case when all jobs have the same time for processing before due date.

## 2. Statement of the problem 1∣rj∣Lmax,Cmax

We consider single machine scheduling problem, where a set of njobs N=12nhas to be processed on a single machine. Each job we is numbered, that is, the entry “job j” is equivalent to the entry “job numbered j.” Simultaneous executing of jobs or preemptions of the processing of a job are prohibited. For jobs jN, value rjis the minimum possible start time, pj0is a processing time of job jand djis a due date of job j.

The schedule is represented by a set π=sjjNof start times of each job. By τ, we denote the permutation of j1jnelements of the set N. A set of all different schedules of jobs from the set Nis denoted by ΠN. Schedule πis called feasibleif sjπrj, jN. We denote the completion time of job jNin schedule πas Cjπ. Difference Ljπ=Cjπdj, jN, denotes lateness of job jin the schedule π. Maximum lateness of jobs of the set Nunder the schedule πis

Lmaxπ=maxjNCjπdj.E1

We denote the completion time of all jobs of the set Nin schedule πas

Cmaxπ=maxjNCjπ.

The problem is to find the optimal schedule πwith the smallest value of the maximum lateness:

Lmax=minπΠNLmaxπ=Lmaxπ.E2

For any arbitrary set of jobs MNwe also denote:

rM=minjMrj,dM=maxjMdj,pM=jMpj.E3

In the standard notation of Graham et al. [2], this problem is denoted as 1rjLmax. Intensive work on the solution of this problem has continued since the early 50s of the 20th century. Lenstra et al. [1] showed that the general case of the problem 1rjLmaxis NP-hard in the strong sense.

Potts [3] introduced an iterative version of extended Jackson rule (IJ) [4] and proved that LmaxπIJLmax32. Hall and Shmoys [5] modified the iterative version and created an algorithm (MIJ) that guarantees the evaluation LmaxπMIJLmax43. They also presented two approximation schemes that guarantee finding ε-approximate solution in Onlogn+n1/εO1/ε2and On/εO1/εoperations. Mastrolilli [6] introduced an improved approximation scheme with complexity of On+1/εO1/εoperations.

A number of polynomially solvable cases of the problem were found, starting with Jackson’s early result [4] for the case rj=0, jN, when the solution is a schedule in which jobs are ordered by nondecreasing due dates (by rule EDD). Such a schedule is also be optimal for the case when the release times and due dates are associated (rirjdidj, i,jN).

Schedule is constructed according to the extended Jackson rule (Schrage schedule): on the next place in the schedule we select a released non-ordered job with the minimum due date; if there are no such jobs, then we select the job with the minimum release time among the unscheduled jobs.

If process times of all jobs are equal, the complexity can be reduced to Onlogn[7]. Vakhania generalized this result [8] considering the case when the processing times of some jobs are restricted to either por 2p. An algorithm with complexity of On2lognlogpwas suggested.

A case when job processing times are mutually divisible is considered in [9]. Author suggest a polynomial-time algorithm with a complexity of On3lognlog2pmaxoperations for solving this case.

Special cases 1prec;rjCmax, 1prec;pj=p;rjLmaxand 1prec;rj;pmtnLmaxwith precedence constraints for jobs have been addressed in works of Lawler [10], Simons [11], Baker et al. [12]. Hoogeveen [13] proposed a polynomial algorithm for the special case when job parameters satisfy the constraints djpjArjdjA, jN, for some constant A. A pseudo-polynomial algorithm for the NP-hard case when release times and due dates are in the reversed order (d1dnand r1rn) was developed in [14].

We denote by LjAπand CjAπthe lateness and completion time of job jNin schedule π, for instance, Awith job parameters rjApjAdjA, jN. Respectively, LmaxAπ=maxjNLjAπis a maximum lateness of the schedule πfor instance A.

This paper deals with a problem with two objective functions Lmaxand Cmax, which in general case can be referred as 1rjLmax,Cmax. This problem was considered in [15], where authors consider some dominance properties and conditions when the Pareto-optimal set can be formed in polynomial time.

Definition 1.1For any instance Aof the problem, each permutation τof the jobs of the set Nis uniquely defines early scheduleπτA. In the early schedule, each job jNstarts immediately after the end of the previous job in the corresponding permutation. If the completion time of the previous job is less than the release time of the current job, then the beginning of the current job is equal to its release time. That is, if τ=j1j2jn, then πτA=sj1sj2sjn, where

sj1=rj1A,sjk=maxsjk1+pjk1ArjkA,k=2,,n.E4

Early schedules play an important role in our construction, since it is sufficient to check all early schedules to find the optimal schedule of any problem instance.

By τAwe denote the optimal permutation and πAwe denote the optimal schedule for instance A. Only early optimal schedules are be considered, that is, πA=πτAA.

We denote by ΠNthe set of all permutations of jobs of the set N, and by ΠAthe set of feasible schedules for instance A.

## 3. Problem 1∣di≤dj,di−ri−pi≥dj−rj−pj∣Lmax,Cmax

This section deals with the problem of constructing a Pareto-optimal set by criteria Cmaxand Lmax, that is, problem 1rjLmax, Cmax. We suggest an algorithm for constructing a set of schedules ΦNt=π1π2πmfor which

Cmaxπ1<Cmaxπ2<<Cmaxπm,E5
Lmaxπ1>Lmaxπ2>>Lmaxπm.E6

There is no schedule πsuch that CmaxπCmaxπiand LmaxπLmaxπi, at least one of the inequalities is strict for some i, i=1,,m. It is shown that mn.

### 3.1 Problem properties

We denote the precedence of the jobs iand jin schedule πas ijπ. We also introduce

rjt=maxrjt;E7
rNt=minjNrjt.E8

In cases when its obvious how many jobs we mean, we write rtinstead of rNt.

We assume that the job parameters satisfy the following constraints:

d1dn,d1r1p1dnrnpn.E9

For example, these constraints correspond to the case when dj=rj+pj+z, j=1,,n, where zis a constant, that is, when all jobs have the same time for processing before due date. A problem with similar constraints but for a single objective function (Lmax) is considered in [16].

We assume that N>1and tis the time when the machine is ready. From the set N, we find two jobs f=fNtand s=sNtin the following way:

fNt=argminjNdjrjt=rNt,E10
sNt=argminjN\fdjrjt=rN\ft,E11

where f=fNt. If N=i, then we set fNt=i, sNt=0, t. We also define d0=+, ft=0, st=0, t. For jobs fand sthe following properties are true.

Lemma 1.1 If the jobs of the set Nsatisfy (4), then for any schedule πΠNfor all jN\f,for which jfπ

Ljπ<LfπE12

is true, and for all jN\fs,satisfying the condition jsπ,

Ljπ<Lsπ,E13

where f=fNtand s=sNt, is also true.

Proof:For each job jsuch that jfπ, completion time Cjπ<Cfπ. If djdf, then obviously

Ljπ=Cjπdj<Cfπdf=Lfπ,E14

therefore (12) is valid.

If for job jN, jfπ, then dj<df. Then rj>rf. If rjrf, then rjtrftand rft=rt, as follows from (7) and (10). Then rjt=rft=rtand dj<df, but this contradicts the definition of job f(10). Therefore, rj>rf. Its obvious that Cjπpj<Cfπpfand, since rj>rf,

Cjπpjrj<Cfπpfrf,E15
CjπCfπ<pj+rjpfrf.E16

Since dj<df, then (from (9)) djrjpjdfrfpfor djdfrj+pjrfpf, so CjπCfπ<pj+rjpfrfdjdf. Then, Ljπt<Lfπtfor each job j,jfπ.

The inequality (13) can be proved in a similar way.

For each job jsatisfying the condition jsπ, we have Cjπ<Csπ. If djds, then Ljπt=Cjπdj<Csπds=Lsπt, therefore (13) is true.

Let for the job jN\f, jsπ, dj<ds, then rj>rs. Indeed, if we assume that rjrs, then rjtrst(it follows from (7)). In addition, rstrtfor any job saccording to definitions (8) and (11). If rst=rt, then for the jobs jand swe can write rjt=rst=rtand dj<ds, which contradicts the definition (11) of job sNt. If rst>rt, that is, rs>rt, then there is no job iN\fs, for which rs>ri>rt. Therefore, for the jobs jand swe get rjt=rstand dj<ds, which contradicts the definition (11) of job sNt. Therefore, rj>rs.

Since CjπCsπpsand pj>0, then Cjπpj<Csπpsand since rj>rs, therefore Cjπpjrj<Csπpsrsand

CjπCsπ<pj+rjpsrs.E17

Since dj<ds, then from (9) we have djrjpjdsrspsor

CjπCsπ<pj+rjpsrsdjds.E18

Hence, Ljπ<Lsπfor each job jN\f,jsπ.

Theorem 1.1 If conditions (9) are true for jobs in the subset NN, then at any time ttand any early schedule πΠNthere exists πΠNsuch that

LmaxπtLmaxπt,andCmaxπtCmaxπtE19

and one of the jobs f=fNtor s=sNtis at the first position in schedule π. If dfds, then job fis at the first position in schedule π.

Proof:Let π=π1fπ2sπ3, where π1,π2and π3are partial schedules of π. Then, we construct a schedule π=fπ1π2sπ3. From the definitions (7), (8), (10) we get rftrjt, jN, hence Cmaxfπ1tCmaxπ1ftand

CmaxπtCmaxπt,andE20
LjπtLjπt,jπ2sπ3.E21

From the lemma 1.1 we have

Ljπt<Lsπt,jπ1π2.E22

Obviously, the following inequality is true for job f

LfπtLfπt.E23

From (20)(23) we get CmaxπtCmaxπtand LmaxπtLmaxπt.

Let π=π1sπ2fπ3, that is, job sis before job f. Construct a schedule π=sπ1π2fπ3. Further proof may be repeated as for job f. The first part of the theorem is proved.

Let us assume dfdsand the schedule π=π1sπ2fπ3. Then, we construct a schedule π=fπ11π12π3, where π11,π12are schedules of jobs of the sets jN:jπ1sπ2dj<dfand jN:jπ1sπ2djdf. Jobs in π11and π12are ordered according to nondecreasing release times rj. From dsdfwe can conclude that sπ12.

For each job jπ11we have dj<df. Of (9) we get djrjpjdfrfpf, hence rj+pj<rf+pf, jπ11, and Cmaxfπ11t=rft+pf+jπ11pj. Since jobs in schedule π12are sorted by nondecreasing release times, then Cmaxfπ11π12tCmaxπ1sπ2ft. As a result

CmaxπtCmaxπt,andE24
LjπtLjπt,jπ3.E25

Job jπ12satisfies djdfand CjπtCfπt, which means

LjπtLfπt,jπ12.E26

Since sπ12, then

LsπtLfπt.E27

From the lemma 1.1

LjπtLsπt,jπ11.E28

Moreover, it is obvious that

LfπtLfπt.E29

From inequalities (24)(29) it follows that CmaxπtCmaxπtand LmaxπtLmaxπt, the theorem is proved.

We call a schedule πΠNeffectiveif there is no schedule πΠNsuch that LmaxπLmaxπand CmaxπCmaxπ, that is, at least one inequality would be strict.

Thus, when constraints (9) are satisfied for jobs from the set N, then there is an effective schedule π, in which either the job f=fNt, or s=sNtis present. Moreover, if dfds, then there is an effective schedule πwith a priority of job f.

We define the set of schedules ΩNtas a subset of ΠNconsisting of n!schedules. Schedule π=i1i2inbelongs to ΩNtif we choose job ik,k=1,2,,nas fk=fNk1Cik1or sk=sNk1Cik1, where Nk1=N\i1i2ik1, Cik1=Cik1πand N0=N, Ci0=t. For dfkdskit is true that ik=fk, so if dfk>dsk, then ik=fkor ik=sk. Its obvious that the set of schedules ΩNtcontains at most 2nschedules.that is, p2i>yp2i1.

Example 1.1

n=2m,tr1<r2<<rn,r2i1<r2i+p2i<r2i1+p2i1,1im,r2i1+p2i1+p2i<r2i+1<r2i+p2i+p2i1<r2i+2,1im1,r2i1+p2i1+p2id2i1>y,1im1,r2i+p2i+p2i1d2iy.

The set ΩNtcontains 2mschedules. The value of yis used below in the text. The optimal solution of the problem 1rj,dj=rj+pj,LmaxyCmaxis π=2,1,4,3nn1.

Theorem 1.2 If for the jobs of the subset NN,N=n,is true (9), then at any time ttand any schedule πΠNexists a schedule πΩNtsuch that

LmaxπtLmaxπtandCmaxπtCmaxπt.E30

Proof:Let π=j1j2jnbe an arbitrary schedule. We denote the first ljobs of the schedule πas πl,l=0,1,2,,n, where π0is an empty schedule, and π¯l=jl+1jn, then π=πlπ¯l. We introduce Nl=N\πland Cl=Cmaxπlt. Suppose for some l,0l<n, πlis the largest initial partial the schedule of some schedule from ΩNt. If j1fNtand j1sNt, then πl=π0, l=0, then the largest partial schedule is empty. Let us say f=fNlCland s=sNlCl. If df>ds, then jl+1fand jl+1s, moreover when dfds, then jl+1f, since πl+1is not an initial schedule of some schedule from ΩNt.

According to the theorem 1.1 for the jobs of the set π¯l,π¯lΠNl, there is a schedule π¯lstarting at time Cl, for which Lmaxπ¯lClLmaxπ¯lCl, Cmaxπ¯lClCmaxπ¯lCl, and π¯l1=fors, moreover, with dfds, true π¯l1=f, where σkis the job in the k-th place in schedule σ. Hence, Lmaxπlπ¯ltLmaxπlπ¯ltand Cmaxπlπ¯ltCmaxπlπ¯lt.

Let us denote π=πlπ¯l. A feature of schedule πis that the first l+1jobs are the same as first l+1jobs of some schedule from the set ΩNt, and LmaxπtLmaxπt, CmaxπtCmaxπt.

After no more than nsequential conversions (since schedule length nn) of the original randomly selected schedule πwe come to schedule πΩNt, for which LmaxπtLmaxπtand CmaxπtCmaxπt. The theorem is proved.

We form the following partial schedule ωNt=i1i2il. For each job ik,k=1,2,,l, we have ik=fkand dfkdsk, where fk=fNk1Ck1and sk=sNk1Ck1. For f=fNlCland s=sNlClinequality df>dsis true. In case when df>dsfor f=fNtand s=sNt, then ωNt=. So ωNtis the “maximum” schedule, during the construction of which job (like f) for the next place of the schedule can be uniquely selected. We can construct a schedule ωNtfor set of jobs Nstarting at time tusing the algorithm 1.1.

Algorithm 1.1for constructing schedule ωNt.

1: Initial step.Let ω=.

2: Main step.Find the jobs ffNtand ssNt;

3: ifdfdsthen

4: ωωf

5: else

6: algorithm stops;

7: end if

8: Let NN\f, trft+pfand go to the next main step.

Lemma 1.2 The complexity of the algorithm 1.1 for finding the schedule ωNtis at most Onlognoperations for any Nand any t.

Proof:At each iteration of the algorithm 1.1 we find two jobs: f=fNtand s=sNt. If jobs are ordered by release times rj(and, accordingly, time rtis for O1operations), then to find two jobs (fand s) you need Olognoperations. Total number of iterations is not more than n. Thus, constructing a schedule ωNtrequires Onlognoperations.

The main step of algorithm 1.1 is finding the jobs fand sand it requires at least Olognoperations. Obviously, the number of iterations of the algorithm is O(n), therefore, the complexity of the algorithm 1.1 of Onlognoperations is the minimum possible for constructing the schedule ωNt.

Lemma 1.3 If for jobs of the set Nconditions (9) are true, then any schedule πΩNtstarts with the schedule ωNt.

Proof:If ωNt=, that is, df>ds, where f=fNt, s=sNt, the statement of the lemma is true, since any schedule starts from empty.

Let ωNt=i1i2il, l>0, and so for each ik, k=1,2,,l, we have ik=fkand dfkdsk, where fk=fNk1Ck1and sk=sNk1Ck1. For f=fNlCland s=sNlClit is true that df>ds. As seen from the definition of the set of schedules ΩNtall schedules in this subset start with a partial schedule ωNt.

Let us use the following notation ω1Nt=fωNtand ω2Nt=sωNt, where f=fNt,s=sNt,N=N\f,N=N\s,t=rft+pf,t=rst+ps. Obviously, the algorithm for finding ω1(as well as ω2) requires Onlognoperations, as much as the algorithm for constructing ωNt.

Consequence 1.1 from Lemma 1.3.If the jobs of the set Nsatisfy conditions (9), then each schedule πΩNtstarts either with ω1Nt, or with ω2Nt.

Theorem 1.3 If the jobs of the set Nsatisfy conditions (9), then for any schedule πΩNtit is true that ijπfor any iω1Ntand jN\ω1Nt.

Proof:In the case ω1Nt=Nstatement of the theorem is obviously true. Let ω1NtN. Further in in the proof we use the notation ω1=ω1Nt.

If f=fNtand s=sNtare such that dfds, then all schedules from the set ΩNtbegin with a partial schedule ωNt=ω1, therefore the statement of the theorem is also true.

Consider the case of df>ds. All schedules from set ΩNtstarting with job fhave partial schedule ωNt=ω1.

Let us choose any arbitrary schedules πΩNtwith job scomes first, π1=s, and any schedule ω1=l,l<n, containing ljobs. Let πl=j1j2jlbe a partial schedule of schedule πcontaining ljobs, where j1=s. We need to prove that πl=ω1. Let us assume the contrary that there is a job jπl, but jω1.

For case jfπwe need to check two subcases. If dj<df, then from (9) we have djrjpjdfrfpf, therefore rj+pj<rf+pf. Then job jis included in schedule ω1according to the definition of ωNtand ω1, but by our assumption jω1. If djdf, then from the fact that πΩNtfollows fjπ, but this contradicts jfπ. Therefore, jω1.

The other case is fjπ. Then for each job iω1, for which iπl, conditions ri<ri+piCmaxω1<rsl+1rjare true, because jω1, where sl+1=sN\ω1Cmaxω1. Jobs sl+1and jwere not ordered in schedule ω1, therefore, Cmaxω1<rsl+1rj. Besides, didj. If di>dj, then ri+pirj+pj, but ri+pi<rjis true. Hence ijπl, since π=πlπ¯lΩNt, but it contradicts our guess that iπland jπl.

Therefore, our assumption is not true and ω1=πl. The theorem is proved.

Therefore, jobs of the set ω1Ntprecede jobs of the set N\ω1Ntfor any schedule from the set ΩNt, including the optimal schedule.

### 3.2 Performance problem with constraint on maximum lateness

The problem 1didj,diripidjrjpj;LmaxyCmaxconsists of the following. We need to find schedule θfor any ywith Cmaxθ=minCmaxπ:Lmaxπy. If Lmaxπ>yfor any πΠN, then θ=.

Lemma 1.4 The complexity of algorithm 1.2 does not exceed On2lognoperations.

Proof:At each iteration of the main step of the algorithm 1.2 we find the schedules ω1and, if necessary, ω2in Onlognoperations. Since ω1and ω2consist of at least one job, then at each iteration of the algorithm we either add one or mere jobs to the schedule θ, or assume θ=and stop. Therefore, the total number of steps of the algorithm is at most n. Thus, algorithm 1.2 requires On2lognoperations.

Algorithm 1.2for solving the problem 1didj,diripidjrjpj;LmaxyCmax.

1: Initial step.Let θωNt,tt;

2: Main step.

3: ifLmaxθt>ythen

4: θand the algorithm stops.

5: end if

6: Find NN\θ, tCmaxθand ω1Nt,ω2Nt.

7: ifN=then

8: the algorithm stops.

9: else

10: ifLmaxω1tythen

11: θθω1and go to next step;

12: end if

13: ifLmaxω1t>yand Lmaxω2tythen

14: θθω2and go to next step;

15: end if

16: ifLmaxω1t>yand Lmaxω2t>ythen

17: θand the algorithm stops.

18: end if

19: end if

The problem 1didj,diripidjrjpj;LmaxyCmaxcannot be solved in less than On2lognoperations because there exists (Example 1.1). The optimal schedule for this example is π=2,1,4,3nn1. To find this schedule, we need On2lognoperations.

We denote by θNtythe schedule constructed by algorithm 1.2 starting at time tfrom the jobs of the set Nwith the maximum lateness not more than y. If N=, then θty=for any tand y.

Theorem 1.4 Let the jobs of the set Nsatisfy conditions (9). If the algorithm 1.2 constructs the schedule θNty, then Cmaxθ=minCmaxπ:LmaxπyπΠN. If, as a result of the algorithm 1.2 the schedule will not be generated, that is, θNty=, then Lmaxπ>yfor each πΠN.

Proof:In case if for schedule πΠNcondition Lmaxπyis true, then according to Theorem 1.2 there is a schedule πΩNtsuch that LmaxπLmaxπyand CmaxπCmaxπ. Therefore, the required schedule θcontains in set ΩNt.

According to Lemma 1.3, all schedules of the set ΩNtstart with ωNt. Let us take θ0=ωNt.

After k,k0main steps of the algorithm 1.2 we got the schedule θkand N=N\θk, t=Cmaxθk. Let us assume that there is an optimal by the criterion of maximum completion time (Cmax) schedule θstarting with θk. According to Theorem 1.2, there is an optimal extension of the schedule θkamong the schedules from the set ΩNt.

Let θk+1=θkω1Nt, that is, Lmaxθk+1y. According to Theorem 1.3, for schedule ω1, ω1=ω1Nt, there is no artificial idle times of the machine and all schedules from the set ΩNtstart with jobs of the set ω1Nt. Therefore, ω1Ntis the best by the criterion of Cmaxamong all feasible by maximum lateness (Lmax) extensions of the partial schedule θk.

If θk+1=θkω2Nt, that is, Lmaxω1t>y, and Lmaxω2ty. All schedules of the set ΩNtstart with either schedule ω1Ntor ω2Nt. As Lmaxω1t>y, then the only suitable extension is ω2Nt.

Thus, at each main step of the algorithm, we choose the fastest continuation of the partial schedule θkamong all those allowed by the maximum lateness. After no more than nmain steps of the algorithm, the required schedule is constructed.

Let us assume that after the k+1steps of the algorithm Lmaxω1t>yand Lmaxω2t>y. If schedule θcould exist, that is, θ, then θwould start with θk. Then for any schedule πΠNtthere would exist a schedule πΩNtsuch that LmaxπtLmaxπtLmaxω1t>yor LmaxπtLmaxπtLmaxω2t>y. Therefore θ=.

Repeating our proof as many times as the main step of algorithm 1.2 (no more than n), we come to the truth of the statement of the theorem.

### 3.3 Algorithm for constructing a set of Pareto schedules by criteria Cmaxand Lmax

Let us develop an algorithm for constructing a set of Pareto schedules ΦNt=π1π2πm, mn, by criteria Cmaxand Lmaxaccording to conditions (5)–(6).

Schedule πmis a solution to problem 1rjLmaxif (9) is true.

Algorithm 1.3for constructing a set of Pareto schedules by criteria Cmaxand Lmax.

1: Initial step.Y+, πωNt, Φ, m0, NN\πand tCmaxπ.

2: ifN=then

3: ΦΦπ,m1and the algorithm stops.

4: end if

5: Main step.

6: ifLmaxω1tLmaxπthen

7: ππω1, where ω1=ω1Ntand go to the next step;

8: end if

9: ifLmaxω1t>Lmaxπthen

10: ifLmaxω1t<ythen

11: find θ=θNtyusing algorithm 1.2, where y=Lmaxω1t;

12: ifθ=then

13: ππω1and go to the next step;

14: else

15: ππθ

16: ifCmaxπm<Cmaxπthen

17: mm+1, πmπ, ΦΦπm, y=Lmaxπm;

18: else

19: πm=πand go to next step;

20: end if

21: end if

22: ifLmaxω1tythen

23: find ω2=ω2Nt;

24: ifLmaxω2t<ythen

25: π=πω2and go to the next step;

26: else

27: π=πm'and the algorithm stops.

28: end if

29: end if

30: end if

31: end if

As a result of the algorithm 1.3, a set of schedules ΦNtis constructed, for the set of jobs Nstarting at time t, for which inequality 1ΦNtntrue. We should note that the set ΦNtfor Example 1.1consists of two schedules, although set ΩNtconsists of 2n2schedules:

π1=1,2,3,4n1n,E31
π2=2,1,4,3nn1.E32

Lemma 1.5 The complexity of the algorithm 1.3 does not exceed On3lognoperations.

Proof:At each iteration of the main step of the algorithm 1.3 we find schedules ω1and, if necessary, ω2, which requires Onlognoperations according to lemma 1.2, and also schedule θin On2lognoperations. As ω1and ω2consist of at least one job, then at any iteration of the algorithm one or more jobs are added to the schedule π, or the algorithm stops at last schedule π. Therefore, the total number of iterations is at most n. Thus, it takes no more than On3lognoperations to execute algorithm 1.3.

Theorem 1.5 If case if (9) is true for each job of the set N, then the schedule π, constructed by algorithm 1.3, is optimal according to the criterion Lmax. Moreover, for any schedule πΠNthere exists a schedule πΦNtsuch that LmaxπLmaxπand CmaxπCmaxπ.

Proof:According to Theorem 1.2, there exists an optimal (by Lmax) schedule from set ΩNt. According to Lemma 1.3, all schedules of the set ΩNtstart with a partial schedule ωNt.

Let π0=ωNt. After k, k0, main steps of algorithm 1.3 we have a partial schedule πk. Suppose there is an optimal (by Lmax) schedule starting with πk. We denote N=N\πkand t=Cmaxπk.

If πk+1=πkω1, where ω1=ω1Nt, then either Lmaxω1tLmaxπk, or Lmaxπk<Lmaxω1t<y, that is, current value of the criterion and the maximum lateness will “appear” on next steps of the algorithm 1.3. That is, θNty=, where y=Lmaxω1t. If θ=θNty, then we improve the current maximum lateness value: π=πkθand y=Lmaxπ=Lmaxω1t. The schedule πis added to the set of schedules ΦNt. Moreover, according to Theorem 1.3 jobs of set ω1precede jobs of set N\ω1. Thus, the schedule ω1alert(without artificial idle times of the machine) would be the best continuation for πk.

If πk+1=πkω2, where ω2=ω2Nt, that is, according to algorithm 1.3 Lmaxω2t<LmaxπLmaxω1t. In this case the continuation ω2is “better” than ω1. Hence, the partial schedule πk+1is a part of some optimal schedule.

Repeating our proof no more than ntimes, we come to optimality (for Lmax) of the schedule π.

The set of schedules ΦNtcontains at most nschedules, since at each main step of the algorithm in the set ΦNtat most one schedule is “added,” and this step is executed no more than ntimes.

Suppose there is a schedule πΠN,πΦNt,such that either CmaxπCmaxπand LmaxπLmaxπ, or CmaxπCmaxπand LmaxπLmaxπfor each schedule πΦNt. Moreover, in each pair of inequalities at least one inequality is strict. According to Theorem 1.1, there is a schedule πΩNtsuch that LmaxπLmaxπand CmaxπCmaxπ. If πΦNt. Thus, it becomes obvious that our assumption is not correct. Let πΩNt\ΦNt. Algorithm 1.3 shows that the structure of each schedule πΦNtcan be represented as a sequence of partial schedules π=ω0ω1ω2ωk,where ω0=ωNt, and ωiis either ω1NiCi,or ω2NiCi,and Ni=N\ω0ωi1, Ci=Cmaxω0ωi1t,i=1,2,,k. The schedule πhas the same structure according to the definition of the set ΩNt, that is, π=ω0ω1ω2ωk,possibly kk,where ω0=ω0=ωNt,ωiis equal to either ω1NiCi,or ω2NiCi,a Ni=N\ω0ωi1, Ci=Cmaxω0ωi1t,i=1,2,,k.

We assume that the first kpartial schedules πand πare equal, that is, ωi=ωi=ωi,i=0,1,,k1,ωkωk.If y=Lmaxω0ωk1,let us construct a schedule θusing algorithm 1.2, θ=θNkCky.If θ=,then according to algorithm 1.3, ωk=ω1NkCk. Because of ωkωk,schedule ωk=ω2NkCk.Objective function value (Lmax) can be reached on a job from the set Nk,since θ=.The whole structure of the algorithm 1.3 construct in such a way that up to the “critical” job (according to Lmax) order the jobs as “tightly” as possible, therefore we complete the schedule ω1, after which CmaxπCmaxπand LmaxπLmaxπ. If θ,then for schedules πand πCmaxπCmaxπand Lmaxπ=Lmaxπ. Thus, for any schedule πΩNt\ΦNtexists schedule πΦNtsuch that CmaxπCmaxπand LmaxπLmaxπ. Hence, for any schedule πΠNthere exists schedule πΦNtsuch that LmaxπLmaxπand CmaxπCmaxπ. The theorem is proved.

Figure 1 schematically shows the considered schedule.

For the set of schedules ΦNt=π1π2πm, mn, we conditions (5)–(6) are true.

The schedule π1is optimal in terms of speed (Cmax), and πmis optimal in terms of the maximum lateness (by Lmax) if the jobs of the set Nsatisfy the conditions (9).

## 4. Conclusions

Single machine scheduling problem with given release dates and two objective functions is considered in this chapter, which is NP-hard in the strong sense. A number of new polynomially and pseudo-polynomially solvable subcases of the problem were found. For a case when

d1dn,d1r1p1dnrnpn,E33

an algorithm for constructing a Pareto-optimal set of schedules by criteria Cmaxand Lmaxis developed. It is proved that the complexity of the algorithm does not exceed On3lognoperations.

An experimental study of the algorithm showed that it can be used to construct optimal schedules (by Lmax) even for instances not satisfying the conditions (33).

## Acknowledgments

The research was supported by RFBR (project 20-58-S52006).

## References

1. 1. Lenstra JK, Rinnooy Kan AHG, Brucker P. Complexity of machine scheduling problems. Annals of Discrete Mathematics. 1977;1:343-362
2. 2. Graham RL, Lawler EL, Lenstra JK, Rinnooy Kan AHG. Optimization and approximation in deterministic sequencing and scheduling: A survey. Annals of Discrete Mathematics. 1979;5:287-326
3. 3. Potts CN. Analysis of a heuristic for one machine sequencing with release dates and delivery times. Operations Research. 1980;28:1436-1441
4. 4. Jackson JR. Scheduling a Production Line to Minimize Maximum Tardiness. Los Angeles, CA: University of California; 1955. Manag. Sci. Res. Project. Research Report N 43
5. 5. Hall LA, Shmoys DB. Jackson’s rule for one-machine schedulings: Making a good heuristic better. Mathematics of Operations Research. 1992;17:22-35
6. 6. Mastrolilli M. Efficient approximation schemes for scheduling problems with release dates and delivery times. Journal of Scheduling. 2003;6(6):521-531
7. 7. Garey MR, Johnson DS, Simons BB, Tarjan RE. Scheduling UnitTime tasks with arbitrary release times and deadlines. SIAM Journal on Computing. 1981;10:256-269
8. 8. Vakhania N. Single-machine scheduling with release times and tails. Annals of Operations Research. 2004;129(1–4):253-271
9. 9. Vakhania N. Dynamic restructuring framework for Scheduling with release times and due-dates. Mathematics. 2019;7(11):1104
10. 10. Lawler EL. Optimal sequencing of a single machine subject to precedence constraints. Management Science. 1973;19(5):544-546
11. 11. Simons BB. A fast algorithm for single processor scheduling. In: Proceedings of the 19th IEEE Annual Symposium on Foundations of Computer Science. New York: Ann. Arbor. Mich; 1978. pp. 246-252
12. 12. Baker KR, Lawler EL, Lenstra JK, Rinnooy Kan AHG. Preemtive scheduling of a single machine to minimize maximum cost subject to release dates and precedence constraints. Operations Research. 1983;31(2):381-386
13. 13. Hoogeveen JA. Minimizing maximum promptness and maximum lateness on a single machine. Mathematics of Operations Research. 1996;21:100-114
14. 14. Lazarev AA, Shulgina ON. Polynomially solvable subcases of the problem of minimizing maximum lateness. Izvestiya VUZov. Mathematics. 2000 (in Russian)
15. 15. Vakhania N. Scheduling a single machine with primary and secondary objectives. Algorithms. 2018;11(6):80
16. 16. Vakhania N. Fast solution of single-machine scheduling problem with embedded jobs. Theoretical Computer Science. 2019;782:91-106

Written By

Alexander A. Lazarev and Nikolay Pravdivets

Submitted: July 1st, 2020 Reviewed: August 21st, 2020 Published: November 2nd, 2020