In this chapter, we consider the single machine scheduling problem with given release dates, processing times, and due dates with two objective functions. The first one is to minimize the maximum lateness, that is, maximum difference between each job due date and its actual completion time. The second one is to minimize the maximum completion time, that is, to complete all the jobs as soon as possible. The problem is NP-hard in the strong sense. We provide a polynomial time algorithm for constructing a Pareto-optimal set of schedules on criteria of maximum lateness and maximum completion time, that is, problem 1 ∣ r j ∣ L max , C max , for the subcase of the problem: d 1 ≤ d 2 ≤ … ≤ d n ; d 1 − r 1 − p 1 ≥ d 2 − r 2 − p 2 ≥ … ≥ d n − r n − p n .
- single machine scheduling
- two-criteria scheduling
- minimization of maximum lateness
- minimization of maximum completion time
- polynomial time algorithm
We consider a classical scheduling problem on a single machine. A release time of each job is predefined and represents the minimum possible start time of the job. When constructing schedules, we consider two objective functions. The first one is to minimize the maximum lateness, that is, maximum difference between each job due date and its actual completion time. The second one is to minimize the maximum completion time, that is, to complete all the jobs as soon as possible. The problem is NP-hard in the strong sense . We provide a polynomial time algorithm for constructing a Pareto-optimal set of schedules on criteria of maximum lateness and maximum completion time, that is, problem , for the subcase of the problem when due dates are: . Example of a problem case that meets these constraints will be the case when all jobs have the same time for processing before due date.
2. Statement of the problem
We consider single machine scheduling problem, where a set of jobs has to be processed on a single machine. Each job we is numbered, that is, the entry “job ” is equivalent to the entry “job numbered .” Simultaneous executing of jobs or preemptions of the processing of a job are prohibited. For jobs , value is the minimum possible start time, is a processing time of job and is a due date of job .
The schedule is represented by a set of start times of each job. By , we denote the permutation of elements of the set . A set of all different schedules of jobs from the set is denoted by . Schedule is called feasible if , . We denote the completion time of job in schedule as . Difference , , denotes lateness of job in the schedule . Maximum lateness of jobs of the set under the schedule is
We denote the completion time of all jobs of the set in schedule as
The problem is to find the optimal schedule with the smallest value of the maximum lateness:
For any arbitrary set of jobs we also denote:
In the standard notation of Graham et al. , this problem is denoted as . Intensive work on the solution of this problem has continued since the early 50s of the 20th century. Lenstra et al.  showed that the general case of the problem is -hard in the strong sense.
Potts  introduced an iterative version of extended Jackson rule (IJ)  and proved that . Hall and Shmoys  modified the iterative version and created an algorithm (MIJ) that guarantees the evaluation . They also presented two approximation schemes that guarantee finding -approximate solution in and operations. Mastrolilli  introduced an improved approximation scheme with complexity of operations.
A number of polynomially solvable cases of the problem were found, starting with Jackson’s early result  for the case , , when the solution is a schedule in which jobs are ordered by nondecreasing due dates (by rule ). Such a schedule is also be optimal for the case when the release times and due dates are associated (, ).
Schedule is constructed according to the extended Jackson rule (Schrage schedule): on the next place in the schedule we select a released non-ordered job with the minimum due date; if there are no such jobs, then we select the job with the minimum release time among the unscheduled jobs.
If process times of all jobs are equal, the complexity can be reduced to . Vakhania generalized this result  considering the case when the processing times of some jobs are restricted to either or . An algorithm with complexity of was suggested.
A case when job processing times are mutually divisible is considered in . Author suggest a polynomial-time algorithm with a complexity of operations for solving this case.
Special cases , and with precedence constraints for jobs have been addressed in works of Lawler , Simons , Baker et al. . Hoogeveen  proposed a polynomial algorithm for the special case when job parameters satisfy the constraints , , for some constant . A pseudo-polynomial algorithm for the NP-hard case when release times and due dates are in the reversed order (and ) was developed in .
We denote by and the lateness and completion time of job in schedule , for instance, with job parameters , . Respectively, is a maximum lateness of the schedule for instance .
This paper deals with a problem with two objective functions and , which in general case can be referred as . This problem was considered in , where authors consider some dominance properties and conditions when the Pareto-optimal set can be formed in polynomial time.
Definition 1.1 For any instance of the problem, each permutation of the jobs of the set is uniquely defines early schedule . In the early schedule, each job starts immediately after the end of the previous job in the corresponding permutation. If the completion time of the previous job is less than the release time of the current job, then the beginning of the current job is equal to its release time. That is, if , then , where
Early schedules play an important role in our construction, since it is sufficient to check all early schedules to find the optimal schedule of any problem instance.
By we denote the optimal permutation and we denote the optimal schedule for instance . Only early optimal schedules are be considered, that is, .
We denote by the set of all permutations of jobs of the set , and by the set of feasible schedules for instance .
This section deals with the problem of constructing a Pareto-optimal set by criteria and , that is, problem , . We suggest an algorithm for constructing a set of schedules for which
There is no schedule such that and , at least one of the inequalities is strict for some , . It is shown that .
3.1 Problem properties
We denote the precedence of the jobs and in schedule as . We also introduce
In cases when its obvious how many jobs we mean, we write instead of .
We assume that the job parameters satisfy the following constraints:
For example, these constraints correspond to the case when , , where is a constant, that is, when all jobs have the same time for processing before due date. A problem with similar constraints but for a single objective function () is considered in .
We assume that and is the time when the machine is ready. From the set , we find two jobs and in the following way:
where . If , then we set , , . We also define , , , . For jobs and the following properties are true.
Lemma 1.1 If the jobs of the set satisfy (4), then for any schedule for all for which
is true, and for all satisfying the condition ,
where and , is also true.
Proof: For each job such that , completion time . If , then obviously
therefore (12) is valid.
Since , then (from (9)) or , so . Then, for each job .
The inequality (13) can be proved in a similar way.
For each job satisfying the condition , we have . If , then , therefore (13) is true.
Let for the job , , , then . Indeed, if we assume that , then (it follows from (7)). In addition, for any job according to definitions (8) and (11). If , then for the jobs and we can write and , which contradicts the definition (11) of job . If , that is, , then there is no job , for which . Therefore, for the jobs and we get and , which contradicts the definition (11) of job . Therefore,
Since and , then and since , therefore and
Since , then from (9) we have or
Hence, for each job .
Theorem 1.1 If conditions (9) are true for jobs in the subset , then at any time and any early schedule there exists such that
and one of the jobs or is at the first position in schedule . If , then job is at the first position in schedule .
From the lemma 1.1 we have
Obviously, the following inequality is true for job
Let , that is, job is before job . Construct a schedule . Further proof may be repeated as for job . The first part of the theorem is proved.
Let us assume and the schedule . Then, we construct a schedule , where are schedules of jobs of the sets and . Jobs in and are ordered according to nondecreasing release times . From we can conclude that .
For each job we have . Of (9) we get , hence , , and . Since jobs in schedule are sorted by nondecreasing release times, then . As a result
Job satisfies and , which means
Since , then
From the lemma 1.1
Moreover, it is obvious that
We call a schedule effective if there is no schedule such that and , that is, at least one inequality would be strict.
Thus, when constraints (9) are satisfied for jobs from the set , then there is an effective schedule , in which either the job , or is present. Moreover, if , then there is an effective schedule with a priority of job .
We define the set of schedules as a subset of consisting of schedules. Schedule belongs to if we choose job as or , where , and , . For it is true that , so if , then or . Its obvious that the set of schedules contains at most schedules.that is, .
The set contains schedules. The value of is used below in the text. The optimal solution of the problem is
Theorem 1.2 If for the jobs of the subset is true (9), then at any time and any schedule exists a schedule such that
Proof: Let be an arbitrary schedule. We denote the first jobs of the schedule as , where is an empty schedule, and , then . We introduce and . Suppose for some , is the largest initial partial the schedule of some schedule from . If and , then , , then the largest partial schedule is empty. Let us say and . If , then and , moreover when , then , since is not an initial schedule of some schedule from .
According to the theorem 1.1 for the jobs of the set , there is a schedule starting at time , for which , , and , moreover, with , true , where is the job in the -th place in schedule . Hence, and .
Let us denote . A feature of schedule is that the first jobs are the same as first jobs of some schedule from the set , and , .
After no more than sequential conversions (since schedule length ) of the original randomly selected schedule we come to schedule , for which and . The theorem is proved.
We form the following partial schedule . For each job , we have and , where and . For and inequality is true. In case when for and , then . So is the “maximum” schedule, during the construction of which job (like ) for the next place of the schedule can be uniquely selected. We can construct a schedule for set of jobs starting at time using the algorithm 1.1.
Algorithm 1.1 for constructing schedule .
1: Initial step. Let .
2: Main step. Find the jobs and ;
6: algorithm stops;
7: end if
8: Let , and go to the next main step.
Lemma 1.2 The complexity of the algorithm 1.1 for finding the schedule is at most operations for any and any .
Proof: At each iteration of the algorithm 1.1 we find two jobs: and . If jobs are ordered by release times (and, accordingly, time is for operations), then to find two jobs (and ) you need operations. Total number of iterations is not more than . Thus, constructing a schedule requires operations.
The main step of algorithm 1.1 is finding the jobs and and it requires at least operations. Obviously, the number of iterations of the algorithm is O(n), therefore, the complexity of the algorithm 1.1 of operations is the minimum possible for constructing the schedule .
Lemma 1.3 If for jobs of the set conditions (9) are true, then any schedule starts with the schedule .
Proof: If , that is, , where , , the statement of the lemma is true, since any schedule starts from empty.
Let , , and so for each , , we have and , where and . For and it is true that . As seen from the definition of the set of schedules all schedules in this subset start with a partial schedule .
Let us use the following notation and , where . Obviously, the algorithm for finding (as well as ) requires operations, as much as the algorithm for constructing .
Consequence 1.1 from Lemma 1.3. If the jobs of the set satisfy conditions (9), then each schedule starts either with , or with .
Theorem 1.3 If the jobs of the set satisfy conditions (9), then for any schedule it is true that for any and .
Proof: In the case statement of the theorem is obviously true. Let . Further in in the proof we use the notation .
If and are such that , then all schedules from the set begin with a partial schedule , therefore the statement of the theorem is also true.
Consider the case of . All schedules from set starting with job have partial schedule .
Let us choose any arbitrary schedules with job comes first, , and any schedule , containing jobs. Let be a partial schedule of schedule containing jobs, where . We need to prove that . Let us assume the contrary that there is a job , but .
For case we need to check two subcases. If , then from (9) we have , therefore . Then job is included in schedule according to the definition of and , but by our assumption . If , then from the fact that follows , but this contradicts . Therefore, .
The other case is . Then for each job , for which , conditions are true, because , where . Jobs and were not ordered in schedule , therefore, . Besides, . If , then , but is true. Hence , since , but it contradicts our guess that and .
Therefore, our assumption is not true and . The theorem is proved.
Therefore, jobs of the set precede jobs of the set for any schedule from the set , including the optimal schedule.
3.2 Performance problem with constraint on maximum lateness
The problem consists of the following. We need to find schedule for any with . If for any , then .
Lemma 1.4 The complexity of algorithm 1.2 does not exceed operations.
Proof: At each iteration of the main step of the algorithm 1.2 we find the schedules and, if necessary, in operations. Since and consist of at least one job, then at each iteration of the algorithm we either add one or mere jobs to the schedule , or assume and stop. Therefore, the total number of steps of the algorithm is at most . Thus, algorithm 1.2 requires operations.
Algorithm 1.2 for solving the problem .
1: Initial step. Let
2: Main step.
4: and the algorithm stops.
5: end if
6: Find , and .
8: the algorithm stops.
11: and go to next step;
12: end if
13: ifand then
14: and go to next step;
15: end if
16: ifand then
17: and the algorithm stops.
18: end if
19: end if
The problem cannot be solved in less than operations because there exists (Example 1.1). The optimal schedule for this example is . To find this schedule, we need operations.
We denote by the schedule constructed by algorithm 1.2 starting at time from the jobs of the set with the maximum lateness not more than . If , then for any and .
Theorem 1.4 Let the jobs of the set satisfy conditions (9). If the algorithm 1.2 constructs the schedule , then . If, as a result of the algorithm 1.2 the schedule will not be generated, that is, , then for each .
Proof: In case if for schedule condition is true, then according to Theorem 1.2 there is a schedule such that and . Therefore, the required schedule contains in set .
According to Lemma 1.3, all schedules of the set start with . Let us take .
After main steps of the algorithm 1.2 we got the schedule and , . Let us assume that there is an optimal by the criterion of maximum completion time () schedule starting with . According to Theorem 1.2, there is an optimal extension of the schedule among the schedules from the set .
Let , that is, . According to Theorem 1.3, for schedule , , there is no artificial idle times of the machine and all schedules from the set start with jobs of the set . Therefore, is the best by the criterion of among all feasible by maximum lateness () extensions of the partial schedule .
If , that is, , and . All schedules of the set start with either schedule or . As , then the only suitable extension is .
Thus, at each main step of the algorithm, we choose the fastest continuation of the partial schedule among all those allowed by the maximum lateness. After no more than main steps of the algorithm, the required schedule is constructed.
Let us assume that after the steps of the algorithm and . If schedule could exist, that is, , then would start with . Then for any schedule there would exist a schedule such that or . Therefore .
Repeating our proof as many times as the main step of algorithm 1.2 (no more than ), we come to the truth of the statement of the theorem.
3.3 Algorithm for constructing a set of Pareto schedules by criteria and
Let us develop an algorithm for constructing a set of Pareto schedules , , by criteria and according to conditions (5)–(6).
Schedule is a solution to problem if (9) is true.
Algorithm 1.3 for constructing a set of Pareto schedules by criteria and .
1: Initial step., , , , and .
3: and the algorithm stops.
4: end if
5: Main step.
7: , where and go to the next step;
8: end if
11: find using algorithm 1.2, where ;
13: and go to the next step;
17: , , , ;
19: and go to next step;
20: end if
21: end if
23: find ;
25: and go to the next step;
27: and the algorithm stops.
28: end if
29: end if
30: end if
31: end if
As a result of the algorithm 1.3, a set of schedules is constructed, for the set of jobs starting at time , for which inequality true. We should note that the set for Example 1.1 consists of two schedules, although set consists of schedules:
Lemma 1.5 The complexity of the algorithm 1.3 does not exceed operations.
Proof: At each iteration of the main step of the algorithm 1.3 we find schedules and, if necessary, , which requires operations according to lemma 1.2, and also schedule in operations. As and consist of at least one job, then at any iteration of the algorithm one or more jobs are added to the schedule , or the algorithm stops at last schedule . Therefore, the total number of iterations is at most . Thus, it takes no more than operations to execute algorithm 1.3.
Theorem 1.5 If case if (9) is true for each job of the set , then the schedule , constructed by algorithm 1.3, is optimal according to the criterion . Moreover, for any schedule there exists a schedule such that and .
Proof: According to Theorem 1.2, there exists an optimal (by ) schedule from set . According to Lemma 1.3, all schedules of the set start with a partial schedule .
Let . After , , main steps of algorithm 1.3 we have a partial schedule . Suppose there is an optimal (by ) schedule starting with . We denote and .
If , where , then either , or , that is, current value of the criterion and the maximum lateness will “appear” on next steps of the algorithm 1.3. That is, , where . If , then we improve the current maximum lateness value: and . The schedule is added to the set of schedules . Moreover, according to Theorem 1.3 jobs of set precede jobs of set . Thus, the schedule alert(without artificial idle times of the machine) would be the best continuation for .
If , where , that is, according to algorithm 1.3 . In this case the continuation is “better” than . Hence, the partial schedule is a part of some optimal schedule.
Repeating our proof no more than times, we come to optimality (for ) of the schedule .
The set of schedules contains at most schedules, since at each main step of the algorithm in the set at most one schedule is “added,” and this step is executed no more than times.
Suppose there is a schedule such that either and , or and for each schedule . Moreover, in each pair of inequalities at least one inequality is strict. According to Theorem 1.1, there is a schedule such that and . If . Thus, it becomes obvious that our assumption is not correct. Let . Algorithm 1.3 shows that the structure of each schedule can be represented as a sequence of partial schedules where , and is either or and , . The schedule has the same structure according to the definition of the set , that is, possibly where is equal to either or a , .
We assume that the first partial schedules and are equal, that is, If let us construct a schedule using algorithm 1.2, If then according to algorithm 1.3, . Because of schedule Objective function value () can be reached on a job from the set since The whole structure of the algorithm 1.3 construct in such a way that up to the “critical” job (according to ) order the jobs as “tightly” as possible, therefore we complete the schedule , after which and . If then for schedules and and . Thus, for any schedule exists schedule such that and . Hence, for any schedule there exists schedule such that and . The theorem is proved.
Figure 1 schematically shows the considered schedule.
For the set of schedules , , we conditions (5)–(6) are true.
The schedule is optimal in terms of speed (), and is optimal in terms of the maximum lateness (by ) if the jobs of the set satisfy the conditions (9).
Single machine scheduling problem with given release dates and two objective functions is considered in this chapter, which is -hard in the strong sense. A number of new polynomially and pseudo-polynomially solvable subcases of the problem were found. For a case when
an algorithm for constructing a Pareto-optimal set of schedules by criteria and is developed. It is proved that the complexity of the algorithm does not exceed operations.
An experimental study of the algorithm showed that it can be used to construct optimal schedules (by ) even for instances not satisfying the conditions (33).
The research was supported by RFBR (project 20-58-S52006).