Job shop scheduling
   HOME

TheInfoList



OR:

Job-shop scheduling, the job-shop problem (JSP) or job-shop scheduling problem (JSSP) is an
optimization problem In mathematics, engineering, computer science and economics Economics () is a behavioral science that studies the Production (economics), production, distribution (economics), distribution, and Consumption (economics), consumption of goo ...
in
computer science Computer science is the study of computation, information, and automation. Computer science spans Theoretical computer science, theoretical disciplines (such as algorithms, theory of computation, and information theory) to Applied science, ...
and
operations research Operations research () (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a branch of applied mathematics that deals with the development and application of analytical methods to improve management and ...
. It is a variant of optimal job scheduling. In a general job scheduling problem, we are given ''n'' jobs ''J''1, ''J''2, ..., ''Jn'' of varying processing times, which need to be scheduled on ''m'' machines with varying processing power, while trying to minimize the
makespan In operations research Operations research () (U.S. Air Force Specialty Code: Operations Analysis), often shortened to the initialism OR, is a branch of applied mathematics that deals with the development and application of analytical methods t ...
– the total length of the schedule (that is, when all the jobs have finished processing). In the specific variant known as ''job-shop scheduling'', each job consists of a set of ''operations'' ''O''1, ''O''2, ..., ''On'' which need to be processed in a specific order (known as ''precedence constraints''). Each operation has a ''specific machine'' that it needs to be processed on and only one operation in a job can be processed at a given time. A common relaxation is the flexible job shop, where each operation can be processed on any machine of a given ''set'' (the machines in each set are identical). The name originally came from the scheduling of jobs in a
job shop A job shop is a manufacturing system that handles custom/bespoke or semi-custom/bespoke manufacturing processes such as small to medium-size customer orders or batch jobs. Further reading *A. Portioli, A. Pozzetti, Progettazione dei sistemi produ ...
, but the theme has wide applications beyond that type of instance. It is a well-known
combinatorial optimization Combinatorial optimization is a subfield of mathematical optimization that consists of finding an optimal object from a finite set of objects, where the set of feasible solutions is discrete or can be reduced to a discrete set. Typical combina ...
problem and was the first to undergo competitive analysis, introduced by Graham in 1966. The best problem instances for a basic model with a makespan objective are due to Taillard. In the standard three-field notation for optimal job scheduling problems, the job-shop variant is denoted by ''J'' in the first field. For example, the problem denoted by "J_3 , p_ , C_\max" is a 3-machines job-shop problem with unit processing times, where the goal is to minimize the maximum completion time.


Problem variations

Many variations of the problem exist, including the following: * Machines can have duplicates (flexible job shop with duplicate machines) or belong to groups of identical machines (flexible job shop). * Machines can require a certain gap between jobs or no idle-time. * Machines can have sequence-dependent setups. *
Objective function In mathematical optimization and decision theory, a loss function or cost function (sometimes also called an error function) is a function that maps an event or values of one or more variables onto a real number intuitively representing some "cost ...
can be to minimize the makespan, the ''Lp'' norm, tardiness, maximum lateness etc. It can also be multi-objective optimization problem. * Certain jobs must be completed before others can start (see
workflow Workflow is a generic term for orchestrated and repeatable patterns of activity, enabled by the systematic organization of resources into processes that transform materials, provide services, or process information. It can be depicted as a seque ...
), and objectives may involve multiple-criteria. * Set of jobs can relate to different set of machines. * Deterministic (fixed) processing times or probabilistic processing times.


NP-hardness

Since the
traveling salesman problem In the theory of computational complexity, the travelling salesman problem (TSP) asks the following question: "Given a list of cities and the distances between each pair of cities, what is the shortest possible route that visits each city exac ...
is
NP-hard In computational complexity theory, a computational problem ''H'' is called NP-hard if, for every problem ''L'' which can be solved in non-deterministic polynomial-time, there is a polynomial-time reduction from ''L'' to ''H''. That is, assumi ...
, the job-shop problem with sequence-dependent setup is also NP-hard since the TSP is a special case of the JSP with a single job (the salesman in TSP) and the machines (the cities in TSP).


Problem representation

The disjunctive graph is one of the popular models used for describing the job-shop scheduling problem instances. A mathematical statement of the problem can be made as follows: Let M = \ and J = \ be two finite sets. On account of the industrial origins of the problem, the \displaystyle M_ are called machines and the \displaystyle J_ are called jobs. Let \displaystyle \ \mathcal denote the set of all sequential assignments of jobs to machines, such that every job is done by every machine exactly once; elements x \in \mathcal may be written as n \times m matrices, in which column \displaystyle i lists the jobs that machine \displaystyle M_ will do, in order. For example, the matrix : x = \begin 1 & 2 \\ 2 & 3 \\ 3 & 1 \end means that machine \displaystyle M_ will do the three jobs \displaystyle J_, J_, J_ in the order \displaystyle J_, J_, J_, while machine \displaystyle M_ will do the jobs in the order \displaystyle J_, J_, J_. Suppose also that there is some cost function C : \mathcal \to , + \infty/math>. The cost function may be interpreted as a "total processing time", and may have some expression in terms of times C_ : M \times J \to , + \infty/math>, the cost/time for machine \displaystyle M_ to do job \displaystyle J_. The job-shop problem is to find an assignment of jobs x \in \mathcal such that \displaystyle C(x) is a minimum, that is, there is no y \in \mathcal such that \displaystyle C(x) > C(y).


Scheduling efficiency

Scheduling efficiency can be defined for a schedule through the ratio of total machine idle time to the total processing time as below: C'=1+= Where l_i represents the idle time of machine , is the makespan, and is the number of machines. This formulation normalizes the makespan by the number of machines and total processing time, allowing for the comparison of resource utilization across job-shop scheduling (JSP) instances of varying sizes.


The problem of infinite cost

One of the first problems that must be dealt with in the JSP is that many proposed solutions have infinite cost: i.e., there exists x_ \in \mathcal such that C(x_) = + \infty. In fact, it is quite simple to concoct examples of such x_ by ensuring that two machines will deadlock, so that each waits for the output of the other's next step.


Major results

Graham introduced the List scheduling algorithm in 1966, which is (2 − 1/m)-competitive, where ''m'' is the number of machines. It was later proven to be the optimal online algorithm for two and three machines. The Coffman–Graham algorithm (1972) for uniform-length jobs is also optimal for two machines and (2 − 2/m)-competitive. In 1992, Bartal, Fiat, Karloff, and Vohra presented a 1.986-competitive algorithm, followed by a 1.945-competitive algorithm by Karger, Philips, and Torng in 1994. That same year, Albers introduced a different 1.923-competitive algorithm. The best known result is by Fleischer and Wahl, achieving a 1.9201 competitive ratio. Albers also established a lower bound of 1.852. Taillard instances play a key role in developing job-shop scheduling with a makespan objective. In 1976, Garey proved that this problem is
NP-complete In computational complexity theory, NP-complete problems are the hardest of the problems to which ''solutions'' can be verified ''quickly''. Somewhat more precisely, a problem is NP-complete when: # It is a decision problem, meaning that for any ...
for ''m'' > 2, meaning no optimal solution can be computed in polynomial time unless P=NP. In 2011, Xin Chen et al. provided optimal algorithms for online scheduling on two related machines, improving previous results.


Offline makespan minimization


Atomic jobs

The simplest form of the offline makespan minimisation problem deals with atomic jobs, that is, jobs that are not subdivided into multiple operations. It is equivalent to packing a number of items of various different sizes into a fixed number of bins, such that the maximum bin size needed is as small as possible. (If instead the number of bins is to be minimised, and the bin size is fixed, the problem becomes a different problem, known as the
bin packing problem The bin packing problem is an optimization problem, in which items of different sizes must be packed into a finite number of bins or containers, each of a fixed given capacity, in a way that minimizes the number of bins used. The problem has m ...
.) Dorit S. Hochbaum and David Shmoys presented a polynomial-time approximation scheme in 1987 that finds an approximate solution to the offline makespan minimisation problem with atomic jobs to any desired degree of accuracy.


Jobs consisting of multiple operations

The basic form of the problem of scheduling jobs with multiple (M) operations, over M machines, such that all of the first operations must be done on the first machine, all of the second operations on the second, etc., and a single job cannot be performed in parallel, is known as the flow-shop scheduling problem. Various algorithms exist, including
genetic algorithm In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to g ...
s.


Johnson's algorithm

A heuristic algorithm by S. M. Johnson can be used to solve the case of a 2 machine N job problem when all jobs are to be processed in the same order.S.M. Johnson, Optimal two- and three-stage production schedules with setup times included, Naval Res. Log. Quart. I(1954)61-68. The steps of algorithm are as follows: Job Pi has two operations, of duration Pi1, Pi2, to be done on Machine M1, M2 in that sequence. *''Step 1.'' List A = , List L1 = , List L2 = . *''Step 2.'' From all available operation durations, pick the minimum. If the minimum belongs to Pk1, Remove K from list A; Add K to end of List L1. If minimum belongs to Pk2, Remove K from list A; Add K to beginning of List L2. *''Step 3.'' Repeat Step 2 until List A is empty. *''Step 4.'' Join List L1, List L2. This is the optimum sequence. Johnson's method only works optimally for two machines. However, since it is optimal, and easy to compute, some researchers have tried to adopt it for M machines, (''M'' > 2.) The idea is as follows: Imagine that each job requires m operations in sequence, on M1, M2 … Mm. We combine the first ''m''/2 machines into an (imaginary) Machining center, MC1, and the remaining Machines into a Machining Center MC2. Then the total processing time for a Job P on MC1 = sum (operation times on first ''m''/2 machines), and processing time for Job P on MC2 = sum(operation times on last ''m''/2 machines). By doing so, we have reduced the m-Machine problem into a Two Machining center scheduling problem. We can solve this using Johnson's method.


Makespan prediction

Machine learning has been recently used to ''predict'' the optimal makespan of a JSP instance without actually producing the optimal schedule. Preliminary results show around 80% accuracy in classifying small randomly generated JSP instances by optimal scheduling efficiency using supervised learning.


Example

Here is an example of a job-shop scheduling problem formulated in
AMPL AMPL (A Mathematical Programming Language) is an algebraic modeling language to describe and solve high-complexity problems for large-scale mathematical computing (e.g. large-scale optimization and scheduling-type problems). It was developed ...
as a
mixed-integer programming An integer programming problem is a mathematical optimization or feasibility program in which some or all of the variables are restricted to be integers. In many settings the term refers to integer linear programming (ILP), in which the objective ...
problem with indicator constraints: param N_JOBS; param N_MACHINES; set JOBS ordered = 1..N_JOBS; set MACHINES ordered = 1..N_MACHINES; param ProcessingTime > 0; param CumulativeTime = sum ProcessingTime ,jj param TimeOffset = max (CumulativeTime 1,j- CumulativeTime 2,j+ ProcessingTime 2,j; var end >= 0; var start >= 0; var precedes binary; minimize makespan: end; subj to makespan_def: end >= start + sum ProcessingTime ,j subj to no12_conflict: precedes 1,i2

> start 2>= start 1+ TimeOffset 1,i2 subj to no21_conflict: !precedes 1,i2

> start 1>= start 2+ TimeOffset 2,i1 data; param N_JOBS := 4; param N_MACHINES := 4; param ProcessingTime: 1 2 3 4 := 1 5 4 2 1 2 8 3 6 2 3 9 7 2 3 4 3 1 5 8;


Related problems

* Flow-shop scheduling is a similar problem but without the constraint that each operation must be done on a specific machine (only the order constraint is kept). * Open-shop scheduling is a similar problem but also without the order constraint.


See also

* Disjunctive graph * Dynamic programming * Genetic algorithm scheduling *
List of NP-complete problems This is a list of some of the more commonly known problems that are NP-complete when expressed as decision problems. As there are thousands of such problems known, this list is in no way comprehensive. Many problems of this type can be found in ...
*
Optimal control Optimal control theory is a branch of control theory that deals with finding a control for a dynamical system over a period of time such that an objective function is optimized. It has numerous applications in science, engineering and operations ...
*
Scheduling (production processes) Scheduling is the process of arranging, controlling and optimizing work and workloads in a Production (economics), production process or manufacturing process. Scheduling is used to allocate plant and machinery resources, plan human resources, plan ...


References


External links


University of Vienna
Directory of methodologies, systems and software for dynamic optimization.
Taillard instances
* ''Brucker P.'
Scheduling Algorithms
Heidelberg, Springer. Fifth ed. {{Scheduling problems Optimal scheduling NP-complete problems pt:Escalonamento de Job Shop