Title

A new generation of optimization algorithms within the Lagrangian relaxation approach for job shop scheduling

Date of Completion

January 1999

Keywords

Engineering, Electronics and Electrical|Engineering, Industrial|Operations Research

Degree

Ph.D.

Abstract

Subgradient method and bundle methods are frequently used in Lagrangian relaxation for integer optimization problems. This dissertation develops a new generation of optimization algorithms which provide better performance. ^ In the subgradient method, the “relaxed problem” must be optimally solved to obtain a subgradient direction. The first part of the dissertation develops a surrogate subgradient method, where a proper direction can be obtained with only approximate optimization of the relaxed problem. This method can obtain directions with much less effort, and provides a new approach that is especially powerful for problems of very large size. While applying this method to job shop scheduling, simplified dynamic programming has been developed, leading to much reduced computation complexity and faster convergence. ^ In bundle methods, in order to obtain good search directions, quadratic programming, line search, and null steps are required. The second part of the dissertation improves bundle methods by making good use of the information obtained from minimizing the relaxed problem—not just the minimum solution, but also near minimum solutions. The bundle information is thus enriched, leading to better search directions and fewer null steps. Furthermore, a simplified bundle method is developed, where quadratic programming, line search, and null steps are no longer necessary. The simplified bundle method is then applied to job shop scheduling, and fuzzy dynamic programming has been developed to obtain a search direction efficiently. ^ The third part of the dissertation presents a novel Lagrangian Relaxation Neural Network (LRNN) for optimization problems by combining recurrent neural network optimization ideas with Lagrangian relaxation for constraint handling. The convergence of the network is proved, and a general framework for neural implementation is established allowing creative variations. When applying the network for job shop scheduling, the neuron-based dynamic programming is developed making innovative use of the subproblem structure. Architectural design issues for the hardware implementation of such neural networks are also explored. A digital circuitry with a micro-controller and an optimization chip is designed, where a parallel architecture and a pipeline architecture are designed for the optimization chip. ^

COinS