Manufacturing process modeling and optimization based on Multi-Layer Perceptron network

被引:42
作者
Liao, TW [1 ]
Chen, LJ
机构
[1] Louisiana State Univ, Dept Ind & Mfg Syst Engn, Baton Rouge, LA 70803 USA
[2] TA Instruments Inc, New Castle, DE 19720 USA
来源
JOURNAL OF MANUFACTURING SCIENCE AND ENGINEERING-TRANSACTIONS OF THE ASME | 1998年 / 120卷 / 01期
关键词
D O I
10.1115/1.2830086
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
It has been shown that a manufacturing process can be modeled (learned) using Multi-Layer Performance (MLP) neural network and then optimized directly using the learned network. This paper extends the previous work by examining several different MLP training algorithms for manufacturing process modeling and three methods for process optimization. The transformation method is used to convert a constrained objective function into an unconstrained one, which is then used as the error function in the process optimization stage. The simulation results indicate that: (i) the conjugate gradient algorithms with backtracking line search outperform the standard BP algorithm in convergence speed; (ii) the neural network approaches could yield more accurate process models than the regression method; (iii) the BP with simulated annealing method is the most reliable optimization method to generate the best optimal solution, and (iv) process optimization directly performed on the neural network is possible but cannot be easily automated totally, especially when the process concerned is a mixed integer problem.
引用
收藏
页码:109 / 119
页数:11
相关论文
共 23 条
[1]  
ACKLEY DH, 1985, COGNITIVE SCI, V9
[2]   OPTIMIZATION FOR TRAINING NEURAL NETS [J].
BARNARD, E .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (02) :232-240
[3]   1ST-ORDER AND 2ND-ORDER METHODS FOR LEARNING - BETWEEN STEEPEST DESCENT AND NEWTON METHOD [J].
BATTITI, R .
NEURAL COMPUTATION, 1992, 4 (02) :141-166
[4]   ENHANCED TRAINING ALGORITHMS, AND INTEGRATED TRAINING ARCHITECTURE SELECTION FOR MULTILAYER PERCEPTRON NETWORKS [J].
BELLO, MG .
IEEE TRANSACTIONS ON NEURAL NETWORKS, 1992, 3 (06) :864-875
[5]   A UNIVERSAL NEURAL NET WITH GUARANTEED CONVERGENCE TO ZERO SYSTEM ERROR [J].
CHANG, TS ;
ABDELGHAFFAR, KAS .
IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1992, 40 (12) :3022-3031
[6]  
CHEN LJ, 1993, THESIS LOUISIANA STA
[7]   A COMPARISON OF STATISTICAL AND AL APPROACHES TO THE SELECTION OF PROCESS PARAMETERS IN INTELLIGENT MACHINING [J].
CHRYSSOLOURIS, G ;
GUILLOT, M .
JOURNAL OF ENGINEERING FOR INDUSTRY-TRANSACTIONS OF THE ASME, 1990, 112 (02) :122-131
[8]  
DENNIS JE, 1983, NUMERICAL METHODS UN
[9]   BACK-PROPAGATION ALGORITHM WHICH VARIES THE NUMBER OF HIDDEN UNITS [J].
HIROSE, Y ;
YAMASHITA, K ;
HIJIYA, S .
NEURAL NETWORKS, 1991, 4 (01) :61-66
[10]   IMPLEMENTATION AND COMPARISON OF 3 NEURAL NETWORK LEARNING ALGORITHMS [J].
HUANG, T ;
ZHANG, C ;
LEE, S ;
WANG, HP .
KYBERNETES, 1993, 22 (01) :22-38