Optimum maintenance policy using semi-Markov decision processes

被引:43
作者
Tomasevicz, Curtis L. [1 ]
Asgarpoor, Sohrab [1 ]
机构
[1] Univ Nebraska, Dept Elect Engn, WSEC, Lincoln, NE 68588 USA
基金
美国国家科学基金会;
关键词
Equipment availability; Maintenance; Power system maintenance; Semi-Markov process; Semi-Markov decision process;
D O I
10.1016/j.epsr.2009.03.008
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
A method is presented to solve for the optimum maintenance policy of repairable power equipment. A semi-Markov decision process (SMDP) is utilized to determine whether maintenance should be performed in each deterioration state and, if so, what type of maintenance. The approach uses a model that assumes equipment can fail due to both deterioration and random occurrences. Preventive maintenance can be performed from each working state to prevent deterioration failure. Minor maintenance will most likely return the equipment to the immediate preceding working state, while major maintenance can return the equipment to any of the previous working states. An example is used to demonstrate the method using MATLAB software. (c) 2009 Elsevier B.V. All rights reserved.
引用
收藏
页码:1286 / 1291
页数:6
相关论文
共 13 条
[1]  
[Anonymous], 1996, Stochastic Processes
[2]  
CHAN GK, 2001, P 2001 N AM POW S CO, P510
[3]  
Endrenyi J., 2006, IEEE Power & Energy Magazine, V4, P59, DOI 10.1109/MPAE.2006.1632455
[4]   Probabilistic evaluation of the effect of maintenance on reliability - An application - Discussion [J].
Endrenyi, J ;
Anders, GJ ;
da Silva, AML .
IEEE TRANSACTIONS ON POWER SYSTEMS, 1998, 13 (02) :583-583
[5]  
HENK C, 1986, STOCHASTIC MODELINGG
[6]  
Howard R., 1971, : Dynamic probabilistic systems, vol. ii: Semi-Markov and decision processes., Vii
[7]  
HOWARD RA, 1960, DYNAMIC PROGRAMMING, V2
[8]   A probabilistic analysis of bias optimality in unichain Markov decision processes [J].
Lewis, ME ;
Puterman, ML .
IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2001, 46 (01) :96-100
[9]  
LIMNIOS N, 2001, SEMIMARKOV MODELS RE
[10]  
Puterman Martin L., 1994, Markov Decision Processes: Discrete Stochastic Dynamic Programming