Markov Decision Process to Optimise Long-term Asset Maintenance and Technologies Investment in Chemical Industry
Identifikátory výsledku
Kód výsledku v IS VaVaI
<a href="https://www.isvavai.cz/riv?ss=detail&h=RIV%2F00216305%3A26210%2F21%3APU141356" target="_blank" >RIV/00216305:26210/21:PU141356 - isvavai.cz</a>
Výsledek na webu
<a href="https://www.sciencedirect.com/science/article/abs/pii/B9780323885065502874?via%3Dihub" target="_blank" >https://www.sciencedirect.com/science/article/abs/pii/B9780323885065502874?via%3Dihub</a>
DOI - Digital Object Identifier
<a href="http://dx.doi.org/10.1016/B978-0-323-88506-5.50287-4" target="_blank" >10.1016/B978-0-323-88506-5.50287-4</a>
Alternativní jazyky
Jazyk výsledku
angličtina
Název v původním jazyce
Markov Decision Process to Optimise Long-term Asset Maintenance and Technologies Investment in Chemical Industry
Popis výsledku v původním jazyce
The decisions on synthesising a process network are often to optimise the payback periods based on investment cost. In addition to the core investment and the cost of used resources, the long-term reliable operation of the process is also crucial. Given available states and technologies of the assets, this study aims to identify the long-term optimal asset planning policy. Markov Decision Process (MDP) is a promising tool in identifying the optimal policy under different states of the assets or equipment. The failure probability of the unit is modelled with the ‘bathtub’ model and each of the condition states are incorporated in the MDP. The decisions to implement the redundant units in the process with variety of technologies are allowed. This paper applied the MDP into an equivalent Mixed Integer Non-linear Programming (MINLP) to solve for the optimal long-term assets decision and the maintenance policy. The applicability of the method is tested on a real case study from Sinopec Petrochemical Plant. The capital and expected operational cost that accounts for equipment maintenance for an infinite time horizon are determined.
Název v anglickém jazyce
Markov Decision Process to Optimise Long-term Asset Maintenance and Technologies Investment in Chemical Industry
Popis výsledku anglicky
The decisions on synthesising a process network are often to optimise the payback periods based on investment cost. In addition to the core investment and the cost of used resources, the long-term reliable operation of the process is also crucial. Given available states and technologies of the assets, this study aims to identify the long-term optimal asset planning policy. Markov Decision Process (MDP) is a promising tool in identifying the optimal policy under different states of the assets or equipment. The failure probability of the unit is modelled with the ‘bathtub’ model and each of the condition states are incorporated in the MDP. The decisions to implement the redundant units in the process with variety of technologies are allowed. This paper applied the MDP into an equivalent Mixed Integer Non-linear Programming (MINLP) to solve for the optimal long-term assets decision and the maintenance policy. The applicability of the method is tested on a real case study from Sinopec Petrochemical Plant. The capital and expected operational cost that accounts for equipment maintenance for an infinite time horizon are determined.
Klasifikace
Druh
C - Kapitola v odborné knize
CEP obor
—
OECD FORD obor
20704 - Energy and fuels
Návaznosti výsledku
Projekt
<a href="/cs/project/EF15_003%2F0000456" target="_blank" >EF15_003/0000456: Laboratoř integrace procesů pro trvalou udržitelnost</a><br>
Návaznosti
P - Projekt vyzkumu a vyvoje financovany z verejnych zdroju (s odkazem do CEP)
Ostatní
Rok uplatnění
2021
Kód důvěrnosti údajů
S - Úplné a pravdivé údaje o projektu nepodléhají ochraně podle zvláštních právních předpisů
Údaje specifické pro druh výsledku
Název knihy nebo sborníku
31st European Symposium on Computer Aided Process Engineering
ISBN
9780323885065
Počet stran výsledku
6
Strana od-do
1853-1858
Počet stran knihy
2140
Název nakladatele
Elsevier Ltd.
Místo vydání
Neuveden
Kód UT WoS kapitoly
—