You are here
Resource allocation and load-shedding policies based on Markov decision processes for renewable energy generation and storage
- Date Issued:
- 2015
- Abstract/Description:
- In modern power systems, renewable energy has become an increasingly popular form of energy generation as a result of all the rules and regulations that are being implemented towards achieving clean energy worldwide. However, clean energy can have drawbacks in several forms. Wind energy, for example can introduce intermittency. In this thesis, we discuss a method to deal with this intermittency. In particular, by shedding some specific amount of load we can avoid a total system breakdown of the entire power plant. The load shedding method discussed in this thesis utilizes a Markov Decision Process with backward policy iteration. This is based on a probabilistic method that chooses the best load-shedding path that minimizes the expected total cost to ensure no power failure. We compare our results with two control policies, a load-balancing policy and a less-load shedding policy. It is shown that the proposed MDP policy outperforms the other control policies and achieves the minimum total expected cost.
Title: | Resource allocation and load-shedding policies based on Markov decision processes for renewable energy generation and storage. |
26 views
9 downloads |
---|---|---|
Name(s): |
Jimenez, Edwards, Author Atia, George, Committee Chair Richie, Samuel, Committee Member Pazour, Jennifer, Committee Member University of Central Florida, Degree Grantor |
|
Type of Resource: | text | |
Date Issued: | 2015 | |
Publisher: | University of Central Florida | |
Language(s): | English | |
Abstract/Description: | In modern power systems, renewable energy has become an increasingly popular form of energy generation as a result of all the rules and regulations that are being implemented towards achieving clean energy worldwide. However, clean energy can have drawbacks in several forms. Wind energy, for example can introduce intermittency. In this thesis, we discuss a method to deal with this intermittency. In particular, by shedding some specific amount of load we can avoid a total system breakdown of the entire power plant. The load shedding method discussed in this thesis utilizes a Markov Decision Process with backward policy iteration. This is based on a probabilistic method that chooses the best load-shedding path that minimizes the expected total cost to ensure no power failure. We compare our results with two control policies, a load-balancing policy and a less-load shedding policy. It is shown that the proposed MDP policy outperforms the other control policies and achieves the minimum total expected cost. | |
Identifier: | CFE0005635 (IID), ucf:50222 (fedora) | |
Note(s): |
2015-05-01 M.S.E.E. Engineering and Computer Science, Electrical Engr and Comp Sci Masters This record was generated from author submitted information. |
|
Subject(s): | natural energy -- markov decision process -- MDP -- load-shedding -- energy storage -- intermittency -- expected cost | |
Persistent Link to This Record: | http://purl.flvc.org/ucf/fd/CFE0005635 | |
Restrictions on Access: | public 2015-05-15 | |
Host Institution: | UCF |