|
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
This site uses |
Last updated on
09 April 2025 |
A. Friebe, A. Marchetti-Spaccamela, T. Cucinotta, A. V. Papadopoulos, T. Nolte, S. Baruah "Resource Management For Stochastic Parallel Synchronous Tasks: Bandits To The Rescue," (to appear on) Springer Real-Time Systems, 2025.
In scheduling real-time tasks, we face the challenge of meeting hard deadlines while optimizing for some other objective, such as minimizing energy consumption. Formulating the optimization as a Multi-Armed Bandit (MAB) problem allows us to use MAB strategies to balance the exploitation of good choices based on observed data with the exploration of potentially better options. In this paper, we integrate hard real-time constraints with MAB strategies for resource management of a Stochastic Parallel Synchronous Task. On a platform with M cores available for the task, m ≤ M cores are initially assigned. Prior work has shown how to compute a virtual deadline such that assigning all M cores to the task if it has not completed by this virtual deadline guarantees that the deadline will be met. An MAB strategy is used to select the value of m. A Dynamic Power Management (DPM) energy model considering CPU sockets and sleep states is described. Experimental evaluation shows that MAB strategies learn consistently suitable m, and perform well compared to binary exponential search and greedy methods.
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() |
Last updated on
11 April 2025 |