by Mike Wise, Data Scientist / Solution Architect – MCS Incubation Services
Building energy consumption is a huge component of energy consumption. In the USA for example, around 40 percent of the energy consumed is in buildings, and as the USA consumes something like 25 percent of the worlds energy, it is clear we are talking about a multi-trillion-dollar segment with a huge environmental footprint. So it makes sense to look for ways to cut it back. And as more and more of our energy comes from renewable but intermittent sources like solar and wind, we will need to introduce variable pricing that reflects the current availability of energy — facilitating the move to sustainable solutions.
Load shaping is a technique that describes the process of changing the timing of your energy consumption to take advantage of low energy prices when they occur, or avoid times of high prices. There are many ways to do this, and I was fortunate enough to have been tasked with implementing a form of load shaping to reduce wintertime peak energy usage on the Microsoft campus in cooperation with Microsoft RE&F (Real Estate and Finance). R was the language that we used for it.
Business Problem and Solution
While load shaping is surely in our future, variably-priced energy for consumers is still very much the exception — due in part to the relative rarity of smart meter deployment.
However, one form that is very common in the commercial market place is “demand charging”, or a demand surcharge that is proportionate to the worst 15-minute period in a month. This is rarely seen in the retail market, but is pretty common worldwide in the commercial space as it is easy to implement without smart meters. For example, note the following figure, a typical bill for a rather large building on campus. The demand charge is a power charge (kW), whereas all of the other consumption based charges are for energy (kWH).
On this campus the worst peaks occur on some cold winter mornings, when the buildings are electrically heated and have been allowed to cool down overnight. This means that they will have to be heated up aggressively again, just as everyone is coming in and powering up their computer, brewing coffee, and turning on the lights. This is sometimes referred to as “Sharktoothing” because of shape of the energy usage profile.
So what we want to do is get rid of the peaks, by spending a relatively small amount of energy to preheat the buildings on cold nights.
While this has been attempted manually in the past, it was abandoned then due to several reasons:
- It was hard to predict the size of the peaks as they were sensitive to weather and whether or not it was going to be necessary to perform the preheating.
- It was difficult to manually guess the optimal pre-heating profile to avoid the peak but also minimize energy usage.
So this was seen as a natural for an automated Machine Learning approach.
Model Design – going from Predictive to Prescriptive Modeling
In a previous project, we had created weather and time-based energy prediction models for these buildings, and we anticipated this would be an easy extension of those models, using much of the same data and code. This proved not to be true, and is typical of what can happen when a purely predictive solution is adapted into being prescriptive solution, where the algorithms not only have to predict the future, but show an optimal course forward. This is because there were two things that needed to be done that were not necessary in the predictive model:
- The control variable (the control of the thermostat settings) was not in the original model as that behavior did not vary with the weather and thus did not affect energy prediction. As these settings are building and time dependent we need an entirely new model to predict these as well.
- There was no tracking of the temperature (i.e. the stored energy) in the original models. Obviously we were going to need this if we were going to be able to optimize the values.
- There are thousands of thermostat and temperature measurements in our biggest buildings, so they needed to be aggregated and cleansed as well.
So we had to cleanse new data and develop two new models. We used RandomForest for all of them as that had worked well in the prediction project, using the caret library to tune the model and compare different techniques with each other. Our plots were all in ggplot2 (and a bit of rgl), but we also made extensive use of many other elements of the “tidy-verse” like dplyr and reshape2. Our data resided in a Hive database running under Azure HDInsight.
Also, as in the preceding project, we made heavy use of R-markdown here to document our various approaches and find the best cleansing techniques and models. This phase was mostly before the advent of RTVS (R Tools for Visual Studio) and we used R-Studio for most of it, though we did use these results in “alpha-trials” of RTVS.
Of course a dynamic model is not guaranteed to converge (it is not subject to the same physical constraints that reality is), and we indeed had a number of convergence problems, that we eventually overcame by adjusting various parameters and choosing different sets of explanatory variables.
Originally we had intended to trial this in the winter of 2015/2016 but model development took longer than planned and the Pacific North West experienced an unusually warm winter anyway. So we took to simulations to try out our model. For deployment we went with a R-Shiny application, developed in R-Studio and later RTVS as that allows the engineers to try out various optimization parameters and see how well the load shaping worked.
We used a simple brute-force optimization (simply trying out various parameter values laid out in a 2D-grid) as we were at this point only interested in seeing if the model worked. A more sophisticated optimization algorithm will be developed before deployment.
In the four panel plots (each with nine panels) in the figure below we see how varying the pre-heating values in 0.1, 0.5, 0.9 percent (in the X-direction) over 2, 3, and 4 hours (in the Y-direction) before 7:45 AM result in different energy use profiles and internal air temperatures (IAT) in the building, in this case the optimal result is the maximum preheating (0.9 for 4 hours), not unexpected as this was our biggest recorded peak in the data and thus required a large amount of pre-heating. The peak was reduced by about 25% in this case (the blue area).
While we have developed convincing models that fit our measured data, it is the nature of prescriptive solutions that they predict behavior outside of the measured envelope. Thus we need to try this out in a real life situation to see if the building actually behaves in this manner when so controlled. We are aiming for a deployment in the in the winter of 2016/17, whereby we will of course also need cooperation from the winter weather to deliver some cold days for us to test with.
And while it is clear that this particular solution is only interesting to electrically heated buildings in cold climates, the general approach of applying load shaping to HVAC systems to achieve optimal-cost energy usage under a variable energy price regime is an application that will clearly see wide-scale deployment as we move to sustainable energy solutions.