Some observations and a few tips to tackle some of the major problems in the Multi-label classification of time-series data.
Sequence-to-point Neural Networks for NILM
This paper presents a Sequence to point architecture which primarily depends upon moving sliding windows to capture patterns for a disaggregation result.
NILM (Non-Intrusive Load Monitoring) has been one of the important processes in understanding customer behavior and building scalable infrastructure solutions. The most basic use case of NILM is to provide the end-user a better understanding of their own household’s energy consumption. This in turn helps users develop better consumption habits, which results in a 5–10% reduction in consumption. So far, we have seen many Deep Learning methods including HMMs, RNNs, LSTMs, GRUs, and Transformers under the domain of Sequence Modelling that predominantly depends upon moving sliding windows to capture patterns for a disaggregation result.
Key Applications of NILM
Identifying energy-saving opportunities: By providing detailed information about the electrical consumption of individual appliances, NILM can help to identify opportunities for reducing energy usage. For example, if a particular appliance is using more energy than expected, it may be possible to switch to a more energy-efficient model or to use the appliance less frequently
Supporting demand-side management: NILM can be used to support demand-side management, which involves controlling the demand for electricity in order to improve the efficiency of the power grid. By providing real-time information about the electrical consumption of individual appliances, NILM can help utilities to better manage demand and avoid overloading the grid
Enhancing energy audits: Energy audits are used to assess the energy efficiency of a building or other facility. By incorporating NILM data, energy audits can provide a more detailed and accurate analysis of energy usage, which can help to identify specific areas where improvements can be made.
One of the popular Sequence Modelling architectures Seq2Seq  is dominating the majority of the tasks in this domain. In a Seq2Seq-type architecture for NILM, the sliding window(s) of the input power sequence will learn to correspond to the sliding window(s) of the output power of the appliance. The problem here is threefold,
- The output window elements are computed many times as part of the sliding window process
- A particular few sliding windows might approximate a better result
- Computationally expensive
Seq2Point  set out to solve the major bottlenecks by proposing outputting a single value (midpoint of the output window) from a sequence input window. Let’s do a deep dive and understand how Seq2Point architecture reduced by around 83% of the error rate from the previous work.
Idea and Architecture
The idea for Seq2Point is that for an input window, the output will be the corresponding midpoint element of the target appliance window. However, there is an important underlying assumption: “Midpoint element is represented as a non-linear regression of the mains input window”. This assumption goes based on the midpoint element (output) of the appliance window and should factor in surrounding information, i.e. before and after the appliance activity.
For training it, the output sequence is padded with zeros at both ends to deal with the endpoints of a sequence. The key differentiator in Seq2Point is that there will be a single absolute output x(t), rather than a smoothed (averaged) prediction from a window.
Results and Benchmarks
UK-DALE  and REDD  were the datasets chosen for the training and Seq2Point was compared with Seq2Seq for benchmarking and analysis.
Data has to be prepared carefully in NILM-based neural networks. The input mains sequence was selected based on the sequence length (599 time stamps) and the output was the midpoint element in the target sequence. The values within the sequence windows were normalized.
Coming to the evaluation metrics, MAE (Mean Absolute Error) was used for calculating the error at every timestamp t, and Normalized SAE (Signal Absolute Error) for a total error in energy over a longer period.
As we can see in the above tables, Seq2Point outperforms Seq2Seq and the AFHMM by a considerable margin on a majority of the selected appliance.
When I was going through the feature maps and the ablation study performed for observing the understanding of how the network is learning the features, I was taken aback by how well the features were learned by the network. Convolutional Neural Networks has once again shown that it’s one of the best feature extractors/pattern finders out there. There are a few sets of filters that picked up amplitude changes and a few others focused on state changes. Surprisingly it was able to learn the duration of state changes for each appliance!
Head over to the Visualization of latent features section of the Seq2Point paper.
Real-world examples of NILM
The Open Energy Monitor is an open-source NILM system that is designed for use in both residential and commercial settings. It uses sensors to monitor the electrical consumption of individual appliances, and provides users with detailed information about their energy usage. This allows users to identify ways to reduce their energy consumption, and to monitor their progress over time.
The Energy Detective (TED) is a NILM system that is designed for residential use. It uses sensors to monitor the electrical consumption of individual appliances, and provides users with real-time information about their energy usage. This allows users to identify which appliances are using the most energy, and to take steps to reduce their energy consumption.
Seq2Point is a great addition to the NILM Research space by virtue of its simplified architecture that delivers big on results. The datasets used were real-world, hinting at a possible deployment in real-world scenarios and creating business value. It outperforms Seq2Seq (De-facto boss in sequence modelling) and also explains the features and behavior of the network.
Here is the implementation of Seq2Point in TensorFlow : https://github.com/MingjunZhong/seq2point-nilm
 Seq2Seq: https://arxiv.org/abs/1409.3215
 Seq2Point: https://arxiv.org/abs/1612.09106
 UK-DALE: https://jack-kelly.com/data/