Abstract

Due to the sweeping digitalization of processes, increasingly vast amounts of time series data are being produced. Accurate classification of such time series facilitates decision making in multiple domains. State-of-the-art classification accuracy is often achieved by ensemble learning where results are synthesized from multiple base models. This characteristic implies that ensemble learning needs substantial computing resources, preventing their use in resource-limited environments, such as in edge devices. To extend the applicability of ensemble learning, we propose the LightTS framework that compresses large ensembles into lightweight models while ensuring competitive accuracy. First, we propose adaptive ensemble distillation that assigns adaptive weights to different base models such that their varying classification capabilities contribute purposefully to the training of the lightweight model. Second, we propose means of identifying Pareto optimal settings w.r.t. model accuracy and model size, thus enabling users with a space budget to select the most accurate lightweight model. We report on experiments using 128 real-world time series sets and different types of base models that justify key decisions in the design of LightTS and provide evidence that LightTS is able to outperform competitors.
Original languageEnglish
PublisherarXiv
Number of pages15
DOIs
Publication statusPublished - 2023

Bibliographical note

An extended version of "LightTS: Lightweight Time Series Classification with Adaptive Ensemble Distillation" accepted at SIGMOD 2023

Fingerprint

Dive into the research topics of 'LightTS: Lightweight Time Series Classification with Adaptive Ensemble Distillation - Extended Version.'. Together they form a unique fingerprint.

Cite this