Computer Science
Ensemble Learning
80%
Extended Version
80%
Training Data
80%
Time Series Data
46%
Digitalization
46%
Model Compression
40%
Voice Activity Detection
40%
Open Source
40%
Outlier Detection
40%
Data Streaming
36%
Model Accuracy
26%
Computing Resource
26%
Multiple Domain
26%
Decision-Making
26%
Optimal Setting
26%
Classification Accuracy
26%
Knowledge Distillation
20%
Deep Neural Network
20%
Memory Consumption
20%
Multivariate Time Series
20%
Medical Process
20%
Parallelism
20%
Autoencoder
20%
Scientific Process
20%
Relative Importance
20%
Memory Requirement
20%
Machine Learning
18%
Baseline Method
18%
Learning System
18%
Static Environment
18%
Deployment Model
18%
Convolutional Neural Network
13%
Neural Network Model
13%
General-Purpose Computer
6%
Preprocessing Step
6%
Computation Load
6%
Evaluation Metric
6%
Keyphrases
Time Series Classification
40%
Ensemble Distillation
40%
Adaptive Ensemble
40%
Quantized Model
40%
Ensemble Distillation
40%
Lightweight Model
40%
Training Data
40%
On-device
40%
Edge Devices
23%
Backpropagation
23%
Full Training
23%
Streaming Data
16%
Pareto Optimal
13%
Large Ensemble
13%
Resource-limited Environment
13%
Space-time Budget
13%
Process Sensing
13%
Medical Processes
13%
Ensemble Accuracy
13%
Peer Models
8%
Distillation Method
8%
Channel Pruning
8%
Bit-flipping
8%
Early Training
8%
Model Deployment
8%
Bit-width
8%
Dynamic Edge
8%
Static Environment
8%
Learning Settings
8%
Precision Parameters
8%
Machine Learning Models
8%
Storage Capability
8%
Edge Habitat
8%
Limited Storage Capacity
8%
Adaptive Adjustment
8%
Continual Learning
8%
Compress
8%
Computational Capability
8%
Time Calibration
8%
Backpropagation Training
8%
General-purpose Computer
6%
Model Improving
6%
Load Computation
6%
Engineering
Model Parameter
100%
Learning System
80%
Smaller Subset
80%
Distillation
40%
Simplifies
40%
Deep Neural Network
40%
Memory Requirement
40%
Relative Importance
40%
Outlier Detection
40%
Improve Efficiency
20%
Design Choice
20%
Data Series
20%
Autoencoder
20%
Parallelism
20%