The growth of interconnected devices has led to an enormous volume of temporal data that requires specialized compression models for efficient storage. Besides this, most applications need to classify these data efficiently, and having to reconstruct the original data from the compressed representation to then classify them is not optimal. For this reason, we propose a Recurrent Autoencoder for Time-series Compression and Classification, termed RAT-CC, that allows to perform any classification task on the compressed representation without needing to reconstruct the original time-series data. RAT-CC leverages a Long Short-Term Memory (LSTM) recurrent autoencoder with a dual-loss function: the standard reconstruction loss to minimize reconstruction error; and an embedding loss to preserve relative distances in the compressed embedding space. This combined loss ensures that the learned embeddings remain meaningful for classification tasks while preserving the necessary information for reconstruction. We assess the compression and classification performance of RAT-CC on four datasets taken from different domains. RAT-CC is implemented in Keras and freely available at (https://github.com/ChJ4m3s/RAT-CC).

RAT-CC: A Recurrent Autoencoder for Time-Series Compression and Classification

Chiarot, Giacomo
;
Vascon, Sebastiano;Silvestri, Claudio;
2025-01-01

Abstract

The growth of interconnected devices has led to an enormous volume of temporal data that requires specialized compression models for efficient storage. Besides this, most applications need to classify these data efficiently, and having to reconstruct the original data from the compressed representation to then classify them is not optimal. For this reason, we propose a Recurrent Autoencoder for Time-series Compression and Classification, termed RAT-CC, that allows to perform any classification task on the compressed representation without needing to reconstruct the original time-series data. RAT-CC leverages a Long Short-Term Memory (LSTM) recurrent autoencoder with a dual-loss function: the standard reconstruction loss to minimize reconstruction error; and an embedding loss to preserve relative distances in the compressed embedding space. This combined loss ensures that the learned embeddings remain meaningful for classification tasks while preserving the necessary information for reconstruction. We assess the compression and classification performance of RAT-CC on four datasets taken from different domains. RAT-CC is implemented in Keras and freely available at (https://github.com/ChJ4m3s/RAT-CC).
2025
13
File in questo prodotto:
File Dimensione Formato  
RAT-CC_A_Recurrent_Autoencoder_for_Time-Series_Compression_and_Classification.pdf

accesso aperto

Tipologia: Versione dell'editore
Licenza: Accesso gratuito (solo visione)
Dimensione 3.57 MB
Formato Adobe PDF
3.57 MB Adobe PDF Visualizza/Apri

I documenti in ARCA sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10278/5097768
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact