Transformer Meets Gated Residual Networks to Enhance PICU’s PPG Artifact Detection Informed by Mutual Information Neural Estimation

  • Thanh Dung Le
  • , Clara Macabiau
  • , Kevin Albert
  • , Symeon Chatzinotas
  • , Philippe Jouvet
  • , Rita Noumeir

Research output: Contribution to journalJournal Articlepeer-review

Abstract

This study delves into the effectiveness of various learning methods in improving Transformer models, focusing mainly on the Gated Residual Network (GRN) Transformer in the context of pediatric intensive care units (PICUs) with limited data availability. Our findings indicate that Transformers trained via supervised learning are less effective than MLP, CNN, and LSTM networks in such environments. Yet, leveraging unsupervised and self-supervised learning (SSL) on unannotated data, with subsequent fine-tuning on annotated data, notably enhances Transformer performance, although not to the level of the GRN–Transformer. Central to our research is analyzing different activation functions for the gated linear unit (GLU), a crucial element of the GRN structure. We also employ Mutual Information Neural Estimation (MINE) to evaluate the GRN’s contribution. Additionally, the study examines the effects of integrating GRN within the Transformer’s attention mechanism versus using it as a separate intermediary layer. Our results highlight that GLU with sigmoid activation stands out, achieving 0.98 accuracy, 0.91 precision, 0.96 recall, and 0.94~F1 -score. The MINE analysis supports the hypothesis that GRN enhances the mutual information (MI) between the hidden representations and the output. Moreover, using GRN as an intermediate filter layer proves more beneficial than incorporating it within the Attention mechanism. This study clarifies how GRN boosters GRN–Transformer’s performance surpasses other techniques. These findings offer a promising avenue for adopting sophisticated models like Transformers in data-constrained environments, such as PPG artifact detection in PICU settings.

Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
DOIs
Publication statusIn press - 2026
Externally publishedYes

!!!Keywords

  • Clinical PPG signals
  • Gated Residual Networks (GRNs)
  • imbalanced classes
  • mutual information (MI)
  • transformers

Fingerprint

Dive into the research topics of 'Transformer Meets Gated Residual Networks to Enhance PICU’s PPG Artifact Detection Informed by Mutual Information Neural Estimation'. These topics are generated from the title and abstract of the publication. Together, they form a unique fingerprint.

Cite this