BRACETS: Bimodal Repository of Auscultation Coupled with Electrical Impedance Thoracic Signals
Description
Background and Objective: Respiratory diseases are among the most significant causes of morbidity and mortality worldwide, causing substantial strain on society and health systems. Over the last few decades, there has been increasing interest in the automatic analysis of respiratory sounds and electrical impedance tomography (EIT). Nevertheless, no publicly available databases with both respiratory sound and EIT data are available. Methods: In this work, we have assembled the first open-access bimodal database focusing on the differential diagnosis of respiratory diseases (BRACETS: Bimodal Repository of Auscultation Coupled with Electrical Impedance Thoracic Signals). It includes simultaneous recordings of single and multi-channel respiratory sounds and EIT. Furthermore, we have proposed several machine learning-based baseline systems for automatically classifying respiratory diseases in six distinct evaluation tasks using respiratory sound and EIT (A1, A2, A3, B1, B2, B3). These tasks included classifying respiratory diseases at sample and subject levels. The performance of the classification models was evaluated using a 5-fold cross-validation scheme (with subject isolation between folds). Results: The resulting database consists of 1097 respiratory sounds and 795 EIT recordings acquired from 78 adult subjects in two countries (Portugal and Greece). In the task of automatically classifying respiratory diseases, the baseline classification models have achieved the following average balanced accuracy: Task A1 - 77.9±13.1%; Task A2 - 51.6±9.7%; Task A3 - 38.6±13.1%; Task B1 - 90.0±22.4%; Task B2 - 61.4±11.8%; Task B3 - 50.8±10.6%. Conclusion: The creation of this database and its public release will aid the research community in developing automated methodologies to assess and monitor respiratory function, and it might serve as a benchmark in the field of digital medicine for managing respiratory diseases. Moreover, it could pave the way for creating multi-modal robust approaches for that same purpose.
Files
Steps to reproduce
In the case you use these data, please cite the following article: -> Pessoa, D., Rocha, B. M., Strodthoff, C., Gomes, M., Rodrigues, G., Petmezas, G., Cheimariotis, G., Kilintzis, V., Kaimakamis, E., Maglaveras, N., Marques, A., Frerichs, I., Carvalho, P. D., & Paiva, R. P. (2023). BRACETS: Bimodal Repository of Auscultation Coupled with Electrical Impedance Thoracic Signals. Computer Methods and Programs in Biomedicine, 107720. https://doi.org/10.1016/j.cmpb.2023.107720 For more information related to this database, please refer to https://doi.org/10.1016/j.cmpb.2023.107720 and https://github.com/DiogoMPessoa/BRACETS-Bimodal-Repository-of-Auscultation-Coupled-with-Electrical-Impedance-Thoracic-Signals
Institutions
Categories
Funding
Fundação para a Ciência e a Tecnologia
DFA/BD/4927/2020
Fundação para a Ciência e a Tecnologia
SFRH/BD/135686/2018
Fundação para a Ciência e a Tecnologia
UIDB/00326/2020
Fundação para a Ciência e a Tecnologia
UIDB/00326/2020
Horizon 2020 Framework Programme
825572