Machine Hearing for Industrial Acoustic Monitoring using Cochleagram and Spiking Neural Network

Yu Zhang, Shirin Dora, Miguel Martinez-Garcia, Saugat Bhattacharyya

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Citations (Scopus)

Abstract

This paper presents a bio-inspired machine learning framework, which aims to mimic the human hearing functionalities, for industrial acoustic monitoring. It involves firstly modelling the functionality of the cochlea, which is an essential part of the inner ear. This is accomplished by extracting important time-frequency information of the acoustic signals through cochleagrams. Then, to emulate more closely the neural activities in the brain when processing information, a bio-plausible Spiking Neural Network (SNN) is applied for pattern recognition. Finally, the proposed method is verified with acoustic data collected from machine bearings with healthy and faulty conditions. The initial feasibility study has demonstrated the viability and the efficacy of the proposed “machine hearing” approach for industrial acoustic monitoring applications.
Original languageEnglish
Title of host publication2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics
PublisherIEEE
Pages1047-1051
Number of pages5
ISBN (Print)9781665413091
Publication statusPublished (in print/issue) - 11 Jul 2022

Publication series

NameIEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM
Volume2022-July

Bibliographical note

Publisher Copyright:
© 2022 IEEE.

Fingerprint

Dive into the research topics of 'Machine Hearing for Industrial Acoustic Monitoring using Cochleagram and Spiking Neural Network'. Together they form a unique fingerprint.

Cite this