Predictions for sound events and soundscape impressions from environmental sound using deep neural networks

Type
Publication
Authors
Abe ( Masanobu Abe )
Hara ( Sunao Hara )
 
Category
 
Publication Year
2023 
Publisher
URL
[ private ] 
Abstract
In this study, we investigate methods for quantifying soundscape impressions, meaning pleasantness and eventfulness, from environmental sounds. From a point of view of Machine Learning (ML) research areas, acoustic scene classification (ASC) tasks and sound event classification (SEC) tasks are intensively studied and their results are helpful to consider soundscape impressions. In general, while most of ASCs and SECs use only sound for the classifications, a soundscape impression should not be perceived just from a sound but perceived from a sound with a landscape by human beings. Therefore, to establish automatic quantification of soundscape impressions, it should use other information such as landscape in addition to sound. First, we tackle to predict of two soundscape impressions using sound data collected by the cloud sensing method. For this purpose, we have proposed prediction method of soundscape impressions using environmental sounds and aerial photographs. Second, we also tackle to establish environmental sound classification using feature extractor trained by Variational Autoencoder (VAE). The feature extractor by VAE can be trained as unsupervised learning, therefore, it could be promising approaches for the growth dataset like as our cloud-sensing data collection schemes. Finally, we discuss about an integration for these methods. 
Description
https://doi.org/10.3397/IN_2023_0739 
Number of Copies

REVIEWS (0) -

No reviews posted yet.

WRITE A REVIEW

Please login to write a review.