Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification

Achintya Kumar Sarkar, Zheng-Hua Tan

Publikation: Bidrag til bog/antologi/rapport/konference proceedingKonferenceartikel i proceedingForskningpeer review

Abstract

In this paper, we present a time-contrastive learning (TCL) based bottleneck (BN) feature extraction method for speech signals with an application to text-dependent (TD) speaker verification (SV). It is well-known that speech signals exhibit quasi-stationary behavior in and only in a short interval, and the TCL method aims to exploit this temporal structure. More specifically, it trains deep neural networks (DNNs) to discriminate temporal events obtained by uniformly segmenting speech signals, in contrast to existing DNN based BN feature extraction methods that train DNNs using labeled data to discriminate speakers or pass-phrases or phones or a combination of them. In the context of speaker verification, speech data of fixed pass-phrases are used for TCL-BN training, while the pass-phrases used for TCL-BN training are excluded from being used for SV, so that the learned features can be considered generic. The method is evaluated on the RedDots Challenge 2016 database. Experimental results show that TCL-BN is superior to the existing speaker and pass-phrase discriminant BN features and the Mel-frequency cepstral coefficient feature for text-dependent speaker verification.
OriginalsprogEngelsk
TitelNIPS Time Series Workshop 2017, Long Beach, CA, USA
Antal sider6
Publikationsdatodec. 2017
StatusUdgivet - dec. 2017
BegivenhedNIPS Time Series Workshop 2017 - Long Beach, USA
Varighed: 4 dec. 20179 dec. 2017

Konference

KonferenceNIPS Time Series Workshop 2017
Land/OmrådeUSA
ByLong Beach
Periode04/12/201709/12/2017
NavnProceedings of Machine Learning Research (PMLR)

Fingeraftryk

Dyk ned i forskningsemnerne om 'Time-Contrastive Learning Based DNN Bottleneck Features for Text-Dependent Speaker Verification'. Sammen danner de et unikt fingeraftryk.

Citationsformater