Title: | Detection of Prosodic Boundaries in Speech Using Wav2Vec 2.0 |
Authors: | Kunešová, Marie Řezáčková, Markéta |
Citation: | KUNEŠOVÁ, M. ŘEZÁČKOVÁ, M. Detection of Prosodic Boundaries in Speech Using Wav2Vec 2.0. In Text, Speech, and Dialogue 25th International Conference, TSD 2022, Brno, Czech Republic, September 6–9, 2022, Proceedings. Cham: Springer International Publishing, 2022. s. 377-388. ISBN: 978-3-031-16269-5 , ISSN: 0302-9743 |
Issue Date: | 2022 |
Publisher: | Springer International Publishing |
Document type: | konferenční příspěvek ConferenceObject |
URI: | 2-s2.0-85139029982 http://hdl.handle.net/11025/50925 |
ISBN: | 978-3-031-16269-5 |
ISSN: | 0302-9743 |
Keywords in different language: | Phrasing;Prosodic boundaries;Phrase boundaries;Phrase boundary detection;wav2vec |
Abstract in different language: | Prosodic boundaries in speech are of great relevance to both speech synthesis and audio annotation. In this paper, we apply the wav2vec 2.0 framework to the task of detecting these boundaries in speech signal, using only acoustic information. We test the approach on a set of recordings of Czech broadcast news, labeled by phonetic experts, and compare it to an existing text-based predictor, which uses the transcripts of the same data. Despite using a relatively small amount of labeled data, the wav2vec2 model achieves an accuracy of 94% and F1 measure of 83% on within-sentence prosodic boundaries (or 95% and 89% on all prosodic boundaries), outperforming the text-based approach. However, by combining the outputs of the two different models we can improve the results even further. |
Rights: | Plný text je přístupný v rámci univerzity přihlášeným uživatelům. © Springer Nature Switzerland AG |
Appears in Collections: | Konferenční příspěvky / Conference papers (NTIS) Konferenční příspěvky / Conference Papers (KKY) OBD |
Files in This Item:
File | Size | Format | |
---|---|---|---|
Kunesova_Rezackova-Detection_of_Prosodic_Boundaries_TSD_2022.pdf | 281,21 kB | Adobe PDF | View/Open Request a copy |
Please use this identifier to cite or link to this item:
http://hdl.handle.net/11025/50925
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.