Intonation contour realisation for Standard Yorùbá text-to-speech synthesis: a fuzzy computational approach

Ọdẹ´túnjí A. Odé´jọbí, Anthony J. Beaumont, Shun Ha Sylvia Wong

Research output: Contribution to journalArticle

Abstract

This paper presents a novel intonation modelling approach and demonstrates its applicability using the Standard Yorùbá language. Our approach is motivated by the theory that abstract and realised forms of intonation and other dimensions of prosody should be modelled within a modular and unified framework. In our model, this framework is implemented using the Relational Tree (R-Tree) technique. The R-Tree is a sophisticated data structure for representing a multi-dimensional waveform in the form of a tree. Our R-Tree for an utterance is generated in two steps. First, the abstract structure of the waveform, called the Skeletal Tree (S-Tree), is generated using tone phonological rules for the target language. Second, the numerical values of the perceptually significant peaks and valleys on the S-Tree are computed using a fuzzy logic based model. The resulting points are then joined by applying interpolation techniques. The actual intonation contour is synthesised by Pitch Synchronous Overlap Technique (PSOLA) using the Praat software. We performed both quantitative and qualitative evaluations of our model. The preliminary results suggest that, although the model does not predict the numerical speech data as accurately as contemporary data-driven approaches, it produces synthetic speech with comparable intelligibility and naturalness. Furthermore, our model is easy to implement, interpret and adapt to other tone languages.
Original languageEnglish
Pages (from-to)563-588
Number of pages26
JournalComputer Speech and Language
Volume20
Issue number4
DOIs
Publication statusPublished - Oct 2006

Fingerprint Dive into the research topics of 'Intonation contour realisation for Standard Yorùbá text-to-speech synthesis: a fuzzy computational approach'. Together they form a unique fingerprint.

  • Cite this