Authors: Jonathan Shen, Ruoming Pang, Ron J. Weiss, Mike Schuster, Navdeep Jaitly, Zongheng Yang, Zhifeng Chen, Yu Zhang, Yuxuan Wang, RJ Skerry-Ryan, Rif A. Saurous, Yannis Agiomyrgiannakis, Yonghui Wu
Abstract: This paper describes Tacotron 2, a neural network architecture for
speech synthesis directly from text. The system is composed of
a recurrent sequence-to-sequence feature prediction network that
maps character embeddings to mel-scale spectrograms, followed by
a modified WaveNet model acting as a vocoder to synthesize timedomain
waveforms from those spectrograms. Our model achieves a
mean opinion score (MOS) of 4.53 comparable to a MOS of 4.58 for
professionally recorded speech. To validate our design choices, we
present ablation studies of key components of our system and evaluate
the impact of using mel spectrograms as the input to WaveNet instead
of linguistic, duration, and F0 features. We further demonstrate
that using a compact acoustic intermediate representation enables
significant simplification of the WaveNet architecture.
All of the below phrases are unseen by Tacotron 2 during training.
Tacotron 2 works well on out-of-domain and complex words.
“Generative adversarial network or variational auto-encoder.”
“Basilar membrane and otolaryngology are not auto-correlations.”
Tacotron 2 learns pronunciations based on phrase semantics.
(Note how Tacotron 2 pronounces "read" in the first two phrases.)
“He has read the whole thing.”
“He reads books.”
“Don't desert me here in the desert!”
“He thought it was time to present the present.”
Tacotron 2 is somewhat robust to spelling errors.
“Thisss isrealy awhsome.”
Tacotron 2 is sensitive to punctuation.
(Note how the comma in the first phrase changes prosody.)
“This is your personal assistant, Google Home.”
“This is your personal assistant Google Home.”
Tacotron 2 learns stress and intonation.
(The speaker is instructed to stress on capitalized words in our
training set. So simply capitalizing some words will change the overall
“The buses aren't the problem, they actually provide a solution.”
“The buses aren't the PROBLEM, they actually provide a SOLUTION.”
Tacotron 2's prosody changes when turning a statement into a question.
“The quick brown fox jumps over the lazy dog.”
“Does the quick brown fox jump over the lazy dog?”
Tacotron 2 is good at tongue twisters.
“Peter Piper picked a peck of pickled peppers. How many pickled peppers did Peter Piper pick?”
“She sells sea-shells on the sea-shore. The shells she sells are sea-shells I'm sure.”
“Talib Kweli confirmed to AllHipHop that he will be releasing an album in the next year.”
“The blue lagoon is a nineteen eighty American romance adventure film.”
“Tajima Airport serves Toyooka.”
Tacotron 2 or Human?
In the following examples, one is generated by Tacotron 2, and one is the recording of a human, but which is which?
“That girl did a video about Star Wars lipstick.”
“She earned a doctorate in sociology at Columbia University.”
“George Washington was the first President of the United States.”