文章詳目資料

International Journal of Computational Linguistics And Chinese Language Processing THCI

  • 加入收藏
  • 下載文章
篇名 使用長短期記憶類神經網路建構中文語音辨識器之研究
卷期 23:2
並列篇名 A Study on Mandarin Speech Recognition using Long Short- Term Memory Neural Network
作者 賴建宏王逸如
頁次 001-018
關鍵字 遞迴式類神經網路長短期記憶梯度消失聲學模型中文大辭彙語音辨識卷積類神經網路深層類神經網路RNNsLSTMsGradient Vanishing Acoustic ModelMandarinLVCSRCNNsDNNsTHCI Core
出刊日期 201812

中文摘要

近年來類神經網路(Neural network)被廣泛運用於語音辨識領域中,本論文使用遞迴式類神經網路(Recurrent Neural Network)訓練聲學模型,並且建立中文大辭彙語音辨識系統。由於遞迴式類神經網路為循環式連接(Cyclic connections),應用於時間序列訊號的模型化(Modeling),較於傳統全連接(Full connection)的深層類神經網路而言更有益處。然 而一般單純遞迴式類神經網路在訓練上隨著時間的遞迴在反向傳播(Backpropagation)更新權重時有著梯度消失(Gradient vanishing)以及梯度爆炸(Gradient exploding)的問題,導致訓練被迫中止,以及無法有效的捕捉到長期的記憶關聯,因此長短期記憶(Long Short-Term Memory, LSTM)為被提出用來解決此問題之模型,本研究基於此模型架構結合了卷積神經網路(Convolutional Neural Network)及深層類神經網路(Deep Neural Network)建構出CLDNN 模型。訓練語料部分,本研究使用了TCC300(24 小時)、AIShell(162 小時)、NER(111小時),並加入語言模型建立大辭彙語音辨識系統,為了檢測系統強健度(Robustness),使用三種不同環境之測試語料,分別為TCC300(2.4 小時,朗讀語速)、NER-clean(1.9 小時,快語速,無雜訊)、NER-other(9 小時,快語速,有雜訊)。

英文摘要

In recent years, neural networks have been widely used in the field of speech recognition. This paper uses the Recurrent Neural Network to train acoustic models and establish a Mandarin speech recognition system. Since the recursive neural networks are cyclic connections, the modeling of temporal signals is more beneficial than the full connected deep neural networks. However, the recursive neural networks have the problem of gradient vanishing and gradient exploding in the backpropagation, which leads to the training being suspended. And the inability to effectively capture long-term memory associations, so Long Short-Term Memory (LSTM) is a model proposed to solve this problem. This study is based on this model architecture and combines convolutional neural networks and deep neural networks to construct the CLDNN models.

相關文獻