頁籤選單縮合
題名 | A Fast Learning Multilayer Neural Model and Its Array Processor Implementation= |
---|---|
作者 | 江政欽; 傅心家; |
期刊 | Journal of Information Science and Engineering |
出版日期 | 19920600 |
卷期 | 8:2 1992.06[民81.06] |
頁次 | 頁283-304 |
分類號 | 310.15 |
語文 | eng |
關鍵詞 | Neural network; Multilayer perceptron; Backpropagation learning algorithm; Ring array processor; Transputer; |
英文摘要 | This paper advocates a novel method of achieving faster learning of Boolean functions in a multilayer neural network. The key new ingredient is the concept of the proposed floaring positive/negative threshods used to edtermine the putput states in the output neurons. In a traditional multilayer perceptron (MLP), an output state is determined to be 1or 0 when its activation value exceeds the fixed target threshold. In the proposed approach, the state of an output activation is determined by comparing the differences between its output activation and its two floating positive/negative thresholds. When the output activation is closer to the positive threshold (or neative threshold), then the output stare is interpreted to be 1(or 0). During the learning phase, the output activation and the two thresholds are adjusted due to weight updating. Simulation results show that the number of learning iterations for a successful training performed by our model is significantly lesser than the learning iterations performed by a traditional multilayer perceptron for many different problems. In addition, we also have mapped this multilayer neural onto a ring systolic array which maximizes the strength of VLSI in terms of intensive and pipeline computing and yet circumvents the limitation on communications. |
本系統之摘要資訊系依該期刊論文摘要之資訊為主。