1-4hit |
Masato TAJIMA Keiji TAKIDA Zenshiro KAWASAKI
Both Viterbi decoding with labels (i.e., the Yamamoto-Itoh scheme) and the soft-output Viterbi algorithm (SOVA) evaluate the metric difference between the maximum-likelihood (ML) path and the discarded path at each level in the trellis. Noting this fact, we show that the former scheme also provides information about the reliability values for decoded information bits.
Masato TAJIMA Keiji TAKIDA Zenshiro KAWASAKI
The structure of bidirectional syndrome decoding for binary rate (n-1)/n convolutional codes is investigated. It is shown that for backward decoding based on the trellis of a syndrome former HT, the syndrome sequence must be generated in time-reversed order using an extra syndrome former H*T, where H* is a generator matrix of the reciprocal dual code of the original code. It is also shown that if the syndrome bits are generated once and only once using HT and H*T, then the corresponding two error sequences have the intersection of n error symbols, where is the memory length of HT.
Masato TAJIMA Keiji TAKIDA Zenshiro KAWASAKI
In this paper, we state some noteworthy facts in connection with simplification of the BCJR algorithm using the bidirectional Viterbi algorithm (BIVA). That is, we clarify the necessity of metric correction in the case that the BIVA is applied to reliability estimation, where information symbols uj obey non-uniform probability distributions.
Tiansheng XU Zenshiro KAWASAKI Keiji TAKIDA Zheng TANG
This paper presents a child verb learning model mainly based on syntactic bootstrapping. The model automatically learns 4-5-year-old children's linguistic knowledge of verbs, including subcategorization frames and thematic roles, using a text in dialogue format. Subcategorization frame acquisition of verbs is guided by the assumption of the existence of nine verb prototypes. These verb prototypes are extracted based on syntactic bootstrapping and some psycholinguistic studies. Thematic roles are assigned by syntactic bootstrapping and other psycholinguistic hypotheses. The experiments are performed on the data from the CHILDES database. The results show that the learning model successfully acquires linguistic knowledge of verbs and also suggest that psycholinguistic studies of child verb learning may provide important hints for linguistic knowledge acquisition in natural language processing (NLP).