Keyword Search Result

[Keyword] log-likelihood ratio(12hit)

1-12hit
  • Adaptive Resource Allocation Based on Factor Graphs in Non-Orthogonal Multiple Access Open Access

    Taichi YAMAGAMI  Satoshi DENNO  Yafei HOU  

     
    PAPER-Wireless Communication Technologies

      Pubricized:
    2022/04/15
      Vol:
    E105-B No:10
      Page(s):
    1258-1267

    In this paper, we propose a non-orthogonal multiple access with adaptive resource allocation. The proposed non-orthogonal multiple access assigns multiple frequency resources for each device to send packets. Even if the number of devices is more than that of the available frequency resources, the proposed non-orthogonal access allows all the devices to transmit their packets simultaneously for high capacity massive machine-type communications (mMTC). Furthermore, this paper proposes adaptive resource allocation algorithms based on factor graphs that adaptively allocate the frequency resources to the devices for improvement of the transmission performances. This paper proposes two allocation algorithms for the proposed non-orthogonal multiple access. This paper shows that the proposed non-orthogonal multiple access achieves superior transmission performance when the number of the devices is 50% greater than the amount of the resource, i.e., the overloading ratio of 1.5, even without the adaptive resource allocation. The adaptive resource allocation enables the proposed non-orthogonal access to attain a gain of about 5dB at the BER of 10-4.

  • Non-Orthogonal Packet Access Based on Low Density Signature With Phase Only Adaptive Precoding

    Satoshi DENNO  Ryoko SASAKI  Yafei HOU  

     
    PAPER-Wireless Communication Technologies

      Pubricized:
    2020/09/15
      Vol:
    E104-B No:3
      Page(s):
    328-337

    This paper proposes non-orthogonal packet access based on low density signature with phase only adaptive precoding. The proposed access allows multiple user terminals to send their packets simultaneously for implementing massive connectivity, though only one antenna is put on every terminal and on an access point. This paper proposes a criterion that defines the optimum rotation angles for the phase only precoding, and an algorithm based on the steepest descent to approach the optimum rotation angles. Moreover, this paper proposes two complexity-reduced algorithms that converge much faster than the original algorithm. When 6 packets are transmitted in 4 time slots, i.e., overloading ratio of 1.5, the proposed adaptive precoding based on all the proposed algorithms attains a gain of about 4dB at the BER of 10-4 in Rician fading channels.

  • Low Complexity Log-Likelihood Ratio Calculation Scheme with Bit Shifts and Summations

    Takayoshi AOKI  Keita MATSUGI  Yukitoshi SANADA  

     
    PAPER-Transmission Systems and Transmission Equipment for Communications

      Pubricized:
    2017/09/19
      Vol:
    E101-B No:3
      Page(s):
    731-739

    This paper presents an approximated log-likelihood ratio calculation scheme with bit shifts and summations. Our previous work yielded a metric calculation scheme that replaces multiplications with bit shifts and summations in the selection of candidate signal points for joint maximum likelihood detection (MLD). Log-likelihood ratio calculation for turbo decoding generally uses multiplications and by replacing them with bit shifts and summations it is possible to reduce the numbers of logic operations under specific transmission parameters. In this paper, an approximated log-likelihood ratio calculation scheme that substitutes bit shifts and summations for multiplications is proposed. In the proposed scheme, additions are used only for higher-order bits. Numerical results obtained through computer simulation show that this scheme can eliminate multiplications in turbo decoding at the cost of just 0.2dB performance degradation at a BER of 10-4.

  • An Efficient Weighted Bit-Flipping Algorithm for Decoding LDPC Codes Based on Log-Likelihood Ratio of Bit Error Probability

    Tso-Cho CHEN  Erl-Huei LU  Chia-Jung LI  Kuo-Tsang HUANG  

     
    PAPER-Fundamental Theories for Communications

      Pubricized:
    2017/05/29
      Vol:
    E100-B No:12
      Page(s):
    2095-2103

    In this paper, a weighted multiple bit flipping (WMBF) algorithman for decoding low-density parity-check (LDPC) codes is proposed first. Then the improved WMBF algorithm which we call the efficient weighted bit-flipping (EWBF) algorithm is developed. The EWBF algorithm can dynamically choose either multiple bit-flipping or single bit-flipping in each iteration according to the log-likelihood ratio of the error probability of the received bits. Thus, it can efficiently increase the convergence speed of decoding and prevent the decoding process from falling into loop traps. Compared with the parallel weighted bit-flipping (PWBF) algorithm, the EWBF algorithm can achieve significantly lower computational complexity without performance degradation when the Euclidean geometry (EG)-LDPC codes are decoded. Furthermore, the flipping criterion does not require any parameter adjustment.

  • Node Selection for Belief Propagation Based Channel Equalization

    Mitsuyoshi HAGIWARA  Toshihiko NISHIMURA  Takeo OHGANE  Yasutaka OGAWA  

     
    PAPER-Wireless Communication Technologies

      Pubricized:
    2017/02/08
      Vol:
    E100-B No:8
      Page(s):
    1285-1292

    Recently, much progress has been made in the study of belief propagation (BP) based signal detection with large-scale factor graphs. When we apply the BP algorithm to equalization in a SISO multipath channel, the corresponding factor graph has many short loops and patterns in an edge connection/strength. Thus, proper convergence may not be achieved. In general, the log-likelihood ratio (LLR) oscillates in ill-converged cases. Therefore, LLR oscillation avoidance is important for BP-based equalization. In this paper, we propose applying node selection (NS) to prevent the LLR from oscillating. The NS extends the loop length virtually by a serial LLR update. Thus, some performance improvement is expected. Simulation results show that the error floor is significantly reduced by NS in the uncoded case and that the NS works very well in the coded case.

  • Information-Theoretic Performance Evaluation of Multibiometric Fusion under Modality Selection Attacks

    Takao MURAKAMI  Yosuke KAGA  Kenta TAKAHASHI  

     
    PAPER-Cryptography and Information Security

      Vol:
    E99-A No:5
      Page(s):
    929-942

    The likelihood-ratio based score level fusion (LR-based fusion) scheme has attracted much attention, since it maximizes accuracy if a log-likelihood ratio (LLR) is accurately estimated. In reality, it can happen that a user cannot input some query samples due to temporary physical conditions such as injuries and illness. It can also happen that some modalities tend to cause false rejection (i.e. the user is a “goat” for these modalities). The LR-based fusion scheme can handle these situations by setting LLRs corresponding to missing query samples to 0. In this paper, we refer to such a mode as a “modality selection mode”, and address an issue of accuracy in this mode. Specifically, we provide the following contributions: (1) We firstly propose a “modality selection attack”, in which an impostor inputs only query samples whose LLRs are more than 0 (i.e. takes an optimal strategy) to impersonate others. We also show that the impostor can perform this attack against the SPRT (Sequential Probability Ratio Test)-based fusion scheme, which is an extension of the LR-based fusion scheme to a sequential fusion scenario. (2) We secondly consider the case when both genuine users and impostors take this optimal strategy, and show that the overall accuracy in this case is “worse” than the case when they input all query samples. More specifically, we prove that the KL (Kullback-Leibler) divergence between a genuine distribution of integrated scores and an impostor's one, which can be compared with password entropy, is smaller in the former case. We also show to what extent the KL divergence losses for each modality. (3) We finally evaluate to what extent the overall accuracy becomes worse using the NIST BSSR1 Set 2 and Set 3 datasets, and discuss directions of multibiometric applications based on the experimental results.

  • Soft-Output Decoding Approach of 2D Modulation Codes in Bit-Patterned Media Recording Systems

    Chanon WARISARN  Piya KOVINTAVEWAT  

     
    PAPER-Storage Technology

      Vol:
    E98-C No:12
      Page(s):
    1187-1192

    The two-dimensional (2D) interference is one of the major impairments in bit-patterned media recording (BPMR) systems due to small bit and track pitches, especially at high recording densities. To alleviate this problem, we introduced a rate-4/5 constructive inter-track interference (CITI) coding scheme to prevent the destructive data patterns to be written onto a magnetic medium for an uncoded BPMR system, i.e., without error-correction codes. Because the CITI code produces only the hard decision, it cannot be employed in a coded BPMR system that uses a low-density parity-check (LDPC) code. To utilize it in an iterative decoding scheme, we propose a soft CITI coding scheme based on the log-likelihood ratio algebra implementation in Boolean logic mappings in order that the soft CITI coding scheme together with a modified 2D soft-output Viterbi algorithm (SOVA) detector and a LDPC decoder will jointly perform iterative decoding. Simulation results show that the proposed scheme provides a significant performance improvement, in particular when an areal density (AD) is high and/or the position jitter is large. Specifically, at a bit-error rate of 10-4 and no position jitter, the proposed system can provide approximately 1.8 and 3.5 dB gain over the conventional coded system without using the CITI code at the ADs of 2.5 and 3.0 Tera-bit per square inch (Tb/in2), respectively.

  • A Chaos MIMO Transmission Scheme Using Turbo Principle for Secure Channel-Coded Transmission

    Eiji OKAMOTO  Yuma INABA  

     
    PAPER

      Vol:
    E98-B No:8
      Page(s):
    1482-1491

    Physical layer security is effective in wireless communications because it makes a transmission secure from the beginning of protocols. We have proposed a chaos multiple-input multiple-output (C-MIMO) transmission scheme that achieves both physical layer security and channel coding gain using chaos signals. C-MIMO is a type of encryption modulation and it obtains the coding gain in conjunction with encryption without a decrease in the transmission efficiency. Thus, the error rate performance is improved in C-MIMO. However, decoding complexity increases exponentially with code length because of the use of maximum likelihood sequence estimation (MLSE), which restricts the code length of C-MIMO and thus the channel coding gain. Therefore, in this paper, we consider outer channel code concatenation instead of code length expansion for C-MIMO, and propose an iterative turbo decoding scheme for performance improvement by introducing a log-likelihood ratio (LLR) into C-MIMO and by utilizing turbo principle. The improved performances of the proposed scheme, compared to the conventional scheme when the outer channel codes are convolutional code and low-density parity check (LDPC) code, are shown by computer simulations.

  • Multi-Stage Decoding Scheme with Post-Processing for LDPC Codes to Lower the Error Floors

    Beomkyu SHIN  Hosung PARK  Jong-Seon NO  Habong CHUNG  

     
    LETTER-Fundamental Theories for Communications

      Vol:
    E94-B No:8
      Page(s):
    2375-2377

    In this letter, we propose a multi-stage decoding scheme with post-processing for low-density parity-check (LDPC) codes, which remedies the rapid performance degradation in the high signal-to-noise ratio (SNR) range known as error floor. In the proposed scheme, the unsuccessfully decoded words of the previous decoding stage are re-decoded by manipulating the received log-likelihood ratios (LLRs) of the properly selected variable nodes. Two effective criteria for selecting the probably erroneous variable nodes are also presented. Numerical results show that the proposed scheme can correct most of the unsuccessfully decoded words of the first stage having oscillatory behavior, which are regarded as a main cause of the error floor.

  • A New LLR Approximation for BICM Systems with HARQ

    Jin Whan KANG  Sang-Hyo KIM  Seokho YOON  Tae Hee HAN  Hyoung Kee CHOI  

     
    LETTER-Wireless Communication Technologies

      Vol:
    E93-B No:6
      Page(s):
    1628-1632

    In this letter, a new approximation of log-likelihood ratio (LLR) for soft input channel decoding is proposed. Conventional simplified LLR using log-sum approximation can degrade the performance of bit interleaved coded modulation (BICM) systems employing hybrid automatic repeat request (HARQ) at low SNR. The proposed LLR performs as well as the exact LLR, and at the same time, requires only a small number of elementary operations.

  • Utterance Verification Using State-Level Log-Likelihood Ratio with Frame and State Selection

    Suk-Bong KWON  Hoirin KIM  

     
    LETTER-Speech and Hearing

      Vol:
    E93-D No:3
      Page(s):
    647-650

    This paper suggests utterance verification system using state-level log-likelihood ratio with frame and state selection. We use hidden Markov models for speech recognition and utterance verification as acoustic models and anti-phone models. The hidden Markov models have three states and each state represents different characteristics of a phone. Thus we propose an algorithm to compute state-level log-likelihood ratio and give weights on states for obtaining more reliable confidence measure of recognized phones. Additionally, we propose a frame selection algorithm to compute confidence measure on frames including proper speech in the input speech. In general, phone segmentation information obtained from speaker-independent speech recognition system is not accurate because triphone-based acoustic models are difficult to effectively train for covering diverse pronunciation and coarticulation effect. So, it is more difficult to find the right matched states when obtaining state segmentation information. A state selection algorithm is suggested for finding valid states. The proposed method using state-level log-likelihood ratio with frame and state selection shows that the relative reduction in equal error rate is 18.1% compared to the baseline system using simple phone-level log-likelihood ratios.

  • Lowering Error Floor of Irregular LDPC Codes by CRC and OSD Algorithm

    Satoshi GOUNAI  Tomoaki OHTSUKI  

     
    PAPER-Fundamental Theories for Communications

      Vol:
    E89-B No:1
      Page(s):
    1-10

    Irregular Low-Density Parity-Check (LDPC) codes generally achieve better performance than regular LDPC codes at low Eb/N0 values. They have, however, higher error floors than regular LDPC codes. With respect to the construction of the irregular LDPC code, it can achieve the trade-off between the performance degradation of low Eb/N0 region and lowering error floor. It is known that a decoding algorithm can achieve very good performance if it combines the Ordered Statistic Decoding (OSD) algorithm and the Log Likelihood Ratio-Belief Propagation (LLR-BP) decoding algorithm. Unfortunately, all the codewords obtained by the OSD algorithm satisfy the parity check equation of the LDPC code. While we can not use the parity check equation of the LDPC code to stop the decoding process, the wrong codeword that satisfies the parity check equation raises the error floor. Once a codeword that satisfies the parity check equation is generated by the LLR-BP decoding algorithm, we regard that codeword as the final estimate and halt decoding; the OSD algorithm is not performed. In this paper, we propose a new encoding/decoding scheme to lower the error floor created by irregular LDPC codes. The proposed encoding scheme encodes information bits by Cyclic Redundancy Check (CRC) and LDPC code. The proposed decoding scheme, which consists of the LLR-BP decoding, CRC check, and OSD decoding, detects errors in the codewords obtained by the LLR-BP decoding algorithm and the OSD decoding algorithm using the parity check equations of LDPC codes and CRC. Computer simulations show that the proposed encoding/decoding scheme can lower the error floor of irregular LDPC codes.

FlyerIEICE has prepared a flyer regarding multilingual services. Please use the one in your native language.