Industrial applications
Polar codes have some limitations when used in industrial applications. Primarily, the original design of the polar codes achieves capacity when block sizes are asymptotically large with a successive cancellation decoder. However, with the block sizes used in industry, the performance of the successive cancellation is poor compared to well-defined and implemented coding schemes such as low-density parity-check code (LDPC) and turbo code. Polar performance can be improved with successive cancellation list decoding, but its usability in real applications is still questionable due to very poor implementation efficiencies caused by the iterative approach. In October 2016,PAC codes
In 2019, Arıkan suggested to employ a convolutional pre-transformation before polar coding. These pre-transformed variant of polar codes were dubbed polarization-adjusted convolutional (PAC) codes. It was shown that the pre-transformation can effectively improve the distance properties of polar codes by reducing the number of minimum-weight and in general small-weight codewords, resulting in the improvement of block error rates under near maximum likelihood (ML) decoding algorithm such as Fano decoding and list decoding. Fano decoding is a tree search algorithm that determines the transmitted codeword by utilizing an optimal metric function to efficiently guide the search process. PAC codes are also equivalent to post-transforming polar codes with certain cyclic codes. At short blocklengths, such codes outperform both convolutional codes and CRC-aided list decoding of conventional polar codes.Neural Polar Decoders
Neural Polar Decoders (NPDs) are an advancement in channel coding that combine neural networks (NNs) with polar codes, providing unified decoding for channels with or without memory, without requiring an explicit channel model. They use four neural networks to approximate the functions of polar decoding: the embedding (E) NN, the check-node (F) NN, the bit-node (G) NN, and the embedding-to-LLR (H) NN. The weights of these NNs are determined by estimating the mutual information of the synthetic channels. By the end of training, the weights of the NPD are fixed and can then be used for decoding. The computational complexity of NPDs is determined by the parameterization of the neural networks, unlike successive cancellation (SC) trellis decoders, whose complexity is determined by the channel model and are typically used for finite-state channels (FSCs). The computational complexity of NPDs is , where is the number of hidden units in the neural networks, is the dimension of the embedding, and is the block length. In contrast, the computational complexity of SC trellis decoders is , where is the state space of the channel model. NPDs can be integrated into SC decoding schemes such as SC list decoding and CRC-aided SC decoding. They are also compatible with non-uniform and i.i.d. input distributions by integrating them into the Honda-Yamamoto scheme.{{Cite journal , last1=Honda , first1=Junya , last2=Yamamoto , first2=Hirosuke , date=2013 , title=Polar Coding Without Alphabet Extension for Asymmetric Models , url=https://ieeexplore.ieee.org/document/6601656 , journal=IEEE Transactions on Information Theory , volume=59 , issue=12 , pages=7829–7838 , doi=10.1109/TIT.2013.2282305 , issn=0018-9448, url-access=subscription This flexibility allows NPDs to be used in various decoding scenarios, improving error correction performance while maintaining manageable computational complexity.References
External links