Noisy network coding

Noisy network coding

Sung Hoon Lim, Young-Han Kim, Abbas El Gamal, and Sae-Young Chung

A noisy network coding scheme for communicating messages between multiple sources and destinations over a general noisy network is presented. For multi-message multicast networks, the scheme naturally generalizes network coding over noiseless networks by Ahlswede, Cai, Li, and Yeung, and compress-forward coding for the relay channel by Cover and El Gamal to discrete memoryless and Gaussian networks. The scheme also extends the results on coding for wireless relay networks and deterministic networks by Avestimehr, Diggavi, and Tse, and coding for wireless erasure networks by Dana, Gowaikar, Palanki, Hassibi, and Effros. The scheme involves lossy compression by the relay as in the compress-forward coding scheme for the relay channel. However, unlike previous compress-forward schemes in which independent messages are sent over multiple blocks, the same message is sent multiple times using independent codebooks as in the network coding scheme for cyclic networks. Furthermore, the relays do not use Wyner-Ziv binning as in previous compress-forward schemes, and each decoder performs simultaneous decoding of the received signals from all the blocks without uniquely decoding the compression indices. A consequence of this new scheme is that achievability is proved simply and more generally without resorting to time expansion to extend results for acyclic networks to networks with cycles. The noisy network coding scheme is then extended to general multi-message networks by combining it with decoding techniques for the interference channel. For the Gaussian multicast network, noisy network coding improves the previously established gap to the cutset bound. We also demonstrate through two popular Gaussian network examples that noisy network coding can outperform conventional compress-forward, amplify-forward, and hash-forward coding schemes.