The BITS lab research is on coding theory and information theory, and their applications to reliable communications. Some particular research topics include:
For prospective Masters students in 2018, the Laboratory Introduction was made on April 10 in the School of Information Science. This video is the 2017 version, which includes greetings from students.
Information theory deals with the fundamental limits of information transmission and compression. Remarkably, information can be transmitted reliably over a communications channel, even if the channel is unreliable. Claude Shannon showed that the information rate R of transmission can be no bigger than the channel capacity: R < log (1 + SNR) for a channel with signal-to-noise ratio SNR. An error-correcting code is a concrete way to correct some errors, and even achieve the channel capacity. One such code can be represented by three circles, as shown in the figure. The number of 1’s inside each circle must be even. The code consists of seven bits, each either a 0 or a 1. But some bits have been erased to an unknown “?”. Can you recover the original bits? Codes for Data Storage Data storage is at the core of the information technology revolution, from the smartphones in our hands to data centers in the cloud. Flash memory, hard disk drives and distributed storage networks combine to provide ubiquitous access to data. But these exciting new systems pose curious new problems of storage density, reliability and efficiency. Coding theory can provide an answer Efficient Decoding Algorithms The decoding algorithms for error-correcting codes are fairly complicated. Your smartphone, internet connection and solid-state drive all perform decoding. The design of decoder circuits is generally separated by a wall: LSI engineers design circuits and coding theorists design codes. What is the most efficient circuit that can be designed? Recently, we have made a remarkable discovery: the design of efficient decoders should be motivated by information theory, breaking down the wall between theory and practice. Using tools from machine learning and information theory, in one case, the best-known decoders can be designed. Cooperative Wireless Communications With the arrival of the smartphone, the demand for wireless network communications has exploded. But new electromagnetic spectrum is scarce. To increase future data rates, cooperative wireless communications is the new way forward. In cooperative wireless communications, users, relays and base stations work together to increase data rates through bandwidth efficiency, as shown in the figure. Lattices are codes which use the same real-number algebra for both the code and the channel, where electromagnetic signals are superimposed. Lattice codes correct errors introduced by channel noise, satisfy transmission power constraints while simultaneously possessing group theoretic properties needed for network coding. We are developing lattice code theory to enable next-generation cooperative wireless communications. Lattice Codes A linear code defined over the real numbers is called a lattice. A two-dimensional hexagonal lattice is shown in the figure. Lattices also exist in three, four, five and higher dimensions, and become increasingly powerful, even if they cannot be visualized. Lattices are important for communication systems, particularly for wireless networks—systems with multiple transmitters and receivers. Also, our lab is also pioneering the use of lattices in signal processing and multimedia security.