Course Overview

Some reading !

In last 50 years Information and Communication Technology (ICT) has had a great impact on our society. The most profound and accelerated impact of ICT can be seen in the last decade in the form of cell phones, connected computers and Internet. We even have a virtual currency. ICT is an interdisciplinary discipline combining IT (Information Technology) and CT (Communication Technology). IT has its root in computer science and CT has its root in theory of communication. Both the fields now can be seen as two sides of the same coin. Both deals with information, in IT we store (send information from now to then) and manipulate the information and in CT we send information from here to there (communicate). The mathematical principles of ICT lie in theoretical computer science (Turing machine) and information and coding theory (work of Shannon and Hamming). Realization of ICT is via logic gates and circuits giving birth to the area of Electronics and VLSI. 
Coding theory is at the heart of ICT with roots in mathematics, origin in electrical engineering and applications to computer science. Whenever you want to send information from one point to other point (communication) or send information from now to then (storage) you require error-correcting codes. Richard W. Hamming created first error-control codes in 1947 (published in 1949) out of frustration when he was working on Bell Model V computers. Every weekend the machine use to stop because of errors and Hamming said, "Why a computer can not detect and correct the errors itself". This resulted in his invention of Hamming codes that can correct single bit error. 
Around the same time in 1948 Shannon published the famous paper on Information theory "A Mathematical Theory of Communication". Information theory answers two fundamental questions about digital information viz. how much you can compress the digital information? (Answer: The Entropy H) and what is the ultimate transmission rate of digital communication (Answer: The Channel Capacity C). While information theory sets the bounds of data storage, communication etc. Coding theory tells us how to achieve these limits. It is more about algorithm and construction of codes. Thus there are two aspects of coding theory: source coding (for data compression) and channel coding (error correction). We will be focusing more on error correction. Information theory is an interdisciplinary field with connections to Statistical physics (thermodynamics), Computer science (Kolmogorov Complexity: complexity of a string of data is the length of the shortest binary program for computing the string), Communications, Economics, Networks and even to Biology and Chemistry. 
Error control coding (ECC) was known before Hamming but very efficient codes were not known and it was after Hamming's discovery it has become a field of research for mathematicians, computer scientist, electrical engineers for about 50 years now. In fact Von Neumann wrote that error control is an integral part of every information processing. So whenever there is information processing, there is error control coding. We can see now its importance in new computing paradigms such as quantum computers, bio- molecular computers. Network coding is another area that is emerging for all kind of networks. Many new applications of coding theory have emerged such as to Cloud computing (Cloud Data Storage and Cloud Security). All ICT applications uses some form of coding from CD/DVD, hard disk data storage, deep space communications, wireless communications, power line communications, cell phones, networks, sensor networks, data compressions, VLSI etc. Applications are endless. Now even people are trying to decipher what kind of error control coding is used in biological information processing? This is the greatest challenge for the ICT in 21st century. We ourselves use a crude form of error control coding in our day-to-day conversation between us (without knowing that it is ECC): Can you guess how? In almost 50 years error control coding has found many deep connections with diverse areas such as the theory of computation, complexity, algorithms, algebra (finite fields and finite rings), linear algebra, cryptography, number theory, algebraic geometry, discrete mathematics and statistical physics. In this course we will study the basics of coding theory with main focus on codes, which are optimal in the sense Shannon’s results and various bounds. 
More in the course. So fasten your seatbelt.

This course is designed for 3rd year BTech and open to MTech students and PhD Students. Those who have missed the boat from 4th year BTech can also take this course. 

You may want to read more at the following wikis:

1. Coding Theory Wiki 
2. Error-correcting codes Wiki 
3. Information Theory Wiki 

Read about Hamming at Richard Hamming 
Read about Shannon at Claude E. Shannon 
Read general info about coding theory at Coding theory: first 50 years 

Last modified: Saturday, 13 December 2014, 2:33 PM