Preloader
IconCall us: +123 599 8989
  • Follow Us On :

Diploma in Information Coding Theory

Course Description

There is a need for designing methods that automatically discover and correct errors in a communication channel. The foundations of coding theory with a focus on indispensable developments and algorithms are the principal components of this course. It begins by describing the significance of variable length and prefix-free codes. You will discover the role of Kraft–McMillan inequality in defining the parameters for the presence of a prefix code. Next, the course explains the various coding methods used in lossless data compression. You will discover the basic properties of Shannon and Huffman codes in ascertaining compression. Subsequently, the processes of obtaining the best possible compression for all input parameters are explained. This will include the significance of word ‘frequencies’ in performing data compression. Following this, you will be taught about an ‘entropy encoding’ technique that transforms a sequence of file symbols into a single decimal number. You will discover the superiority of arithmetic codes over the Huffman coding in allocating a fraction of a bit for the codewords.

Next, the course explores the fundamental properties of the various methods of database compression. You will explore the differences between entropic and hash-based compression schemes. In addition to this, the notion of channel coding in information theory is described. You will discover how channel coding protects the data from being corrupted in the transmission channel. The processes of transmitting error-free information in a noise-defiled channel of communication are also explained. Following this, you will explore how sphere-packing bound restricts the limits of an arbitrary block code. This will include the process of solving the error for a given code using the random coding bound method. The probability of interpreting fault for the codes of a given block length using the upper bound is highlighted. Subsequently, the matching converse and achievability results of the proof of the channel-coding theorem are explained. This includes the process of determining the rate of communication in a noisy channel in depicting tight bounds.

Finally, the course explains the significance of the Gaussian channel in the channel capacity. You will explore the process of comparing transmission systems with Shannon limits. In addition to this, you will discover the processes of modelling various transmission channels. Next, the various measures of information for a continuous random variable are described. You will study the concepts of mutual information and differential entropy in ascertaining continuous probability distributions. This will include the mathematical quantities required to prove the formula for the capacity of Gaussian channels. Following this, the process of deriving the channel-coding theorem in the Gaussian channel is illustrated with examples. You will explore the converse and achievability proofs of the theorem. Lastly, the methods of allocating optimal power to enhance the total capacity using parallel channels and water-filling algorithms are explained. ‘Diploma in Information Coding Theory’ is a revealing course that deals with the process of building trustworthy systems for transmitting information through noisy channels.

What you'll learn in this course?

  • Technology

  • Computer Science

  • Algorithms

  • Databases

  • Coding

  • Mathematics

NPTEL

India

Reviews

3.0
Ratings
Diploma in Information Coding Theory
Course includes:
  • img Duration 10-15 hrs
  • img Modules 7
  • img Certifications diploma
Share this course: