Coding theory

Essay by sadeeq7468University, Bachelor'sA+, November 2014

download word file, 6 pages 0.0

Downloaded 2 times
Keywords

Coding theory, sometimes called algebraic coding theory, deals with the design of error-correcting codes for the reliable transmission of information across noisy channels. It makes use of classical and modern algebraic techniques involving finite fields, group theory, and polynomial algebra. It has connections with other areas of discrete mathematics, especially number theory and the theory of experimental designs" Three areas that are commonly associated with Coding Theory are Data Compression, Cryptology, and Error Correcting Codes.

Data Compression

Data Compression is efficiently encoding source information so that it uses the smallest amount of space possible. This is accomplished by removing redundant bits of data. Data Compression is a signal processing operation and part of the quantification of information process, called Information Theory.

Cryptology

Cryptology, as defined in the Encyclopedia Britannica, is the "science concerned with data communication and storage in secure and usually secret form" (Simmons, n.d.). Through the use of encryption, data can be securely transmitted over insecure channels.

Error Correcting Codes

Error Correcting Codes increase the dependability of information that is being transmitted. Noisy communication channels, electrical interference, equipment malfunctions, long-term storage on magnetic tape, and human mistakes are all causes of information corruption. Error Correction Codes add redundant bits to the data being transmitted in order to ensure accurate delivery. Error Correcting Codes can be found in the applications used to transmit images from space, communicate wirelessly, improve sound quality in CDs, correct quantum errors, and accurately decode ISBN numbers. New uses for Error Correcting Codes are frequently being discovered.

There are three types of errors that occur in data, Single Bit, Multiple Bit, and Burst errors. A Single Bit error is the corruption of one bit of data; Multiple Bit errors involve the corruption of two or more data unconnected bits, and Burst Errors involve the corruption...