The Math of Today -- April 30
April 30
Claude Shannon and Information Theory
Claude Shannon, “the father of information theory”, was born on April 30, 1916. Shannon was an American mathematician, electrical engineer and cryptographer. His most famous theory, the information theory, is used widely nowadays in data compression, information encryption and quantum computing.
You may use the word “bit” a lot and wondering where this strange word comes from. Well, it comes from Shannon! In 1948, Shannon published a famous paper, A Mathematical Theory of Communication, and used the “bit”, short for “binary digits”, to describe the size of information. After that, this unit of information was accepted and used all over the world.
Thanks to Shannon, now we can use math to measure information. The word “entropy” is used to describe the uncertainty of the information. The higher the entropy, the more possible combinations we have, which means it is harder to guess the “real” combination. This concept is also used in the study of languages. English has an average entropy of 4.03 bits, while Chinese has an average entropy of 9.65 bits. That might be the reason why Chinese is harder to learn.
Claude Shannon, “the father of information theory”, was born on April 30, 1916. Shannon was an American mathematician, electrical engineer and cryptographer. His most famous theory, the information theory, is used widely nowadays in data compression, information encryption and quantum computing.
You may use the word “bit” a lot and wondering where this strange word comes from. Well, it comes from Shannon! In 1948, Shannon published a famous paper, A Mathematical Theory of Communication, and used the “bit”, short for “binary digits”, to describe the size of information. After that, this unit of information was accepted and used all over the world.
Thanks to Shannon, now we can use math to measure information. The word “entropy” is used to describe the uncertainty of the information. The higher the entropy, the more possible combinations we have, which means it is harder to guess the “real” combination. This concept is also used in the study of languages. English has an average entropy of 4.03 bits, while Chinese has an average entropy of 9.65 bits. That might be the reason why Chinese is harder to learn.
hi
ReplyDelete