A 1948 paper by Claude Shannon SM ’37, PhD ’40 created the field of information theory — and set its research agenda for the next 50 years.
Larry Hardesty, MIT News Office
January 19, 2010
It’s the early 1980s, and you’re an equipment manufacturer for the fledgling personal-computer market. For years, modems that send data over the telephone lines have been stuck at a maximum rate of 9.6 kilobits per second: if you try to increase the rate, an intolerable number of errors creeps into the data.
Then a group of engineers demonstrates that newly devised error-correcting codes can boost a modem’s transmission rate by 25 percent. You sense a business opportunity. Are there codes that can drive the data rate even higher? If so, how much higher? And what are those codes?
In fact, by the early 1980s, the answers to the first two questions were more than 30 years old. They’d been supplied in 1948 by Claude Shannon SM ’37, PhD ’40 in a groundbreaking paper that essentially created the discipline of information theory. “People who know Shannon’s work throughout science think it’s just one of the most brilliant things they’ve ever seen,” says David Forney, an adjunct professor in MIT’s Laboratory for Information and Decision Systems.
more
http://web.mit.edu/newsoffice/2010/explained-shannon-0115.html