With a 1948 research paper, "A Mathematical Theory of Communication," Claude Shannon pretty much single handedly invented the idea of information. Before then, engineers would design logical circuits, or work to reduce noise in telephone transmission lines, on a case-by-case basis, not realizing their separate efforts could be unified under a single theory, around which an entire body of scientific work would be built.
Shannon was the first to realize all the various systems for the transmitting intelligence — telegraphy, radio, television — had the same fundamental characteristics, no matter the sender, receiver, or content.
"Building circuits in those days was an art, with all the mess and false starting, and indefinable intuition that 'art' implies," wrote Jimmy Soni and Rob Goodman, in their 2017 biography "A Mind at Play: How Claude Shannon Invented the Information Age." After Shannon, "Designing circuits was no longer an exercise in intuition, it was a science of equations and shortcut rules."
And fortuitously, his work was instrumental at the time to the then-emerging science of building computational machines.
At 13, Claude Shannon won first place in a Boy Scout contest for Morse Code signaling using only the body. He "had something of the machine" in him, the biographers wrote. His time at the Massachusetts Institute of Technology as a student seemed to cement this tendency. There, "math and engineering were an extension of the metal shop, and the wood shop," the biographers wrote. “A man learns to use the calculus as he learns to use the chisel or the file.”
Given a pipe wrench, an engineer should be able to write the words that would recreate that wrench exactly; And given those words, an engineer should be build that pipe wrench correctly: That was the introduction to engineering offered by computer science pioneer Vannevar Bush, under whom Shannon did his graduate work.