Has there been a more influential technology company than Intel Corporation? Founded in 1968 by Robert Noyce, Andrew Grove, and Gordon Moore (of Moore’s Law fame), their Intel CPUs have been central to nearly every major advance in computing. Let’s take a look at eight of Intel’s greatest hits.
Intel 8080 puts microprocessors in everyday items
Until the release of the Intel 8080, manufacturers designed and produced whole computers using system-specific components. After the 8080, manufacturers began designing interchangeable components around the CPU. This spawned hobbyist computer building that laid the groundwork for the upcoming PC boom in the ‘80s. The 8080 was at the center of the MITS Altair 8800, the machine for which Microsoft’s Bill Gates and Paul Allen wrote BASIC. The rest is history.
MCS-48 microcontroller computerizes toys and games
Introduced in 1977, the MCS-48 microprocessor series was used in consumer products like television remote controls, musical instruments, toys, and games well past the turn of the century. It was the most cost-effective way to get 256 bytes of on-chip RAM onto a microprocessor for two decades. Notable items driven by the MCS-48 include the original IBM PC keyboard, Roland Jupiter-4 synthesizer, and the original Donkey Kong arcade game.
2910 single-chip codec revolutionizes the telephone
This silicon wafer obsoleted those giant analog telephone switchboards popularly seen in old black-and-white films. Intel engineers developed a system of chips that converted analog telephone signals into digital data using a technique called pulse-code modulation (PCM), eventually packing the circuitry into one chip—the 2910 single-chip codec, introduced in 1977. Now a telephone line can accommodate thousands of calls instead of just one per line like in the days of analog.
Intel 8086—the most important architecture in computer history
The 8086 is most notable as the first chip in what is probably the most important design in modern PCs—the famous x86 CPU family. This first of the line is found in the original IBM PC design introduced in 1981, and it endures in the basic instruction set of modern computers and servers. NASA used the original 8086 CPUs in ground-based Space Shuttle maintenance equipment from its initial rollout in 1981 until the end of the shuttle program in 2011.
See related: [INFOGRAPHIC] A Brief History of Server Computing
Moore’s Law defines how we think about components
While technically not a product, Gordon Moore’s observation of the semiconductor product development cycle in the early ‘60s—that the number of transistors in an integrated circuit will double approximately two years—continues to influence that way the industry forecasts R&D, and business technology buying cycles to this day.
Intel Xeon processors drive the Internet
What the x86 are to consumer PCs the Xeon family is to high-performance servers. When the first Xeon launched in 1998 it changed the way we think about performance computers, introducing high core counts, multiple sockets, providing support for ECC memory, and enabling virtualization. As such, Xeon processors provide the computing backbone for datacenters across the world. It is said that the Internet lives on servers driven by Intel Xeon CPUs.
Intel Atom spawns low-cost, lightweight computing
Enthusiasts initially panned this low-power, low-voltage chip when it was launched in 2008. Now many of these users are eating crow—especially ones who have come to realize Atom chips pack all the power they need to be productive into a convenient, lightweight 2-in-1 hybrid device. The Atom’s legacy is far from realized. This chip is at the heart of mobile technology, robotics, and the Internet of Things (IoT) movement.
Intel gives Stephen Hawking a better voice
Perhaps the greatest mind of his generation, Stephen Hawking, who suffers from ALS, has relied on computers to speak since 1985. The facial cue technology Hawking was using to communicate had become too cumbersome and slow for the professor, and in 2011 he asked Intel CIO Justin Rattner if there was anything they could do. Together they developed a new user interface that still operated on facial cues, but with contextual menus and all sorts of shortcuts worked in. Check out this WIRED article for an in-depth look at its development.
What have we left off? Which Intel innovations would you say are the most influential in the history of computers?
