At the beginning of the computer era, before the term “personal computer” was even born, you pretty much had to be programmer working in a command-line operating system (OS) even to turn a computer on and get the lights flickering. The high-tech trailblazers started bringing computer technology to people’s desks in the mid-1970s, especially the “two Steves” (Jobs and Wozniak) who made the Apple II the first successful “personal computer.”
By 1985, the PC boom was going like gangbusters, and the ongoing refinement of the graphical user interface (GUI) – first on the Macintosh, later in Windows, now in Linux, too – put amazing amounts of power at every user’s fingertips. This so-called “democratization of computers” brought millions of people into the new, exciting and seemingly limitless Internet Age.
Peripherals matter Despite the advent of easy-to-use computers (perhaps “easier to use” is more accurate), some elements of the user experience stayed with the “geeks,” those engineers and programmers who started the whole ball rolling in the first place. And makers of multifunction office devices, computer-aided machining and design (CAM/CAD), RAID backup systems and disc duplicators have had a somewhat harder time “democratizing” those product areas. This was a common problem in the 1990s.