Following Your Nose
It was Bill Gates who first mapped out the future of Apple. He did so on January 5, 2000, at the Consumer Electronics Show in Las Vegas, Nevada. Of course, he’d intended to lay out a game plan for Microsoft, not Apple. But that’s not the way things worked out.
CES was an up-and-coming trade show back then. For years it had been the gathering point for people making everything from car speakers to stereo systems to televisions, from electronic football games that beeped as you pushed the buttons to video cameras to home security systems. The arrival of computer companies transformed the event, and within a few years it would become the largest digital technology exposition of them all, drawing audiences upwards of 150,000 and all but paralyzing Sin City for a week each January. Apple didn’t attend CES. Steve preferred to announce his products in an environment he controlled.
Microsoft didn’t control CES, but it certainly overshadowed everyone else. Chairman Gates, who relinquished his CEO title to Steve Ballmer in 2000, gave the keynote speech eight years running. Gates was a natural choice as the show’s semipermanent celebrity speaker, and he used the dais as a bully pulpit. In 2000, Microsoft really was the computer industry. Some 90 percent of the world’s personal computers ran its Windows operating system. Its software managed not only desktop and laptop PCs but also the servers that stored and organized the data of the world’s biggest corporations, and that undergirded the information technology of most governmental bureaucracies. Inside ATMs and cash registers, at airline check-in counters, and on the decks of aircraft carriers, Microsoft software made the world’s most sophisticated technologies hum. If the consumer electronics universe was about to be thrown into turmoil, who better to hear from than the leader of the industry doing the disrupting?
That evening, Gates spoke to a standing-room-only crowd of more than three thousand people at the Las Vegas Hilton Theater, where he revealed how Microsoft would “usher in the ‘consumer-electronics-plus’ era.” PCs running the Windows operating system would become the central component of “home media centers” that would harness the Internet and interact with consumer devices and even household appliances, all loaded with Microsoft software. This would be a bonanza for consumers, he explained, because they would now get “personalized, convenient access to their favorite music, news, entertainment, family photos and email through an array of consumer electronics, including televisions, telephones, home and car stereos, and Pocket PCs.”
The speech was a forecast, a warning, and a blueprint. Gates posited a vision of what the home would look like after the realization and interweaving of a set of trends. There would be much more connectivity among devices, access to a new range of digital content and programming via the Internet, newly interactive video games played at home, and gizmos with responsive screens and software smarts to replace mere electronic gadgets with push buttons. This is what we are going to do to your world, Gates was telling the manufacturers of consumer electronics. It is coming whether you like it or not, because this is what digital technology does to an industry. So get on board, you old-timers tinkering with microwave ovens and car stereos and televisions and headphones. Here’s how you can fit in to your own future, which actually belongs to us!
Such was Microsoft’s power, at that moment, as the unquestioned ruler of the empire of computing. The company had so thoroughly infiltrated and then controlled every aspect of the world’s defining digital technology that it seemed obvious to most everyone attending CES that if this was the future Microsoft wanted, this was the future. The obvious implication that Gates left unsaid was that this would be an enormous bonanza for Microsoft, which, by establishing the specifications that all kinds of hardware manufacturers would need to follow, would ensure its own dominance in the next brave new world.
Ruling the market for new consumer electronics devices might have solved Gates’s biggest problem: the fact that Microsoft was no longer growing at the galloping 25-plus percent pace investors like to see in a tech company. Remember that when Bill and Steve got into the business, computing still belonged to the IBMs and DECs of the world, with their big, expensive machines sold into a market consisting of a few hundred corporations, governments, and universities. As Moore’s law drove prices down, PC manufacturers sold their wares to a galaxy of other businesses, both big and small, that could now afford powerful computing that would make them more efficient. But numerically speaking, the biggest potential audience of all was relatively untapped. Once you can sell computing to consumers directly, and once you get computing into products that become part of their everyday lives, the volumes become transformative. Consider this: According to researchers at the Gartner Group, 355 million personal computers—servers, desktop PCs, and laptops—were sold around the world in 2011. Some 1.8 billion cellphones were sold the same year. And that’s a number that doesn’t include all the other kinds of computing-based or networkable devices that might become part of a consumer’s life, including video game consoles, audio players, radios, thermostats, car navigation systems, and anything else that can become smarter through the power of connected computing.
Gates, who is perhaps the world’s shrewdest business strategist, saw this future coming. And he expected Microsoft to garner the same slice of this world that it had of the computing world. After all, who else could possibly define the standards for digital interaction between devices? This had been Gates’s game: envisioning and delivering the future. The scale of his concerns and ambitions dwarfed Steve’s. He wanted Microsoft software on billions of devices; Steve just wanted anything that would help him sell a few thousand more Macs each month. Gates was the only one who could reasonably think about dominating his awkwardly named but clearly inevitable “consumer-electronics-plus” era. He was powerful, and very, very smart: despite his penchant for dense verbiage, he had done a wonderful job describing the future of computing as we now have it, some fifteen years later. All he and Steve Ballmer had to do was execute the strategy. If they could, they would steer the company through its transition to this future, and in so doing return Microsoft to the kind of growth that investors wanted to see.
No one knew it at the time, but Gates’s speech that January morning in Las Vegas marked the apex of Microsoft’s hegemony. On December 31, 1999, the company had been worth $619.3 billion, with a share price of $58.38. It would never be worth more.
Instead, a company still struggling to survive on the fringes of computing would execute Gates’s vision. It would do so by moving incrementally, by following its nose where the technology led, and by being opportunistic. Over the next few years, Steve Jobs would steer Apple toward a whole new rhythm of doing business. No one would have guessed it then, but the future belonged to Apple, not Microsoft.


WHEN WORD GOT back to Cupertino of Bill’s ambitious CES presentation, Avie Tevanian and Jon Rubinstein persuaded Steve to convene an emergency off-site executive staff meeting at the Garden Court Hotel in downtown Palo Alto to rethink where Apple was headed. “Bill Gates was already talking about what we would end up calling our ‘digital hub’ strategy,” recalls Mike Slade. “So I just cribbed his talk and pitched it to Steve at the off-site meeting. I said, ‘Shouldn’t we be doing this? We can’t let Microsoft do it. They’ll just screw it up!’ ”
Apple employees had never had much respect for Microsoft’s ability to create anything but ungainly, confusing, and half-baked technologies for consumers. The animus went back decades. Even though Microsoft Word, Excel, and PowerPoint were instrumental in the early success of the Mac, Microsoft’s unforgivable sin, from the vantage point of Cupertino, was its derivative creation of Windows. Steve was being expedient when he offered to abandon Apple’s long-standing lawsuit against Microsoft to seal the deal with Gates upon his return in 1997. But folks at Apple still considered Windows a rip-off of Apple’s ideas, pure and simple. Worse yet, they saw it as an inelegant theft, and one that got imposed on the world by a kind of bullying that Apple both despised and envied.
Steve’s team sincerely believed that a world defined by Microsoft’s “consumer-electronics-plus” vision would be as ugly as that godforsaken name. In 2000, if anyone needed evidence of how ham-handed Microsoft could be when it tried to befriend actual humans, as opposed to the corporate buyers it had always really cared about, all they had to do was open up Word or Excel or PowerPoint on a PC, where they would be greeted by an animated digital “concierge” called “Clippit.” An anthropomorphized talking paper clip that was intended to be an informal help center for users of the Office suite of productivity applications, Clippit was, in the minds of many users, a patronizing, useless abomination that was frustratingly difficult to banish from your PC screen. Time magazine would eventually call it one of the fifty worst inventions ever, right up there with Agent Orange, subprime mortgages, and the Ford Pinto.
The team at Apple could not abide the idea of letting the creators of Clippit establish the look and feel of whatever new world of consumer computing, communications, and digital media was emerging. They wanted the new consumer digital technologies to be held to the highest standards of elegance, beauty, and simplicity. Apple had always displayed a sense of style and design that was unmatched by anyone in the computing business. All you had to do was compare an iMac to the average PC.
Gates always knew that he could never hope to approximate Steve’s aesthetic sensibility. “He had an expectation of superlative things in his own work and in the products they would create,” he says. “Steve had a design mind-set. When I get to a hotel room, I don’t go, ‘Oh, this bedside table is so poorly designed, look at this, this could have been so much better.’ When I look at a car, I don’t say, ‘Oh, if I had designed this car I would have done this and this.’ People like Jony Ive and Steve Jobs are always looking at stuff that way. You know, I look at code and say, ‘Okay, this is architected well,’ but it’s just a different way of understanding the world. His most natural, innate sense was a world-class instinct about whether this or that object met certain standards. He had extremely high standards of what was shit, and what was not shit.” By those standards, Steve’s executive team was right: Microsoft and Apple had dramatically different notions of what constituted acceptable design, much less great design. If these applications and devices were to become as ubiquitous as Gates proclaimed, this was a rare opportunity to establish a benchmark for the functional and stylistic aesthetics of how the average person would deal with digital technology.
Apple had already dipped its toe into this emerging market with a well-designed but ill-chosen application called iMovie. It was introduced at precisely the moment when affordable digital video cameras from Japanese manufacturers like Sony, JVC, and Panasonic were beginning to hit the market. Steve had thought that an elegant and simple movie-editing application was just what the buyers of those cameras would need. iMovie was sophisticated software that radically simplified the tedious process of editing jerky amateur video into slick home movies with almost professional-quality production values. But if iMovie was proof that Apple could create cool consumer software, it was also proof that the consumer market could be diabolically hard to predict. iMovie was an elegant solution to a problem consumers weren’t yet dying to have solved.
In October 1999, Steve introduced iMovie as part of the rollout of a new generation of juiced-up iMacs. But sales were sluggish. Steve blamed himself for not explaining it well enough. So at an executive team meeting in December 1999, Steve gave early prototypes of new Sony digital camcorders to six of his top execs, asking each to shoot and edit his own four-minute home movie, with the finished productions to be shown in a week. He would pick the best of the bunch to show during his appearance at the January 2000 MacWorld in an effort to demonstrate how iMovie was something anyone could master over a weekend.
“Fred [Anderson], Ruby [Jon Rubinstein], Avie [Tevanian], Tim [Cook], Sina [Tamaddon], Steve, and me all made four-minute movies. I’ll be honest, it was a painfully cumbersome process, even for geeks like us,” remembers Slade. “You had to shoot the movie, then spool the video into the iMac, edit it, add music and credits, and then spool it back out onto the camcorder because the hard disk wasn’t big enough to hold both the original video clips and the finished movie, and we didn’t yet have recordable DVD drives. Many of us thought it was a pretty worthless strategy.
“But the movies were pretty funny,” he allows. “I had little kids back then, so I showed them playing in the leaves on a fall day with Van Morrison’s ‘Tupelo Honey’ as the background music. Steve’s was about his kids, too. And Fred, well, apparently his life was so boring that all he could do was make a movie about his goddamn cat. Tim Cook made one about trying to buy a house in Palo Alto, and how overpriced they were. I thought Ruby’s was the best, though. He had been on a business trip to Dallas on his birthday that week, so he made this totally deadpan movie of the highlights of his day, where he had scenes sitting alone in his hotel room, and in conference rooms, and other boring places showing himself saying ‘Happy Birthday, Jon. Woohoo!’ everywhere he went. And Sina made a beautiful one about his kids playing with their pets and jumping on the bed to a Green Day song.” (That’s the one Steve chose for MacWorld.)
The short little movies may have been fun to watch, but most of them had taken many hours to create. Movie editing, even when simplified by iMovie, was a process that required time, dedication, and skill. It was the kind of thing that a parent might do once in a while, but only in rare cases when he or she had a lot of free time over the weekend. It wasn’t until after the Garden Court off-site convened by Avie and Ruby that Steve acknowledged that Apple needed to create a much simpler consumer application than iMovie, something that users could engage with easily every day. The consensus at the meeting was that a digital music management application seemed like a good possibility. Rather than dig in his heels and insist on greater effort to make iMovie a hit, Steve chose to follow his team into the world of digital music. The big question now was whether Apple could move fast enough to make up for arriving so late to that party.


IT’S NOT SURPRISING that Steve had been so attracted to iMovie, since it was a piece of software designed primarily for parents. He and Laurene now had three children, after the birth of Eve in 1998, and by the turn of the century had settled into a relatively predictable and normal domestic routine.
Steve’s ability to compartmentalize and focus, qualities that were helping him turn around Apple, also shaped the way he balanced his work and family life. Back when he had been leading the Mac team or driving NeXT, Steve had spent many a late night at the office as part of a small team trying to deliver the next great thing. But now his role at Apple was so different: heading up a company with thousands of employees, Steve managed everything through his small team of senior executives. Rather than hover over the shoulders of star engineers and programmers, he could do much of his work via email. So he would make it home for dinner almost every night, spend time with Laurene and the kids, and then work at his computer late into the night. He and I were iChat buddies at the time, and I would regularly see the green light on next to his name on my screen in the wee hours, an indication that he was logged in to his Mac. (iChat was Apple’s video chat application, and there were times when we used it to talk about business, although sometimes his son, Reed, then an early teen, would sneak up behind Steve and make faces at me as we talked.)
On a spectrum plotting how much time parents spend with their kids versus time they spend focused on their job, Steve would land far toward the latter end. Both he and Laurene knew Steve would always work very, very hard—it had been a basic assumption when they’d gotten married. “Neither of us had much of a social life,” says Laurene. “It was never that important to us.” Laurene often worked beside him at night, at first on Terravera, a small health food business she eventually sold, and then on College Track, her first philanthropic venture. They had adjoining studies; she’d run ideas past him, and on many nights he’d spend an hour or two talking over Apple business with her. They’d often catch a TV show before falling asleep, mostly The Daily Show with Jon Stewart after it launched in 1999. The bulk of the parenting did fall on Laurene, but they scheduled their lives to ensure that Steve was involved. The Christmas holidays were often spent in Hawaii, mostly at a bungalow at the Kona Village Resort on the Big Island.
Besides creating a schedule that accommodated Steve’s heavy workload, the couple did everything they could to try to give their children what Steve himself defined as a “normal” life. He and Laurene created an environment that hewed to what can best be described as upper-middle-class norms. Over the years, their neighborhood was increasingly populated by the rich and famous (Larry Page of Google lived nearby, and Steve Young, the famous San Francisco 49ers quarterback, was a neighbor), but Steve and Laurene did everything they could to make their house feel as homey as possible. It was not a walled compound. The front door opened right to the street. The children roamed the neighborhood. The family biked around the area together.
Very slowly, Steve and Laurene even added furniture. “Those stories are true,” Laurene sighs, albeit with a chuckle. “He truly could take forever to decide on stuff like that, but then so could I.” While you could see the telltale signs of children around, it usually was far neater than my own house—having a staff can help with that. As lovely as it was inside, I always thought the heart of the place was the rambling vegetable and flower garden outside the kitchen door. It was the property’s most distinct feature and completely unlike the landscaping that graced other homes in the area. When I visited I’d sometimes catch Steve having just finished up in the garden, or Laurene walking in with one of the kids and a basket of freshly picked veggies and flowers.
This was his refuge. Although colleagues occasionally would visit him there, he tried to keep his home and family life completely sequestered from the press. As he did with other journalists who knew him well, it was understood that any discussions we ever had about his family were off the record—when I wrote in Fortune about my own kids coming over to see Toy Story with his son, Reed, I cleared it with him first.
But Steve and Laurene didn’t make any effort to hide away from their neighbors. They were regulars in downtown Palo Alto. Fortune had its Silicon Valley bureau offices on Emerson Street, just up the street from a building Steve had bought to use as an office closer to home. He didn’t use it all that much, but when he did it wasn’t unusual to see him out for a walk with a colleague, or running a personal errand by himself. (When Fortune eventually closed the bureau as part of a series of cost-cutting moves, I told Laurene about it, and she leased the space as the headquarters of a nonprofit she was starting, which she called the Emerson Collective.) Once when I ran into Steve we wound up shopping for a new bicycle for Laurene’s upcoming birthday. Steve had done his research, so it didn’t take long. We were in and out of Palo Alto Bicycles on University Avenue in ten minutes. He said, “I’d never have Andrea do something like this,” referring to his longtime administrative assistant. “I like buying presents for my family myself.”
These normal encounters with someone who, to repeat Catmull’s pithy phrase, “veered so far from the mean,” were memorable enough that dozens of people, after Steve’s death, wrote about such meetings on Quora, an online query site popular with Silicon Valley types. A designer named Tim Smith described the time his old Sunbeam Alpine sports car stalled in front of the Jobs driveway. Laurene came out and brought him a beer while he tried to figure out what to do, and then she offered to call a friend of theirs who knew a lot about Sunbeams. When the friend came by—dressed in a tuxedo for a night out—Steve emerged from the house with Reed. Steve got in the car and tried to crank it while his friend worked under the hood trying to get the thing going, but nothing worked. As Smith writes online: “I have to stop here—it’s a Kodak moment—something you want to remember. It’s a beautiful Fall evening in Palo Alto. Your car’s broken. A formally dressed close friend of Steve Jobs is under the hood working on your engine. You are talking with Steve’s absolutely lovely and down-to-earth wife. Steve is in the car, with his kid, trying to crank it. You don’t often get close to people like Jobs, much less in a ridiculous situation like this, where you realize that they are just really good people. They’re normal, funny, charitable, real people. Not the people the press talks about. Steve is not the maniacal business and design despot the media loves to portray—well he is, but not always.”
This was a side of Steve’s life that was seldom seen, and he made no attempt to publicize it. The general myth of Steve as a brilliant and driven egotist, who would sacrifice or shove aside anything or anyone for his career, carried the unfortunate corollary that he must have been a bad father and friend, and a man incapable of caring and love. It was a stereotype that never came close to gibing with my own experience of him.
Contrary to that caricature, and unlike most other CEOs I had interviewed at Fortune and the Wall Street Journal, Steve always seemed human and spontaneous with a penchant for honesty that stung and yet rung. True, some of this could turn negative: he could be scathing when he disagreed with something Fortune had published, and more than once I heard him sneer condescendingly at certain colleagues of mine with unreserved arrogance. But he could also be goofy: Once, when he was telling me that a new software interface was “good enough to lick,” he actually leaned forward and licked the screen of the 27-inch cinema-display monitor in front of a whole room full of engineers. And he could be deeply funny in the most disarming ways: One time I came to interview him wearing a loud silk shirt with wavy vertical navy blue stripes separating rows of dozens of large, blossomy, bloodred figures, each about three inches across. Those splotches really leaped from that shirt. When I walked into the conference room, Steve looked me up and down and quipped, “Did you have a meeting with a firing squad before you came to see me?” He paused for effect and then cackled. He would cut loose with a good belly laugh when he was truly amused; according to Laurene, she heard it most when he was cracking wise with the kids around the house.
It isn’t that I looked at Steve and saw a model father. I knew how hard he worked, and that his relentless drive carried a personal cost. But I had been given a look inside his home life over the years, and it seemed every bit as authentic as that of my own friends and colleagues. These stories on Quora, and the moments I experienced with him around Palo Alto or at his house, are mundane. But as time went on I came to realize that this was exactly the point: he craved a certain normalcy in his life, and he was able to get that most at home. With his family. They provided a therapeutic—and very human—outlet that he needed, especially in contrast to Apple, where he was gearing up to dive head-first into an uncertain future.


IF IMOVIE HAD been a sort of exploratory mission into the world of digital applications for consumers, iTunes would prove to be the expedition itself. Armed with a leadership team he trusted more and more, his keen aesthetic sensibility, a belief that the intersection of the arts and technology could lead to amazing things, and the growing understanding that great ideas develop in fits and starts, Steve was ready to see what Apple could bring to the world of music. In hindsight, of course, this seems like such an obvious course of action. But as in all of the most challenging and eventually rewarding journeys, there was little certainty at the outset of where they would end up. Steve would just have to follow his nose.
He had always loved music, but like many people in their forties, the playlist he returned to was pretty well established. Steve and I talked about the Beatles and Dylan, and sometimes one or the other of us would carp about something new we didn’t like so much. You can come to seem like an old fogey pretty fast when it comes to music, and in this way Steve was no different from anyone else.
This might help explain why Steve did not react earlier to the explosion, in the late 1990s, of digital sound formats for storing and playing music on a personal computer. During that period, several startup companies started dabbling with “jukebox” applications to manage MP3s—the shorthand term for digital files that contained, in compressed form, recorded music that had been “ripped” (in other words, copied) from an audio CD onto a PC’s hard disk. Others developed their own encrypted compression algorithms in hopes of convincing the recording industry to adopt their technology and build a new business model for music to be sold directly to consumers online. Two, in fact, had been started or financed by Microsoft alumni—RealNetworks and Liquid Audio.
Then there was Napster, the brainchild of a Massachusetts teenager named Shawn Fanning. Napster was the software application that really blew the lid off things. In the summer of 1999, Fanning concocted a “peer-to-peer” file-sharing service that allowed individuals around the world—conceivably anyone with a computer and an Internet connection—to upload and download MP3s, creating a way for people to share their own music collections with one another. Since the files were in digital form, the free copies were practically indistinguishable from the originals. It was one of the first truly “viral” Internet applications, a genuine killer app, that attracted tens of millions of users within months. It also was illegal. Napster facilitated the widespread piracy of recorded music, triggering a wholesale behavioral shift among music consumers that would eventually all but wreck the recording industry’s traditional business model. The courts would shut down Napster in 2001, but not before it had become a cultural sensation, and Shawn Fanning a celebrity worthy of the cover of Time magazine.
All this had gotten rolling while Steve had been busy stabilizing Apple. He had been preoccupied attacking problems that were directly in front of him: rationalizing inventory, stabilizing cash flow, trimming head count, assembling a new management team, and reviving advertising and marketing, not to mention supervising the design of new products. Steve’s intense focus had been on Apple’s internal needs and issues. Music hovered on the periphery of his narrowed field of vision. But now he realized that Apple had to move into music, and fast.
The story of Apple’s move into digital music is the tale of a man, and a team, learning how to adapt over and over again on the fly. Steve had solidified the company by narrowing its product lines so that Apple could once again produce distinctive computers. He had reaffirmed the company’s mission, for employees and for customers, with ingenious marketing and respectable financial results. But Apple’s product portfolio was still built around computers. Now that Steve was beginning to sense that the merger of consumer electronics and computers was emerging as a critical growth market, Apple’s metabolism, and many of Steve’s old habits, would have to shift. Starting with the creation of iTunes, Apple had to become a far more nimble company than it had ever been in the past. Steve had displayed a newfound openness by agreeing that the company had to move quickly past iMovie and into digital music. Now he’d have to maintain that same willingness to be flexible, and to follow his nose, wherever it led.
Historically, Steve had always preferred that Apple create its own software from scratch—he didn’t trust anyone as much as he trusted his own people. But since Apple was coming so late to digital music, it would not have time to develop a music management program on its own. So Steve decided to go shopping for an existing jukebox app that Apple could adapt to its own style.
Three independent developers had already created jukeboxes for the Macintosh. The best of the bunch was a forty-dollar application called SoundJam, which happened to be developed by two former Apple software engineers. SoundJam was also of interest to Steve because at its heart was a sophisticated database program that would allow music to be cataloged by more than a dozen attributes. It was a favorite of so-called power users who had large libraries of thousands of music tracks to manage. It was simple to navigate and operate, and it could import music files directly from audio CDs and compress them in a variety of formats into smaller chunks of digital data.
In March 2000, Apple bought SoundJam and attached some unusual terms: the authors of SoundJam would come to work for Apple, but their software distributor could continue selling the existing SoundJam product until Apple had reengineered it into iTunes. The other catch was that the whole transaction be kept secret for two years. There would be no public indication that anything had changed at SoundJam, the distributor and the SoundJam programmers would continue to make money, and Apple could keep its designs on building a jukebox application under wraps. Secrecy was key, since so many parties—studios, consumer electronics manufacturers, tech companies, broadcasters—were trying to find a way to lead digital music. Apple had been a leaky ship during its early years and throughout the Sculley/Spindler/Amelio era. But Steve had eradicated that problem by making it more than clear that anyone caught leaking company information or plans would be fired immediately. So the transaction stayed a secret, as he wished.
Tamaddon’s applications division, which had learned a lot from the development of iMovie, moved with speed and a minimum of fuss. The SoundJam team was integrated in seamlessly. Its developers worked directly with Avie and Sina to improve some attributes of the old program, including Steve’s favorite—a psychedelic “visualizer” feature that generated trippy, colorful, abstract moving full-screen images derived from whatever music was playing. More important, they simplified the software, eliminating options and complexity whenever they could. This, too, it turns out, would become a hallmark of the new Apple that Steve was creating. Saying no—to software features, new projects, new hires, boondoggle conferences, all kinds of press queries, even to Wall Street’s desire for better guidance on future earnings, and anything else deemed extraneous or distracting. Above all, saying no became a crucial way of keeping everyone, including himself, focused on what really mattered. The sheer simplicity of the quadrant strategy had laid the foundation for an organization that would say no again and again—until it said yes, at which point it would attack the new project with fierce determination.
The iTunes team moved remarkably fast. A mere nine months after having purchased SoundJam, and just a year after Bill Gates’s public christening of the concept of a world of connected computers, consumer electronics devices, and applications, Steve was able to unveil iTunes at the MacWorld trade show in San Francisco on January 9, 2001. He had a strong set of products to show off besides iTunes, including the Titanium PowerBook, the first of what would become Apple’s exceedingly popular laptops to be clad in metal rather than plastic, and OS X, which would finally ship as a finished product in March.
But iTunes turned out to be the real star of the show, because it was something that practically everyone in the room knew that they wanted. Steve demonstrated how the software would allow you to rip an entire library of music CDs into a digital archive on your Mac’s hard drive, and how the iTunes database would help you easily find and play particular tracks. You could mix tracks into a personal playlist of music that could be stored in the app or burned back onto a recordable, portable CD. And unlike OS X, which wouldn’t ship until late March, iTunes was available for download immediately, for free. Steve then showed a television ad with a stage full of recognizable pop music stars that concluded with the slogan that would soon show up on billboards across the country: Rip. Mix. Burn. He may have been in his forties, but the campaign was totally cool.
Also, for the first time in public, Steve took his first steps on the path to publicly co-opting Gates’s promised future. In classic Apple style, he began by reworking the language of Gates’s vision, trading “consumer-electronics-plus” for the much more felicitous “digital hub.” Energetically pacing the stage, he walked the audience through an enormous screen shot showing a Mac in the middle of six spokes extending to a digital still camera, a PDA, a DVD player, a CD Walkman, a video camcorder, and something called a digital music player. It was an image that updated his old principle of a computer as a “bicycle for the mind.” The Mac, Jobs explained, would be the ideal tool for managing, editing, and organizing content from all these devices, as well as a central repository for software updates, contacts, music and video files, and anything else you needed on your mobile devices. The computer industry’s P. T. Barnum made it all seem so much friendlier than the intimidating future Gates had painted. He made it seem accessible and human and simple. Apple promised to deliver software and hardware that you could manage and bend to your will. That was the power of the “I” in iTunes. You ruled this future, not Microsoft, or even Apple. Such was the power of Steve’s elocution.
Two days earlier, Gates had once again waxed on about what he now was calling the “digital living room” at CES. Microsoft’s booth there was tricked out to resemble a series of rooms in a typical home. Nothing about it was very realistic. When it came to the consumer’s future, Gates was the one offering the airy visions of a breakthrough future, while Steve was inching ahead with real products. It was almost as if the two men had reversed roles from that interview I’d conducted at Steve’s house a decade before.
During the first week after iTunes was introduced and made available online for free, 275,000 copies were downloaded. That was just a slice of the 20 million Macs installed around the world, but it already exceeded the number of actual users of iMovie, which had been available for download for fifteen months. There was just one problem: other than the iMac sitting in the center of the octopus-like digital hub diagram Steve had shown at MacWorld, none of the connected devices had been made by Apple. That had to change.


EARLY IN 2001, toward the end of a meeting with Steve, Eddy Cue, a young software engineer with a good head for business who would come to play a key role in Steve’s executive team, bellyached. “We can’t make things better than we’re making them,” he said. “Yet we’re at the same place we were at back in 1997.” Indeed, while annual sales had reached $7.9 billion in 2000, they were projected to drop well below $6 billion in 2001. “You’ve just got to hang on,” Steve told him. “People will come around.” His patience was admirable, but then again, Steve had believed since the 1980s that the world would eventually come to recognize the superiority of Apple products. Here he was in a new millennium, still waiting on humanity. His company was stable, but it wasn’t yet strong. It needed something to get it growing. It needed a new kind of product.
The desire to create a portable digital music player arose directly from the development of iTunes: as more and more Apple execs and engineers started listening to MP3s on their computers, it was only a matter of time before they wanted to take their digital music with them in some sort of portable digital version of the old Sony Walkman. The few pocket-sized MP3 players on the market were poorly designed and clumsy to use. It wasn’t so much that the sound was bad, but instead that the procedures for loading them with music and then finding what you wanted to hear were hopelessly opaque. Steve was proud of iTunes, and especially of how easy it made it for someone to organize and manage large libraries of recorded music. Not one of the existing devices could make the most of his nifty piece of software.
The only solution, the team decided, would be for Apple itself to make something better. It was a gambit that would push the company further out of its comfort zone: the only mass-market consumer electronics product it had ever manufactured was a long-forgotten Apple-branded digital still camera from the Sculley years. Steve himself had been involved in nothing like this since the illegal “blue box” long-distance telephone dialer he and Woz built and sold back in the 1970s. Computers were Apple’s focus and raison d’être. But this group was starting to function at such a high level that they welcomed the challenge of making a new kind of device. And none of them thought a portable music player alone would be transformative, so it seemed like a low-risk gamble. The terminology they used suggested the limits of their ambitions: many of them saw a music player primarily as a “computer peripheral,” like a printer or a Wi-Fi router.
As the head of hardware engineering, Jon Rubinstein always kept his eyes open for new electronic components—processors, disk drives, memory chips, graphics technologies—that might pique Steve’s interest or give Apple a competitive edge. In late 2000, during a trip to Japan, Ruby stopped by Toshiba, the electronic giant that, among other things, made hard drives for personal computers. The Toshiba engineers told Ruby that they wanted to show him the next “big” thing in laptop hard drives—the prototype of a miniature, 5-gigabyte disk drive that wasn’t even two inches in diameter. It could fit into a cigarette pack with plenty of room to spare, and yet was capacious enough to hold thousands of digital files, whether these were images, documents, or, say, songs. Ruby couldn’t believe his eyes. This was the first thing he’d seen that had enough capacity at a small enough size to form the heart of an Apple music player. Unlike the tapes or CDs that you played in Sony’s Walkman or Discman, this hard drive would have enough disk storage to hold copies of perhaps a thousand tracks, rather than just a dozen. And its “random access” capabilities distanced it even more from the likes of a Discman, since it gave you the potential to find a particular song out of that enormous trove almost instantly.
In January 2001, Ruby asked some former Newton engineers to begin work in earnest on some sort of portable audio device around the Toshiba micro-drive. In March he put an engineer he’d hired from Philips NV, Tony Fadell, in charge of the group. Fadell, an energetic entrepreneur with the build of a college wrestler and the intensity of a high school football coach, had worked at General Magic back in the early 1990s, with Bill Atkinson, Andy Hertzfeld, and Susan Kare, veterans of the original Macintosh team, who had told him horror stories about Steve in his early days. “I expected an overbearing tyrant,” he says, “but he wasn’t like that at all. He didn’t resemble the guy from their stories at all. On the things he cared about he could be very intense, but in general, he was much softer, much more considerate. He wasn’t a crazy micromanager. He trusted his guys.”
No one had any idea what the end product would look like, or how users would control it, or how much it would have to function like a tiny computer itself, or how exactly it would interact with iTunes song libraries on the iMac, or even when it possibly could be shipped. All they knew were the basic requirements: that it would somehow pack the tiny hard drive, an audio amplifier powerful enough to drive headphones, a small screen to display and navigate through the music it contained, a microprocessor or microcontroller to give it enough smarts, software to make it programmable and to help it interact directly with iTunes, and a high-speed FireWire port to let it mate via a cable with a Macintosh, in the space of something that you could easily slip into a front pocket of your Levi’s. Of course it had to look cool and of course Steve wanted it as soon as possible.
In this way, Steve had not changed at all: he still presented his team with outrageous goals that seemed impossibly out of reach. But there were two things that had changed, things that improved the odds that his team could live up to his stretch targets. Steve himself was more willing to reshape his goals as the development process revealed either limitations or new opportunities. And the group he had assembled was the most talented collection of people he had ever worked with, a naturally ambitious crew that knew that Steve encouraged their spirit of constant inquisitiveness and willingness to push boundaries. “What I loved about working for Steve,” says Cue, “is that you learned that you could accomplish the impossible. Again and again.”
Another reason that Steve felt confident that Apple could create a great consumer device was that a successful music player could only be the result of a holistic mix of great hardware and software. The iPod was truly a “whole widget” challenge, as Steve described it. With a crash schedule in hand, Fadell led the group building the iPod, but contributions came from everyone on the executive team, as well as from engineers who worked elsewhere in the company. Turning Ruby’s Toshiba microdrive into the heart of a pocket-sized piece of functioning hardware was not, by any means, the biggest challenge. The hard part was creating a usable device, one that would make those thousand tracks accessible with a click or two of a switch, and that would pair simply and directly with a Mac so its owner could import copies of his iTunes digital music files, along with his custom playlists. It also would be nice to be able to display some information about each track and to take full advantage of iTunes’ ability to sort them by artist, album title, and even genre. To make all that happen, the music player would need enough smarts to host a rudimentary computer database program. The iPod, in other words, would actually be a tiny, specialpurpose computer.
But that was just the beginning. Out of all the various aspects of computing, Steve was always most fascinated with the contact point between a person and a computer. It was the user interface that had made the Macintosh seem the epitome of a personal computer in its time. There were good reasons that Steve found this point of interaction so critical. If the point at which a person interacted with a machine was complicated, he or she would likely never unlock its secrets. Most people don’t care about the innards of their computer—they care only about what’s on the screen, and what they can get to through that screen. Steve understood the profound importance of this from the very beginning of his career. It was part of what distinguished him from so many other computer makers, most of whom were engineers who believed that a rational customer would of course care deeply about the insides of his or her computer. This bias held true nearly two decades after the introduction of Mac. So if Apple could make its portable music device a cinch to interact with, users would revel in portable, programmable music in a way they’d never imagined possible. If Apple couldn’t do so, its machine would be a clunker like all the rest.
Getting the interface right meant blending the right software with the right hardware. Some of the software work was already done, of course: the iTunes application on the Mac was the perfect tool to create the database of music tracks and information to be loaded onto the iPod. But the portable device itself needed its own miniature operating system to provide the software underpinnings of the user interface that would be presented on the screen, much like the Mac OS established the graphical user interface that Mac users operated with a mouse and a keyboard. To accomplish this, the software team mashed up repurposed operating system code from the old Newton with the rudimentary file management system that Apple had quietly licensed from a tiny startup company called PortalPlayer and some elements from Mac OS X.
Getting the hardware right was harder. This is where Jony Ive and his team of designers really showed their mettle. They created something known as a “thumb-wheel,” which functioned in some ways like the “scroll-wheel” on many computer mice. The iPod’s thumb-wheel was basically a flat disk that you could rotate clockwise or counterclockwise with your thumb to rapidly navigate up and down the long lists displayed on the screen. But Ive and his team customized it for the iPod with a series of little touches that made it truly intuitive. The faster you spun the wheel, the quicker the list would move up or down the list. In the middle of the wheel was a button you clicked to make a choice, just as you clicked the button of a Mac’s mouse. Situated around the perimeter of the thumb-wheel like a rim were other buttons that let you jump forward to the next track, restart a track from the beginning, or jump back to the previous track without having to locate it on the screen.
The breakthrough on the iPod user interface is what ultimately made the product seem so magical and unique. There were plenty of other important software innovations, like the software that enables easy synchronization of the device with a user’s iTunes music collection. But if the team had not cracked the usability problem for navigating a pocket library of hundreds or thousands of tracks, the iPod would never have gotten off the ground. It was a solution that came with ancillary benefits as well. The iPod interface was so well designed that it was able to grow and become even more useful as other technologies in the device improved and became cheaper. And since the thumb-wheel technology was half hardware and half software, it was much easier for Apple to lock in this design advantage with patents and copyrights so tough that no competitor dared try to copy it. Were it primarily a software feature, it would’ve been far more vulnerable to being aped. Once again, Apple had found a beautifully intuitive way to control a complex, intelligent device hidden underneath a gleaming, minimalist exterior. This is where Ive first showed that he could design far more than the shapes of things. He could help design the user experience, too. There was nothing that mattered more to Steve.


BEFITTING THE MEASURED ambitions for the new product, the iPod was introduced at an event held in the tiny Town Hall auditorium at Apple headquarters on October 23, 2001. Reaction from the assembled journalists was anything but measured, however. Following the technology where it led had allowed Steve to create a product with a blend of features that made so much intuitive sense that it would change consumer behavior. The iPod was spectacular and totally unexpected.
To use one was to fall in love with it. Apple gave an iPod to every journalist who attended the October introduction, something it had never done before. These technology writers and reviewers and other cognoscenti wound up raving in print about features Apple hadn’t even touted. The showstopper for many was the iPod’s random-play capability, something Steve initially considered to be of marginal interest. This so-called “shuffle mode” turned the device into the equivalent of a personal radio station that would play only your own music, in a totally unpredictable sequence. If you had a large library, your iPod operating in shuffle mode was a wonderful way to stumble upon music you had forgotten that you even owned. In that way, the iPod helped people rediscover the pleasures of the music itself.
The iPod gave Apple a new jolt of cool and expanded the appeal of its products to a much broader universe of consumers, especially younger buyers. In time, it would prove to be the Walkman, and then some, of the early twenty-first century. It was also the first new hardware link in a chain of successive innovative and self-reinforcing software and hardware and network products that started pouring forth once Apple got serious about making the Macintosh a genuine digital hub. Slowly, the iPod proved to be the product that would begin to turn Apple back into a growth company. “We followed where our own desires led us,” Steve explained, recalling how much his team had hated the existing music players on the market, “and we ended up ahead.”
Even the iPod tested Steve’s faith in consumers, however. It took them a while to fully warm to the device. It presented an unfamiliar method of interacting with music, and its $399 price was a significant impediment, especially when you could buy a Sony Discman CD player for under $100. Sales started out on the slow side: Apple sold just 150,000 iPods during the first quarter they were available. One year later, Steve cut the price of that first iPod by $100 and introduced a second version with twice as much capacity and a new “touch-wheel” that was a wheel in shape only—it was actually a circular touch-pad that moved users through their music even more smoothly than the mechanical thumb-wheel, and wasn’t nearly as prone to break. That second introduction was the first clear outward signal that iPod had transformed more than just the experience of listening to music—it had revitalized Apple’s capabilities as a manufacturer as well. The iPod had accelerated Apple’s creative metabolism, instilling a new organizational discipline that would make the promise of frequent, market-churning, incremental improvements—the kind that Bill Gates had lectured Steve about in that joint interview in Palo Alto a decade before—into a breathtaking new kind of rapid-fire technological innovation.
The iPod had led Apple to a newfound ability to keep outdoing itself almost like clockwork. Some of this required execution at a very high level. The iPod’s low price (at least compared to Apple’s computers), forced Apple to learn how to ensure high-quality manufacturing at higher unit volumes than Apple had ever delivered before. These new demands on manufacturing were exacerbated by the competitive dynamics of the consumer electronics market, which expected Apple to refresh the iPod product line far more frequently than its computers. To churn out iPods this way, Apple had to develop disciplines that would fundamentally transform the company into a much more capable enterprise. Tim Cook had to build up an extensive international supply chain, and he and Ruby had to develop relationships with a set of Asian factories capable of delivering lots of high-quality machines in record times. The iPod had quickened the company’s metabolism in a way that would pay off for years to come.
But outdoing itself also required Apple’s top execs—and Steve himself—to think about the future in a new way, with a willingness to follow the technology wherever it might lead. “Learning about new technologies and markets is what makes this fun for me and for everyone at Apple,” Steve once told me, a few years after the iPod’s debut. “By definition, it’s just what we do, and there are lots of ways to do it. Five or six years ago we didn’t know anything about video editing, so we bought a company to learn how to do that. Then we didn’t know anything about MP3 players, but our people are smart. They went out and figured it out by looking at what was already out there with a very critical eye, and then they combined that with what we already knew about design, user interface, materials, and digital electronics. The truth is, we’d get bored otherwise.” In another interview, Steve said, “Who cares where the good ideas come from? If you’re paying attention you’ll notice them.” When his focus had been directed entirely on fixing Apple’s own problems, Steve had almost missed the digital music revolution. Now that Apple was on more solid footing, he was focused outward again, and paying attention very carefully. “When I came back, Apple was like a person who was ill and couldn’t go out and do or learn anything,” Steve explained. “But we made it healthy again, and have increased its strength. Now, figuring out new things to do is what keeps us going.”
