How It's
Changing the Way We Think About the World
Kenneth Neil Cukier and Viktor
Mayer-Schoenberger
KENNETH CUKIER is Data Editor of The Economist. VIKTOR MAYER-SCHOENBERGER is Professor of Internet Governance and Regulation at the Oxford Internet Institute. They are the authors of Big Data: A Revolution That Will Transform How We Live, Work, and Think [1] (Houghton Mifflin Harcourt, 2013), from which this essay is adapted. Copyright © by Kenneth Cukier and Viktor Mayer-Schoenberger. Reprinted by permission of Houghton Mifflin Harcourt. Everyone knows that the Internet has changed how businesses operate, governments function, and people live. But a new, less visible technological trend is just as transformative: “big data.” Big data starts with the fact that there is a lot more information floating around these days than ever before, and it is being put to extraordinary new uses. Big data is distinct from the Internet, although the Web makes it much easier to collect and share data. Big data is about more than just communication: the idea is that we can learn from a large body of information things that we could not comprehend when we used only smaller amounts. In the third century BC, the Library of Alexandria was believed to house the sum of human knowledge. Today, there is enough information in the world to give every person alive 320 times as much of it as historians think was stored in Alexandria’s entire collection -- an estimated 1,200 exabytes’ worth. If all this information were placed on CDs and they were stacked up, the CDs would form five separate piles that would all reach to the moon. This explosion of data is relatively new. As recently as the year 2000, only one-quarter of all the world’s stored information was digital. The rest was preserved on paper, film, and other analog media. But because the amount of digital data expands so quickly -- doubling around every three years -- that situation was swiftly inverted. Today, less than two percent of all stored information is nondigital. Given this massive scale, it is tempting to understand big data solely in terms of size. But that would be misleading. Big data is also characterized by the ability to render into data many aspects of the world that have never been quantified before; call it “datafication.” For example, location has been datafied, first with the invention of longitude and latitude, and more recently with GPS satellite systems. Words are treated as data when computers mine centuries’ worth of books. Even friendships and “likes” are datafied, via Facebook. This kind of data is being put to incredible new uses with the assistance of inexpensive computer memory, powerful processors, smart algorithms, clever software, and math that borrows from basic statistics. Instead of trying to “teach” a computer how to do things, such as drive a car or translate between languages, which artificial-intelligence experts have tried unsuccessfully to do for decades, the new approach is to feed enough data into a computer so that it can infer the probability that, say, a traffic light is green and not red or that, in a certain context, lumière is a more appropriate substitute for “light” than léger. Using great volumes of information in this way requires three profound changes in how we approach data. The first is to collect and use a lot of data rather than settle for small amounts or samples, as statisticians have done for well over a century. The second is to shed our preference for highly curated and pristine data and instead accept messiness: in an increasing number of situations, a bit of inaccuracy can be tolerated, because the benefits of using vastly more data of variable quality outweigh the costs of using smaller amounts of very exact data. Third, in many instances, we will need to give up our quest to discover the cause of things, in return for accepting correlations. With big data, instead of trying to understand precisely why an engine breaks down or why a drug’s side effect disappears, researchers can instead collect and analyze massive quantities of information about such events and everything that is associated with them, looking for patterns that might help predict future occurrences. Big data helps answer what, not why, and often that’s good enough. The Internet has reshaped how humanity communicates. Big data is different: it marks a transformation in how society processes information. In time, big data might change our way of thinking about the world. As we tap ever more data to understand events and make decisions, we are likely to discover that many aspects of life are probabilistic, rather than certain. APPROACHING "N=ALL" For most of history, people have worked with relatively small amounts of data because the tools for collecting, organizing, storing, and analyzing information were poor. People winnowed the information they relied on to the barest minimum so that they could examine it more easily. This was the genius of modern-day statistics, which first came to the fore in the late nineteenth century and enabled society to understand complex realities even when little data existed. Today, the technical environment has shifted 179 degrees. There still is, and always will be, a constraint on how much data we can manage, but it is far less limiting than it used to be and will become even less so as time goes on. The way people handled the problem of capturing information in the past was through sampling. When collecting data was costly and processing it was difficult and time consuming, the sample was a savior. Modern sampling is based on the idea that, within a certain margin of error, one can infer something about the total population from a small subset, as long the sample is chosen at random. Hence, exit polls on election night query a randomly selected group of several hundred people to predict the voting behavior of an entire state. For straightforward questions, this process works well. But it falls apart when we want to drill down into subgroups within the sample. What if a pollster wants to know which candidate single women under 30 are most likely to vote for? How about university-educated, single Asian American women under 30? Suddenly, the random sample is largely useless, since there may be only a couple of people with those characteristics in the sample, too few to make a meaningful assessment of how the entire subpopulation will vote. But if we collect all the data -- “n = all,” to use the terminology of statistics -- the problem disappears. This example raises another shortcoming of using some data rather than all of it. In the past, when people collected only a little data, they often had to decide at the outset what to collect and how it would be used. Today, when we gather all the data, we do not need to know beforehand what we plan to use it for. Of course, it might not always be possible to collect all the data, but it is getting much more feasible to capture vastly more of a phenomenon than simply a sample and to aim for all of it. Big data is a matter not just of creating somewhat larger samples but of harnessing as much of the existing data as possible about what is being studied. We still need statistics; we just no longer need to rely on small samples. There is a tradeoff to make, however. When we increase the scale by orders of magnitude, we might have to give up on clean, carefully curated data and tolerate some messiness. This idea runs counter to how people have tried to work with data for centuries. Yet the obsession with accuracy and precision is in some ways an artifact of an information-constrained environment. When there was not that much data around, researchers had to make sure that the figures they bothered to collect were as exact as possible. Tapping vastly more data means that we can now allow some inaccuracies to slip in (provided the data set is not completely incorrect), in return for benefiting from the insights that a massive body of data provides. Consider language translation. It might seem obvious that computers would translate well, since they can store lots of information and retrieve it quickly. But if one were to simply substitute words from a French-English dictionary, the translation would be atrocious. Language is complex. A breakthrough came in the 1990s, when IBM delved into statistical machine translation. It fed Canadian parliamentary transcripts in both French and English into a computer and programmed it to infer which word in one language is the best alternative for another. This process changed the task of translation into a giant problem of probability and math. But after this initial improvement, progress stalled. Then Google barged in. Instead of using a relatively small number of high-quality translations, the search giant harnessed more data, but from the less orderly Internet -- “data in the wild,” so to speak. Google inhaled translations from corporate websites, documents in every language from the European Union, even translations from its giant book-scanning project. Instead of millions of pages of texts, Google analyzed billions. The result is that its translations are quite good -- better than IBM’s were--and cover 65 languages. Large amounts of messy data trumped small amounts of cleaner data. FROM CAUSATION TO CORRELATION These two shifts in how we think about data -- from some to all and from clean to messy -- give rise to a third change: from causation to correlation. This represents a move away from always trying to understand the deeper reasons behind how the world works to simply learning about an association among phenomena and using that to get things done. Of course, knowing the causes behind things is desirable. The problem is that causes are often extremely hard to figure out, and many times, when we think we have identified them, it is nothing more than a self-congratulatory illusion. Behavioral economics has shown that humans are conditioned to see causes even where none exist. So we need to be particularly on guard to prevent our cognitive biases from deluding us; sometimes, we just have to let the data speak. Take UPS, the delivery company. It places sensors on vehicle parts to identify certain heat or vibrational patterns that in the past have been associated with failures in those parts. In this way, the company can predict a breakdown before it happens and replace the part when it is convenient, instead of on the side of the road. The data do not reveal the exact relationship between the heat or the vibrational patterns and the part’s failure. They do not tell UPS why the part is in trouble. But they reveal enough for the company to know what to do in the near term and guide its investigation into any underlying problem that might exist with the part in question or with the vehicle. A similar approach is being used to treat breakdowns of the human machine. Researchers in Canada are developing a big-data approach to spot infections in premature babies before overt symptoms appear. By converting 16 vital signs, including heartbeat, blood pressure, respiration, and blood-oxygen levels, into an information flow of more than 1,000 data points per second, they have been able to find correlations between very minor changes and more serious problems. Eventually, this technique will enable doctors to act earlier to save lives. Over time, recording these observations might also allow doctors to understand what actually causes such problems. But when a newborn’s health is at risk, simply knowing that something is likely to occur can be far more important than understanding exactly why. Medicine provides another good example of why, with big data, seeing correlations can be enormously valuable, even when the underlying causes remain obscure. In February 2009, Google created a stir in health-care circles. Researchers at the company published a paper in Nature that showed how it was possible to track outbreaks of the seasonal flu using nothing more than the archived records of Google searches. Google handles more than a billion searches in the United States every day and stores them all. The company took the 50 million most commonly searched terms between 2003 and 2008 and compared them against historical influenza data from the Centers for Disease Control and Prevention. The idea was to discover whether the incidence of certain searches coincided with outbreaks of the flu -- in other words, to see whether an increase in the frequency of certain Google searches conducted in a particular geographic area correlated with the CDC’s data on outbreaks of flu there. The CDC tracks actual patient visits to hospitals and clinics across the country, but the information it releases suffers from a reporting lag of a week or two -- an eternity in the case of a pandemic. Google’s system, by contrast, would work in near-real time. Google did not presume to know which queries would prove to be the best indicators. Instead, it ran all the terms through an algorithm that ranked how well they correlated with flu outbreaks. Then, the system tried combining the terms to see if that improved the model. Finally, after running nearly half a billion calculations against the data, Google identified 45 terms -- words such as “headache” and “runny nose” -- that had a strong correlation with the CDC’s data on flu outbreaks. All 45 terms related in some way to influenza. But with a billion searches a day, it would have been impossible for a person to guess which ones might work best and test only those. Moreover, the data were imperfect. Since the data were never intended to be used in this way, misspellings and incomplete phrases were common. But the sheer size of the data set more than compensated for its messiness. The result, of course, was simply a correlation. It said nothing about the reasons why someone performed any particular search. Was it because the person felt ill, or heard sneezing in the next cubicle, or felt anxious after reading the news? Google’s system doesn’t know, and it doesn’t care. Indeed, last December, it seems that Google’s system may have overestimated the number of flu cases in the United States. This serves as a reminder that predictions are only probabilities and are not always correct, especially when the basis for the prediction -- Internet searches -- is in a constant state of change and vulnerable to outside influences, such as media reports. Still, big data can hint at the general direction of an ongoing development, and Google’s system did just that. BACK-END OPERATIONS Many technologists believe that big data traces its lineage back to the digital revolution of the 1980s, when advances in microprocessors and computer memory made it possible to analyze and store ever more information. That is only superficially the case. Computers and the Internet certainly aid big data by lowering the cost of collecting, storing, processing, and sharing information. But at its heart, big data is only the latest step in humanity’s quest to understand and quantify the world. To appreciate how this is the case, it helps to take a quick look behind us. Appreciating people’s posteriors is the art and science of Shigeomi Koshimizu, a professor at the Advanced Institute of Industrial Technology in Tokyo. Few would think that the way a person sits constitutes information, but it can. When a person is seated, the contours of the body, its posture, and its weight distribution can all be quantified and tabulated. Koshimizu and his team of engineers convert backsides into data by measuring the pressure they exert at 360 different points with sensors placed in a car seat and by indexing each point on a scale of zero to 256. The result is a digital code that is unique to each individual. In a trial, the system was able to distinguish among a handful of people with 98 percent accuracy. The research is not asinine. Koshimizu’s plan is to adapt the technology as an antitheft system for cars. A vehicle equipped with it could recognize when someone other than an approved driver sat down behind the wheel and could demand a password to allow the car to function. Transforming sitting positions into data creates a viable service and a potentially lucrative business. And its usefulness may go far beyond deterring auto theft. For instance, the aggregated data might reveal clues about a relationship between drivers’ posture and road safety, such as telltale shifts in position prior to accidents. The system might also be able to sense when a driver slumps slightly from fatigue and send an alert or automatically apply the brakes. Koshimizu took something that had never been treated as data -- or even imagined to have an informational quality -- and transformed it into a numerically quantified format. There is no good term yet for this sort of transformation, but “datafication” seems apt. Datafication is not the same as digitization, which takes analog content -- books, films, photographs -- and converts it into digital information, a sequence of ones and zeros that computers can read. Datafication is a far broader activity: taking all aspects of life and turning them into data. Google’s augmented-reality glasses datafy the gaze. Twitter datafies stray thoughts. LinkedIn datafies professional networks. Once we datafy things, we can transform their purpose and turn the information into new forms of value. For example, IBM was granted a U.S. patent in 2012 for “securing premises using surface-based computing technology” -- a technical way of describing a touch-sensitive floor covering, somewhat like a giant smartphone screen. Datafying the floor can open up all kinds of possibilities. The floor could be able to identify the objects on it, so that it might know to turn on lights in a room or open doors when a person entered. Moreover, it might identify individuals by their weight or by the way they stand and walk. It could tell if someone fell and did not get back up, an important feature for the elderly. Retailers could track the flow of customers through their stores. Once it becomes possible to turn activities of this kind into data that can be stored and analyzed, we can learn more about the world -- things we could never know before because we could not measure them easily and cheaply. BIG DATA IN THE BIG APPLE Big data will have implications far beyond medicine and consumer goods: it will profoundly change how governments work and alter the nature of politics. When it comes to generating economic growth, providing public services, or fighting wars, those who can harness big data effectively will enjoy a significant edge over others. So far, the most exciting work is happening at the municipal level, where it is easier to access data and to experiment with the information. In an effort spearheaded by New York City Mayor Michael Bloomberg (who made a fortune in the data business), the city is using big data to improve public services and lower costs. One example is a new fire-prevention strategy. Illegally subdivided buildings are far more likely than other buildings to go up in flames. The city gets 25,000 complaints about overcrowded buildings a year, but it has only 200 inspectors to respond. A small team of analytics specialists in the mayor’s office reckoned that big data could help resolve this imbalance between needs and resources. The team created a database of all 900,000 buildings in the city and augmented it with troves of data collected by 19 city agencies: records of tax liens, anomalies in utility usage, service cuts, missed payments, ambulance visits, local crime rates, rodent complaints, and more. Then, they compared this database to records of building fires from the past five years, ranked by severity, hoping to uncover correlations. Not surprisingly, among the predictors of a fire were the type of building and the year it was built. Less expected, however, was the finding that buildings obtaining permits for exterior brickwork correlated with lower risks of severe fire. Using all this data allowed the team to create a system that could help them determine which overcrowding complaints needed urgent attention. None of the buildings’ characteristics they recorded caused fires; rather, they correlated with an increased or decreased risk of fire. That knowledge has proved immensely valuable: in the past, building inspectors issued vacate orders in 13 percent of their visits; using the new method, that figure rose to 70 percent -- a huge efficiency gain. Of course, insurance companies have long used similar methods to estimate fire risks, but they mainly rely on only a handful of attributes and usually ones that intuitively correspond with fires. By contrast, New York City’s big-data approach was able to examine many more variables, including ones that would not at first seem to have any relation to fire risk. And the city’s model was cheaper and faster, since it made use of existing data. Most important, the big-data predictions are probably more on target, too. Big data is also helping increase the transparency of democratic governance. A movement has grown up around the idea of “open data,” which goes beyond the freedom-of-information laws that are now commonplace in developed democracies. Supporters call on governments to make the vast amounts of innocuous data that they hold easily available to the public. The United States has been at the forefront, with its Data.gov website, and many other countries have followed. At the same time as governments promote the use of big data, they will also need to protect citizens against unhealthy market dominance. Companies such as Google, Amazon, and Facebook -- as well as lesser-known “data brokers,” such as Acxiom and Experian -- are amassing vast amounts of information on everyone and everything. Antitrust laws protect against the monopolization of markets for goods and services such as software or media outlets, because the sizes of the markets for those goods are relatively easy to estimate. But how should governments apply antitrust rules to big data, a market that is hard to define and that is constantly changing form? Meanwhile, privacy will become an even bigger worry, since more data will almost certainly lead to more compromised private information, a downside of big data that current technologies and laws seem unlikely to prevent. Regulations governing big data might even emerge as a battleground among countries. European governments are already scrutinizing Google over a raft of antitrust and privacy concerns, in a scenario reminiscent of the antitrust enforcement actions the European Commission took against Microsoft beginning a decade ago. Facebook might become a target for similar actions all over the world, because it holds so much data about individuals. Diplomats should brace for fights over whether to treat information flows as similar to free trade: in the future, when China censors Internet searches, it might face complaints not only about unjustly muzzling speech but also about unfairly restraining commerce. BIG DATA OR BIG BROTHER? States will need to help protect their citizens and their markets from new vulnerabilities caused by big data. But there is another potential dark side: big data could become Big Brother. In all countries, but particularly in nondemocratic ones, big data exacerbates the existing asymmetry of power between the state and the people. The asymmetry could well become so great that it leads to big-data authoritarianism, a possibility vividly imagined in science-fiction movies such as Minority Report. That 2002 film took place in a near-future dystopia in which the character played by Tom Cruise headed a “Precrime” police unit that relied on clairvoyants whose visions identified people who were about to commit crimes. The plot revolves around the system’s obvious potential for error and, worse yet, its denial of free will. Although the idea of identifying potential wrongdoers before they have committed a crime seems fanciful, big data has allowed some authorities to take it seriously. In 2007, the Department of Homeland Security launched a research project called FAST (Future Attribute Screening Technology), aimed at identifying potential terrorists by analyzing data about individuals’ vital signs, body language, and other physiological patterns. Police forces in many cities, including Los Angeles, Memphis, Richmond, and Santa Cruz, have adopted “predictive policing” software, which analyzes data on previous crimes to identify where and when the next ones might be committed. For the moment, these systems do not identify specific individuals as suspects. But that is the direction in which things seem to be heading. Perhaps such systems would identify which young people are most likely to shoplift. There might be decent reasons to get so specific, especially when it comes to preventing negative social outcomes other than crime. For example, if social workers could tell with 95 percent accuracy which teenage girls would get pregnant or which high school boys would drop out of school, wouldn’t they be remiss if they did not step in to help? It sounds tempting. Prevention is better than punishment, after all. But even an intervention that did not admonish and instead provided assistance could be construed as a penalty -- at the very least, one might be stigmatized in the eyes of others. In this case, the state’s actions would take the form of a penalty before any act were committed, obliterating the sanctity of free will. Another worry is what could happen when governments put too much trust in the power of data. In his 1999 book, Seeing Like a State, the anthropologist James Scott documented the ways in which governments, in their zeal for quantification and data collection, sometimes end up making people’s lives miserable. They use maps to determine how to reorganize communities without first learning anything about the people who live there. They use long tables of data about harvests to decide to collectivize agriculture without knowing a whit about farming. They take all the imperfect, organic ways in which people have interacted over time and bend them to their needs, sometimes just to satisfy a desire for quantifiable order. This misplaced trust in data can come back to bite. Organizations can be beguiled by data’s false charms and endow more meaning to the numbers than they deserve. That is one of the lessons of the Vietnam War. U.S. Secretary of Defense Robert McNamara became obsessed with using statistics as a way to measure the war’s progress. He and his colleagues fixated on the number of enemy fighters killed. Relied on by commanders and published daily in newspapers, the body count became the data point that defined an era. To the war’s supporters, it was proof of progress; to critics, it was evidence of the war’s immorality. Yet the statistics revealed very little about the complex reality of the conflict. The figures were frequently inaccurate and were of little value as a way to measure success. Although it is important to learn from data to improve lives, common sense must be permitted to override the spreadsheets. HUMAN TOUCH Big data is poised to reshape the way we live, work, and think. A worldview built on the importance of causation is being challenged by a preponderance of correlations. The possession of knowledge, which once meant an understanding of the past, is coming to mean an ability to predict the future. The challenges posed by big data will not be easy to resolve. Rather, they are simply the next step in the timeless debate over how to best understand the world. Still, big data will become integral to addressing many of the world’s pressing problems. Tackling climate change will require analyzing pollution data to understand where best to focus efforts and find ways to mitigate problems. The sensors being placed all over the world, including those embedded in smartphones, provide a wealth of data that will allow climatologists to more accurately model global warming. Meanwhile, improving and lowering the cost of health care, especially for the world’s poor, will make it necessary to automate some tasks that currently require human judgment but could be done by a computer, such as examining biopsies for cancerous cells or detecting infections before symptoms fully emerge. Ultimately, big data marks the moment when the “information society” finally fulfills the promise implied by its name. The data take center stage. All those digital bits that have been gathered can now be harnessed in novel ways to serve new purposes and unlock new forms of value. But this requires a new way of thinking and will challenge institutions and identities. In a world where data shape decisions more and more, what purpose will remain for people, or for intuition, or for going against the facts? If everyone appeals to the data and harnesses big-data tools, perhaps what will become the central point of differentiation is unpredictability: the human element of instinct, risk taking, accidents, and even error. If so, then there will be a special need to carve out a place for the human: to reserve space for intuition, common sense, and serendipity to ensure that they are not crowded out by data and machine-made answers. This has important implications for the notion of progress in society. Big data enables us to experiment faster and explore more leads. These advantages should produce more innovation. But at times, the spark of invention becomes what the data do not say. That is something that no amount of data can ever confirm or corroborate, since it has yet to exist. If Henry Ford had queried big-data algorithms to discover what his customers wanted, they would have come back with “a faster horse,” to recast his famous line. In a world of big data, it is the most human traits that will need to be fostered -- creativity, intuition, and intellectual ambition -- since human ingenuity is the source of progress. Big data is a resource and a tool. It is meant to inform, rather than explain; it points toward understanding, but it can still lead to misunderstanding, depending on how well it is wielded. And however dazzling the power of big data appears, its seductive glimmer must never blind us to its inherent imperfections. Rather, we must adopt this technology with an appreciation not just of its power but also of its limitations.
Return to Article:
http://www.foreignaffairs.com/articles/139104/kenneth-neil-cukier-and-viktor-mayer-schoenberger/the-rise-of-big-data
Published on Foreign Affairs (http://www.foreignaffairs.com)
Links:
[1] http://www.amazon.com/Big-Data-Revolution-Transform-Think/dp/0544002695 |
His saga is the entrepreneurial creation myth writ large: Steve Jobs cofounded Apple in his parents’ garage in 1976, was ousted in 1985, returned to rescue it from near bankruptcy in 1997, and by the time he died, in October 2011, had built it into the world’s most valuable company. Along the way he helped to transform seven industries: personal computing, animated movies, music, phones, tablet computing, retail stores, and digital publishing. He thus belongs in the pantheon of America’s great innovators, along with Thomas Edison, Henry Ford, and Walt Disney. None of these men was a saint, but long after their personalities are forgotten, history will remember how they applied imagination to technology and business.
In the months since my biography of Jobs came out, countless commentators have tried to draw management lessons from it. Some of those readers have been insightful, but I think that many of them (especially those with no experience in entrepreneurship) fixate too much on the rough edges of his personality. The essence of Jobs, I think, is that his personality was integral to his way of doing business. He acted as if the normal rules didn’t apply to him, and the passion, intensity, and extreme emotionalism he brought to everyday life were things he also poured into the products he made. His petulance and impatience were part and parcel of his perfectionism.
One of the last times I saw him, after I had finished writing most of the book, I asked him again about his tendency to be rough on people. “Look at the results,” he replied. “These are all smart people I work with, and any of them could get a top job at another place if they were truly feeling brutalized. But they don’t.” Then he paused for a few moments and said, almost wistfully, “And we got some amazing things done.” Indeed, he and Apple had had a string of hits over the past dozen years that was greater than that of any other innovative company in modern times: iMac, iPod, iPod nano, iTunes Store, Apple Stores, MacBook, iPhone, iPad, App Store, OS X Lion—not to mention every Pixar film. And as he battled his final illness, Jobs was surrounded by an intensely loyal cadre of colleagues who had been inspired by him for years and a very loving wife, sister, and four children.
So I think the real lessons from Steve Jobs have to be drawn from looking at what he actually accomplished. I once asked him what he thought was his most important creation, thinking he would answer the iPad or the Macintosh. Instead he said it was Apple the company. Making an enduring company, he said, was both far harder and more important than making a great product. How did he do it? Business schools will be studying that question a century from now. Here are what I consider the keys to his success.
Focus When Jobs returned to Apple in 1997, it was producing a random array of computers and peripherals, including a dozen different versions of the Macintosh. After a few weeks of product review sessions, he’d finally had enough. “Stop!” he shouted. “This is crazy.” He grabbed a Magic Marker, padded in his bare feet to a whiteboard, and drew a two-by-two grid. “Here’s what we need,” he declared. Atop the two columns, he wrote “Consumer” and “Pro.” He labeled the two rows “Desktop” and “Portable.” Their job, he told his team members, was to focus on four great products, one for each quadrant. All other products should be canceled. There was a stunned silence. But by getting Apple to focus on making just four computers, he saved the company. “Deciding what not to do is as important as deciding what to do,” he told me. “That’s true for companies, and it’s true for products.”
After he righted the company, Jobs began taking his “top 100” people on a retreat each year. On the last day, he would stand in front of a whiteboard (he loved whiteboards, because they gave him complete control of a situation and they engendered focus) and ask, “What are the 10 things we should be doing next?” People would fight to get their suggestions on the list. Jobs would write them down—and then cross off the ones he decreed dumb. After much jockeying, the group would come up with a list of 10. Then Jobs would slash the bottom seven and announce, “We can only do three.”
Focus was ingrained in Jobs’s personality and had been honed by his Zen training. He relentlessly filtered out what he considered distractions. Colleagues and family members would at times be exasperated as they tried to get him to deal with issues—a legal problem, a medical diagnosis—they considered important. But he would give a cold stare and refuse to shift his laserlike focus until he was ready.
Near the end of his life, Jobs was visited at home by Larry Page, who was about to resume control of Google, the company he had cofounded. Even though their companies were feuding, Jobs was willing to give some advice. “The main thing I stressed was focus,” he recalled. Figure out what Google wants to be when it grows up, he told Page. “It’s now all over the map. What are the five products you want to focus on? Get rid of the rest, because they’re dragging you down. They’re turning you into Microsoft. They’re causing you to turn out products that are adequate but not great.” Page followed the advice. In January 2012 he told employees to focus on just a few priorities, such as Android and Google+, and to make them “beautiful,” the way Jobs would have done.
Simplify Jobs’s Zenlike ability to focus was accompanied by the related instinct to simplify things by zeroing in on their essence and eliminating unnecessary components. “Simplicity is the ultimate sophistication,” declared Apple’s first marketing brochure. To see what that means, compare any Apple software with, say, Microsoft Word, which keeps getting uglier and more cluttered with nonintuitive navigational ribbons and intrusive features. It is a reminder of the glory of Apple’s quest for simplicity.
Jobs learned to admire simplicity when he was working the night shift at Atari as a college dropout. Atari’s games came with no manual and needed to be uncomplicated enough that a stoned freshman could figure them out. The only instructions for its Star Trek game were: “1. Insert quarter. 2. Avoid Klingons.” His love of simplicity in design was refined at design conferences he attended at the Aspen Institute in the late 1970s on a campus built in the Bauhaus style, which emphasized clean lines and functional design devoid of frills or distractions.
When Jobs visited Xerox’s Palo Alto Research Center and saw the plans for a computer that had a graphical user interface and a mouse, he set about making the design both more intuitive (his team enabled the user to drag and drop documents and folders on a virtual desktop) and simpler. For example, the Xerox mouse had three buttons and cost $300; Jobs went to a local industrial design firm and told one of its founders, Dean Hovey, that he wanted a simple, single-button model that cost $15. Hovey complied.
Jobs aimed for the simplicity that comes from conquering, rather than merely ignoring, complexity. Achieving this depth of simplicity, he realized, would produce a machine that felt as if it deferred to users in a friendly way, rather than challenging them. “It takes a lot of hard work,” he said, “to make something simple, to truly understand the underlying challenges and come up with elegant solutions.”
In Jony Ive, Apple’s industrial designer, Jobs met his soul mate in the quest for deep rather than superficial simplicity. They knew that simplicity is not merely a minimalist style or the removal of clutter. In order to eliminate screws, buttons, or excess navigational screens, it was necessary to understand profoundly the role each element played. “To be truly simple, you have to go really deep,” Ive explained. “For example, to have no screws on something, you can end up having a product that is so convoluted and so complex. The better way is to go deeper with the simplicity, to understand everything about it and how it’s manufactured.”
During the design of the iPod interface, Jobs tried at every meeting to find ways to cut clutter. He insisted on being able to get to whatever he wanted in three clicks. One navigation screen, for example, asked users whether they wanted to search by song, album, or artist. “Why do we need that screen?” Jobs demanded. The designers realized they didn’t. “There would be times when we’d rack our brains on a user interface problem, and he would go, ‘Did you think of this?’” says Tony Fadell, who led the iPod team. “And then we’d all go, ‘Holy shit.’ He’d redefine the problem or approach, and our little problem would go away.” At one point Jobs made the simplest of all suggestions: Let’s get rid of the on/off button. At first the team members were taken aback, but then they realized the button was unnecessary. The device would gradually power down if it wasn’t being used and would spring to life when reengaged.
Likewise, when Jobs was shown a cluttered set of proposed navigation screens for iDVD, which allowed users to burn video onto a disk, he jumped up and drew a simple rectangle on a whiteboard. “Here’s the new application,” he said. “It’s got one window. You drag your video into the window. Then you click the button that says ‘Burn.’ That’s it. That’s what we’re going to make.”
In looking for industries or categories ripe for disruption, Jobs always asked who was making products more complicated than they should be. In 2001 portable music players and ways to acquire songs online fit that description, leading to the iPod and the iTunes Store. Mobile phones were next. Jobs would grab a phone at a meeting and rant (correctly) that nobody could possibly figure out how to navigate half the features, including the address book. At the end of his career he was setting his sights on the television industry, which had made it almost impossible for people to click on a simple device to watch what they wanted when they wanted.
Take Responsibility End to End Jobs knew that the best way to achieve simplicity was to make sure that hardware, software, and peripheral devices were seamlessly integrated. An Apple ecosystem—an iPod connected to a Mac with iTunes software, for example—allowed devices to be simpler, syncing to be smoother, and glitches to be rarer. The more complex tasks, such as making new playlists, could be done on the computer, allowing the iPod to have fewer functions and buttons.
Jobs and Apple took end-to-end responsibility for the user experience—something too few companies do. From the performance of the ARM microprocessor in the iPhone to the act of buying that phone in an Apple Store, every aspect of the customer experience was tightly linked together. Both Microsoft in the 1980s and Google in the past few years have taken a more open approach that allows their operating systems and software to be used by various hardware manufacturers. That has sometimes proved the better business model. But Jobs fervently believed that it was a recipe for (to use his technical term) crappier products. “People are busy,” he said. “They have other things to do than think about how to integrate their computers and devices.”
Part of Jobs’s compulsion to take responsibility for what he called “the whole widget” stemmed from his personality, which was very controlling. But it was also driven by his passion for perfection and making elegant products. He got hives, or worse, when contemplating the use of great Apple software on another company’s uninspired hardware, and he was equally allergic to the thought that unapproved apps or content might pollute the perfection of an Apple device. It was an approach that did not always maximize short-term profits, but in a world filled with junky devices, inscrutable error messages, and annoying interfaces, it led to astonishing products marked by delightful user experiences. Being in the Apple ecosystem could be as sublime as walking in one of the Zen gardens of Kyoto that Jobs loved, and neither experience was created by worshipping at the altar of openness or by letting a thousand flowers bloom. Sometimes it’s nice to be in the hands of a control freak.
When Behind, Leapfrog The mark of an innovative company is not only that it comes up with new ideas first. It also knows how to leapfrog when it finds itself behind. That happened when Jobs built the original iMac. He focused on making it useful for managing a user’s photos and videos, but it was left behind when dealing with music. People with PCs were downloading and swapping music and then ripping and burning their own CDs. The iMac’s slot drive couldn’t burn CDs. “I felt like a dope,” he said. “I thought we had missed it.”
But instead of merely catching up by upgrading the iMac’s CD drive, he decided to create an integrated system that would transform the music industry. The result was the combination of iTunes, the iTunes Store, and the iPod, which allowed users to buy, share, manage, store, and play music better than they could with any other devices.
After the iPod became a huge success, Jobs spent little time relishing it. Instead he began to worry about what might endanger it. One possibility was that mobile phone makers would start adding music players to their handsets. So he cannibalized iPod sales by creating the iPhone. “If we don’t cannibalize ourselves, someone else will,” he said.
Put Products Before Profits When Jobs and his small team designed the original Macintosh, in the early 1980s, his injunction was to make it “insanely great.” He never spoke of profit maximization or cost trade-offs. “Don’t worry about price, just specify the computer’s abilities,” he told the original team leader. At his first retreat with the Macintosh team, he began by writing a maxim on his whiteboard: “Don’t compromise.” The machine that resulted cost too much and led to Jobs’s ouster from Apple. But the Macintosh also “put a dent in the universe,” as he said, by accelerating the home computer revolution. And in the long run he got the balance right: Focus on making the product great and the profits will follow.
John Sculley, who ran Apple from 1983 to 1993, was a marketing and sales executive from Pepsi. He focused more on profit maximization than on product design after Jobs left, and Apple gradually declined. “I have my own theory about why decline happens at companies,” Jobs told me: They make some great products, but then the sales and marketing people take over the company, because they are the ones who can juice up profits. “When the sales guys run the company, the product guys don’t matter so much, and a lot of them just turn off. It happened at Apple when Sculley came in, which was my fault, and it happened when Ballmer took over at Microsoft.”
When Jobs returned, he shifted Apple’s focus back to making innovative products: the sprightly iMac, the PowerBook, and then the iPod, the iPhone, and the iPad. As he explained, “My passion has been to build an enduring company where people were motivated to make great products. Everything else was secondary. Sure, it was great to make a profit, because that was what allowed you to make great products. But the products, not the profits, were the motivation. Sculley flipped these priorities to where the goal was to make money. It’s a subtle difference, but it ends up meaning everything—the people you hire, who gets promoted, what you discuss in meetings.”
Don’t Be a Slave To Focus Groups When Jobs took his original Macintosh team on its first retreat, one member asked whether they should do some market research to see what customers wanted. “No,” Jobs replied, “because customers don’t know what they want until we’ve shown them.” He invoked Henry Ford’s line “If I’d asked customers what they wanted, they would have told me, ‘A faster horse!’”
Caring deeply about what customers want is much different from continually asking them what they want; it requires intuition and instinct about desires that have not yet formed. “Our task is to read things that are not yet on the page,” Jobs explained. Instead of relying on market research, he honed his version of empathy—an intimate intuition about the desires of his customers. He developed his appreciation for intuition—feelings that are based on accumulated experiential wisdom—while he was studying Buddhism in India as a college dropout. “The people in the Indian countryside don’t use their intellect like we do; they use their intuition instead,” he recalled. “Intuition is a very powerful thing—more powerful than intellect, in my opinion.”
Sometimes that meant that Jobs used a one-person focus group: himself. He made products that he and his friends wanted. For example, there were many portable music players around in 2000, but Jobs felt they were all lame, and as a music fanatic he wanted a simple device that would allow him to carry a thousand songs in his pocket. “We made the iPod for ourselves,” he said, “and when you’re doing something for yourself, or your best friend or family, you’re not going to cheese out.”
Bend Reality Jobs’s (in)famous ability to push people to do the impossible was dubbed by colleagues his Reality Distortion Field, after an episode of Star Trek in which aliens create a convincing alternative reality through sheer mental force. An early example was when Jobs was on the night shift at Atari and pushed Steve Wozniak to create a game called Breakout. Woz said it would take months, but Jobs stared at him and insisted he could do it in four days. Woz knew that was impossible, but he ended up doing it.
Those who did not know Jobs interpreted the Reality Distortion Field as a euphemism for bullying and lying. But those who worked with him admitted that the trait, infuriating as it might be, led them to perform extraordinary feats. Because Jobs felt that life’s ordinary rules didn’t apply to him, he could inspire his team to change the course of computer history with a small fraction of the resources that Xerox or IBM had. “It was a self-fulfilling distortion,” recalls Debi Coleman, a member of the original Mac team who won an award one year for being the employee who best stood up to Jobs. “You did the impossible because you didn’t realize it was impossible.”
One day Jobs marched into the cubicle of Larry Kenyon, the engineer who was working on the Macintosh operating system, and complained that it was taking too long to boot up. Kenyon started to explain why reducing the boot-up time wasn’t possible, but Jobs cut him off. “If it would save a person’s life, could you find a way to shave 10 seconds off the boot time?” he asked. Kenyon allowed that he probably could. Jobs went to a whiteboard and showed that if five million people were using the Mac and it took 10 seconds extra to turn it on every day, that added up to 300 million or so hours a year—the equivalent of at least 100 lifetimes a year. After a few weeks Kenyon had the machine booting up 28 seconds faster.
When Jobs was designing the iPhone, he decided that he wanted its face to be a tough, scratchproof glass, rather than plastic. He met with Wendell Weeks, the CEO of Corning, who told him that Corning had developed a chemical exchange process in the 1960s that led to what it dubbed “Gorilla glass.” Jobs replied that he wanted a major shipment of Gorilla glass in six months. Weeks said that Corning was not making the glass and didn’t have that capacity. “Don’t be afraid,” Jobs replied. This stunned Weeks, who was unfamiliar with Jobs’s Reality Distortion Field. He tried to explain that a false sense of confidence would not overcome engineering challenges, but Jobs had repeatedly shown that he didn’t accept that premise. He stared unblinking at Weeks. “Yes, you can do it,” he said. “Get your mind around it. You can do it.” Weeks recalls that he shook his head in astonishment and then called the managers of Corning’s facility in Harrodsburg, Kentucky, which had been making LCD displays, and told them to convert immediately to making Gorilla glass full-time. “We did it in under six months,” he says. “We put our best scientists and engineers on it, and we just made it work.” As a result, every piece of glass on an iPhone or an iPad is made in America by Corning.
Impute Jobs’s early mentor Mike Markkula wrote him a memo in 1979 that urged three principles. The first two were “empathy” and “focus.” The third was an awkward word, “impute,” but it became one of Jobs’s key doctrines. He knew that people form an opinion about a product or a company on the basis of how it is presented and packaged. “Mike taught me that people do judge a book by its cover,” he told me.
When he was getting ready to ship the Macintosh in 1984, he obsessed over the colors and design of the box. Similarly, he personally spent time designing and redesigning the jewellike boxes that cradle the iPod and the iPhone and listed himself on the patents for them. He and Ive believed that unpacking was a ritual like theater and heralded the glory of the product. “When you open the box of an iPhone or iPad, we want that tactile experience to set the tone for how you perceive the product,” Jobs said.
Sometimes Jobs used the design of a machine to “impute” a signal rather than to be merely functional. For example, when he was creating the new and playful iMac, after his return to Apple, he was shown a design by Ive that had a little recessed handle nestled in the top. It was more semiotic than useful. This was a desktop computer. Not many people were really going to carry it around. But Jobs and Ive realized that a lot of people were still intimidated by computers. If it had a handle, the new machine would seem friendly, deferential, and at one’s service. The handle signaled permission to touch the iMac. The manufacturing team was opposed to the extra cost, but Jobs simply announced, “No, we’re doing this.” He didn’t even try to explain.
Push for Perfection During the development of almost every product he ever created, Jobs at a certain point “hit the pause button” and went back to the drawing board because he felt it wasn’t perfect. That happened even with the movie Toy Story. After Jeff Katzenberg and the team at Disney, which had bought the rights to the movie, pushed the Pixar team to make it edgier and darker, Jobs and the director, John Lasseter, finally stopped production and rewrote the story to make it friendlier. When he was about to launch Apple Stores, he and his store guru, Ron Johnson, suddenly decided to delay everything a few months so that the stores’ layouts could be reorganized around activities and not just product categories.
The same was true for the iPhone. The initial design had the glass screen set into an aluminum case. One Monday morning Jobs went over to see Ive. “I didn’t sleep last night,” he said, “because I realized that I just don’t love it.” Ive, to his dismay, instantly saw that Jobs was right. “I remember feeling absolutely embarrassed that he had to make the observation,” he says. The problem was that the iPhone should have been all about the display, but in its current design the case competed with the display instead of getting out of the way. The whole device felt too masculine, task-driven, efficient. “Guys, you’ve killed yourselves over this design for the last nine months, but we’re going to change it,” Jobs told Ive’s team. “We’re all going to have to work nights and weekends, and if you want, we can hand out some guns so you can kill us now.” Instead of balking, the team agreed. “It was one of my proudest moments at Apple,” Jobs recalled.
A similar thing happened as Jobs and Ive were finishing the iPad. At one point Jobs looked at the model and felt slightly dissatisfied. It didn’t seem casual and friendly enough to scoop up and whisk away. They needed to signal that you could grab it with one hand, on impulse. They decided that the bottom edge should be slightly rounded, so that a user would feel comfortable just snatching it up rather than lifting it carefully. That meant engineering had to design the necessary connection ports and buttons in a thin, simple lip that sloped away gently underneath. Jobs delayed the product until the change could be made.
Jobs’s perfectionism extended even to the parts unseen. As a young boy, he had helped his father build a fence around their backyard, and he was told they had to use just as much care on the back of the fence as on the front. “Nobody will ever know,” Steve said. His father replied, “But you will know.” A true craftsman uses a good piece of wood even for the back of a cabinet against the wall, his father explained, and they should do the same for the back of the fence. It was the mark of an artist to have such a passion for perfection. In overseeing the Apple II and the Macintosh, Jobs applied this lesson to the circuit board inside the machine. In both instances he sent the engineers back to make the chips line up neatly so the board would look nice. This seemed particularly odd to the engineers of the Macintosh, because Jobs had decreed that the machine be tightly sealed. “Nobody is going to see the PC board,” one of them protested. Jobs reacted as his father had: “I want it to be as beautiful as possible, even if it’s inside the box. A great carpenter isn’t going to use lousy wood for the back of a cabinet, even though nobody’s going to see it.” They were true artists, he said, and should act that way. And once the board was redesigned, he had the engineers and other members of the Macintosh team sign their names so that they could be engraved inside the case. “Real artists sign their work,” he said.
Tolerate Only “A” Players Jobs was famously impatient, petulant, and tough with the people around him. But his treatment of people, though not laudable, emanated from his passion for perfection and his desire to work with only the best. It was his way of preventing what he called “the bozo explosion,” in which managers are so polite that mediocre people feel comfortable sticking around. “I don’t think I run roughshod over people,” he said, “but if something sucks, I tell people to their face. It’s my job to be honest.” When I pressed him on whether he could have gotten the same results while being nicer, he said perhaps so. “But it’s not who I am,” he said. “Maybe there’s a better way—a gentlemen’s club where we all wear ties and speak in this Brahmin language and velvet code words—but I don’t know that way, because I am middle-class from California.”
Was all his stormy and abusive behavior necessary? Probably not. There were other ways he could have motivated his team. “Steve’s contributions could have been made without so many stories about him terrorizing folks,” Apple’s cofounder, Wozniak, said. “I like being more patient and not having so many conflicts. I think a company can be a good family.” But then he added something that is undeniably true: “If the Macintosh project had been run my way, things probably would have been a mess.”
It’s important to appreciate that Jobs’s rudeness and roughness were accompanied by an ability to be inspirational. He infused Apple employees with an abiding passion to create groundbreaking products and a belief that they could accomplish what seemed impossible. And we have to judge him by the outcome. Jobs had a close-knit family, and so it was at Apple: His top players tended to stick around longer and be more loyal than those at other companies, including ones led by bosses who were kinder and gentler. CEOs who study Jobs and decide to emulate his roughness without understanding his ability to generate loyalty make a dangerous mistake.
“I’ve learned over the years that when you have really good people, you don’t have to baby them,” Jobs told me. “By expecting them to do great things, you can get them to do great things. Ask any member of that Mac team. They will tell you it was worth the pain.” Most of them do. “He would shout at a meeting, ‘You asshole, you never do anything right,’” Debi Coleman recalls. “Yet I consider myself the absolute luckiest person in the world to have worked with him.”
Engage Face-to-Face Despite being a denizen of the digital world, or maybe because he knew all too well its potential to be isolating, Jobs was a strong believer in face-to-face meetings. “There’s a temptation in our networked age to think that ideas can be developed by e-mail and iChat,” he told me. “That’s crazy. Creativity comes from spontaneous meetings, from random discussions. You run into someone, you ask what they’re doing, you say ‘Wow,’ and soon you’re cooking up all sorts of ideas.”
He had the Pixar building designed to promote unplanned encounters and collaborations. “If a building doesn’t encourage that, you’ll lose a lot of innovation and the magic that’s sparked by serendipity,” he said. “So we designed the building to make people get out of their offices and mingle in the central atrium with people they might not otherwise see.” The front doors and main stairs and corridors all led to the atrium; the café and the mailboxes were there; the conference rooms had windows that looked out onto it; and the 600-seat theater and two smaller screening rooms all spilled into it. “Steve’s theory worked from day one,” Lasseter recalls. “I kept running into people I hadn’t seen for months. I’ve never seen a building that promoted collaboration and creativity as well as this one.”
Jobs hated formal presentations, but he loved freewheeling face-to-face meetings. He gathered his executive team every week to kick around ideas without a formal agenda, and he spent every Wednesday afternoon doing the same with his marketing and advertising team. Slide shows were banned. “I hate the way people use slide presentations instead of thinking,” Jobs recalled. “People would confront a problem by creating a presentation. I wanted them to engage, to hash things out at the table, rather than show a bunch of slides. People who know what they’re talking about don’t need PowerPoint.”
Know Both the Big Picture and the Details Jobs’s passion was applied to issues both large and minuscule. Some CEOs are great at vision; others are managers who know that God is in the details. Jobs was both. Time Warner CEO Jeff Bewkes says that one of Jobs’s salient traits was his ability and desire to envision overarching strategy while also focusing on the tiniest aspects of design. For example, in 2000 he came up with the grand vision that the personal computer should become a “digital hub” for managing all of a user’s music, videos, photos, and content, and thus got Apple into the personal-device business with the iPod and then the iPad. In 2010 he came up with the successor strategy—the “hub” would move to the cloud—and Apple began building a huge server farm so that all a user’s content could be uploaded and then seamlessly synced to other personal devices. But even as he was laying out these grand visions, he was fretting over the shape and color of the screws inside the iMac.
Combine the Humanities with the Sciences “I always thought of myself as a humanities person as a kid, but I liked electronics,” Jobs told me on the day he decided to cooperate on a biography. “Then I read something that one of my heroes, Edwin Land of Polaroid, said about the importance of people who could stand at the intersection of humanities and sciences, and I decided that’s what I wanted to do.” It was as if he was describing the theme of his life, and the more I studied him, the more I realized that this was, indeed, the essence of his tale.
He connected the humanities to the sciences, creativity to technology, arts to engineering. There were greater technologists (Wozniak, Gates), and certainly better designers and artists. But no one else in our era could better firewire together poetry and processors in a way that jolted innovation. And he did it with an intuitive feel for business strategy. At almost every product launch over the past decade, Jobs ended with a slide that showed a sign at the intersection of Liberal Arts and Technology Streets.
The creativity that can occur when a feel for both the humanities and the sciences exists in one strong personality was what most interested me in my biographies of Franklin and Einstein, and I believe that it will be a key to building innovative economies in the 21st century. It is the essence of applied imagination, and it’s why both the humanities and the sciences are critical for any society that is to have a creative edge in the future.
Even when he was dying, Jobs set his sights on disrupting more industries. He had a vision for turning textbooks into artistic creations that anyone with a Mac could fashion and craft—something that Apple announced in January 2012. He also dreamed of producing magical tools for digital photography and ways to make television simple and personal. Those, no doubt, will come as well. And even though he will not be around to see them to fruition, his rules for success helped him build a company that not only will create these and other disruptive products, but will stand at the intersection of creativity and technology as long as Jobs’s DNA persists at its core.
Stay Hungry, Stay Foolish Steve Jobs was a product of the two great social movements that emanated from the San Francisco Bay Area in the late 1960s. The first was the counterculture of hippies and antiwar activists, which was marked by psychedelic drugs, rock music, and antiauthoritarianism. The second was the high-tech and hacker culture of Silicon Valley, filled with engineers, geeks, wireheads, phreakers, cyberpunks, hobbyists, and garage entrepreneurs. Overlying both were various paths to personal enlightenment—Zen and Hinduism, meditation and yoga, primal scream therapy and sensory deprivation, Esalen and est.
An admixture of these cultures was found in publications such as Stewart Brand’s Whole Earth Catalog. On its first cover was the famous picture of Earth taken from space, and its subtitle was “access to tools.” The underlying philosophy was that technology could be our friend. Jobs—who became a hippie, a rebel, a spiritual seeker, a phone phreaker, and an electronic hobbyist all wrapped into one—was a fan. He was particularly taken by the final issue, which came out in 1971, when he was still in high school. He took it with him to college and then to the apple farm commune where he lived after dropping out. He later recalled: “On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: ‘Stay Hungry. Stay Foolish.’” Jobs stayed hungry and foolish throughout his career by making sure that the business and engineering aspect of his personality was always complemented by a hippie nonconformist side from his days as an artistic, acid-dropping, enlightenment-seeking rebel. In every aspect of his life—the women he dated, the way he dealt with his cancer diagnosis, the way he ran his business—his behavior reflected the contradictions, confluence, and eventual synthesis of all these varying strands.
Even as Apple became corporate, Jobs asserted his rebel and counterculture streak in its ads, as if to proclaim that he was still a hacker and a hippie at heart. The famous “1984” ad showed a renegade woman outrunning the thought police to sling a sledgehammer at the screen of an Orwellian Big Brother. And when he returned to Apple, Jobs helped write the text for the “Think Different” ads: “Here’s to the crazy ones. The misfits. The rebels. The troublemakers. The round pegs in the square holes…” If there was any doubt that, consciously or not, he was describing himself, he dispelled it with the last lines: “While some see them as the crazy ones, we see genius. Because the people who are crazy enough to think they can change the world are the ones who do.”
http://hbr.org/2012/04/the-real-leadership-lessons-of-steve-jobs/ar/pr