Download high-resolution image Look inside
Listen to a clip from the audiobook
audio pause button
0:00
0:00

How to Fly a Horse

The Secret History of Creation, Invention, and Discovery

Look inside
Listen to a clip from the audiobook
audio pause button
0:00
0:00
To create is human. Technology pioneer Kevin Ashton has experienced firsthand the all-consuming challenge of creating something new. Now, in a tour-de-force narrative twenty years in the making, Ashton demystifies the sacred act, leading us on a journey through humanity’s greatest creations to uncover the surprising truth behind who creates and how they do it. From the crystallographer’s laboratory where the secrets of DNA were first revealed by a long forgotten woman, to the Ohio bicycle shop where the Wright brothers set out to “fly a horse,” Ashton showcases the seemingly unremarkable individuals, gradual steps, multiple failures, and countless ordinary and usually uncredited acts that lead to our most astounding breakthroughs. Drawing on examples from Mozart to the Muppets, Archimedes to Apple, Kandinsky to a can of Coke, How to Fly a Horse is essential reading for would-be creators and innovators, and also a passionate and immensely rewarding exploration of how “new” comes to be.

Selected for Common Reading at the following colleges and universities:

High Point University
Northern Arizona University
Siena Heights University
University of North Carolina – Asheville
CHAPTER 1
Creating Is Ordinary

1 | Edmond

In the Indian Ocean, fifteen hundred miles east of Africa and four thousand miles west of Australia, lies an island that the Portuguese knew as Santa Apolónia, the British as Bourbon, and the French, for a time, as Île Bonaparte. Today it is called Réunion. A bronze statue stands in Sainte-­Suzanne, one of Réunion’s oldest towns. It shows an African boy in 1841, dressed as if for church, in a single-­breasted jacket, bow tie, and flat-­front pants that gather on the ground. He wears no shoes. He holds out his right hand, not in greeting but with his thumb and fingers coiled against his palm, perhaps about to flip a coin. He is twelve years old, an orphan and a slave, and his name is Edmond.

The world has few statues of Africa’s enslaved children. To understand why Edmond stands here, on this lonely ocean speck, his hand held just so, we must travel west and back, thousands of miles and hundreds of years.

On Mexico’s Gulf Coast, the people of Papantla have dried the fruit of a vinelike orchid and used it as a spice for more millennia than they remember. In 1400, the Aztecs took it as tax and called it “black flower.” In 1519, the Spanish introduced it to Europe and called it “little pod,” or vainilla. In 1703, French botanist Charles Plumier renamed it “vanilla.”

Vanilla is hard to farm. Vanilla orchids are great creeping plants, not at all like the Phalaenopsis flowers we put in our homes. They can live for centuries and grow large, sometimes covering thousands of square feet or climbing five stories high. It has been said that lady’s slippers are the tallest orchids and tigers the most massive, but vanilla dwarfs them both. For thousands of years, its flower was a secret known only to the people who grew it. It is not black, as the Aztecs were led to believe, but a pale tube that blooms once a year and dies in a morning. If a flower is pollinated, it produces a long, green, beanlike capsule that takes nine months to ripen. It must be picked at precisely the right time. Too soon and it will be too small; too late and it will split and spoil. Picked beans are left in the sun for days, until they stop ripening. They do not smell of vanilla yet. That aroma develops during curing: two weeks on wool blankets outdoors each day before being wrapped to sweat each night. Then the beans are dried for four months and finished by hand with straightening and massage. The result is oily black lashes worth their weight in silver or gold.

Vanilla captivated the Europeans. Anne of Austria, daughter of Spain’s King Philip III, drank it in hot chocolate. Queen Elizabeth I of England ate it in puddings. King Henry IV of France made adulterating it a criminal offense punishable by a beating. Thomas Jefferson discovered it in Paris and wrote America’s first recipe for vanilla ice cream.

But no one outside Mexico could make it grow. For three hundred years, vines transported to Europe would not flower. It was only in 1806 that vanilla first bloomed in a London greenhouse and three more decades before a plant in Belgium bore Europe’s first fruit.

The missing ingredient was whatever pollinated the orchid in the wild. The flower in London was a chance occurrence. The fruit in Belgium came from complicated artificial pollination. It was not until late in the nineteenth century that Charles Darwin inferred that a Mexican insect must be vanilla’s pollinator, and not until late in the twentieth century that the insect was identified as a glossy green bee called Euglossa viridissima. Without the pollinator, Europe had a problem. Demand for vanilla was increasing, but Mexico was producing only one or two tons a year. The Europeans needed another source of supply. The Spanish hoped vanilla would thrive in the Philippines. The Dutch planted it in Java. The British sent it to India. All attempts failed.

This is where Edmond enters. He was born in Sainte-­Suzanne in 1829. At that time Réunion was called Bourbon. His mother, Mélise, died in childbirth. He did not know his father. Slaves did not have last names—­he was simply “Edmond.” When Edmond was a few years old, his owner, Elvire Bellier-­Beaumont, gave him to her brother Ferréol in nearby Belle-­Vue. Ferréol owned a plantation. Edmond grew up following Ferréol Bellier-­Beaumont around the estate, learning about its fruits, vegetables, and flowers, including one of its oddities—­a vanilla vine Ferréol had kept alive since 1822.

Like all the vanilla on Réunion, Ferréol’s vine was sterile. French colonists had been trying to grow the plant on the island since 1819. After a few false starts—­some orchids were the wrong species, some soon died—­they eventually had a hundred live vines. But Réunion saw no more success with vanilla than Europe’s other colonies had. The orchids seldom flowered and never bore fruit.

Then, one morning late in 1841, as the spring of the Southern Hemisphere came to the island, Ferréol took his customary walk with Edmond and was surprised to find two green capsules hanging from the vine. His orchid, barren for twenty years, had fruit. What came next surprised him even more. Twelve-­year-­old Edmond said he had pollinated the plant himself.

To this day there are people in Réunion who do not believe it. It seems impossible to them that a child, a slave and, above all, an African, could have solved the problem that beat Europe for hundreds of years. They say it was an accident—­that he was trying to damage the flowers after an argument with Ferréol or he was busy seducing a girl in the gardens when it happened.

Ferréol did not believe the boy at first. But when more fruit appeared, days later, he asked for a demonstration. Edmond pulled back the lip of a vanilla flower and, using a toothpick-­sized piece of bamboo to lift the part that prevents self-­fertilization, he gently pinched its pollen-­bearing anther and pollen-­receiving stigma together. Today the French call this le geste d’Edmond—­Edmond’s gesture. Ferréol called the other plantation owners together, and soon Edmond was traveling the island teaching other slaves how to pollinate vanilla orchids. After seven years, Réunion’s annual production was a hundred pounds of dried vanilla pods. After ten years, it was two tons. By the end of the century, it was two hundred tons and had surpassed the output of Mexico.

Ferréol freed Edmond in June 1848, six months before most of Réunion’s other slaves. Edmond was given the last name Albius, the Latin word for “whiter.” Some suspect this was a compliment in racially charged Réunion. Others think it was an insult from the naming registry. Whatever the intention, things went badly. Edmond left the plantation for the city and was imprisoned for theft. Ferréol was unable to prevent the incarceration but succeeded in getting Edmond released after three years instead of five. Edmond died in 1880, at the age of fifty-­one. A small story in a Réunion newspaper, Le Moniteur, described it as a “destitute and miserable end.”

Edmond’s innovation spread to Mauritius, the Seychelles, and the huge island to Réunion’s west, Madagascar. Madagascar has a perfect environment for vanilla. By the twentieth century, it was producing most of the world’s vanilla, with a crop that in some years was worth more than $100 million.

The demand for vanilla increased with the supply. Today it is the world’s most popular spice and, after saffron, the second most expensive. It has become an ingredient in thousands of things, some obvious, some not. Over a third of the world’s ice cream is Jefferson’s original flavor, vanilla. Vanilla is the principal flavoring in Coke, and the Coca-­Cola Company is said to be the world’s largest vanilla buyer. The fine fragrances Chanel No. 5, Opium, and Angel use the world’s most expensive vanilla, worth $10,000 a pound. Most chocolate contains vanilla. So do many cleaning products, beauty products, and candles. In 1841, on the day of Edmond’s demonstration to Ferréol, the world produced fewer than two thousand vanilla beans, all in Mexico, all the result of pollination by bees. On the same day in 2010, the world produced more than five million vanilla beans, in countries including Indonesia, China, and Kenya, almost all of them—­including the ones grown in Mexico—­the result of le geste d’Edmond.


2 | Counting Creators

What is unusual about Edmond’s story is not that a young slave created something important but that he got the credit for it. Ferréol worked hard to ensure that Edmond was remembered. He told Réunion’s plantation owners that it was Edmond who first pollinated vanilla. He lobbied on Edmond’s behalf, saying, “This young negro deserves recognition from this country. It owes him a debt, for starting up a new industry with a fabulous product.” When Jean Michel Claude Richard, director of Réunion’s botanical gardens, said he had developed the technique and shown it to Edmond, Ferréol intervened. “Through old age, faulty memory or some other cause,” he wrote, “Mr. Richard now imagines that he himself discovered the secret of how to pollinate vanilla, and imagines that he taught the technique to the person who discovered it! Let us leave him to his fantasies.” Without Ferréol’s great effort, the truth would have been lost.

In most cases, the truth has been lost. We do not know, for example, who first realized that the fruit of an orchid could be cured until it tastes good. Vanilla is an innovation inherited from people long forgotten. This is not exceptional; it is normal. Most of our world is made of innovations inherited from people long forgotten—­not people who were rare but people who were common.

Before the Renaissance, concepts like authorship, inventorship, or claiming credit barely existed. Until the early fifteenth century, “author” meant “father,” from the Latin word for “master,” auctor. Auctor-­ship implied authority, something that, in most of the world, had been the divine right of kings and religious leaders since Gilgamesh ruled Uruk four thousand years earlier. It was not to be shared with mere mortals. An “inventor,” from invenire, “find,” was a discoverer, not a creator, until the 1550s. “Credit,” from credo, “trust,” did not mean “acknowledgment” until the late sixteenth century.

This is one reason we know so little about who made what before the late 1300s. It is not that no records were made—­writing has been around for millennia. Nor is it that there was no creation—­everything we use today has roots stretching back to the beginning of humanity. The problem is that, until the Renaissance, people who created things didn’t matter much. The idea that at least some people who create things should be recognized was a big step forward. It is why we know that Johannes Gutenberg invented printing in Germany in 1440 but not who invented windmills in England in 1185, and that Giunta Pisano painted the crucifix in Bologna’s Basilica of San Domenico in 1250 but not who made the mosaic of Saint Demetrios in Kiev’s Golden-­Domed Monastery in 1110.

There are exceptions. We know the names of hundreds of ancient Greek philosophers, from Acrion to Zeno, as well as a few Greek engineers of the same period, such as Eupalinos, Philo, and Ctesibius. We also know of a number of Chinese artists from around 400 c.e. onward, including the calligrapher Wei Shuo and her student Wang Xizhi. But the general principle holds. Broadly speaking, our knowledge of who created what started around the middle of the thirteenth century, increased during the European Renaissance of the fourteenth to seventeenth centuries, and has kept increasing ever since. The reasons for the change are complicated and the subject of debate among historians—­they include power struggles within the churches of Europe, the rise of science, and the rediscovery of ancient philosophy—­but there is little doubt that most creators started getting credit for their creations only after the year 1200.

One way this happened was through patents, which give credit within rigorous constraints. The first patents were issued in Italy in the fifteenth century, in Britain and the United States in the seventeenth century, and in France in the eighteenth century. The modern U.S. Patent and Trademark Office granted its first patent on July 31, 1790. It granted its eight millionth patent on August 16, 2011. The patent office does not keep records of how many different people have been granted patents, but economist Manuel Trajtenberg developed a way of working it out. He analyzed names phonetically and compared matches with zip codes, coinventors, and other information to identify each unique inventor. Trajtenberg’s data suggests that more than six million distinct individuals had received U.S. patents by the end of 2011.

The inventors are not distributed evenly across the years. Their numbers are increasing. The first million inventors took 130 years to get their patents, the second million 35 years, the third million 22 years, the fourth million 17 years, the fifth million 10 years, and the sixth million inventors took 8 years. Even with foreign inventors removed and adjustments for population increase, the trend is unmistakable. In 1800, about one in every 175,000 Americans was granted a first patent. In 2000, one in every 4,000 Americans received one.

Not all creations get a patent. Books, songs, plays, movies, and other works of art are protected by copyright instead, which in the United States is managed by the Copyright Office, part of the Library of Congress. Copyrights show the same growth as patents. In 1870, 5,600 works were registered for copyright. In 1886, the number grew to more than 31,000, and Ainsworth Spofford, the librarian of Congress, had to plead for more space. “Again it becomes necessary to refer to the difficulty and embarrassment of prosecuting the annual enumeration of the books and pamphlets recently completed,” he wrote in a report to Congress. “Each year and each month adds to the painfully overcrowded condition of the collections, and although many rooms have been filled with the overflow from the main Library, the difficulty of handling so large an accumulation of unshelved books is constantly growing.” This became a refrain. In 1946, register of copyrights Sam Bass Warner reported that “the number of registrations of copyright claims rose to 202,144 the greatest number in the history of the Copyright Office, and a number so far beyond the capacities of the existing staff that Congress, responding to the need, generously provided for additional personnel.” In 1991, copyright registrations reached a peak of more than 600,000. As with patents, the increase exceeded population growth. In 1870, there was 1 copyright registration for every 7,000 U.S. citizens. In 1991, there was one copyright registration for every 400 U.S. citizens.

More credit is given for creation in science, too. The Science Citation Index tracks the world’s leading peer-­reviewed journals in science and technology. For 1955, the index lists 125,000 new scientific papers—­about 1 for every 1,350 U.S. citizens. For 2005, it lists more than 1,250,000 scientific papers—­one for every 250 U.S. citizens.

Patents, copyrights, and peer-­reviewed papers are imperfect proxies. Their growth is driven by money as well as knowledge. Not all work that gets this recognition is necessarily good. And, as we shall see later, giving credit to individuals is misleading. Creation is a chain reaction: thousands of people contribute, most of them anonymous, all of them creative. But, with numbers so big, and even though we miscount and undercount, the point is hard to miss: over the last few centuries, more people from more fields have been getting more credit as creators.

We have not become more creative. The people of the Renaissance were born into a world enriched by tens of thousands of years of human invention: clothes, cathedrals, mathematics, writing, art, agriculture, ships, roads, pets, houses, bread, and beer, to name a fraction. The second half of the twentieth century and the first decades of the twenty-­first century may appear to be a time of unprecedented innovation, but there are other reasons for this, and we will discuss them later. What the numbers show is something else: when we start counting creators, we find that a lot of people create. In 2011, almost as many Americans received their first patent as attended a typical NASCAR race. Creating is not for an elite few. It is not even close to being for an elite few.

The question is not whether invention is the sole province of a tiny minority but the opposite: how many of us are creative? The answer, hidden in plain sight, is all of us. Resistance to the possibility that Edmond, a boy with no formal education, could create something important is grounded in the myth that creating is extraordinary. Creating is not extraordinary, even if its results sometimes are. Creation is human. It is all of us. It is everybody.


3 | The Species of New

Even without numbers, it is easy to see that creation is not the exclusive domain of rare geniuses with occasional inspiration. Creation surrounds us. Everything we see and feel is a result of it or has been touched by it. There is too much creation for creating to be infrequent.

This book is creation. You probably heard about it via creation, or the person who told you about it did. It was written using creation, and creation is one reason you can understand it. You are either lit by creation now or you will be, come sundown. You are heated or cooled or at least insulated by creation—­by clothes and walls and windows. The sky above you is softened by fumes and smog in the day and polluted by electric light at night—­all results of creation. Watch, and it will be crossed by an airplane or a satellite or the slow dissolve of a vapor trail. Apples, cows, and all other things agricultural, apparently natural, are also creation: the result of tens of thousands of years of innovation in trading, breeding, feeding, farming, and—­unless you live on the farm—­preservation and transportation.

You are a result of creation. It helped your parents meet. It likely assisted your birth, gestation, and maybe conception. Before you were born, it eradicated diseases and dangers that could have killed you. After, it inoculated and protected you against others. It treated the illnesses you caught. It helps heal your wounds and relieve your pain. It did the same for your parents and their parents. It recently cleaned you, fed you, and quenched your thirst. It is why you are where you are. Cars, shoes, saddles, or ships transported you, your parents, or your grandparents to the place you now call home, which was less habitable before creation—­too hot in the summer or too cold in the winter or too wet or too swampy or too far from potable water or freely growing food or prowled by predators or all of the above.

Listen, and you hear creation. It is in the sound of passing sirens, distant music, church bells, cell phones, lawn mowers and snow blowers, basketballs and bicycles, waves on breakers, hammers and saws, the creak and crackle of melting ice cubes, even the bark of a dog—­a wolf changed by millennia of selective breeding by humans—­or the purr of a cat, the descendant of one of just five African wildcats humans have been selectively breeding for ten thousand years. Anything that is as it is due to conscious human intervention is invention, creation, new.

Creation is so around and inside us that we cannot look without seeing it or listen without hearing it. As a result, we do not notice it at all. We live in symbiosis with new. It is not something we do; it is something we are. It affects our life expectancy, our height and weight and gait, our way of life, where we live, and the things we think and do. We change our technology, and our technology changes us. This is true for every human being on the planet. It has been true for two thousand generations; ever since the moment our species started thinking about improving its tools.

Anything we create is a tool—­a fabrication with purpose. There is nothing special about species with tools. Beavers make dams. Birds build nests. Dolphins use sponges to hunt for fish. Chimpanzees use sticks to dig for roots and stone hammers to open hard-­shelled food. Otters use rocks to break open crabs. Elephants repel flies by making branches into switches they wave with their trunks. Clearly our tools are better. The Hoover Dam beats the beaver dam. But why?

Our tools have not been better for long. Six million years ago, evolution forked. One path led to chimpanzees—­distant relatives, but the closest living ones we have. The other path led to us. Unknown numbers of human species emerged. There was Homo habilis, Homo heidelbergensis, Homo ergaster, Homo rudolfensis, and many others, some whose status is still controversial, some still to be discovered. All human. None us.

Like other species, these humans used tools. The earliest were pointed stones used to cut nuts, fruit, and maybe meat. Later, some human species made two-­sided hand axes requiring careful masonry and nearly perfect symmetry. But apart from minor adjustments, human tools were monotonous for a million years, unchanged no matter when or where they were used, passed through twenty-­five thousand generations without modification. Despite the mental focus needed to make it, the design of that early human hand ax, like the design of a beaver dam or bird’s nest, came from instinct, not thought.

Humans that looked like us first appeared 200,000 years ago. This was the species called Homo sapiens. Members of Homo sapiens did not act like us in one important way: their tools were simple and did not change. We do not know why. Their brains were the same size as ours. They had our opposable thumbs, our senses, and our strength. Yet for 150,000 years, like the other human species of their time, they made nothing new.

Then, 50,000 years ago, something happened. The crude, barely recognizable stone tools Homo sapiens had been using began to change—­and change quickly. Until this moment, this species, like all other animals, did not innovate. Their tools were the same as their parents’ tools and their grandparents’ tools and their great-­grandparents’ tools. They made them, but they didn’t make them better. The tools were inherited, instinctive, and immutable—­products of evolution, not conscious creation.

Then came by far the most important moment in human history—­the day one member of the species looked at a tool and thought, “I can make this better.” The descendants of this individual are called Homo sapiens sapiens. They are our ancestors. They are us. What the human race created was creation itself.

The ability to change anything was the change that changed everything. The urge to make better tools gave us a massive advantage over all other species, including rival species of humans. Within a few tens of thousands of years, all other humans were extinct, displaced by an anatomically similar species with only one important difference: ever-­improving technology.

What makes our species different and dominant is innovation. What is special about us is not the size of our brains, speech, or the mere fact that we use tools. It is that each of us is in our own way driven to make things better. We occupy the evolutionary niche of new. The niche of new is not the property of a privileged few. It is what makes humans human.

We do not know exactly what evolutionary spark caused the ignition of innovation 50,000 years ago. It left no trace in the fossil record. We do know that our bodies, including our brain size, did not change—­our immediate pre-innovation ancestor, Homo sapiens, looked exactly like us. That makes the prime suspect our mind: the precise arrangement of, and connections between, our brain cells. Something structural seems to have changed there—­perhaps as a result of 150,000 years of fine-­tuning. Whatever it was, it had profound implications, and today it lives on in everyone. Behavioral neurologist Richard Caselli says, “Despite great qualitative and quantitative differences between individuals, the neurobiologic principles of creative behavior are the same from the least to the most creative among us.” Put simply, we all have creative minds.

This is one reason the creativity myth is so terribly wrong. Creating is not rare. We are all born to do it. If it seems magical, it is because it is innate. If it seems like some of us are better at it than others, that is because it is part of being human, like talking or walking. We are not all equally creative, just as we are not all equally gifted orators or athletes. But we can all create.

The human race’s creative power is distributed in all of us, not concentrated in some of us. Our creations are too great and too numerous to come from a few steps by a few people. They must come from many steps by many people. Invention is incremental—­a series of slight and constant changes. Some changes open doors to new worlds of opportunity and we call them breakthroughs. Others are marginal. But when we look carefully, we will always find one small change leading to another, sometimes within one mind, often among several, sometimes across continents or between generations, sometimes taking hours or days and occasionally centuries, the baton of innovation passing in an endless relay of renewal. Creating accretes and compounds, and as a consequence, every day, each human life is made possible by the sum of all previous human creations. Every object in our life, however old or new, however apparently humble or simple, holds the stories, thoughts, and courage of thousands of people, some living, most dead—­the accumulated new of fifty thousand years. Our tools and art are our humanity, our inheritance, and the everlasting legacy of our ancestors. The things we make are the speech of our species: stories of triumph, courage, and creation, of optimism, adaptation, and hope; tales not of one person here and there but of one people everywhere; written in a common language, not African, American, Asian, or European but human.

There are many beautiful things about creating being human and innate. One is that we all create in more or less the same way. Our individual strengths and tendencies of course cause differences, but they are small and few relative to the similarities, which are great and many. We are more like Leonardo, Mozart, and Einstein than not.


4 | An End to Genius

The Renaissance belief that creating is reserved for genius survived through the Enlightenment of the seventeenth century, the Romanticism of the eighteenth century, and the Industrial Revolution of the nineteenth century. It was not until the middle of the twentieth century that the alternative position—­that everyone is capable of creation—­first emerged from early studies of the brain.

In the 1940s, the brain was an enigma. The body’s secrets had been revealed by several centuries of medicine, but the brain, producing consciousness without moving parts, remained a puzzle. Here is one reason theories of creation resorted to magic: the brain, throne of creation, was three pounds of gray and impenetrable mystery.

As the West recovered from World War II, new technologies appeared. One was the computer. This mechanical mind made understanding the brain seem possible for the first time. In 1952, Ross Ashby synthesized the excitement in a book called Design for a Brain. He summarized the new thinking elegantly:

The most fundamental facts are that the earth is over 2,000,000,000 years old and that natural selection has been winnowing the living organisms incessantly. As a result they are today highly specialized in the arts of survival, and among these arts has been the development of a brain, an organ that has been developed in evolution as a specialized means to survival. The nervous system, and living matter in general, will be assumed to be essentially similar to all other matter. No deus ex machina will be invoked.

Put simply: brains don’t need magic.

A San Franciscan named Allen Newell came of academic age during this period. Drawn by the energy of the era, he abandoned his plan to become a forest ranger (in part because his first job was feeding gangrenous calves’ livers to fingerling trout), became a scientist instead, and then, one Friday afternoon in November 1954, experienced what he would later call a “conversion experience” during a seminar on mechanical pattern recognition. He decided to devote his life to a single scientific question: “How can the human mind occur in the physical universe?”

“We now know that the world is governed by physics,” he explained, “and we now understand the way biology nestles comfortably within that. The issue is how does the mind do that as well? The answer must have the details. I’ve got to know how the gears clank, how the pistons go and all of that.”

As he embarked on this work, Newell became one of the first people to realize that creating did not require genius. In a 1959 paper called “The Processes of Creative Thinking,” he reviewed what little psychological data there was about creative work, then set out his radical idea: “Creative thinking is simply a special kind of problem-­solving behavior.” He made the point in the understated language academics use when they know they are on to something:

The data currently available about the processes involved in creative and non-­creative thinking show no particular differences between the two. It is impossible to distinguish, by looking at the statistics describing the processes, the highly skilled practitioner from the rank amateur. Creative activity appears simply to be a special class of problem-­solving activity characterized by novelty, unconventionality, persistence, and difficulty in problem formulation.

It was the beginning of the end for genius and creation. Making intelligent machines forced new rigor on the study of thought. The capacity to create was starting to look more and more like an innate function of the human brain—­possible with standard equipment, no genius necessary.

Newell did not claim that everyone was equally creative. Creating, like any human ability, comes in a spectrum of competence. But everybody can do it. There is no electric fence between those who can create and those who cannot, with genius on one side and the general population on the other.

Newell’s work, along with the work of others in the artificial intelligence community, undermined the myth of creativity. As a result, some of the next generation of scientists started to think about creation differently. One of the most important of these was Robert Weisberg, a cognitive psychologist at Philadelphia’s Temple University.

Weisberg was an undergraduate during the first years of the artificial intelligence revolution, spending the early 1960s in New York before getting his PhD from Princeton and joining the faculty at Temple in 1967. He spent his career proving that creating is innate, ordinary, and for everybody.

Weisberg’s view is simple. He builds on Newell’s contention that creative thinking is the same as problem solving, then extends it to say that creative thinking is the same as thinking in general but with a creative result. In Weisberg’s words, “when one says of someone that he or she is ‘thinking creatively,’ one is commenting on the outcome of the process, not on the process itself. Although the impact of creative ideas and products can sometimes be profound, the mechanisms through which an innovation comes about can be very ordinary.”

Said another way, normal thinking is rich and complex—­so rich and complex that it can sometimes yield extraordinary—­or “creative”—­results. We do not need other processes. Weisberg shows this in two ways: with carefully designed experiments and detailed case studies of creative acts—­from the painting of Picasso’s Guernica to the discovery of DNA and the music of Billie Holiday. In each example, by using a combination of experiment and history, Weisberg demonstrates how creating can be explained without resorting to genius and great leaps of the imagination.

Weisberg has not written about Edmond, but his theory works for Edmond’s story. At first, Edmond’s discovery of how to pollinate vanilla came from nowhere and seemed miraculous. But toward the end of his life, Ferréol Bellier-­Beaumont revealed how the young slave solved the mystery of the black flower.

Ferréol began his story in 1793, when German naturalist Konrad Sprengel discovered that plants reproduced sexually. Sprengel called it “the secret of nature.” The secret was not well received. Sprengel’s peers did not want to hear that flowers had a sex life. His findings spread anyway, especially among botanists and farmers who were more interested in growing good plants than in judging floral morality. And so Ferréol knew how to manually fertilize watermelon, by “marrying the male and female parts together.” He showed this to Edmond, who, as Ferréol described it, later “realized that the vanilla flower also had male and female elements, and worked out for himself how to join them together.” Edmond’s discovery, despite its huge economic impact, was an incremental step. It is no less creative as a result. All great discoveries, even ones that look like transforming leaps, are short hops.

Weisberg’s work, with subtitles like Genius and Other Myths and Beyond the Myth of Genius, did not eliminate the magical view of creation nor the idea that people who create are a breed apart. It is easier to sell secrets. Titles available in today’s bookstores include 10 Things Nobody Told You About Being Creative, 39 Keys to Creativity, 52 Ways to Get and Keep Your Creativity Flowing, 62 Exercises to Unlock Your Most Creative Ideas, 100 What-­Ifs of Creativity, and 250 Exercises to Wake Up Your Brain. Weisberg’s books are out of print. The myth of creativity does not die easily.

But it is becoming less fashionable, and Weisberg is not the only expert advocating for an epiphany-­free, everybody-­can theory of creation. Ken Robinson was awarded a knighthood for his work on creation and education and is known for the moving, funny talks he gives at an annual conference in California called TED (for technology, entertainment, and design). One of his themes is how education suppresses creation. He describes “the really extraordinary capacity that children have, their capacity for innovation,” and says that “all kids have tremendous talents and we squander them, pretty ruthlessly.” Robinson’s conclusion is that “creativity now is as important in education as literacy, and we should treat it with the same status.” Cartoonist Hugh MacLeod makes the same point more colorfully: “Everyone is born creative; everyone is given a box of crayons in kindergarten. Being suddenly hit years later with the ‘creative bug’ is just a wee voice telling you, ‘I’d like my crayons back, please.’ ”

“Entertaining. . . . [E]nlightening. . . . Might be the genre’s be all and end all. . . . If you want to tap your creative potential, buy this book. It’s the last one you’ll ever need to read.”
Toronto Star

“One of the most creative books on creativity I have ever read, a genuinely inspiring journey through the worlds of art, science, business and culture that will forever change how you think about where new ideas come from.”
—William C. Taylor, cofounder and editor of Fast Company and author of Practically Radical
 
“[Ashton’s] is a democratic idea—a scientific version of the American dream. . . . [A]n approachable, thought-provoking book that encourages everyone to be the best they can be.”
The Guardian (London)
 
“[How to Fly a Horse] takes on creation’s most pernicious clichés. . . . [Ashton] arrives at his theories by dint of his own hard work. . . . Being a genius is hard work. But that spark is in all of us.” 
The Washington Post
 
“An inspiring vision of creativity that’s littered with practical advice, and is a cracking read to boot.”
—BBC Focus
 
“[An] entertaining and inspiring meditation on the nature of creative innovation... Fans of Malcolm Gladwell and Stephen Levitt will enjoy Ashton’s hybrid nonfiction style, which builds a compelling cultural treatise from a coalescence of engaging anecdotes.”
Booklist

“Ashton’s beautifully written exploration of creativity explodes so many myths and opens so many doors that readers, like me, will be left reeling with possibilities. We can all create, we can all innovate. Move over, Malcolm Gladwell; Ashton has done you one better.”
—Larry Downes, author of the New York Times bestseller Unleashing the Killer App and co-author of Big Bang Disruption
 
“If you have ever wondered what it takes to create something, read this inspiring and insightful book. Using examples ranging from Mozart to the Muppets, Kevin Ashton shows how to tap the creative abilities that lurk in us all. There are no secrets, no shortcuts; just ordinary steps we can all take to bring something new into the world. Ashton’s message is direct and hopeful: creativity isn’t just for geniuses—it’s for everybody.” 
—Joseph T. Hallinan, author of Why We Make Mistakes

“A detailed and persuasive argument for how creativity actually works—not through magical bursts of inspiration but with careful thought, dogged problem-solving, and hard-won insight. Ashton draws on a wealth of illuminating and entertaining stories from the annals of business, science, and the arts to show how any of us can apply this process to our own work.”
—Mason Currey, author of Daily Rituals: How Artists Work

“If you consider yourself a curious person then you will love this book. Ashton shares so many delightful stories of where things come from and how things came to be, I seriously believe that it will make anyone who reads it smarter.”
—Simon Sinek, New York Times bestselling author of Start With Why and Leaders Eat Last
 
How to Fly a Horse solves the mysteries of invention. Kevin Ashton, the innovator who coined the ‘internet of things,’ shows that creativity is more often the result of ordinary steps than extraordinary leaps. With engrossing stories, provocative studies, and lucid writing, this book is not to be missed.”
—Adam Grant, professor of management at the Wharton School and New York Times bestselling author of Give and Take
  
“Kevin Ashton’s new book How to Fly a Horse is all about the creative sorcery and motivational magic necessary to make impossible things happen in teams or as individuals. Through numerous examples of creative genius ranging from Einstein to the creators of South Park to the invention of jet planes and concertos, Ashton reveals the secrets of the great scientists, artists, and industrialists of the last few centuries.”
—John Maeda, author of The Laws of Simplicity and founder of the SIMPLICITY Consortium at the MIT Media Lab

© hayeshayes.com

KEVIN ASHTON led pioneering work on RFID (radio frequency identification) networks, for which he coined the term “the Internet of Things,” and cofounded the Auto-ID Center at MIT. His writing about innovation and technology has appeared in Quartz, Medium, The Atlantic, and The New York Times.

View titles by Kevin Ashton

About

To create is human. Technology pioneer Kevin Ashton has experienced firsthand the all-consuming challenge of creating something new. Now, in a tour-de-force narrative twenty years in the making, Ashton demystifies the sacred act, leading us on a journey through humanity’s greatest creations to uncover the surprising truth behind who creates and how they do it. From the crystallographer’s laboratory where the secrets of DNA were first revealed by a long forgotten woman, to the Ohio bicycle shop where the Wright brothers set out to “fly a horse,” Ashton showcases the seemingly unremarkable individuals, gradual steps, multiple failures, and countless ordinary and usually uncredited acts that lead to our most astounding breakthroughs. Drawing on examples from Mozart to the Muppets, Archimedes to Apple, Kandinsky to a can of Coke, How to Fly a Horse is essential reading for would-be creators and innovators, and also a passionate and immensely rewarding exploration of how “new” comes to be.

Selected for Common Reading at the following colleges and universities:

High Point University
Northern Arizona University
Siena Heights University
University of North Carolina – Asheville

Excerpt

CHAPTER 1
Creating Is Ordinary

1 | Edmond

In the Indian Ocean, fifteen hundred miles east of Africa and four thousand miles west of Australia, lies an island that the Portuguese knew as Santa Apolónia, the British as Bourbon, and the French, for a time, as Île Bonaparte. Today it is called Réunion. A bronze statue stands in Sainte-­Suzanne, one of Réunion’s oldest towns. It shows an African boy in 1841, dressed as if for church, in a single-­breasted jacket, bow tie, and flat-­front pants that gather on the ground. He wears no shoes. He holds out his right hand, not in greeting but with his thumb and fingers coiled against his palm, perhaps about to flip a coin. He is twelve years old, an orphan and a slave, and his name is Edmond.

The world has few statues of Africa’s enslaved children. To understand why Edmond stands here, on this lonely ocean speck, his hand held just so, we must travel west and back, thousands of miles and hundreds of years.

On Mexico’s Gulf Coast, the people of Papantla have dried the fruit of a vinelike orchid and used it as a spice for more millennia than they remember. In 1400, the Aztecs took it as tax and called it “black flower.” In 1519, the Spanish introduced it to Europe and called it “little pod,” or vainilla. In 1703, French botanist Charles Plumier renamed it “vanilla.”

Vanilla is hard to farm. Vanilla orchids are great creeping plants, not at all like the Phalaenopsis flowers we put in our homes. They can live for centuries and grow large, sometimes covering thousands of square feet or climbing five stories high. It has been said that lady’s slippers are the tallest orchids and tigers the most massive, but vanilla dwarfs them both. For thousands of years, its flower was a secret known only to the people who grew it. It is not black, as the Aztecs were led to believe, but a pale tube that blooms once a year and dies in a morning. If a flower is pollinated, it produces a long, green, beanlike capsule that takes nine months to ripen. It must be picked at precisely the right time. Too soon and it will be too small; too late and it will split and spoil. Picked beans are left in the sun for days, until they stop ripening. They do not smell of vanilla yet. That aroma develops during curing: two weeks on wool blankets outdoors each day before being wrapped to sweat each night. Then the beans are dried for four months and finished by hand with straightening and massage. The result is oily black lashes worth their weight in silver or gold.

Vanilla captivated the Europeans. Anne of Austria, daughter of Spain’s King Philip III, drank it in hot chocolate. Queen Elizabeth I of England ate it in puddings. King Henry IV of France made adulterating it a criminal offense punishable by a beating. Thomas Jefferson discovered it in Paris and wrote America’s first recipe for vanilla ice cream.

But no one outside Mexico could make it grow. For three hundred years, vines transported to Europe would not flower. It was only in 1806 that vanilla first bloomed in a London greenhouse and three more decades before a plant in Belgium bore Europe’s first fruit.

The missing ingredient was whatever pollinated the orchid in the wild. The flower in London was a chance occurrence. The fruit in Belgium came from complicated artificial pollination. It was not until late in the nineteenth century that Charles Darwin inferred that a Mexican insect must be vanilla’s pollinator, and not until late in the twentieth century that the insect was identified as a glossy green bee called Euglossa viridissima. Without the pollinator, Europe had a problem. Demand for vanilla was increasing, but Mexico was producing only one or two tons a year. The Europeans needed another source of supply. The Spanish hoped vanilla would thrive in the Philippines. The Dutch planted it in Java. The British sent it to India. All attempts failed.

This is where Edmond enters. He was born in Sainte-­Suzanne in 1829. At that time Réunion was called Bourbon. His mother, Mélise, died in childbirth. He did not know his father. Slaves did not have last names—­he was simply “Edmond.” When Edmond was a few years old, his owner, Elvire Bellier-­Beaumont, gave him to her brother Ferréol in nearby Belle-­Vue. Ferréol owned a plantation. Edmond grew up following Ferréol Bellier-­Beaumont around the estate, learning about its fruits, vegetables, and flowers, including one of its oddities—­a vanilla vine Ferréol had kept alive since 1822.

Like all the vanilla on Réunion, Ferréol’s vine was sterile. French colonists had been trying to grow the plant on the island since 1819. After a few false starts—­some orchids were the wrong species, some soon died—­they eventually had a hundred live vines. But Réunion saw no more success with vanilla than Europe’s other colonies had. The orchids seldom flowered and never bore fruit.

Then, one morning late in 1841, as the spring of the Southern Hemisphere came to the island, Ferréol took his customary walk with Edmond and was surprised to find two green capsules hanging from the vine. His orchid, barren for twenty years, had fruit. What came next surprised him even more. Twelve-­year-­old Edmond said he had pollinated the plant himself.

To this day there are people in Réunion who do not believe it. It seems impossible to them that a child, a slave and, above all, an African, could have solved the problem that beat Europe for hundreds of years. They say it was an accident—­that he was trying to damage the flowers after an argument with Ferréol or he was busy seducing a girl in the gardens when it happened.

Ferréol did not believe the boy at first. But when more fruit appeared, days later, he asked for a demonstration. Edmond pulled back the lip of a vanilla flower and, using a toothpick-­sized piece of bamboo to lift the part that prevents self-­fertilization, he gently pinched its pollen-­bearing anther and pollen-­receiving stigma together. Today the French call this le geste d’Edmond—­Edmond’s gesture. Ferréol called the other plantation owners together, and soon Edmond was traveling the island teaching other slaves how to pollinate vanilla orchids. After seven years, Réunion’s annual production was a hundred pounds of dried vanilla pods. After ten years, it was two tons. By the end of the century, it was two hundred tons and had surpassed the output of Mexico.

Ferréol freed Edmond in June 1848, six months before most of Réunion’s other slaves. Edmond was given the last name Albius, the Latin word for “whiter.” Some suspect this was a compliment in racially charged Réunion. Others think it was an insult from the naming registry. Whatever the intention, things went badly. Edmond left the plantation for the city and was imprisoned for theft. Ferréol was unable to prevent the incarceration but succeeded in getting Edmond released after three years instead of five. Edmond died in 1880, at the age of fifty-­one. A small story in a Réunion newspaper, Le Moniteur, described it as a “destitute and miserable end.”

Edmond’s innovation spread to Mauritius, the Seychelles, and the huge island to Réunion’s west, Madagascar. Madagascar has a perfect environment for vanilla. By the twentieth century, it was producing most of the world’s vanilla, with a crop that in some years was worth more than $100 million.

The demand for vanilla increased with the supply. Today it is the world’s most popular spice and, after saffron, the second most expensive. It has become an ingredient in thousands of things, some obvious, some not. Over a third of the world’s ice cream is Jefferson’s original flavor, vanilla. Vanilla is the principal flavoring in Coke, and the Coca-­Cola Company is said to be the world’s largest vanilla buyer. The fine fragrances Chanel No. 5, Opium, and Angel use the world’s most expensive vanilla, worth $10,000 a pound. Most chocolate contains vanilla. So do many cleaning products, beauty products, and candles. In 1841, on the day of Edmond’s demonstration to Ferréol, the world produced fewer than two thousand vanilla beans, all in Mexico, all the result of pollination by bees. On the same day in 2010, the world produced more than five million vanilla beans, in countries including Indonesia, China, and Kenya, almost all of them—­including the ones grown in Mexico—­the result of le geste d’Edmond.


2 | Counting Creators

What is unusual about Edmond’s story is not that a young slave created something important but that he got the credit for it. Ferréol worked hard to ensure that Edmond was remembered. He told Réunion’s plantation owners that it was Edmond who first pollinated vanilla. He lobbied on Edmond’s behalf, saying, “This young negro deserves recognition from this country. It owes him a debt, for starting up a new industry with a fabulous product.” When Jean Michel Claude Richard, director of Réunion’s botanical gardens, said he had developed the technique and shown it to Edmond, Ferréol intervened. “Through old age, faulty memory or some other cause,” he wrote, “Mr. Richard now imagines that he himself discovered the secret of how to pollinate vanilla, and imagines that he taught the technique to the person who discovered it! Let us leave him to his fantasies.” Without Ferréol’s great effort, the truth would have been lost.

In most cases, the truth has been lost. We do not know, for example, who first realized that the fruit of an orchid could be cured until it tastes good. Vanilla is an innovation inherited from people long forgotten. This is not exceptional; it is normal. Most of our world is made of innovations inherited from people long forgotten—­not people who were rare but people who were common.

Before the Renaissance, concepts like authorship, inventorship, or claiming credit barely existed. Until the early fifteenth century, “author” meant “father,” from the Latin word for “master,” auctor. Auctor-­ship implied authority, something that, in most of the world, had been the divine right of kings and religious leaders since Gilgamesh ruled Uruk four thousand years earlier. It was not to be shared with mere mortals. An “inventor,” from invenire, “find,” was a discoverer, not a creator, until the 1550s. “Credit,” from credo, “trust,” did not mean “acknowledgment” until the late sixteenth century.

This is one reason we know so little about who made what before the late 1300s. It is not that no records were made—­writing has been around for millennia. Nor is it that there was no creation—­everything we use today has roots stretching back to the beginning of humanity. The problem is that, until the Renaissance, people who created things didn’t matter much. The idea that at least some people who create things should be recognized was a big step forward. It is why we know that Johannes Gutenberg invented printing in Germany in 1440 but not who invented windmills in England in 1185, and that Giunta Pisano painted the crucifix in Bologna’s Basilica of San Domenico in 1250 but not who made the mosaic of Saint Demetrios in Kiev’s Golden-­Domed Monastery in 1110.

There are exceptions. We know the names of hundreds of ancient Greek philosophers, from Acrion to Zeno, as well as a few Greek engineers of the same period, such as Eupalinos, Philo, and Ctesibius. We also know of a number of Chinese artists from around 400 c.e. onward, including the calligrapher Wei Shuo and her student Wang Xizhi. But the general principle holds. Broadly speaking, our knowledge of who created what started around the middle of the thirteenth century, increased during the European Renaissance of the fourteenth to seventeenth centuries, and has kept increasing ever since. The reasons for the change are complicated and the subject of debate among historians—­they include power struggles within the churches of Europe, the rise of science, and the rediscovery of ancient philosophy—­but there is little doubt that most creators started getting credit for their creations only after the year 1200.

One way this happened was through patents, which give credit within rigorous constraints. The first patents were issued in Italy in the fifteenth century, in Britain and the United States in the seventeenth century, and in France in the eighteenth century. The modern U.S. Patent and Trademark Office granted its first patent on July 31, 1790. It granted its eight millionth patent on August 16, 2011. The patent office does not keep records of how many different people have been granted patents, but economist Manuel Trajtenberg developed a way of working it out. He analyzed names phonetically and compared matches with zip codes, coinventors, and other information to identify each unique inventor. Trajtenberg’s data suggests that more than six million distinct individuals had received U.S. patents by the end of 2011.

The inventors are not distributed evenly across the years. Their numbers are increasing. The first million inventors took 130 years to get their patents, the second million 35 years, the third million 22 years, the fourth million 17 years, the fifth million 10 years, and the sixth million inventors took 8 years. Even with foreign inventors removed and adjustments for population increase, the trend is unmistakable. In 1800, about one in every 175,000 Americans was granted a first patent. In 2000, one in every 4,000 Americans received one.

Not all creations get a patent. Books, songs, plays, movies, and other works of art are protected by copyright instead, which in the United States is managed by the Copyright Office, part of the Library of Congress. Copyrights show the same growth as patents. In 1870, 5,600 works were registered for copyright. In 1886, the number grew to more than 31,000, and Ainsworth Spofford, the librarian of Congress, had to plead for more space. “Again it becomes necessary to refer to the difficulty and embarrassment of prosecuting the annual enumeration of the books and pamphlets recently completed,” he wrote in a report to Congress. “Each year and each month adds to the painfully overcrowded condition of the collections, and although many rooms have been filled with the overflow from the main Library, the difficulty of handling so large an accumulation of unshelved books is constantly growing.” This became a refrain. In 1946, register of copyrights Sam Bass Warner reported that “the number of registrations of copyright claims rose to 202,144 the greatest number in the history of the Copyright Office, and a number so far beyond the capacities of the existing staff that Congress, responding to the need, generously provided for additional personnel.” In 1991, copyright registrations reached a peak of more than 600,000. As with patents, the increase exceeded population growth. In 1870, there was 1 copyright registration for every 7,000 U.S. citizens. In 1991, there was one copyright registration for every 400 U.S. citizens.

More credit is given for creation in science, too. The Science Citation Index tracks the world’s leading peer-­reviewed journals in science and technology. For 1955, the index lists 125,000 new scientific papers—­about 1 for every 1,350 U.S. citizens. For 2005, it lists more than 1,250,000 scientific papers—­one for every 250 U.S. citizens.

Patents, copyrights, and peer-­reviewed papers are imperfect proxies. Their growth is driven by money as well as knowledge. Not all work that gets this recognition is necessarily good. And, as we shall see later, giving credit to individuals is misleading. Creation is a chain reaction: thousands of people contribute, most of them anonymous, all of them creative. But, with numbers so big, and even though we miscount and undercount, the point is hard to miss: over the last few centuries, more people from more fields have been getting more credit as creators.

We have not become more creative. The people of the Renaissance were born into a world enriched by tens of thousands of years of human invention: clothes, cathedrals, mathematics, writing, art, agriculture, ships, roads, pets, houses, bread, and beer, to name a fraction. The second half of the twentieth century and the first decades of the twenty-­first century may appear to be a time of unprecedented innovation, but there are other reasons for this, and we will discuss them later. What the numbers show is something else: when we start counting creators, we find that a lot of people create. In 2011, almost as many Americans received their first patent as attended a typical NASCAR race. Creating is not for an elite few. It is not even close to being for an elite few.

The question is not whether invention is the sole province of a tiny minority but the opposite: how many of us are creative? The answer, hidden in plain sight, is all of us. Resistance to the possibility that Edmond, a boy with no formal education, could create something important is grounded in the myth that creating is extraordinary. Creating is not extraordinary, even if its results sometimes are. Creation is human. It is all of us. It is everybody.


3 | The Species of New

Even without numbers, it is easy to see that creation is not the exclusive domain of rare geniuses with occasional inspiration. Creation surrounds us. Everything we see and feel is a result of it or has been touched by it. There is too much creation for creating to be infrequent.

This book is creation. You probably heard about it via creation, or the person who told you about it did. It was written using creation, and creation is one reason you can understand it. You are either lit by creation now or you will be, come sundown. You are heated or cooled or at least insulated by creation—­by clothes and walls and windows. The sky above you is softened by fumes and smog in the day and polluted by electric light at night—­all results of creation. Watch, and it will be crossed by an airplane or a satellite or the slow dissolve of a vapor trail. Apples, cows, and all other things agricultural, apparently natural, are also creation: the result of tens of thousands of years of innovation in trading, breeding, feeding, farming, and—­unless you live on the farm—­preservation and transportation.

You are a result of creation. It helped your parents meet. It likely assisted your birth, gestation, and maybe conception. Before you were born, it eradicated diseases and dangers that could have killed you. After, it inoculated and protected you against others. It treated the illnesses you caught. It helps heal your wounds and relieve your pain. It did the same for your parents and their parents. It recently cleaned you, fed you, and quenched your thirst. It is why you are where you are. Cars, shoes, saddles, or ships transported you, your parents, or your grandparents to the place you now call home, which was less habitable before creation—­too hot in the summer or too cold in the winter or too wet or too swampy or too far from potable water or freely growing food or prowled by predators or all of the above.

Listen, and you hear creation. It is in the sound of passing sirens, distant music, church bells, cell phones, lawn mowers and snow blowers, basketballs and bicycles, waves on breakers, hammers and saws, the creak and crackle of melting ice cubes, even the bark of a dog—­a wolf changed by millennia of selective breeding by humans—­or the purr of a cat, the descendant of one of just five African wildcats humans have been selectively breeding for ten thousand years. Anything that is as it is due to conscious human intervention is invention, creation, new.

Creation is so around and inside us that we cannot look without seeing it or listen without hearing it. As a result, we do not notice it at all. We live in symbiosis with new. It is not something we do; it is something we are. It affects our life expectancy, our height and weight and gait, our way of life, where we live, and the things we think and do. We change our technology, and our technology changes us. This is true for every human being on the planet. It has been true for two thousand generations; ever since the moment our species started thinking about improving its tools.

Anything we create is a tool—­a fabrication with purpose. There is nothing special about species with tools. Beavers make dams. Birds build nests. Dolphins use sponges to hunt for fish. Chimpanzees use sticks to dig for roots and stone hammers to open hard-­shelled food. Otters use rocks to break open crabs. Elephants repel flies by making branches into switches they wave with their trunks. Clearly our tools are better. The Hoover Dam beats the beaver dam. But why?

Our tools have not been better for long. Six million years ago, evolution forked. One path led to chimpanzees—­distant relatives, but the closest living ones we have. The other path led to us. Unknown numbers of human species emerged. There was Homo habilis, Homo heidelbergensis, Homo ergaster, Homo rudolfensis, and many others, some whose status is still controversial, some still to be discovered. All human. None us.

Like other species, these humans used tools. The earliest were pointed stones used to cut nuts, fruit, and maybe meat. Later, some human species made two-­sided hand axes requiring careful masonry and nearly perfect symmetry. But apart from minor adjustments, human tools were monotonous for a million years, unchanged no matter when or where they were used, passed through twenty-­five thousand generations without modification. Despite the mental focus needed to make it, the design of that early human hand ax, like the design of a beaver dam or bird’s nest, came from instinct, not thought.

Humans that looked like us first appeared 200,000 years ago. This was the species called Homo sapiens. Members of Homo sapiens did not act like us in one important way: their tools were simple and did not change. We do not know why. Their brains were the same size as ours. They had our opposable thumbs, our senses, and our strength. Yet for 150,000 years, like the other human species of their time, they made nothing new.

Then, 50,000 years ago, something happened. The crude, barely recognizable stone tools Homo sapiens had been using began to change—­and change quickly. Until this moment, this species, like all other animals, did not innovate. Their tools were the same as their parents’ tools and their grandparents’ tools and their great-­grandparents’ tools. They made them, but they didn’t make them better. The tools were inherited, instinctive, and immutable—­products of evolution, not conscious creation.

Then came by far the most important moment in human history—­the day one member of the species looked at a tool and thought, “I can make this better.” The descendants of this individual are called Homo sapiens sapiens. They are our ancestors. They are us. What the human race created was creation itself.

The ability to change anything was the change that changed everything. The urge to make better tools gave us a massive advantage over all other species, including rival species of humans. Within a few tens of thousands of years, all other humans were extinct, displaced by an anatomically similar species with only one important difference: ever-­improving technology.

What makes our species different and dominant is innovation. What is special about us is not the size of our brains, speech, or the mere fact that we use tools. It is that each of us is in our own way driven to make things better. We occupy the evolutionary niche of new. The niche of new is not the property of a privileged few. It is what makes humans human.

We do not know exactly what evolutionary spark caused the ignition of innovation 50,000 years ago. It left no trace in the fossil record. We do know that our bodies, including our brain size, did not change—­our immediate pre-innovation ancestor, Homo sapiens, looked exactly like us. That makes the prime suspect our mind: the precise arrangement of, and connections between, our brain cells. Something structural seems to have changed there—­perhaps as a result of 150,000 years of fine-­tuning. Whatever it was, it had profound implications, and today it lives on in everyone. Behavioral neurologist Richard Caselli says, “Despite great qualitative and quantitative differences between individuals, the neurobiologic principles of creative behavior are the same from the least to the most creative among us.” Put simply, we all have creative minds.

This is one reason the creativity myth is so terribly wrong. Creating is not rare. We are all born to do it. If it seems magical, it is because it is innate. If it seems like some of us are better at it than others, that is because it is part of being human, like talking or walking. We are not all equally creative, just as we are not all equally gifted orators or athletes. But we can all create.

The human race’s creative power is distributed in all of us, not concentrated in some of us. Our creations are too great and too numerous to come from a few steps by a few people. They must come from many steps by many people. Invention is incremental—­a series of slight and constant changes. Some changes open doors to new worlds of opportunity and we call them breakthroughs. Others are marginal. But when we look carefully, we will always find one small change leading to another, sometimes within one mind, often among several, sometimes across continents or between generations, sometimes taking hours or days and occasionally centuries, the baton of innovation passing in an endless relay of renewal. Creating accretes and compounds, and as a consequence, every day, each human life is made possible by the sum of all previous human creations. Every object in our life, however old or new, however apparently humble or simple, holds the stories, thoughts, and courage of thousands of people, some living, most dead—­the accumulated new of fifty thousand years. Our tools and art are our humanity, our inheritance, and the everlasting legacy of our ancestors. The things we make are the speech of our species: stories of triumph, courage, and creation, of optimism, adaptation, and hope; tales not of one person here and there but of one people everywhere; written in a common language, not African, American, Asian, or European but human.

There are many beautiful things about creating being human and innate. One is that we all create in more or less the same way. Our individual strengths and tendencies of course cause differences, but they are small and few relative to the similarities, which are great and many. We are more like Leonardo, Mozart, and Einstein than not.


4 | An End to Genius

The Renaissance belief that creating is reserved for genius survived through the Enlightenment of the seventeenth century, the Romanticism of the eighteenth century, and the Industrial Revolution of the nineteenth century. It was not until the middle of the twentieth century that the alternative position—­that everyone is capable of creation—­first emerged from early studies of the brain.

In the 1940s, the brain was an enigma. The body’s secrets had been revealed by several centuries of medicine, but the brain, producing consciousness without moving parts, remained a puzzle. Here is one reason theories of creation resorted to magic: the brain, throne of creation, was three pounds of gray and impenetrable mystery.

As the West recovered from World War II, new technologies appeared. One was the computer. This mechanical mind made understanding the brain seem possible for the first time. In 1952, Ross Ashby synthesized the excitement in a book called Design for a Brain. He summarized the new thinking elegantly:

The most fundamental facts are that the earth is over 2,000,000,000 years old and that natural selection has been winnowing the living organisms incessantly. As a result they are today highly specialized in the arts of survival, and among these arts has been the development of a brain, an organ that has been developed in evolution as a specialized means to survival. The nervous system, and living matter in general, will be assumed to be essentially similar to all other matter. No deus ex machina will be invoked.

Put simply: brains don’t need magic.

A San Franciscan named Allen Newell came of academic age during this period. Drawn by the energy of the era, he abandoned his plan to become a forest ranger (in part because his first job was feeding gangrenous calves’ livers to fingerling trout), became a scientist instead, and then, one Friday afternoon in November 1954, experienced what he would later call a “conversion experience” during a seminar on mechanical pattern recognition. He decided to devote his life to a single scientific question: “How can the human mind occur in the physical universe?”

“We now know that the world is governed by physics,” he explained, “and we now understand the way biology nestles comfortably within that. The issue is how does the mind do that as well? The answer must have the details. I’ve got to know how the gears clank, how the pistons go and all of that.”

As he embarked on this work, Newell became one of the first people to realize that creating did not require genius. In a 1959 paper called “The Processes of Creative Thinking,” he reviewed what little psychological data there was about creative work, then set out his radical idea: “Creative thinking is simply a special kind of problem-­solving behavior.” He made the point in the understated language academics use when they know they are on to something:

The data currently available about the processes involved in creative and non-­creative thinking show no particular differences between the two. It is impossible to distinguish, by looking at the statistics describing the processes, the highly skilled practitioner from the rank amateur. Creative activity appears simply to be a special class of problem-­solving activity characterized by novelty, unconventionality, persistence, and difficulty in problem formulation.

It was the beginning of the end for genius and creation. Making intelligent machines forced new rigor on the study of thought. The capacity to create was starting to look more and more like an innate function of the human brain—­possible with standard equipment, no genius necessary.

Newell did not claim that everyone was equally creative. Creating, like any human ability, comes in a spectrum of competence. But everybody can do it. There is no electric fence between those who can create and those who cannot, with genius on one side and the general population on the other.

Newell’s work, along with the work of others in the artificial intelligence community, undermined the myth of creativity. As a result, some of the next generation of scientists started to think about creation differently. One of the most important of these was Robert Weisberg, a cognitive psychologist at Philadelphia’s Temple University.

Weisberg was an undergraduate during the first years of the artificial intelligence revolution, spending the early 1960s in New York before getting his PhD from Princeton and joining the faculty at Temple in 1967. He spent his career proving that creating is innate, ordinary, and for everybody.

Weisberg’s view is simple. He builds on Newell’s contention that creative thinking is the same as problem solving, then extends it to say that creative thinking is the same as thinking in general but with a creative result. In Weisberg’s words, “when one says of someone that he or she is ‘thinking creatively,’ one is commenting on the outcome of the process, not on the process itself. Although the impact of creative ideas and products can sometimes be profound, the mechanisms through which an innovation comes about can be very ordinary.”

Said another way, normal thinking is rich and complex—­so rich and complex that it can sometimes yield extraordinary—­or “creative”—­results. We do not need other processes. Weisberg shows this in two ways: with carefully designed experiments and detailed case studies of creative acts—­from the painting of Picasso’s Guernica to the discovery of DNA and the music of Billie Holiday. In each example, by using a combination of experiment and history, Weisberg demonstrates how creating can be explained without resorting to genius and great leaps of the imagination.

Weisberg has not written about Edmond, but his theory works for Edmond’s story. At first, Edmond’s discovery of how to pollinate vanilla came from nowhere and seemed miraculous. But toward the end of his life, Ferréol Bellier-­Beaumont revealed how the young slave solved the mystery of the black flower.

Ferréol began his story in 1793, when German naturalist Konrad Sprengel discovered that plants reproduced sexually. Sprengel called it “the secret of nature.” The secret was not well received. Sprengel’s peers did not want to hear that flowers had a sex life. His findings spread anyway, especially among botanists and farmers who were more interested in growing good plants than in judging floral morality. And so Ferréol knew how to manually fertilize watermelon, by “marrying the male and female parts together.” He showed this to Edmond, who, as Ferréol described it, later “realized that the vanilla flower also had male and female elements, and worked out for himself how to join them together.” Edmond’s discovery, despite its huge economic impact, was an incremental step. It is no less creative as a result. All great discoveries, even ones that look like transforming leaps, are short hops.

Weisberg’s work, with subtitles like Genius and Other Myths and Beyond the Myth of Genius, did not eliminate the magical view of creation nor the idea that people who create are a breed apart. It is easier to sell secrets. Titles available in today’s bookstores include 10 Things Nobody Told You About Being Creative, 39 Keys to Creativity, 52 Ways to Get and Keep Your Creativity Flowing, 62 Exercises to Unlock Your Most Creative Ideas, 100 What-­Ifs of Creativity, and 250 Exercises to Wake Up Your Brain. Weisberg’s books are out of print. The myth of creativity does not die easily.

But it is becoming less fashionable, and Weisberg is not the only expert advocating for an epiphany-­free, everybody-­can theory of creation. Ken Robinson was awarded a knighthood for his work on creation and education and is known for the moving, funny talks he gives at an annual conference in California called TED (for technology, entertainment, and design). One of his themes is how education suppresses creation. He describes “the really extraordinary capacity that children have, their capacity for innovation,” and says that “all kids have tremendous talents and we squander them, pretty ruthlessly.” Robinson’s conclusion is that “creativity now is as important in education as literacy, and we should treat it with the same status.” Cartoonist Hugh MacLeod makes the same point more colorfully: “Everyone is born creative; everyone is given a box of crayons in kindergarten. Being suddenly hit years later with the ‘creative bug’ is just a wee voice telling you, ‘I’d like my crayons back, please.’ ”

Praise

“Entertaining. . . . [E]nlightening. . . . Might be the genre’s be all and end all. . . . If you want to tap your creative potential, buy this book. It’s the last one you’ll ever need to read.”
Toronto Star

“One of the most creative books on creativity I have ever read, a genuinely inspiring journey through the worlds of art, science, business and culture that will forever change how you think about where new ideas come from.”
—William C. Taylor, cofounder and editor of Fast Company and author of Practically Radical
 
“[Ashton’s] is a democratic idea—a scientific version of the American dream. . . . [A]n approachable, thought-provoking book that encourages everyone to be the best they can be.”
The Guardian (London)
 
“[How to Fly a Horse] takes on creation’s most pernicious clichés. . . . [Ashton] arrives at his theories by dint of his own hard work. . . . Being a genius is hard work. But that spark is in all of us.” 
The Washington Post
 
“An inspiring vision of creativity that’s littered with practical advice, and is a cracking read to boot.”
—BBC Focus
 
“[An] entertaining and inspiring meditation on the nature of creative innovation... Fans of Malcolm Gladwell and Stephen Levitt will enjoy Ashton’s hybrid nonfiction style, which builds a compelling cultural treatise from a coalescence of engaging anecdotes.”
Booklist

“Ashton’s beautifully written exploration of creativity explodes so many myths and opens so many doors that readers, like me, will be left reeling with possibilities. We can all create, we can all innovate. Move over, Malcolm Gladwell; Ashton has done you one better.”
—Larry Downes, author of the New York Times bestseller Unleashing the Killer App and co-author of Big Bang Disruption
 
“If you have ever wondered what it takes to create something, read this inspiring and insightful book. Using examples ranging from Mozart to the Muppets, Kevin Ashton shows how to tap the creative abilities that lurk in us all. There are no secrets, no shortcuts; just ordinary steps we can all take to bring something new into the world. Ashton’s message is direct and hopeful: creativity isn’t just for geniuses—it’s for everybody.” 
—Joseph T. Hallinan, author of Why We Make Mistakes

“A detailed and persuasive argument for how creativity actually works—not through magical bursts of inspiration but with careful thought, dogged problem-solving, and hard-won insight. Ashton draws on a wealth of illuminating and entertaining stories from the annals of business, science, and the arts to show how any of us can apply this process to our own work.”
—Mason Currey, author of Daily Rituals: How Artists Work

“If you consider yourself a curious person then you will love this book. Ashton shares so many delightful stories of where things come from and how things came to be, I seriously believe that it will make anyone who reads it smarter.”
—Simon Sinek, New York Times bestselling author of Start With Why and Leaders Eat Last
 
How to Fly a Horse solves the mysteries of invention. Kevin Ashton, the innovator who coined the ‘internet of things,’ shows that creativity is more often the result of ordinary steps than extraordinary leaps. With engrossing stories, provocative studies, and lucid writing, this book is not to be missed.”
—Adam Grant, professor of management at the Wharton School and New York Times bestselling author of Give and Take
  
“Kevin Ashton’s new book How to Fly a Horse is all about the creative sorcery and motivational magic necessary to make impossible things happen in teams or as individuals. Through numerous examples of creative genius ranging from Einstein to the creators of South Park to the invention of jet planes and concertos, Ashton reveals the secrets of the great scientists, artists, and industrialists of the last few centuries.”
—John Maeda, author of The Laws of Simplicity and founder of the SIMPLICITY Consortium at the MIT Media Lab

Author

© hayeshayes.com

KEVIN ASHTON led pioneering work on RFID (radio frequency identification) networks, for which he coined the term “the Internet of Things,” and cofounded the Auto-ID Center at MIT. His writing about innovation and technology has appeared in Quartz, Medium, The Atlantic, and The New York Times.

View titles by Kevin Ashton

Register for the 2025 Penguin Random House First-Year Experience® Conference Author Events!

Penguin Random House Author Events at the 44th Annual First-Year Experience® Conference February 16-19, 2025 New Orleans, Louisiana Hyatt Regency New Orleans Click Here to RSVP A complimentary meal and a limited number of books will be available to attendees. Each event will also be followed by an author signing. Interested in hosting one of these

Read more

What Students Will Be Reading: Campus Common Reading Roundup, 2024-25

With the fall semester in full swing, colleges and universities around the country have announced their Common Reading books for the upcoming 2024-25 academic year. We’ve compiled a list of over 336 programs and their title selections, which you can download here: First-Year Reading 2024-25. We will continue to update this listing to provide the

Read more

2025 Catalog for First-Year & Common Reading

We are delighted to present our new First-Year & Common Reading Catalog for 2025! From award-winning fiction, poetry, memoir, and biography to new books about science, technology, history, student success, the environment, public health, and current events, the titles presented in our common reading catalog will have students not only eagerly flipping through the pages,

Read more