the power of positive genius
Before I was born, my father, who was a neuroscientist at UCLA at the time, made me an unwilling subject of one of the very first EEG experiments conducted on an unborn child. He and his colleagues hooked up electrodes to the belly of my very pregnant (and clearly very patient) mom to see if they could detect and analyze my brain wave patterns. The tests failed (I’m not sure what that says about my brain), but some influences in our lives run deep. Even before birth, I was wired for a love of psychology and science.
A mere six years later, I willingly volunteered for another neuroscience experiment, which, though of course I had no way of knowing it at the time, would ultimately lead to the writing of this book. By that point my father was a professor at Baylor University. All of my babysitters happened to be students from his introductory psychology classes, and I was in love with all of them. But as I slowly started realizing that my relationships with them weren’t going as well as I’d hoped (for instance, my parents had to pay the girls at the end of the date), I decided—after observing the successes of Ariel in The Little Mermaid—that I would need to become part of their world. So I asked my dad if I could be part of one his classroom demonstrations. He was so excited that his son might be following in his footsteps that he didn’t stop to wonder if I had ulterior motives—as indeed I did.
Regardless, he brought me to Baylor University for one of his famous lectures. I remember sitting in the bulky, brown brain wave machine in front of the class as he attached electrode after electrode to my scalp with conductive jelly. I didn’t care; I was just happy because all of my girlfriends’ eyes were on me.
But in his excitement about having his son in class, my dad made a simple mistake. He forgot to ground the wire and left it lying across a copper strip on the floor. When he turned on the machine, the current passed right through me—it was as though I had stuck my finger in a socket. To this day, I don’t blame my dad for shocking me. I do blame him for laughing along with the entire class as I angrily pulled off all my electrodes and strode off with as much indignation as a six-year-old could muster.
Not surprisingly, I never did get to date any of his students. But I am grateful to my dad nonetheless for hooking me up to that torture machine, because his experiments gave me a lifelong fascination with studying how the brain perceives the world. That evil instrument was a primitive evoked potential machine, a device that records the electrical activity along the scalp, thus allowing neuroscientists to measure and record levels of activity in the brain as it processes stimuli from the external world.
Look around at the people in your office, on the subway, sitting across from you at the cafe. Have you ever wondered if the world you see is the same one they see? Have you worked with a stressed manager who constantly points out only the flaws and none of the good, or spent time with a relative during the holidays who complains about everything despite being surrounded by love, and thought to yourself: How could they possibly see the world that way?
The reason some people see the world so differently from others is that the human brain doesn’t just take a picture of the external world like a camera; it is constantly interpreting and processing the information it receives. Every time the world provides us with information, whether the report of a down stock market, a stressful e-mail, or a smiling coworker, our brains expend energy creating our understanding of this information. This energy is called “evoked potential,” and EEGs were some of the first instruments that allowed us to peek behind the curtain and better understand this process.
While the human brain receives eleven million pieces of information every second from your environment, it can process only forty bits per second, which means it has to choose what tiny percentage of this input to process and attend to, and what huge chunk to dismiss or ignore.1 Thus your reality is a choice; what you choose to focus on shapes how you perceive and interpret your world.
Today, using EEGs, fMRIs, and eye-tracking machines, we have the ability to measure and study those energy patterns. And more important, we are now learning how to change these energy patterns to help us create a more positive interpretation of the world around us.
This is key, because the better your brain is at using its energy to focus on the positives, the greater your chances at success.
This book is all about how to evoke your potential, by changing your mindset.
The goal of science is prediction. If you take vitamin C, doctors want to be able to predict whether it will lower your chances of getting a cold. If you drop a bowling ball at one hundred feet, physicists want to predict how hard it will hit the ground.
The goal of business is to build revenue and create sustainable, growing income. Since a business can only be as successful as the people working in it, companies have long sought a way to use science to predict high performance in individuals. Yet for all the research that has been done on the topic, no theory has ever been able to fully explain the science of human potential—until now.
Back in the nineteenth century, Sir Francis Galton was among the first to study how our brains’ energy patterns predicted performance. Without the aid of EEGs, of course, he posited that intelligence could be quantified and predicted by the speed of the brain’s processing system.2 The faster your brain is at discerning sensory stimuli and reacting, he hypothesized, the smarter you are. But of course, reaction time is only one small piece of the complex equation of human intelligence.
From the 1920s through the 1980s, scientists thought potential could be measured by IQ, which was basically just a measure of one’s verbal and math skills. So businesses and governments poured money into pumping up math and reading in public schools and shut down the arts and music programs. HR departments designed tests based upon IQ, then hired everyone from salespeople to CEOs using those same yardsticks of intelligence.
Problem was, they had it all wrong. As it turns out, IQ and technical skills combined predict only 20 to 25 percent of job success.3 That means that over 75 percent of your career outcome has nothing to do with your intelligence and training—which is a huge problem because in a down economy companies spend a majority of their training budgets attempting to raise employee intelligence and technical skills. This money, scientifically speaking, is irresponsibly spent.
So how else can we predict professional success? If IQ is a bad predictor, maybe SAT scores, a more modern testing tool, would be better? Not the case. As a matter of fact, they are much worse. SAT scores predict only 8 to 15 percent of college freshmen’s GPA, which means that for around 88.5 percent of college students SAT scores are no better at predicting academic success than a pair of dice.4 (Again, it is a shame that we waste hours and hours preparing for predictive tests that are not actually predictive.)
The next metric businesses tried to use to predict prospective employees’ performance was grades. High school grades are twice as predictive of college success as SAT scores. Great, grades must predict potential for future success in the workplace too, right? Thomas J. Stanley, PhD, author of The Millionaire Mind, begs to differ. After a decade of research, he found no correlation between grades and professional success: a coin flip would be as predictive of greatness as grades.5 This explains the oft-cited paradox that so many C students in business school end up running companies and so many A students end up working for them.
Enter researchers like Howard Gardner and Peter Salovey. Gardner was the first to argue that the ability to understand one’s own feelings as well as the feelings of others was more important than IQ. In 1990, two psychologists, Peter Salovey at Yale (whom you will read more about later) and John D. Mayer at the University of New Hampshire, published an earth-shattering paper arguing that IQ was worthless and that the ability to understand feelings was a far greater predictor of human potential.6 They dubbed this emotional intelligence.
Most of you are probably familiar with emotional intelligence. It refers to your ability to regulate your emotions, and for the past two decades it has been thought to be the key to succeeding in the often stressful and volatile world of business. Spurred on by Daniel Goleman’s internationally bestselling book Emotional Intelligence, which popularized research like Salovey’s, companies all over the world began testing employees’ and potential employees’ emotional intelligence quotient (EQ) instead of IQ. The big debate among academics and at companies became, Which is more important, IQ or EQ? This is where society and science took a major wrong turn. Now, please do not misunderstand, I think emotional intelligence was one of the best theories to come out of psychology labs in the 1990s. But the question of which kind of intelligence was more important was the wrong one.
Soon, Gardner introduced his second main category of intelligence, the ability to understand and relate to other people. He called it “social intelligence,” and again, Goleman introduced it to the business world with his bestselling book Social Intelligence. Again, the science was valid, but its value as a predictor of potential was undercut by the misguided “which is most important” debate.
Companies and researchers have been arguing this question ever since. Which is most important: IQ, emotional intelligence, or social intelligence? This is talking in circles. It’s like asking which is more important in sports, offense or defense, or who is more important to a business, clients or employees. To be truly successful, instead of thinking about intelligence in isolation, we need to focus on how to combine all our intelligences.
Once I immersed myself in the research, it couldn’t have been clearer. Yes, all these intelligences matter, but what matters most is how your brain knits them together. Thus the question should not be, which intelligence is most important, but how can we learn to harness and amplify them. And how can we?
Copyright © 2013 by Shawn Achor. All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.