I have always been intrigued by the paradox of expertise. It seems that the more expert one becomes in an area of specialization, the less creative and innovative that person becomes. The paradox is that people who know more, see less; and the people who know less, see more. Apple Computer Inc. founder, Steve Jobs, attempted, without success, to get Atari and Hewlett-Packard interested in his and Steve Wozniak’s personal computer. As Steve recounts, “So we went to Atari and said, ‘Hey, we’ve got this amazing thing, even built with some of your parts, and what do you think about funding us? Or we’ll give it to you. We just want to do it. Pay our salary; we’ll come work for you.’ And their experts laughed and said, ‘No.’ So then we went to Hewlett-Packard, and they said, ‘Hey, we don’t need you. You haven’t got through college yet.” What is it that freezes the expert’s thought and makes it difficult to consider new things that deviate from their theories?
The figure below illustrates a series of progressively modified drawings that change almost imperceptibly from a man into a woman. When test subjects are shown the entire series of drawings one by one, their perception of this intermediate drawing is biased according to which end of the series they started from. Test subjects who start by viewing a picture that is clearly a man are biased in favor of continuing to see a man long after an “objective observer” (an observer who has seen only a single picture) recognizes that the man is now a woman. Similarly, test subjects who start at the woman end of the series are biased in favor of continuing to see a woman. Once an observer has formed an image–that is, once he or she has developed an expectation concerning the subject being observed–this influences future perceptions of the subject.
Ken Olson, president, chairman and founder of Digital Equipment Corp., thought the idea of a personal computer absurd, as he said, “there is no reason anyone would want a computer in their home.” Robert Goddard, the father of modern rocketry, was ridiculed by every scientist for his revolutionary liquid-fueled rockets. Even the New York Times chimed in with an editorial in 1921 by scientists who claimed that Goddard lacked even the basic knowledge ladled out daily in high school science classes. Pierrre Pachet, a renowned physiology professor and expert, declared “Louis Pasteur’s theory of germs is ridiculous fiction.” If we experience any strain in imagining a possibility, we quickly conclude it’s impossible. This principle also helps explain why evolutionary change often goes unnoticed by the expert. The greater the commitment of the expert to their established view, the more difficult it is for the expert to do anything more than to continue repeating their established view. It also explains the phenomenon of a beginner who comes up with the breakthrough insight or idea that was overlooked by the experts who worked on the same problem for years. There is also a tendency to assimilate new data into pre-existing images.
In the early 1900s, Psychologist Cheves W. Perky demonstrated this principle in several experiments. She would ask a group of subjects to form a mental image of a banana, and to mentally project it on a blank wall. She then surreptitiously projected a very dim slide of a banana. Anyone coming into the room sees the slide immediately, but the subjects did not. Perky claimed that the subjects incorporated the slide into their mental image of a banana. State-of-the-art experiments have borne out what is now called the Perky effect: holding a mental image interferes with perception and understanding. This is why experts always assimilate new insights, ideas and concepts into their view. Their mental image of the established view interferes with their perception and understanding of new ideas and concepts. In the case of the Perky experiment with the slide of a banana, the students did not see the slide. In the case of real life, physicists could not see Einstein’s theory of relativity because of their established, accepted view. For years, they tried to incorporate his view into the established view without success.
What happened in this experiment is what happens in real life; despite ambiguous stimuli, people form some sort of tentative hypothesis about what they see. The longer they are exposed to this blurred image, the greater confidence they develop in this initial and perhaps erroneous impression, the greater the impact this initial hypothesis has on subsequent perceptions.
Suppose an expert has an established theory about the danger of boxes and their effect on human life and the environment. The theory is that boxes might be harmful and the use of boxes should be regulated. Now, suppose that I leave a box on the floor, and my wife trips on it, falling against my son, who is carrying a carton of eggs, which then fall and break. The expert’s approach to an event like this would be that the best way to prevent the breakage of eggs would be to outlaw leaving boxes on the floor. As silly as this example is, it is analogous to what is happening in the world of global warming. The chief difference is that in the case of atmospheric CO2 and climate catastrophe, the chain of inference is longer and less plausible than in my example.
If you survey the history of science, it is apparent that most individuals who have created radical innovations did not do so simply because they knew more than others. One of the most important experiences Noble laureate, Richard Feynman, had in his life was reading a copy of the James Watson’s typescript of what was to become his famous book, The Double Helix, about his discovery, together with Francis Crick, of the structure of DNA. Feynman had become unproductive and began to believe he had run out of ideas. The discovery Feynman made was that Watson had been involved in making such a fundamental advance in science, and yet he had been completely out of touch with what everybody else in his field was doing. As told in Watson’s classic memoir, “The Double Helix,” it was a tale of boundless ambition, impatience with authority and disdain, if not contempt, for received opinion. “A goodly number of scientists,” Watson explained, “are not only narrow-minded and dull but also just stupid.” Feynman wrote one word, in capitals: DISREGARD on his notepad when he read that. This word became his motto. That, he said, was the whole point. That was what he had forgotten, and why he had been making so little progress. The way for thinkers like himself to make a breakthrough was to be ignorant of what everybody else was doing and make their own interpretations and guesses.
So Feynman “stopped trying to keep up with what others were doing and went back to his roots, comparing experiment with theory, making guesses that were all his own.” Thus he became creative again, as he had been when he had just been working things out for himself, before becoming a famous physicist in academia. While this is an important lesson for science, it is a supreme lesson for any discipline where “current knowledge” can be dominated that are simply incoherent, overlooked by the experts who worked on the same problem for years.
Make your own interpretations of your experiences to shape your own beliefs and concepts about your world. This is the lesson Feynman called the most important of his life.