I’ve been trying to figure out if this article, written as a speech to West Point cadets by William Deresiewicz, a noted American writer and former academic, might have some insights for us. This is a very long quote from the article:
What can solitude have to do with leadership? Solitude means being alone, and leadership necessitates the presence of others—the people you’re leading. When we think about leadership in American history we are likely to think of Washington, at the head of an army, or Lincoln, at the head of a nation, or King, at the head of a movement—people with multitudes behind them, looking to them for direction. When we think of solitude, we are apt to think of Thoreau, a man alone in the woods, keeping a journal and communing with nature in silence.
Things have changed since I went to college in the ’80s. Everything has gotten much more intense. You have to do much more now to get into a top school like Yale or West Point, and you have to start a lot earlier. My peers and I didn’t begin thinking about college until we were juniors, and maybe we each did a couple of extracurriculars. But I know what it’s like for you guys now. It’s an endless series of hoops that you have to jump through, starting from way back, maybe as early as junior high school: classes and standardized tests, extracurriculars in school and extracurriculars outside of school; test prep courses, admissions coaches, private tutors.
I sat on the Yale College admissions committee a couple of years ago. The first thing the admissions officer would do in presenting a case to the rest of the committee was read what they call in admissions lingo the “brag,” the list of the student’s extracurriculars. Well, it turned out that a student who had 6 or 7 extracurriculars was already in trouble, because the students who got in usually had—in addition to perfect grades and top scores—10 or 12.
What I saw around me were great kids who had been trained to be world-class hoop jumpers. Any goal you set them, they could achieve. Any test you gave them, they could pass with flying colors. They were, as one of them put it herself, “excellent sheep.” I had no doubt that they would continue to jump through hoops and ace tests and go on to Harvard Business School, or Michigan Law School, or Johns Hopkins medical school, or Goldman Sachs, or McKinsey consulting, or whatever. And this approach would indeed take them far in life. They would come back for their 25th reunion as partners at White & Case, or attending physicians at Mass General, or assistant secretaries in the Department of State.
That is exactly what places like Yale mean when they talk about training leaders: educating people who make a big name for themselves in the world, people with impressive titles that the university can brag about; people who make it to the top by climbing the greasy pole of whatever hierarchy they decide to attach themselves to.
Why is it that the best people so often are stuck in the middle and the people who are running things—the leaders—are the mediocrities? Because excellence isn’t usually what gets you up the greasy pole.
What gets you up is a talent for maneuvering—kissing up to the people above you, kicking down to the people below you. Pleasing your teachers, pleasing your superiors, picking powerful mentors and riding their coattails until it’s time to stab them in the back. Jumping through hoops. Getting along by going along. Being whatever other people want you to be, so that it finally comes to seem that, like the manager of the Central Station, you have nothing inside you at all. Not taking stupid risks like trying to change how things are done or question why they’re done.
We have a crisis of leadership in America because our overwhelming power and wealth, earned under earlier generations of leaders, made us complacent, and for too long we have been training leaders who only know how to keep the routine going. Who can answer questions, but don’t know how to ask them. Who can fulfill goals, but don’t know how to set them. Who think about how to get things done, but not whether they’re worth doing in the first place. What we have now are the greatest technocrats the world has ever seen, people who have been trained to be incredibly good at one specific thing, but who have no interest in anything beyond their area of expertise.
What we don’t have are people who can think for themselves; people who can formulate a new way of doing things, a new way of looking at things; people with vision.
A team of researchersat Stanford wanted to figure out how today’s college students were able to multitask so much more effectively than adults. How do they manage to do it, the researchers asked? The answer, they discovered—and this is by no means what they expected—is that they don’t. The enhanced cognitive abilities the investigators expected to find, the mental faculties that enable people to multitask effectively, were simply not there. In other words, people do not multitask effectively. And here’s the really surprising finding: The more people multitask, the worse they are not just at other mental abilities, but at multitasking itself.
One thing that made the study different from others is that the researchers didn’t test people’s cognitive functions while they were multitasking. They separated the subject group into high multitaskers and low multitaskers, and then used a different set of tests to measure the kinds of cognitive abilities involved in multitasking. They found that in every case the high multitaskers scored worse. They were worse at distinguishing between relevant and irrelevant information and ignoring the latter. They were more easily distracted. They were more unorganized, unable to keep information in the right conceptual boxes and retrieve it quickly. And they were even worse at the very thing that defines multitasking: switching between tasks.
Multitasking, in short, impairs your ability to think. Thinking isn’t about learning other people’s ideas, or memorizing a body of information. It requires concentrating on one thing long enough to develop an idea of your own. You simply cannot do that in bursts of 20 seconds at a time, constantly interrupted by Facebook messages or Twitter tweets, or fiddling with your iPod, or watching something on YouTube.
My first thought is never my best thought. My first thought is always someone else’s; it’s always what I’ve already heard about the subject, always the conventional wisdom. It’s only by concentrating, sticking to the question, being patient, letting all the parts of my mind come into play that I arrive at an original idea. By giving my brain a chance to make associations, draw connections, take me by surprise. And often even that idea doesn’t turn out to be very good. I need time to make mistakes and recognize them, to make false starts and correct them, to outlast my impulses, to defeat my desire to declare the job done and move on to the next thing.
I used to have students who bragged to me about how fast they wrote their papers. I would tell them that the great German novelist Thomas Mann said that a writer is someone for whom writing is more difficult than it is for other people. The best writers write much more slowly than everyone else, and the better they are, the slower they write. James Joyce wrote Ulysses, the greatest novel of the 20th century, at the rate of about a hundred words a day for seven years. T.S. Eliot, one of the greatest poets our country has ever produced, wrote about 150 pages of poetry over the course of his entire 25-year career. That’s half a page a month. So it is with any other form of thought. You do your best thinking by slowing down and concentrating.
Concentration. Think about what the word means. It means gathering yourself together into a single point rather than letting yourself be dispersed everywhere into a cloud of electronic and social input. It seems to me that Facebook and Twitter and YouTube—and just so you don’t think this is a generational thing, television and radio and magazines and even newspapers—are all ultimately just an elaborate excuse to run away from yourself. To avoid the difficult and troubling questions that being human throws in your way: Am I doing the right thing with my life? Do I believe the things I was taught as a child? What do the words I live by—words like duty, honor, and country—really mean? Am I happy?
What caused me to wonder about the relevance of this to our field is this: learning how to play an instrument is one of the very few human endeavors left which is self-evidently resistant to multitasking. It is virtually impossible to practice and do anything else. It is absolutely impossible to practice productively and do anything else, and most musicians know it and don’t even try. And, of course, practicing is inherently a solitary activity; anyone who has learned to play an instrument to a professional standard has spent thousands upon thousands of hours in self-imposed solitary. Do these two facts have implications for musicians? Do they have implications for the culture of orchestras?
They might. Orchestras are far more resistant than most other forms of “entertainment” to the idea of letting the audience multitask, for one thing - certainly more resistant than sports. Given that most people involved in the operation of orchestras are, or were, serious musicians at one point, there might be a deep bias against letting the audience do anything but pay attention to nothing but the performance, while at the same time believing - as sports teams obviously don’t - that the performance should be enough, all by itself, to keep the audience engaged.
But I wonder if musicians’ deep experience of solitary concentration affects them in other ways as well. Are musicians becoming different from everyone else because no one else has ever not multitasked their way through life, or done much of anything by themselves? Does that put musicians at an advantage, or a disadvantage, or some of both?