Saturday, July 21, 2012

"Pete Carroll’s Approach to Practice" 7.18.2012

“‘Practice is Everything’ is one of the core tenants of the Win Forever philosophy. We want to create an environment that will permit each of our players to reach his maximum potential, and one of the ways we do that is by practicing with great focus. A player who is fully prepared on the practice field will feel ready to meet whatever comes his way on game day and thus, feel more confident and able to minimize distractions of fear or doubt.

As coaches, we want to run a practice regiment that continually covers all the fundamentals of sound football but varies enough to prepare the team for all contingencies and keep the players’ attention. This is not easy to accomplish, but it is a challenge that our staff embraces. We want to develop an environment that fosters learning and develops confidence.

Our goal is to consistently be the most effective football team we can be. When game day comes along, we want to be fully prepared. We don’t want to be worried about anything. We just want to cut loose, let it rip, and be ourselves. Having a routine can be very powerful in this regard. If you compete day in and day out to excel at something in a systematic way, you can’t help but improve. While we are always making small adjustments according to what we need to work on at a particular time, the basic structure and routines of our practice are totally consistent. Beginning with the team meeting and ending with the final play of practice, the details of each day’s work are accounted for down to the minute.

At USC, we began each practice day with a team meeting. There, I always attempted to set the focus and tone for the day, always with enthusiasm. I was simply demonstrating the energy I wished to see from the coaches and players as we approached practice that day. We tried to keep these meetings short so that when the team broke up into position groups, the coaches had time to cover the assignments for the day in more detail. A primary job of the position coaches, however, was to reinforce the level of excitement and enthusiasm. The energy and the spirit of the day were paramount.

During the first few minutes of most team meetings we would talk about daily events going on around the world, both in sports and outside them. We would make general announcements and try to capture everyone’s attention as we began to focus on the day. My goal was to create a close-knit environment, with our coaches and players sharing responsibility for the day’s outcome.
We also showed highlights from the previous day’s practice. The players didn’t know which plays were going to go up on the big screen, but they did know that if they had been dogging it, that play would surely be shown, accompanied by hooting and hollering. The coaches had a blast with it, and there were always funny plays that we showed back and forth in slow motion. But the very serious belief remained: “Practice is Everything.” By beginning each meeting with highlights, we energized the atmosphere, got the juices flowing, and had some fun jump-starting the day.

Our topics for the day might include areas we needed to work on or notable moments from practice the day before. We might call out someone’s birthday or point out a notable academic accomplishment of one of our players. I loved to talk about sporting events and current national and world issues, especially if they could serve as educational moments. Another one of my favorite activities was to acknowledge and introduce notable visitors or former players who were on campus visiting the Trojans.

From a leadership perspective, these meetings were a great opportunity to connect. As a leader, I don’t see any benefit in maintain a reserve or keeping a distance, the way some other coaches did when I was growing up. I wanted our players to feel my enthusiasm and the coaching staff’s enthusiasm and get geared up for the day. I wanted them to know that we cared and that the task ahead in practice was as much a chance for them to shine as any conference game. We spared no effort to make sure that our guys approached every practice as an opportunity and a challenge. I wanted them to see practice as something to look forward to with excitement and focus. When we did that properly, our practices were as competitive and fiery as a game.

Regardless of whatever we discussed or did in the team meeting, by the time we reached the practice field we were into serious business. Our players needed to channel the energy from the team meeting into an unshakably competitive state of mind so that they could take advantage of the practice opportunities.”

 ___

Saturday, June 23, 2012

"Failure and Rescue" by Atul Gawande, 6.4.2012, New Yorker

The following was delivered as the commencement address at Williams College on Sunday, June 3rd.
   ...

When I was nearing the end of medical school, I decided to go into surgery. I had become enthralled by surgeons, especially by their competence. The source of their success, I believed, was their physical skill—their hand-eye coördination and fine-motor control. But it wasn’t, I learned in residency training. Getting the physical skills is important, and they take some time to practice and master, but they turn out to be no more difficult to learn than those that Mrs. C. mastered as a seamstress. Instead, the critical skills of the best surgeons I saw involved the ability to handle complexity and uncertainty. They had developed judgment, mastery of teamwork, and willingness to accept responsibility for the consequences of their choices. In this respect, I realized, surgery turns out to be no different than a life in teaching, public service, business, or almost anything you may decide to pursue. We all face complexity and uncertainty no matter where our path takes us. That means we all face the risk of failure. So along the way, we all are forced to develop these critical capacities—of judgment, teamwork, and acceptance of responsibility. 

In commencement addresses like this, people admonish us: take risks; be willing to fail. But this has always puzzled me. Do you want a surgeon whose motto is “I like taking risks”? We do in fact want people to take risks, to strive for difficult goals even when the possibility of failure looms. Progress cannot happen otherwise. But how they do it is what seems to matter. The key to reducing death after surgery was the introduction of ways to reduce the risk of things going wrong—through specialization, better planning, and technology...

...

Researchers at the University of Michigan discovered the answer recently, and it has a twist I didn’t expect. I thought that the best places simply did a better job at controlling and minimizing risks—that they did a better job of preventing things from going wrong. But, to my surprise, they didn’t. Their complication rates after surgery were almost the same as others. Instead, what they proved to be really great at was rescuing people when they had a complication, preventing failures from becoming a catastrophe. 

Scientists have given a new name to the deaths that occur in surgery after something goes wrong—whether it is an infection or some bizarre twist of the stomach. They call them a “failure to rescue.” More than anything, this is what distinguished the great from the mediocre. They didn’t fail less. They rescued more. 

This may in fact be the real story of human and societal improvement. We talk a lot about “risk management”—a nice hygienic phrase. But in the end, risk is necessary. Things can and will go wrong. Yet some have a better capacity to prepare for the possibility, to limit the damage, and to sometimes even retrieve success from failure. 

When things go wrong, there seem to be three main pitfalls to avoid, three ways to fail to rescue. You could choose a wrong plan, an inadequate plan, or no plan at all. Say you’re cooking and you inadvertently set a grease pan on fire. Throwing gasoline on the fire would be a completely wrong plan. Trying to blow the fire out would be inadequate. And ignoring it—“Fire? What fire?”—would be no plan at all.

...

There was, as I said, every type of error. But the key one was the delay in accepting that something serious was wrong. We see this in national policy, too. All policies court failure—our war in Iraq, for instance, or the effort to stimulate our struggling economy. But when you refuse to even acknowledge that things aren’t going as expected, failure can become a humanitarian disaster. The sooner you’re able to see clearly that your best hopes and intentions have gone awry, the better. You have more room to pivot and adjust. You have more of a chance to rescue.

But recognizing that your expectations are proving wrong—accepting that you need a new plan—is commonly the hardest thing to do. We have this problem called confidence. To take a risk, you must have confidence in yourself. In surgery, you learn early how essential that is. You are imperfect. Your knowledge is never complete. The science is never certain. Your skills are never infallible. Yet you must act. You cannot let yourself become paralyzed by fear.

Yet you cannot blind yourself to failure, either. Indeed, you must prepare for it. For, strangely enough, only then is success possible.

...


Nothing went exactly perfectly. There was still a good deal of fumbling around as they tried to sort out what was really going on and what would need to be done. For a time, they hoped for a small, short procedure, using just a scope and avoiding a big operation. It would have been an inadequate plan—perhaps even the completely wrong one. But they avoided the worst mistake—which was to have no plan at all. They’d acted early enough to buy themselves time for trial and error, to figure out all the steps required to get her through this calamity. They gave her and themselves the chance to rescue success from failure. 

...

As you embark on your path from here, you are going to take chances—on a relationship, a job, a new line of study. You will have great hopes. But things won’t always go right.

When I graduated from college, I went abroad to study philosophy. I hoped to become a philosopher, but I proved to be profoundly mediocre in the field. I tried starting a rock band. You don’t want to know how awful the songs I wrote were. I wrote one song, for example, comparing my love for a girl to the decline of Marxism. After this, I worked in government on health-care legislation that not only went nowhere, it set the prospect of health reform back almost two decades.

But the only failure is the failure to rescue something. I took away ideas and experiences and relationships with people that profoundly changed what I was able to do when I finally found the place that was for me, which was in medicine. 

So you will take risks, and you will have failures. But it’s what happens afterward that is defining. A failure often does not have to be a failure at all. However, you have to be ready for it—will you admit when things go wrong? Will you take steps to set them right?—because the difference between triumph and defeat, you’ll find, isn’t about willingness to take risks. It’s about mastery of rescue.

____
"Failure and Rescue" by. Atul Gawande 6.4.2012 New Yorker

Sunday, October 9, 2011

"Playboy Interview: Steven Jobs" by David Sheff [Excerpt] 2.1.85

Jobs: ...We went off and built the most insanely great computer in the world.

Playboy: Does it take insane people to make insanely great things?

Jobs: Actually, making an insanely great product has a lot to do with the process of making the product, how you learn things and adopt new ideas and throw out old ideas. But, yeah, the people who made Mac are sort of on the edge.

Playboy: What’s the difference between the people who have insanely great ideas and the people who pull off those insanely great ideas?

Jobs: Let me compare it with IBM. How come the Mac group produced Mac and the people at IBM produced the PCjr? We think the Mac will sell zillions, but we didn’t build Mac for anybody else. We built it for ourselves. We were the group of people who were going to judge whether it was great or not. We weren’t going to go out and do market research. We just wanted to build the best thing we could build. When you’re a carpenter making a beautiful chest of drawers, you’re not going to use a piece of plywood on the back, even though it faces the wall and nobody will ever see it. You’ll know it’s there, so you’re going to use a beautiful piece of wood on the back. For you to sleep well at night, the aesthetic, the quality, has to be carried all the way through.

Playboy: Are you saying that the people who made the PC don’t have that kind of pride in the product?

Jobs: If they did, they wouldn’t have turned out the PC. It seems clear to me that they were designing that on the basis of market research for a specific market segment, for a specific demographic type of customer, and they hoped that if they built this, lots of people would buy them and they’d make lots of money. Those are different motivations. The people in the Mac group wanted to build the greatest computer that has ever been seen.


...


Playboy: Why is the computer field dominated by people so young? The average age of Apple employees is 29.

Jobs: It’s often the same with any new, revolutionary thing. People get stuck as they get older. Our minds are sort of electrochemical computers. Your thoughts construct patterns like scaffolding in your mind. You are really etching chemical patterns. In most cases, people get stuck in those patterns, just like grooves in a record, and they never get out of them. It’s a rare person who etches grooves that are other than a specific way of looking at things, a specific way of questioning things. It’s rare that you see an artist in his 30s or 40s able to really contribute something amazing. Of course, there are some people who are innately curious, forever little kids in their awe of life, but they’re rare.


...


Playboy: A lot of guys in their 40s are going to be real pleased with you. Let’s move on to the other thing that people talk about when they mention Apple‐‑the company, not the computer. You feel a similar sense of mission about the way things are run at Apple, don’t you?

Jobs: I do feel there is another way we have an effect on society besides our computers. I think Apple has a chance to be the model of a Fortune 500 company in the late Eighties and early Nineties. Ten to 15 years ago, if you asked people to make a list of the five most exciting companies in America, Polaroid and Xerox would have been on everyone’s list. Where are they now? They would be on no one’s list today. What happened? Companies, as they grow to become multibillion-dollar entities, somehow lose their vision. They insert lots of layers of middle management between the people running the company and the people doing the work. They no longer have an inherent feel or a passion about the products. The creative people, who are the ones who care passionately, have to persuade five layers of management to do what they know is the right thing to do. What happens in most companies is that you don’t keep great people under working environments where individual accomplishment is discouraged rather than encouraged. The great people leave and you end up with mediocrity. I know, because that’s how Apple was built. Apple is an Ellis Island company. Apple is built on refugees from other companies. These are the extremely bright individual contributors who were troublemakers at other companies. You know, Dr. Edwin Land was a troublemaker. He dropped out of Harvard and founded Polaroid. Not only was he one of the great inventors of our time but, more important, he saw the intersection of art and science and business and built an organization to reflect that. Polaroid did that for some years, but eventually Dr. Land, one of those brilliant troublemakers, was asked to leave his own company‐‑which is one of the dumbest things I’ve ever heard of. So Land, at 75, went off to spend the remainder of his life doing pure science, trying to crack the code of color vision. The man is a national treasure. I don’t understand why people like that can’t be held up as models: This is the most incredible thing to be‐‑not an astronaut, not a football player‐‑but this. Anyway, one of our biggest challenges, and the one I think John Sculley and I should be judged on in five to ten years, is making Apple an incredibly great ten- or 20-billion‐dollar company. Will it still have the spirit it does today? We’re charting new territory. There are no models that we can look to for our high growth, for some of the new management concepts we have. So we’re having to find our own way.


...


Playboy: You take great pride in having Apple keep ahead. How do you feel about the older companies that have to play catch-up with the younger companies‐‑or perish

Jobs: That’s inevitably what happens. That’s why I think death is the most wonderful invention of life. It purges the system of these old models that are obsolete. I think that’s one of Apple’s challenges, really. When two young people walk in with the next thing, are we going to embrace it and say this is fantastic? Are we going to be willing to drop our models, or are we going to explain it away? I think we’ll do better, because we’re completely aware of it and we make it a priority.


...


Playboy: Do you know what you want to do with the rest of this lifetime?

Jobs: There’s an old Hindu saying that comes into my mind occasionally: “For the first 30 years of your life, you make your habits. For the last 30 years of your life, your habits make you.” As I’m going to be 30 in February, the thought has crossed my mind.

Playboy: And?

Jobs: And I’m not sure. I’ll always stay connected with Apple. I hope that throughout my life I’ll sort of have the thread of my life and the thread of Apple weave in and out of each other, like a tapestry. There may be a few years when I’m not there, but I’ll always come back. And that’s what I may try to do. The key thing to remember about me is that I’m still a student. I’m still in boot camp. If anyone is reading any of my thoughts, I’d keep that in mind. Don’t take it all too seriously. If you want to live your life in a creative way, as an artist, you have to not look back too much. You have to be willing to take whatever you’ve done and whoever you were and throw them away. What are we, anyway? Most of what we think we are is just a collection of likes and dislikes, habits, patterns. At the core of what we are is our values, and what decisions and actions we make reflect those values. That is why it’s hard doing interviews and being visible: As you are growing and changing, the more the outside world tries to reinforce an image of you that it thinks you are, the harder it is to continue to be an artist, which is why a lot of times, artists have to go, “Bye. I have to go. I’m going crazy and I’m getting out of here.” And they go and hibernate somewhere. Maybe later they re-emerge a little differently.


...


Playboy: With your wealth and past accomplishments, you have the ability to pursue dreams as few others do. Does that freedom frighten you?

Jobs: The minute you have the means to take responsibility for your own dreams and can be held accountable for whether they come true or not, life is a lot tougher. It’s easy to have wonderful thoughts when the chance to implement them is remote. When you’ve gotten to a place where you at least have a chance of implementing your ideas, there’s a lot more responsibility in that.


____

"Playboy Interview: Steven Jobs" by David Sheff, 2.1.85

Thursday, August 11, 2011

"The Auteur Myth" by Jonah Leher, 7.27.11

It’s a provocative analogy, but I think we tend to overemphasize the singular impact of auteurs, at least in the film business. (I’ll refrain from speculating on the internal workings of the uber-secretive Apple.) Consider the career of Alfred Hitchcock. Although the director is often cited as the quintessential auteur – every Hitchcock film overflows with “Hitchcockian” elements – his films were also a testament to his artistic collaborations. This helps explain why Hitchcock flourished under the studio system, as the studios helped make such collaborations possible, signing the talent to long-term contracts. (In the late 1940s, Hitchcock actually experimented with independent cinema, and set up his own production company. He folded the company after his first two films flopped.) At first glance, this seems surprising: Why would a genius like Hitchcock need the constraints of the studio system? Shouldn’t all the other people and the feedback of executives held him back? Auteurs aren’t supposed to need collaborators.

The reason the studios were so important for Hitchcock is that they allowed him to cultivate the right kind of creative team. While the director relied on many longstanding partners, such as his decade-long relationship with the editor George Tomasini and cinematographer Robert Burks, he also routinely brought in new talent, including John Steinbeck, Raymond Chandler and Salvador Dali. For instance, on North by Northwest, a classic Cary Grant thriller, Hitchcock insisted on working with Ernest Lehman, a screenwriter best known for Sabrina. It was, at first glance, a peculiar choice: Sabrina was a romantic comedy, and Hitchcock had been hired to create a dark suspense movie. But Hitchcock knew what he was doing. In fact, he gave Lehman a tremendous amount of creative freedom. (Hitchcock’s only requirement was that the plot contain three elements: a case of mistaken identity, the United Nations building and a chase scene across the face of Mt. Rushmore.) Although it took Lehman more than a year to write the script, the wait was worth it. “I wanted to write the Hitchcock picture to end all Hitchcock pictures,” Lehman said. And that’s exactly what he did.

Interestingly, the collapse of the studio system in the late 1950s led to a marked decline in Hitchcock’s creative output; the auteur began making mediocre movies. As Thomas Schatz writes in The Genius of the System, an illuminating history of the Golden Age of Hollywood: “Now that Hitchcock could write his own ticket” – he was no longer forced to work within a single studio – “both the quantity and quality of his work fell sharply…The decline of his output suggested that in order to turn out quality pictures with any consistency, even a distinctive stylist and inveterate independent like Hitchcock required a base of filmmaking operations.” By the early 1960s, each Hitchcock movie was an utterly independent venture, so that the director was often the only point of continuity from one film to the next. The end result was a series of financial and critical failures, such as Torn Curtain and Topaz.

I certainly don’t meant to disparage the genius of Hitchcock or Steve Jobs or to defend uninspired data driven design. But it’s also important to remember that nobody creates Vertigo or the iPad by themselves; even auteurs need the support of a vast system. When you look closely at auteurs, what you often find is that their real genius is for the the assembly of creative teams, trusting the right people with the right tasks at the right time. Sure, they make the final decisions, but they are choosing between alternatives created by others. When we frame auteurs as engaging in the opposite of collaboration, when we obsess over Hitchcock’s narrative flair but neglect Lehman’s script, or think about Jobs’ aesthetic but not Ive’s design (or the design of those working for Ives), we are indulging in a romantic vision of creativity that rarely exists. Even geniuses need a little help.

PS. One of my favorite Jobs stories comes from Andy Hertzfeld, a lead engineer on the Apple team that developed the first Macintosh Computer. In his book Revolution in the Valley, Hertzfeld describes Jobs as constantly challenging and inspiring his design team with a series of strange ideas. First, Jobs wanted the Mac to look like a Porsche, to “have a classic look that won’t go out of style.” (Jobs was the proud owner of a Porsche 928.) The following month, after a trip to Macy’s, Jobs insisted that the computer should look like a Cuisinart food-processor – he liked the transparency of the kitchen appliance – and so that became the new template for the Mac. Although these concepts didn’t pan out, Jobs never stopped insisting that “It’s got to be different, different from everything else…” The point, though, is that although Jobs was performing an essential function, he wasn’t inventing the new machine by himself. Rather, he was acting a lot like Hitchcock, telling Lehman that he needed to incorporate a chase scene across the face of Mt. Rushmore.


______

"The Auteur Myth" by Jonah Leher, Wired : Weird Science, 7.27.11


"The Auteur vs. the Committee" by Randall Stross, 7.23.11

AT Apple, one is the magic number.

One person is the Decider for final design choices. Not focus groups. Not data crunchers. Not committee consensus-builders. The decisions reflect the sensibility of just one person: Steven P. Jobs, the C.E.O.

By contrast, Google has followed the conventional approach, with lots of people playing a role. That group prefers to rely on experimental data, not designers, to guide its decisions.

The contest is not even close. The company that has a single arbiter of taste has been producing superior products, showing that you don’t need multiple teams and dozens or hundreds or thousands of voices.

Two years ago, the technology blogger John Gruber presented a talk, “The Auteur Theory of Design,” at the Macworld Expo. Mr. Gruber suggested how filmmaking could be a helpful model in guiding creative collaboration in other realms, like software.

The auteur, a film director who both has a distinctive vision for a work and exercises creative control, works with many other creative people. “What the director is doing, nonstop, from the beginning of signing on until the movie is done, is making decisions,” Mr. Gruber said. “And just simply making decisions, one after another, can be a form of art.”

“The quality of any collaborative creative endeavor tends to approach the level of taste of whoever is in charge,” Mr. Gruber pointed out.

Two years after he outlined his theory, it is still a touchstone in design circles for discussing Apple and its rivals.

Garry Tan, designer in residence and a venture partner at Y Combinator, an investor in start-ups, says: “Steve Jobs is not always right—MobileMe would be an example. But we do know that all major design decisions have to pass his muster. That is what an auteur does.”

Mr. Jobs has acquired a reputation as a great designer, Mr. Tan says, not because he personally makes the designs but because “he’s got the eye.” He has also hired classically trained designers like Jonathan Ive. “Design excellence also attracts design talent,” Mr. Tan explains.

Google has what it calls a “creative lab,” a group that had originally worked on advertising to promote its brand. More recently, the lab has been asked to supply a design vision to the engineering and user-experience groups that work on all of Google’s products. Chris L. Wiggins, the lab’s creative director, whose own background is in advertising, describes design as a collaborative process among groups “with really fruitful back-and-forth.”

“There’s only one Steve Jobs, and he’s a genius,” says Mr. Wiggins. “But it’s important to distinguish that we’re discussing the design of Web applications, not hardware or desktop software. And for that we take a different approach to design than Apple,” he says. Google, he says, utilizes the Web to pull feedback from users and make constant improvements.

Mr. Wiggins’s argument that Apple’s apples should not be compared to Google’s oranges does not explain, however, why Apple’s smartphone software gets much higher marks than Google’s.

GOOGLE’S ability to attract and retain design talent has not been helped by the departure of designers who felt their expertise was not fully appreciated. “Google is an engineering company, and as a researcher or designer, it’s very difficult to have your voice heard at a strategic level,” writes Paul Adams on his blog, “Think Outside In.” Mr. Adams was a senior user-experience researcher at Google until last year; he is now at Facebook.

Douglas Bowman is another example. He was hired as Google’s first visual designer in 2006, when the company was already seven years old. “Seven years is a long time to run a company without a classically trained designer,” he wrote in his blog Stopdesign in 2009. He complained that there was no one at or near the helm of Google who “thoroughly understands the principles and elements of design” “I had a recent debate over whether a border should be 3, 4 or 5 pixels wide,” Mr. Bowman wrote, adding, “I can’t operate in an environment like that.” His post was titled, “Goodbye, Google.”

Mr. Bowman’s departure spurred other designers with experience at either Google or Apple to comment on differences between the two companies. Mr. Gruber, at his Daring Fireball blog, concisely summarized one account under the headline “Apple Is a Design Company With Engineers; Google Is an Engineering Company With Designers.”

In May, Google, ever the engineering company, showed an unwillingness to notice design expertise when it tried to recruit Pablo Villalba Villar, the chief executive of Teambox, an online project management company. Mr. Villalba later wrote that he had no intention of leaving Teambox and cooperated to experience Google’s hiring process for himself. He tried to call attention to his main expertise in user interaction and product design. But he said that what the recruiter wanted to know was his mastery of 14 programming languages.

Mr. Villalba was dismayed that Google did not appear to have changed since Mr. Bowman left. “Design can’t be done by committee,” he said.

Recently, as Larry Page, the company co-founder, began his tenure as C.E.O., , Google rolled out Google+ and a new look for the Google home page, Gmail and its calendar. More redesigns have been promised. But they will be produced, as before, within a very crowded and noisy editing booth. Google does not have a true auteur who unilaterally decides on the final cut.

_________________

"The Auteur vs. the Committee" by Randall Stross, New York Times 7.23.11


Wednesday, June 15, 2011

"David Foster Wallace: In His Own Words" Commencement Speech to Kenyon College, 2005

the most obvious, important realities are often the ones that are hardest to see and talk about. Stated as an English sentence, of course, this is just a banal platitude, but the fact is that in the day to day trenches of adult existence, banal platitudes can have a life or death importance, or so I wish to suggest to you on this dry and lovely morning.

...

blind certainty, a close-mindedness that amounts to an imprisonment so total that the prisoner doesn't even know he's locked up.

...

The point here is that I think this is one part of what teaching me how to think is really supposed to mean. To be just a little less arrogant. To have just a little critical awareness about myself and my certainties. Because a huge percentage of the stuff that I tend to be automatically certain of is, it turns out, totally wrong and deluded.

...

Here is just one example of the total wrongness of something I tend to be automatically sure of: everything in my own immediate experience supports my deep belief that I am the absolute centre of the universe; the realest, most vivid and important person in existence. We rarely think about this sort of natural, basic self-centredness because it's so socially repulsive. But it's pretty much the same for all of us. It is our default setting, hard-wired into our boards at birth. Think about it: there is no experience you have had that you are not the absolute centre of. The world as you experience it is there in front of YOU or behind YOU, to the left or right of YOU, on YOUR TV or YOUR monitor. And so on. Other people's thoughts and feelings have to be communicated to you somehow, but your own are so immediate, urgent, real.

...

It's a matter of my choosing to do the work of somehow altering or getting free of my natural, hard-wired default setting which is to be deeply and literally self-centered and to see and interpret everything through this lens of self.

...

Probably the most dangerous thing about an academic education--least in my own case--is that it enables my tendency to over-intellectualise stuff, to get lost in abstract argument inside my head, instead of simply paying attention to what is going on right in front of me, paying attention to what is going on inside me.

...

learning how to think really means learning how to exercise some control over how and what you think. It means being conscious and aware enough to choose what you pay attention to and to choose how you construct meaning from experience. Because if you cannot exercise this kind of choice in adult life, you will be totally hosed. Think of the old cliché about "the mind being an excellent servant but a terrible master".

...

The point is that petty, frustrating crap like this is exactly where the work of choosing is gonna come in. Because the traffic jams and crowded aisles and long checkout lines give me time to think, and if I don't make a conscious decision about how to think and what to pay attention to, I'm gonna be pissed and miserable every time I have to shop. Because my natural default setting is the certainty that situations like this are really all about me. About MY hungriness and MY fatigue and MY desire to just get home, and it's going to seem for all the world like everybody else is just in my way. And who are all these people in my way? And look at how repulsive most of them are, and how stupid and cow-like and dead-eyed and nonhuman they seem in the checkout line, or at how annoying and rude it is that people are talking loudly on cell phones in the middle of the line. And look at how deeply and personally unfair this is.

...

Again, please don't think that I'm giving you moral advice, or that I'm saying you are supposed to think this way, or that anyone expects you to just automatically do it. Because it's hard. It takes will and effort, and if you are like me, some days you won't be able to do it, or you just flat out won't want to.

...

The only thing that's capital-T True is that you get to decide how you're gonna try to see it.
This, I submit, is the freedom of a real education, of learning how to be well-adjusted. You get to consciously decide what has meaning and what doesn't. You get to decide what to worship.

...

There is no such thing as not worshipping. Everybody worships. The only choice we get is what to worship.

...

there are all different kinds of freedom, and the kind that is most precious you will not hear much talk about much in the great outside world of wanting and achieving.... The really important kind of freedom involves attention and awareness and discipline, and being able truly to care about other people and to sacrifice for them over and over in myriad petty, unsexy ways every day. That is real freedom. That is being educated, and understanding how to think. The alternative is unconsciousness, the default setting, the rat race, the constant gnawing sense of having had, and lost, some infinite thing.

...

It is unimaginably hard to do this, to stay conscious and alive in the adult world day in and day out.
_____
David Foster Wallace: In His Own Words, Commencement Speech to Kenyon College, 2005


Thursday, June 9, 2011

"Extreme Heat Slowing You Down? It’s All in Your Head" Erik Malinowski, WIRED 5.20.2011

It’s one of the basic principles of human physiology, that when extreme heat sends your core temperature off the charts, your body slows down. Maybe it’s regular ol’ fatigue, or it could be the lactic acid building up in your muscle tissue, but at least it’ssomething tangible.

Or is it?

A joint study conducted by researchers at two British universities has opened up the possibility that when coaches yell at athletes to just “play through the pain,” it’s way more conceivable than we thought.

The study, recently published in the European Journal of Applied Physiology, involved seven male cyclists engaging in various 30-minute stationary trials. Subjects were allowed to ride their own bikes, thanks to the use of a KingCycle ergometer, but what the study ultimately hinged on was that the temperature of the environment was displayed for the cyclists.

The control trial was conducted in a room kept at 71.2 degrees Fahrenheit. A second “hot” trial was held in a room at 88.5 degrees. The final one was a “deception” trial, in which the temperature was displayed as 78.8 degrees but it was actually 88.8 degrees, the hottest of the three. The trials were administered in a randomized way, and all seven subjects performed all three. (Rectal thermometers used to measure each cyclist’s core body temperature were also displayed as being slightly lower than what they were actually were.)

What researchers found was that while cyclists performed better in the control trial (10.33 miles) than the hot trial (9.87), they actually traveled a greater distance on average in the deception trial (10.4) than the other two. And the mean power output — the wattage pumped up by all that exertion and cycling — was actually higher in the deception trial (184.4 watts) as opposed to the hot trial (168.1). There was no discernible output difference between control and deception, even though one was conducted in a setting 17 degrees hotter than the other.

It’ll be fascinating to see where the research goes from here, but it’s noteworthy that exercise output, even in difficult conditions, can be so directly impacted solely on visual cues like the readout on a temperature gauge. The results could have a significant effect on any sport or activity that combines intense physical exertion with high temperatures. (Aside from competitive cycling, soccer and marathon-running are the two most obvious candidates.)

And with all sorts of training gadgets that can help you keep track of heat both internal and external, it sure would be great if you could hack one to knock the temperature down a few clicks.

_____

"Extreme Heat Slowing You Down? It’s All in Your Head" Erik Malinowski, WIRED 5.20.2011