Saturday, December 08, 2012

The Science of Storytelling: Why Telling a Story is the Most Powerful Way to Activate Our Brains

 

A good story can make or break a presentation, article, or conversation. But why is that? When Buffer co-founder Leo Widrich started to market his product through stories instead of benefits and bullet points, sign-ups went through the roof. Here he shares the science of why storytelling is so uniquely powerful.

In 1748, the British politician and aristocrat John Montagu, the 4th Earl of Sandwich, spent a lot of his free time playing cards. He greatly enjoyed eating a snack while still keeping one hand free for the cards. So he came up with the idea to eat beef between slices of toast, which would allow him to finally eat and play cards at the same time. Eating his newly invented "sandwich," the name for two slices of bread with meat in between, became one of the most popular meal inventions in the western world.

What's interesting about this is that you are very likely to never forget the story of who invented the sandwich ever again. Or at least, much less likely to do so, if it would have been presented to us in bullet points or other purely information-based form.

For over 27,000 years, since the first cave paintings were discovered, telling stories has been one of our most fundamental communication methods. Recently a good friend  of mine gave me an introduction to the power of storytelling, and I wanted to learn more.

Here is the science around storytelling and how we can use it to make better decisions every day:
Our brain on stories: How our brains become more active when we tell stories.
 
We all enjoy a good story, whether it's a novel, a movie, or simply something one of our friends is explaining to us. But why do we feel so much more engaged when we hear a narrative about events?
 
It's in fact quite simple. If we listen to a powerpoint presentation with boring bullet points, a certain part in the brain gets activated. Scientists call this Broca's area and Wernicke's area. Overall, it hits our language processing parts in the brain, where we decode words into meaning. And that's it, nothing else happens.

When we are being told a story, things change dramatically.. Not only are the language processing parts in our brain activated, but any other area in our brain that we would use when experiencing the events of the story are too.

If someone tells us about how delicious certain foods were, our sensory cortex lights up. If it's about motion, our motor cortex gets active:
"Metaphors like "The singer had a velvet voice" and "He had leathery hands" roused the sensory cortex. […] Then, the brains of participants were scanned as they read sentences like "John grasped the object" and "Pablo kicked the ball." The scans revealed activity in the motor cortex, which coordinates the body's movements."
A story can put your whole brain to work. And yet, it gets better:
When we tell stories to others that have really helped us shape our thinking and way of life, we can have the same effect on them too. The brains of the person telling a story and listening to it can synchronize, says Uri Hasson from Princeton:
"When the woman spoke English, the volunteers understood her story, and their brains synchronized. When she had activity in her insula, an emotional brain region, the listeners did too. When her frontal cortex lit up, so did theirs. By simply telling a story, the woman could plant ideas, thoughts and emotions into the listeners' brains."
Anything you've experienced, you can get others to experience the same. Or at least, get their brain areas that you've activated that way, active too:

 

Evolution has wired our brains for storytelling—how to make use of it


Now all this is interesting. We know that we can activate our brains better if we listen to stories. The still unanswered question is: Why is that? Why does the format of a story, where events unfold one after the other, have such a profound impact on our learning?

The simple answer is this: We are wired that way. A story, if broken down into the simplest form, is a connection of cause and effect. And that is exactly how we think. We think in narratives all day long, no matter if it is about buying groceries, whether we think about work or our spouse at home. We make up (short) stories in our heads for every action and conversation. In fact, Jeremy Hsu found [that] "personal stories and gossip make up 65% of our conversations."

Now, whenever we hear a story, we want to relate it to one of our existing experiences. That's why metaphors work so well with us. While we are busy searching for a similar experience in our brains, we activate a part called insula, which helps us relate to that same experience of pain, joy, or disgust.
The following graphic probably describes it best:


In a great experiment, John Bargh at Yale found the following:
"Volunteers would meet one of the experimenters, believing that they would be starting the experiment shortly. In reality, the experiment began when the experimenter, seemingly struggling with an armful of folders, asks the volunteer to briefly hold their coffee. As the key experimental manipulation, the coffee was either hot or iced. Subjects then read a description of some individual, and those who had held the warmer cup tended to rate the individual as having a warmer personality, with no change in ratings of other attributes."
We link up metaphors and literal happenings automatically. Everything in our brain is looking for the cause and effect relationship of something we've previously experienced.

Let's dig into some hands on tips to make use of it:

 

Exchange giving suggestions for telling stories

Do you know the feeling when a good friend tells you a story and then two weeks later, you mention the same story to him, as if it was your idea? This is totally normal and at the same time, one of the most powerful ways to get people on board with your ideas and thoughts. According to Uri Hasson from Princeton, a story is the only way to activate parts in the brain so that a listener turns the story into their own idea and experience.

The next time you struggle with getting people on board with your projects and ideas, simply tell them a story, where the outcome is that doing what you had in mind is the best thing to do. According to Princeton researcher Hasson, storytelling is the only way to plant ideas into other people's minds.

Write more persuasively—bring in stories from yourself or an expert

This is something that took me a long time to understand. If you start out writing, it's only natural to think "I don't have a lot of experience with this, how can I make my post believable if I use personal stories?" The best way to get around this is by simply exchanging stories with those of experts. When this blog used to be a social media blog, I would ask for quotes from the top folks in the industry or simply find great passages they had written online. It's a great way to add credibility and at the same time, tell a story.

The simple story is more successful than the complicated one

When we think of stories, it is often easy to convince ourselves that they have to be complex and detailed to be interesting. The truth is however, that the simpler a story, the more likely it will stick. Using simple language as well as low complexity is the best way to activate the brain regions that make us truly relate to the happenings of a story. This is a similar reason why multitasking is so hard for us. Try for example to reduce the number of adjectives or complicated nouns in a presentation or article and exchange them with more simple, yet heartfelt language.

Quick last fact: Our brain learns to ignore certain overused words and phrases that used to make stories awesome. Scientists, in the midst of researching the topic of storytelling have also discovered, that certain words and phrases have lost all storytelling power:
"Some scientists have contended that figures of speech like "a rough day" are so familiar that they are treated simply as words and no more."
This means, that the frontal cortex—the area of your brain responsible to experience emotions—can't be activated with these phrases. It's something that might be worth remembering when crafting your next story.

Monday, December 03, 2012

Is Zero Even A Number ?

By Laura Gray
BBC News

Superstorm Sandy had many consequences, some easier to foresee than others. Millions experienced floods and power cuts, the New York marathon was cancelled, and pictures of sharks in the city appeared on the internet. Another outcome was to draw attention to the unique position of the number zero.

To deal with fuel shortages after the storm, New York Mayor Michael Bloomberg introduced rationing on 8 November.

"Drivers in New York City who have licence plates that end in an odd number or end in a letter or other character will be able to gas or diesel only on odd-numbered days such as tomorrow which happens to be the 9th," he said.

"Those with licence plates ending in an even number, or the number zero, will be able to buy gas or diesel only on even number days such as Saturday November 10th."

The use of the phrase "even number, or the number zero" implies that zero is not even.

On the other hand, the mayor is lumping zero together with the even numbers, so he certainly doesn't think it's odd.

So what is it - odd, even or neither?

For mathematicians the answer is easy: zero is an even number. The rest of us may not feel completely sure. According to Dr James Grime of the Millennium Maths Project at Cambridge University, reaction time experiments in the 1990s revealed people are 10% slower at deciding whether zero is odd or even than other numbers.

Children find it particularly difficult to recognise if zero is odd or even. "A survey of primary school children in the 1990s showed that about 50% thought zero is even, about 20% thought it was odd and the remaining 30% thought it was neither, both, or that they don't know," explains Dr Grime. "It appears that we may file numbers mentally into lists such as the even numbers two, four, six, eight or numbers to the power of two which would include two, four, six, eight or two, four, eight, 16. Zero is not on these lists so it takes us longer to work out."

So why, mathematically, is zero an even number?

Because any number that can be divided by two to create another whole number is even. Zero passes this test because if you halve zero you get zero.

Zero also has odd numbers either side of it - minus one and one - and so this is another test it passes to be classified as an even number.

In fact, there is an argument that zero is the most even number of all.

A number which is "doubly even" can be divided by two and then divided by two again. Zero can be divided by two forever and the result will always be a whole number - zero.

It's not just the public who have struggled to recognise zero as an even number. During the smog in 1977 in Paris, car use was restricted so that people with licence plates ending in odd or even numbers drove on alternate days.

"The police did not know whether to stop the zero-numbered licence plates and so they just let them pass because they didn't know whether it was odd or even," says Dr Grime.

It even took mathematicians some time to agree on the question.

To begin with, zero was not recognised as a number at all. The Babylonians and the Ancient Greeks used it to differentiate between small and large numbers, for example 26 and 206. Before this, people could only tell if one number was bigger than another by the context in which it was used.

In the 13th Century, the Italian mathematician Fibonacci was the first to popularise Arabic numerals, the numbers that we use today, in Europe. He classified one to nine as numbers but zero as a "sign".

The struggle they had was if zero is nothing, is it even a number at all and can it have the properties of being a number? For example, oddness or evenness?

"It wasn't until the 1600s that zero was truly accepted as an even number - after resistance and debate," says Grime. For more than 1,000 years mathematicians had difficulties with the number zero and non-mathematicians are still often uncertain about how to classify it.

So Bloomberg had every reason to spell out to New Yorkers in black and white that he was lumping zero along with the (other) even numbers.

Tuesday, July 27, 2010

How facts backfire

Researchers discover a surprising threat to democracy: our brains
By Joe Keohane

It’s one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. “Whenever the people are well-informed, they can be trusted with their own government,” Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but it’s an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.

In the end, truth will out. Won’t it?

Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

This bodes ill for a democracy, because most voters — the people making decisions about how the country runs — aren’t blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

This effect is only heightened by the information glut, which offers — alongside an unprecedented amount of good information — endless rumors, misinformation, and questionable variations on the truth. In other words, it’s never been easier for people to be wrong, and at the same time feel more certain that they’re right.

“Area Man Passionate Defender Of What He Imagines Constitution To Be,” read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton University’s Larry M. Bartels argued, “the political ignorance of the American voter is one of the best documented data in political science.”

On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare — the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct — but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)

Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the “I know I’m right” syndrome, and considers it a “potentially formidable problem” in a democratic system. “It implies not only that most people will resist correcting their factual beliefs,” he wrote, “but also that the very people who most need to correct them will be least likely to do so.”

What’s going on? How can we have things so wrong, and be so sure that we’re right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesn’t. This is known as “motivated reasoning.” Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.

New research, published in the journal Political Behavior last month, suggests that once those facts — or “facts” — are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michigan’s Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there weren’t), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.

For the most part, it didn’t. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic — a factor known as salience — the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didn’t backfire, but the readers did still ignore the inconvenient fact that the Bush administration’s restrictions weren’t total.

It’s unclear what is driving the behavior — it could range from simple defensiveness, to people working harder to defend their initial beliefs — but as Nyhan dryly put it, “It’s hard to be optimistic about the effectiveness of fact-checking.”

It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. “It’s very much up in the air,” says Nyhan.

But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, you’ll listen — and if you feel insecure or threatened, you won’t. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.

There are also some cases where directness works. Kuklinski’s welfare study suggested that people will actually update their beliefs if you hit them “between the eyes” with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.

Kuklinski’s study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.

And if you harbor the notion — popular on both sides of the aisle — that the solution is more education and a higher level of political sophistication in voters overall, well, that’s a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are “the very folks on whom democratic theory relies most heavily.”

In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts — inference, intuition, and so forth — to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, we’re easily suckered by political falsehoods.

Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the “reputational costs” of peddling bad info, he suggests, you might discourage people from doing it so often. “So if you go on ‘Meet the Press’ and you get hammered for saying something misleading,” he says, “you’d think twice before you go and do it again.”

Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery. Getting a politician or pundit to argue straight-faced that George W. Bush ordered 9/11, or that Barack Obama is the culmination of a five-decade plot by the government of Kenya to destroy the United States — that’s easy. Getting him to register shame? That isn’t.

© Copyright 2010 The New York Times Company

Wednesday, June 02, 2010

China breaks into Supercomputing league

BEIJING — A Chinese supercomputer has been ranked the world's second-fastest machine in a list issued by U.S. and European researchers, highlighting China's ambitions to become a global technology center.

The Nebulae system at the National Supercomputing Centre in Shenzhen in southern China came in behind the U.S. Department of Energy's Jaguar in Oak Ridge, Tennessee, according to the list released Monday.

Supercomputers are used for complex work such as modeling weather systems, simulating nuclear explosions and designing jetliners.

The semiannual TOP500 list highlighted Beijing's efforts to join the United States, Europe and Japan in the global technology elite and its sharp increases in research spending, driven by booming economic growth.

It also reflected China's continued reliance on Western know-how: Nebulae was built by China's Dawning Information Industry Ltd. but uses processors from Intel Corp. and Nvidia Corp., both American companies.

The Nebulae is capable of sustained computing of 1.271 petaflops - or 1,271 trillion calculations - per second, according to TOP500. It said the Jaguar was capable of sustained computing of 1.75 petaflops.

The Chinese computer ranked first in theoretical computing speed at 2.98 petaflops, the group said. The list was compiled by Hans Meuer of the University of Mannheim, Germany; Erich Strohmaier and Horst Simon of NERSC/Lawrence Berkeley National Laboratory and Jack Dongarra of the University of Tennessee, Knoxville.

The communist Beijing government wants China to evolve from a low-cost factory into an prosperous "innovation society." A 15-year government plan issued in 2006 promises support for areas ranging from computers to lasers to genetics.

Boosted by Nebulae's performance, China rose to No. 2 overall on the TOP500 list with 24 of the 500 systems on the list and 9.2 percent of global supercomputing capacity, up from 21 systems six months ago.

The United States held onto its overall lead with 282 of the 500 systems and 55.4 percent of installed performance.

Europe had 144 systems on the list, including 38 in Britain, 29 in France and 24 in Germany.

Elsewhere in Asia , Japan had 18 supercomputers on the list, up from 16 six months ago, and India had five.

A second Chinese computer also made the Top 10. The Tianhe-1 at the National Super Computer Center in the eastern city of Tianjin, at No. 7, uses processors made by Intel and Advanced Micro Devices Inc., another American company.

The list also reflected breakneck advances in supercomputing speeds.

No. 1 on the June 2008 list was the Roadrunner system at the Los Alamos National Laboratory in New Mexico, capable of 1.04 petaflops, or about two-thirds of Jaguar's level. In the latest list, Roadrunner dropped to No. 3.

Friday, January 15, 2010

Which Is Worse for Your Brain: Texting or Pot? (Hint: Pass the Pipe)

It’s a challenge of modern life: email, Twitter feeds, instant messaging, text messages, and other snippets of information are coming at us so fast that it’s hard not to feel under digital attack. Sure, some of it’s important — and that’s precisely the problem. Turn it all off and you might as well quit the workforce. But read it all and your mind becomes so drained that it’s a challenge to get anything else done.

In some ways, technology has evolved in a way that puts mere humans in a bind. Consider the email conundrum. From the moment you wake up, it seems the inbox is calling your name. And if you’re like most of us, you answer its call pretty quickly.

“The brain hates uncertainty,” says David Rock, the CEO of Results Coaching Systems and author of “Your Brain at Work.” “It’s literally painful to not download your email the moment you arrive at your desk in the morning. But once you’ve processed 30 or 40 emails, you’ve ruined your brain chemistry for higher level tasks that are going to create value.”

In fact, a University of London study done for Hewlett-Packard found that “infomania” — a term connected with addiction to email and texting — can lower your IQ by twice as much as smoking marijuana. Moreover, email can raise the levels of noradrenaline and dopamine in your brain by constantly introducing new stimuli into your day. When those levels get too high, complex thinking becomes more difficult, making it harder to make decisions and solve problems — key roles for all managers.

In short, the brain’s capacity for decision-making evolved at a time when people had less to think about. Great, so now you have an excuse for not keeping up. But you still need a game plan.

1. Take control of email.
Don’t start your day with email. Set your email so it doesn’t download new mail automatically or, at the very least, turn off any alert system. Instead, set a time to check for messages manually — preferably later in the day, after you’ve used your brainpower for more important things.

Equally important is that others at your business know how you want email used. “Emails should be short, concise, and used only when a conversation is not an option,” says Adrian Moorhouse, managing director of executive coaching firm Lane4. “The easier communication is to digest, the more likely it is that the messages will be delivered effectively.”

Some colleagues seem unable to help themselves. We all know the type. They send too many emails; they gossip or forward jokes. Get them to divert their personal chatter online by allowing them to use social media at work (even if it’s just at set times of the day). Or talk to the worst offenders one-on-one. Peter Taylor, the director of the project management office for Siemens and author of “The Lazy Project Manager,” says when he’s cc’d on emails, he tells the senders to cut it out. “If people had to produce single sheets of paper and hand them out every time they wanted to communicate, they’d be a lot more conscientious. I educate everyone who I communicate with and as a result, the emails I do receive are pertinent to me. I restructure those emails, copy them into ongoing documents, and keep my inbox very small.”

If you’re reaching a breaking point, do the email equivalent of filing for bankruptcy. Simply wipe your inbox to start afresh. It seems drastic, but it can work. Send a message to all contacts letting them know what you’re planning, select all emails, and delete or archive them. If you’re planning a new regime of folders, rules, filters, and information-sharing disciplines, starting from scratch isn’t so crazy.

2. Prioritize your prioritizing.
To help you prioritize, start by setting clear goals. We all tend to do this subconsciously, according to Lane4’s Moorhouse, but writing them down helps you actually achieve them. Here, too, time of day really matters. Prioritizing is one of the brain’s most energy-hungry processes,” writes Rock in his book. That means it’s best done when your mind is fresh and well rested. Allocate time to order your thoughts — dashing off a to-do list of tasks that are “front of mind” is easy, but it won’t break the back of the work you need to cover.

Try organizing your thinking visually. One great way is with Mind Maps, diagrams of ideas linked together in a tree system that help you visualise all of them in context to each other. That way you won’t forget any of your ideas when you have to decide which ones are the most important.

3. Blindside the data (approach it from an unexpected direction).
Break down complex information into sub-groups. Once you’ve determined a goal, you can “chunk” your work into groups to achieve it. You can also do this with your to-do lists.

According to an experiment at Wilfred Laurier University, (It’s About Time: Optimistic Predictions in Work and Love, European Review of Social Psychology) people are generally very bad at estimating when they’ll finish their own work, but good at guessing for others. So gauge your timing by using someone else’s experience. You’ll be less stressed if you’re realistic about your workload.

4. Do less.
To do less, you should delegate more. Too many managers can’t resist the temptation personally to get involved in everything that’s happening. But effective delegation means limiting the amount of information you have to process, as well as empowering those around you. Then, ask for regular briefings.

5. Unplug.
Many managers feel they can’t shut off the fire hydrant of information. But they can take a break from it. “It’s tempting to think that more information makes for better decisions,” says Penny de Valk, CEO of the UK-based Institute of Leadership and Management. “But in most cases, it just erodes your focus. You need time to synthesize information and generate real intelligence.”

That takes discipline, of course, but it’s useful to stop thinking when you are stuck on a project so your brain can recover. “You do need to switch off and rebalance your brain chemistry if you’re going to come up with new ideas,” says Rock. Stefan Sagmeister of New York-based design firm Sagmeister says he so much believes in the power of time off that he closes up shop for 12 months every seven years to pursue “little experiments” that he doesn’t have time for in his daily life.