Softpanorama
May the source be with you, but remember the KISS principle ;-)

Contents Bulletin Scripting in shell and Perl Network troubleshooting History Humor

Information Overload

How Digital Devices Deprive Brain of Needed Downtime

In Greek mythology, Sisyphus, an evil king,
was condemned to Hades to forever roll a big rock to the top of a mountain,
and then the rock always rolled back down again.
Similar version of Hell is suffered every day by people
 managed by micromanagers and control freaks.
 

News Books Recommended Links Obsession with Computers and Internet Drinking from a firehose Mental overload Email Overload
IT Staff Health Issues Health insurance Sleep Deprivation Obsessive-compulsive disorder Slackerism Coping with the toxic stress in IT environment Learned helplessness
Burnout Toxic managers  Micromanagers and Control Freaks Stoisim Humor Random Findings Etc

There can be several possible reasons of information overload:

Prolonged exposure to stress (Working for a corporate psychopath) increase the level of overload and contribute to the level of stress you experience.

Prolonged exposure to information overload produces so called information fatigue syndrome.  Symptoms include paralysis of analytical capacity, increased anxiety, greater self-doubt, and a tendency to blame others.  Long exposure produces symptoms similar to post-traumatic stress syndrome and in milder form is intrinsically connected with demoralization and burnout.  Here the most helpful page is probably Softpanorama Humor Archive. Unique Collection of Open Source Related Humor. Humor is one of the most affecting methods of fighting stress and  overload. It helps a person to remain positive in difficult situations more effectively that most drugs.

When people are faced with more information than they can process, they become unable to make decisions or take action. There are two important aspects of this problem: 

Often information overload is typical for high-tech startups. "Technology has changed, but human nature hasn't. Whether it's the Gold Rush of 1849 or the Web Rush of l999, people are people. More often than not, they're miserable, nasty, selfish creatures, driven by vanity and greed, doing whatever they can to get ahead, even if it means stepping on the person next to them, crushing the weak, and destroying themselves in the process." Actually this is not true.  The IT industry is a unique environment; we are truly given a more choice as to where our priorities lie than in many other jobs. But there is no free lunch. You want a cool job? Don't expect to work for a huge company and get paid the big bucks. You want to make good money? Don't expect to be able to leave the office in the middle of the day just to sit in the park and drink coffee. You want to make great money? Don't expect to work 40 or even 50 hours a week...

Actually startups aren't about the paradise, nor are they viable for those who crave security. They are about risk, not just financial but also emotional and intellectual. Some think that the rewards for success are worth it, some not... It's true that some startups hire, than harass and inflict burnout on programmers and sys-admins. Life in the fast lane can be brutal - long hours, almost no employer-employee loyalty, greed and moral cowardice, back-stabbing, pressure, etc. If you don't want to do what your boss want, a startup can probably find immigrant that will do it for less money. That is the Silicon Valley Way (TM).

Many visitors to this page are probably system administrators. And it's sad to say but sysadmins are often the janitors of e-business. To clean up the messes from the ugly packages superfast growth and unrealistic schedules they often work long, late hours. It's a thankless job (although not the only one and not the most miserable one...) Anyway the reality is that sysadmins/programmers in startups and small companies that are struggling to survive. Sometimes are also put under substantial stress... I'm surprised most of them aren't more neurotic from sleep deprivation. 

At the same time many sysadmins in established companies working with "Gold" coverage from Sun or HP can surf the WEB for 80% of the day... And if you can rarely showed up before 11 a.m., sometimes it is just a survival skill to stay past midnight once in a while... In large companies most sysadmin roles aren't always firefighting, and not so much stress, but they tends to wear on a capable person pretty quickly. Sometimes it is really look like cleaning. You clean it today, but in a month everything return to the same state.  Sometimes it make sense to play an idiot in large company in best traditions of Peter Principle. Officially recognized low-performers often can spend 90% of their time addressing only 10% of problems that high-performer needs to address. The most valued employees in large companies are often on the verge of burn-out because they are too overloaded and have way too many pressures, conflicts and demands combined with too few rewards, acknowledgments and successes.

For IS top guns it might make sense to stop for a moment to dig infodirt and ask themselves a simple question "Does working with the fancy hardware and software (let's assume for a moment that Unix can be fancy first five years or so ;-) worth 60 hours a week or even 40 hours of cleaning infodirt?". Independent of your answer thinking about this may help to adjust your priorities :-).

Pseudo-Attention Deficit Disorder:  Some programmers are perversely wired. It is not uncommon for them to be sitting in a meeting and using a hand-held device to exchange instant messages surreptitiously — with someone in the same meeting. You have Pseudo-attention deficit disorder if:

Dr. Nikolai Bezroukov


Top updates

Bulletin Latest Past week Past month
Google Search


NEWS CONTENTS

Old News ;-)

2006 2005 2004 2003 2002 2001 2000

[Jun 15, 2013] Messages Galore, but No Time to Think By PHYLLIS KORKKI

"In the name of simplicity, I even try to avoid instant messaging. But I also can’t help worrying that I am missing out. "
June 15, 2013 | NYT

I’M old enough to remember a simpler time in the office, when talking — whether in person or on the phone — was the main way to communicate. I once had a job where I filled out those pink “While You Were Out” slips for employees who had stepped away from their desks.

Then, in the 1990s, came e-mail, and things were never the same. Besides delivering a serious blow to the sellers of those pieces of paper, e-mail made communicating with people incredibly — and, at first, delightfully — easy.

Now, a few decades later, people constantly complain that their e-mail in-boxes are unmanageable. And many more technologies have joined the workplace party. We can now use cellphones, texts, instant messaging, text messaging, social media, corporate intranets and cloud applications to communicate at work.

Something may have been lost as we adopted these new communication tools: the ability to concentrate.

“Nobody can think anymore because they’re constantly interrupted,” said Leslie Perlow, a Harvard Business School professor and author of “Sleeping With Your Smartphone.” “Technology has enabled this expectation that we always be on.” Workers fear the repercussions that could result if they are unavailable, she said.

The intermingling of work and personal life adds to the onslaught, as people communicate about personal topics during the workday, and about work topics when they are at home.

According to a 2011 article in The Ergonomics Open Journal, electronic communication tools can demand constant switching, which contributes to a feeling of “discontinuity” in the workplace. On the other hand, people sometimes deliberately introduce interruptions into their day as a way to reduce boredom and to socialize, the article said.

We’re only beginning to understand the workplace impact of new communication tools. The use of such technology in the office is “less rational than we would like to think,” said Steve Whittaker a professor of human-computer interaction at the University of California, Santa Cruz. Sometimes, “it’s one person who’s an evangelist,” he said. “They will start using a particular thing, and they will bring other people along with them.”

More tech-oriented types might favor the latest new communication “toy,” while others, like me, are less enthusiastic. In the name of simplicity, I even try to avoid instant messaging. But I also can’t help worrying that I am missing out.

Plenty of workplace advice focuses on how we, as individuals, can manage our technology, but in many cases, this is a collective, team-level issue, Professor Perlow said.

As Professor Whittaker put it, “We haven’t stabilized our regular practices,” and these may need to be negotiated among workers.

It’s important to distinguish between collaborative and one-on-one communication, he said. Cloud-based systems are meant for sharing and editing documents, and they can enable people in different cities to work together in real time. Internal social media pages can be useful for seeking and sharing knowledge.

But when one person wants to communicate with another privately, e-mail remains the go-to method, Professor Whittaker said. That’s why it is nearly universal, despite a general yearning for something better.

To lessen the disruptive nature of e-mail and other messages, teams need to discuss how to alter their work process to allow blocks of time where they can disconnect entirely, Professor Perlow said. “I don’t think you can do it without leadership support,” she added.

MAYBE more managers, consulting with their teams, need to set up clear guidelines for communication. When is it best to use the cloud? When is it best to use e-mail, or instant messaging? And when is it acceptable, even preferable, to turn off all technology? Not that managers need to be dictators, but a little clarity can lead to much more productivity.

Making it a priority to learn how to use the latest tools more effectively is a good idea, too. For example, how do those filters that help prioritize messages really work?

And let’s never forget the value of face-to-face, or voice-to-voice, communication. An actual unrehearsed conversation — requiring sustained attention and spontaneous reactions — may be old-fashioned, but it just might turn up something new.

 

[May 22, 2013] Present Shock And The Loss Of History And Context

Zero Hedge

Submitted by Charles Hugh-Smith of OfTwoMinds blog,

In his new book, Douglas Rushkoff examines the telescoping of time and context wrought by ubiquitous digital technologies.

One of the few observers who is able to articulate a coherent critical account of American culture is Douglas Rushkoff. His new must-read book is Present Shock: When Everything Happens Now (print edition) and (Kindle edition).

I have long found inspiration and insight in Rushkoff's work, especially his keen understanding of the pathologies of consumerism. In my 2009 book Survival+, I wrote:

Rushkoff's reply to an interview question on the consequences of ubiquitous marketing reveals how media/marketing has created an unquestioned politics of experience in which one's identity and sense of self is constructed almost entirely by what one buys:

"Children are being adultified because our economy is depending on them to make purchasing decisions. So they're essentially the victims of a marketing and capitalist machine gone awry. You know, we need to expand, expand, expand. There is no such thing as enough in our current economic model and kids are bearing the brunt of that.... So they're isolated, they're alone, they're desperate. It's a sad and lonely feeling....The net effect of all of this marketing, all of this disorienting marketing, all of the shock media, all of this programming designed to untether us from a sense of self, is a loss of autonomy. You know, we no longer are the active source of our own experience or our own choices. Instead, we succumb to the notion that life is a series of product purchases that have been laid out and whose qualities and parameters have been pre-established."

In my view, this is a brilliant analysis of the rot at the heart of the American project.

In his new book, Rushkoff examines the telescoping of time and context wrought by ubiquitous digital technologies. We're always accessible, always connected and every channel is always on; this overload affects not just our ability to process information but our culture and the way media and marketing are designed and delivered.

The title consciously plays off the influential 1970 book by Alvin Toffler, Future Shock, which posited that our innate ability to process change was limited even as the rate of change in our post-industrial world increased. That rate of change would soon overwhelm our capacity to process new inputs and adapt to them.

In Rushkoff's view, we've reached that future: the speed of change and the demands of the present are disorienting us in profound ways.

We all know what stress feels like: it often causes our view to narrow to the present stressor, and we lose perspective and the ability to "make sense" of anything beyond managing the immediate situation.

Rushkoff identifies five symptoms of present shock:

  1. Narrative collapse - the loss of linear stories and their replacement with both crass reality programming and post-narrative shows like The Simpsons.
  2. Digiphrenia – digitally provoked mental chaos as technology lets us be in more than one place at any one moment. As Rushkoff notes in this chapter: Our boss isn't the guy in the corner office, but the PDA in our pocket. Our taskmaster is depersonalized and internalized.
  3. Overwinding – trying to squish huge timescales into much smaller ones, for example, packing a year’s worth of retail sales expectations into a single Black Friday event.
  4. Fractalnoia – making sense of our world entirely in the present tense, by drawing connections between things with weak causal relationships, for example Big Data, which excels at identifying correlations but is utterly incapable of identifying cause amidst the correlations.
  5. Apocalypto – the intolerance for presentism leads us to fantasize a grand finale, the cultural equivalent of a "market-clearing event."

As Janet Maslin of the New York Times wrote in her review: "How do we shield ourselves from distraction, or gravitate to what really matters?"

Studies have shown that our innate ability to remember people and identify their relationships with others is limited to around 100 people--the size of a village or combat company. We undoubtedly have similar innate limitations on how many channels of input we can absorb.

Clay Shirky (author of Here Comes Everybody: The Power of Organizing Without Organizations) calls this filter failure, his term for what used to be called information overload. Our filters become overloaded and we lose the ability to "make sense" of what's going on around us.

As the phenomenologists discovered in the 20th century, our basic coping mechanism is to separate the world (and inputs) into three basic categories: the focal point, the foreground and the deep background. Being unable to sort out which input belongs in the three spaces leads to disorientation and poor decisions.

The parallels between filter failure and stress are not coincidental, as we handle filter failure and present shock the same way we handle stress: we limit inputs and make a concerted effort to reorient our awareness and context, what some call "be still and know."

Another troubling parallel to present shock is addiction. People now respond to texts, emails, alerts and phone calls like rats in the proverbial cage with the lever that releases another tab of cocaine: they over-stimulate themselves to death but are incapable of restraining their impulse for more.

The "obvious" solution is to turn off inputs as a way of restoring our ability to live in a present without novelty and distraction. This is akin to withdrawal from a powerful opiate, and so we should not be surprised that there are now treatment facilities for kids who need to detox from digital inputs.

Rushkoff is especially attuned to the distortions in our experience of time created by digital media-communication present shock: "Time in the digital era is no longer linear but disembodied and associative. The past is not something behind us on the timeline but dispersed through the sea of information."

In effect, change no longer flows linearly like time anymore, it flows in all directions at once.

History and meaningful context are both fatally disrupted by this non-linear flow of time and narrative. Is it any wonder that we now read about young well-educated people who do not understand the meaning of "policy"? To understand policy requires a grasp of the histories and narratives that led to the policy, and the linear, causally-linked way that policy is designed to solve or ameliorate a specific problem or challenge.

If the causal chains of history and narrative are disrupted, then how can anyone fashion a meaningful context for actions and narratives, and effectively frame problems and solutions? If everything is equally valid in a non-linear flood of data, then what roles can authenticity, experience and knowledge play in making sense of our world?

These are knotty, complex issues, and you will find much to constructively ponder in Present Shock.

[May 19, 2013] Faced With Overload, a Need to Find Focus by Tony Schwartz

NYTimes.com

For more than a decade, the most significant ritual in my work life has been to take on the most important task of the day as my first activity, for 90 minutes, without interruption, followed by a renewal break. I do so because mornings are when I have the highest energy and the fewest distractions.

... Far and away the biggest work challenges most of us now face are cognitive overload and difficulty focusing on one thing at a time.

Whenever I singularly devote the first 90 minutes of my day to the most challenging or important task – they’re often one and the same — I get a ton accomplished.

Following a deliberate break – even just a few minutes — I feel refreshed and ready to face the rest of the day. When I don’t start that way, my day is never quite as good, and I sometimes head home at night wondering what I actually did while I was so busy working.

Performing at a sustainably high level in a world of relentlessly rising complexity requires that we manage not just our time but also our energy – not just how many hours we work, but when we work, on what and how we feel along the way.

Fail to take control of your days — deliberately, consciously and purposefully — and you’ll be swept along on a river of urgent but mostly unimportant demands.

It’s all too easy to rationalize that we’re powerless victims in the face of expectation from others, but doing that is itself a poor use of energy. Far better to focus on what we can influence, even if there are times when it’s at the margins.

Small moves, it turns out, can make a significant difference.

When it comes to doing the most important thing first each morning, for example, it’s best to make that choice, along with your other top priorities, the night before.

Plainly, there are going to be times that something gets in your way and it’s beyond your control. If you can reschedule for later, even 30 minutes, or 45, do that. If you can’t, so be it. Tomorrow is another day.

If you’re a night owl and you have more energy later in the day, consider scheduling your most important work then. But weigh the risk carefully, because as your day wears on, the number of pulls on your attention will almost surely have increased.

Either way, it’s better to work highly focused for short periods of time, with breaks in between, than to be partially focused for long periods of time. Think of it as a sprint, rather than a marathon. You can push yourself to your limits for short periods of time, so long as you have a clear stopping point. And after a rest, you can sprint again.

How you’re feeling at any given time profoundly influences how effectively you’re capable of working, but most of us pay too little attention to these inner signals.

Fatigue is the most basic drag on productivity, but negative emotions like frustration, irritability and anxiety are equally pernicious. A simple but powerful way to check in with yourself is to intermittently rate the quantity and quality of your energy — say at midmorning, and midafternoon — on a scale from 1 to 10.

If you’re a 5 or below on either one, the best thing you can do is take a break.

Even just breathing deeply for as little as one minute – in to a count of three, out to a count of six – can quiet your mind, calm your emotions and clear your bloodstream of the stress hormone cortisol.

Learn to manage your energy more skillfully, and you’ll get more done, in less time, at a higher level of focus. You’ll feel better — and better about yourself — at the end of the day.

About the Author

Tony Schwartz is the chief executive of the Energy Project and the author, most recently, of “Be Excellent at Anything: The Four Keys to Transforming the Way We Work and Live.” Twitter: @tonyschwartz

Anne-Marie Hislop, Chicago

The key is figuring our when we are most productive and focused. Although a morning person, I need early time for my rituals - exercise, shower, coffee, and NYT online (along with pop-ins at other sites). Then, by time I get to work around 8AM I will have my most focused, productive hours.

I also find that standing while I work on my computer, which I do more and more when I can (don't have a way to type extensively while standing) energizes me and helps me focus.

John Lamont, Pennsylvania

Frankly I think the core premise of the article, how to get more "done", needs to be questioned, especially in the context to which it speaks, the corporate environment. We would be far better served if Mr. Schwarz's audience spent that 1st 30 minutes of their day sitting back and thinking about what they do, who it's really for, what are the consequences of what they and their company do, and is it morally and ecologically ethical.

In the grand scheme of things I don't doubt mankind is now better off than it was 2,000 or even 500 years ago, but in our relentless drive to produce, consume and sell we have developed, and continue to develop technologies and complex global social interactions that have a good chance of setting us back to the stone age.

Let's not be in such a hurry getting wherever we think we're going and spend a bit more time pondering about where it is we actually want to be and who we want to be when we get there.

When Android Ate My Best Friends  by Carla Schroder

 Everyone has a cell phone these days. Out here in my little corner of the world, in a county that competes with the neighboring county for the poorest in the state, everyone can somehow afford smartphones with generous data plans. I have no idea what people's eye colors are anymore, or if they even have eyes, because all I see are the tops of their heads as they are bent over their tiny screens. This stuff is not cheap-- I don't know anyone whose monthly bite is under a hundred dollars. Which is why I have a cheapo TracFone, because I refuse to pay that much. Plus I like hoarding minutes, so I turn it off. I don't have to be in constant contact with my eleventeen bestest friends at all times.

Michelle, Ma Bell

American telecoms are special beasts. Back in the olden days we had a single giant regulated monopoly, AT&T. Technological progress was non-existent, and stuck at a level barely above Alexander Graham Bell's original prototype. We could not own our telephones, but had to lease them from the phone company, which made those old phones some of the most valuable hardware in existence because we kept paying for them year after year after year. We could not add extensions, or attach any other equipment. The one upside was rock-solid service, which set American telephone service apart from most other countries, where unreliability was the norm.

Then Ma Bell was broken up and we got competition, sort of. We never got a choice of carriers for local service, but long distance became competitive. Though again only sort-of, because in-state calls cost more than cross-country calls, and other oddities. Now with mobile phones everywhere a lot of people don't even bother with land lines, and they'e all happy at getting free long distance, even though it's not really free and they're paying a lot more. But even though mobile service costs more, it includes more, like worse call quality and no-service areas. I estimate that 40% of all cell traffic is "What? Are you there? Hello? What?"

When We Dialed Telephones

Where was I going with all this? Oh yeah, ubiquity. My grandmother had a single black dial telephone, and it sat on a special table next to a chair in her entry hall. A phone call was a bit of an event-- she couldn't Web surf while half-listening, or watch TV, or go shopping, or put people on hold and juggle multiple calls. She had conversations, one at a time. She couldn't just pick up and call someone when she felt like it because she was on a party line. That is a shared phone line, which meant everyone who shared the line could eavesdrop on your calls. When I grew up the other giant time- and attention-pit-of-suck, television, was not yet everywhere, and a lot of our friends did not have TVs. So when we got together we talked to each other. With eye contact and everything.

Now we're all proud that Android dominates mobile phones, rah rah Linux. Little kids have their own phones, and again I marvel at how much people are willing to pay for their mobile fix. Sure, for some it's a necessity, but in my somewhat humble opinion most of the time it's more akin to an addiction. It's like the rats that push the button that stimulates the pleasure centers of their brains, and then starve to death because they won't push the food button. Humans just plain love to push buttons, and are willing to pay top dollar for the privilege-- vending machines, video poker, serial channel-surfing on the TV, mobile phones; give them buttons to push and they're happy for hours.

Woa, you might be thinking, get off the grumpy train, because mobile phones are useful tools! And you are right. But I'm still going to be grumpy at people who won't turn them off when we're visiting or doing an activity together. You know those people who have to answer the phone no matter what they're doing? Showering, sleeping, birthing babies? They're a thousand times worse with mobile phones. In the olden days it was considered rude to leave the TV on when people came to visit. Unless they came to watch a program, of course. Remember when call waiting was all new and special? And an insult, like whoever you were talking to was hoping someone better than you would interrupt your call. Now the phone is the TV, along with a million other interruptions, distractions, delights, and portability. We can't escape the things.

Thinking. No Really.

One of the things I love about computer nerds is most of them understand the need for long stretches of uninterrupted time to think, and to concentrate deeply on a task. It is impossible to master a new skill or solve a problem when you're skittering randomly from one activity to the next, never engaging more than the bare surface of your consciousness. It's unsatisfying, because you never accomplish anything. Multi-tasking is a myth. It is the very rare human who can perform two or more tasks at once. A "multi-tasker" is someone who juggles multiple chores and does a poor job at all of them. I prefer total immersion: full attention and no distractions.

Magic happens in your brain when you achieve that state of total focus. It's almost a meditative state. Obstacles fall away and your path become wide and clear. It's as though you're forging new neural pathways and amping up your brainpower. Single-tasking has superpowers.

When Television Ate My Best Friend

The more things change, the more they stay the same, so please enjoy Linda Ellerbee's classic When Television Ate My Best Friend:  

"At last I knew what had happened to Lucy. The television ate her.  It must have been a terrible thing to see. Now my parents were thinking of getting one. I was scared. They didn’t understand what television could do."

Beginning Android Programming

Pushing buttons is fun, and building the buttons is a million times more fun. Try Juliet Kemp's excellent introduction to Android programming:

Android Programming for Beginners: Part 1

Android Programming for Beginners: Part 2

[Apr 28, 2012] Lessons from Sheryl Sandberg Stop Working More Than 40 Hours a Week

Inc.com

You may think you're getting more accomplished by working longer hours. You're probably wrong.

There's been a flurry of recent coverage praising Sheryl Sandberg, the chief operating officer of Facebook, for leaving the office every day at 5:30 p.m. to be with her kids. Apparently she's been doing this for years, but only recently "came out of the closet," as it were.

What's insane is that Sandberg felt the need to hide the fact, since there's a century of research establishing the undeniable fact that working more than 40 hours per week actually decreases productivity.

In the early 1900s, Ford Motor ran dozens of tests to discover the optimum work hours for worker productivity. They discovered that the "sweet spot" is 40 hours a week–and that, while adding another 20 hours provides a minor increase in productivity, that increase only lasts for three to four weeks, and then turns negative.

Anyone who's spent time in a corporate environment knows that what was true of factory workers a hundred years ago is true of office workers today. People who put in a solid 40 hours a week get more done than those who regularly work 60 or more hours.

The workaholics (and their profoundly misguided management) may think they're accomplishing more than the less fanatical worker, but in every case that I've personally observed, the long hours result in work that must be scrapped or redone.

Accounting for Burnout What's more, people who consistently work long work weeks get burned out and inevitably start having personal problems that get in the way of getting things done.

I remember a guy in one company I worked for who used the number of divorces in his group as a measure of its productivity. Believe it or not, his top management reportedly considered this a valid metric. What's ironic (but not surprising) is that the group itself accomplished next to nothing.

In fact, now that I think about it, that's probably why he had to trot out such an absurd (and, let's face it, evil) metric.

Proponents of long work weeks often point to the even longer average work weeks in countries like Thailand, Korea, and Pakistan–with the implication that the longer work weeks are creating a competitive advantage.

Europe's Ban on 50-Hour Weeks However, the facts don't bear this out. In six of the top 10 most competitive countries in the world (Sweden, Finland, Germany, Netherlands, Denmark, and the United Kingdom), it's illegal to demand more than a 48-hour work week. You simply don't see the 50-, 60-, and 70-hour work weeks that have become de rigeur in some parts of the U.S. business world.

If U.S. managers were smart, they'd end this "if you don't come in on Saturday, don't bother coming to work on Sunday" idiocy. If you want employees (salaried or hourly) to get the most done–in the shortest amount of time and on a consistent basis–40 hours a week is just about right.

In other words, nobody should be apologizing for leaving at work at a reasonable hour like 5:30 p.m. In fact, people should be apologizing if they're working too long each week–because it's probably making the team less effective overall.

[Apr 07, 2011] Rational Inattention

These models are interesting, but the mathematics underneath them can be challenging (here's a taste):

'Rational Inattention' Guides Overloaded Brains, Helps Economists Understand Market Behavior, by Antonella Tutino, Economic Letter, FRB Dallas: Between Internet news sources, social media and email, people are awash in information, most of it accessible at near-zero cost. Yet, humans possess only a finite capacity to process all of it. The average email user, for example, receives dozens of messages per day. The messages can’t all receive equal attention. How carefully does someone read an email from a sibling or friend before crafting a reply? How closely does a person read an email from the boss?

Limitations on the ability to process information force people to make choices regarding the subjects to which they pay more or less attention. Economists have long acknowledged the existence of human cognitive capacities, but only in recent years have models embodying such limits known as “rational inattention” found their way into mainstream macroeconomics.

Rational inattention models have a broad range of applications. They may reconcile relatively unchanged prices and volatile ones and how the two play out in aggregate demand in the U.S. economy. Moreover, such models can capture salient features of the business cycle, providing a rationale for sharp contractions or slower expansions. Finally, rational inattention models have significant implications for monetary policy. Since the focus of these models revolves around formation of peoples’ expectations, understanding how individuals perceive the economy is instrumental to policymakers’ efforts to achieve output and price stabilization objectives.

Rational Inattention: A Primer

One macroeconomic school of thought—known as rational expectations—assumes that people fully and quickly process all freely available information. By comparison, under rational inattention theory, information is also fully and freely available, but people lack the capability to quickly absorb it all and translate it into decisions. Rational inattention is based on a simple observation: Attention is a scarce resource and, as such, it must be budgeted wisely.[1]

[Dec 29, 2010]  The Shallows What the Internet Is Doing to Our Brains by  Nicholas Carr

Part of the problem is information overload and it should be called by the author as such. The other part is that deluge of Internet information does not mean high quality of information. There is specific "Google effect" when pages are created just to extract advertising fees and promoted using "link farms" to get high Google rating for the topic. And spending time on junk is spending time on junk whether it is electronic or not. It's just easier with computer.  Nicklaus Carr easier article in Atlantic Is Google Making Us Stupid- - Magazine - The Atlantic covers the same ground.
Amazon.com

William Timothy Lukeman Death by a thousand distracting cuts, June 8, 2010

In this short but informative, thought-provoking book, Nicholas Carr presents an argument I've long felt to be true on a humanist level, but supports it with considerable scientific research. In fact, he speaks as a longtime computer enthusiast, one who's come to question what he once wholeheartedly embraced ... and even now, he takes care to distinguish between the beneficial & detrimental aspects of the Internet.

The argument in question?

The studies that Carr presents are troubling, to say the least. From what has been gleaned to date, it's clear that the brain retains a certain amount of plasticity throughout life -- that is, it can be reshaped, and the way that we think can be reshaped, for good or for ill. Thus, if the brain is trained to respond to & take pleasure in the faster pace of the digital world, it is reshaped to favor that approach to experiencing the world as a whole. More, it comes to crave that experience, as the body increasingly craves more of anything it's trained to respond to pleasurably & positively. The more you use a drug, the more you need to sustain even the basic rush.

And where does that leave the mind shaped by deep reading? The mind that immerses itself in the universe of a book, rather than simply looking for a few key phrases & paragraphs? The mind that develops through slow, quiet contemplation, mulling over ideas in their entirety, and growing as a result? The mature mind that ponders possibilities & consequences, rather than simply going with the bright, dazzling, digital flow?

Nowhere, it seems.

Carr makes it clear that the digital world, like any other technology that undeniably makes parts of life so much easier, is here to stay. All the more reason, then, to approach it warily, suspiciously, and limit its use whenever possible, since it is so ubiquitous. "Yes, but," many will say, "everything is moving so fast that we've got to adapt to it, keep up with it!" Not unlike the Red Queen commenting that it takes all of one's energy & speed to simply remain in one place while running. But what sort of life is that? How much depth does it really have?

Because some aspects of life -- often the most meaningful & rewarding aspects -- require time & depth. Yet the digital world constantly makes us break it into discrete, interchangeable bits that hurtle us forward so rapidly & inexorably that we simply don't have time to stop & think. And before we know it, we're unwilling & even unable to think. Not in any way that allows true self-awareness in any real context.

Emerson once said (as aptly quoted by Carr), "Things are in the saddle / And ride mankind." The danger is that we'll not only willingly, even eagerly, wear those saddles, but that we'll come to desire them & buckle them on ever more tightly, until we feel naked without them. And we'll gladly pay anything to keep them there, even as we lose the capacity to wonder why we ever put them on in the first place.

Most highly recommended!

Devin (Vancouver) - See all my reviews

This book is a more fully fleshed out attempt to answer the question that Carr first posed a couple of years ago in an article titled "Is Google Making Us Stupid?"

The Shallows: What the Internet is Doing to Our Brains explores the ideas of his Google article in much more detail.

To fully understand what the Internet is doing to our brains, we must first understand our brains. Carr highlights results from a variety of iconic and recent studies that illustrate the plasticity of our thinking organs. We see experiments ranging from the severed sensory nerves of monkeys' hands in the 1960s (and their brains subsequent `rewiring') to London taxi drivers whose posterior hippocampuses (the "part of the brain that plays a key role in storing and manipulating spatial represenations of a person's surroudings") were much larger than normal. In short, we see plenty of evidence that the brain can reorganize itself, and is certainly not fixed in one state for all of its adult life.

The Shallows then explores the history of the written word and its explosion due to Gutenberg's invention, and even further back to the argument between Socrates and Plato concerning the value of the written word. Socrates argued that if we committed all of our thoughts to paper, we would not have to remember anything. How do we know this? From the writings of Plato, of course. The soundwaves of Socrates' voice, as wise as he was, cannot travel through time like written words can.

With the first half detailing the brain's plasticity and our species' history with the accumulation of knowledge, Carr sets up the latter half of the book perfectly, and his ideas might be grossly simplified into something like this:

P1: Experiments of brain plasticity have proven that our brains change over time.
P2: We are using the Internet for an increasing amount of our activities, including work, entertainment and commerce.
P3: The Internet is a medium that encourages distractedness and makes our brains inept at remembering.

C: We are all becoming a lot more dependent upon our digital devices, and in doing so, are increasingly distracted in everything we do, both online and off.

Carr's book is a giant caution sign on the side of the road that we ride into the increasingly digital future. The caution sign might be too far behind us already, as we've blazed ahead and rewired our minds to think like computers - logical, task-switching, and distracted at every second of the day. If people in their 30s and 40s (who may have had the Internet for approximately 25-40% of their life times) are experiencing these changes in their brains, imagine the effect the Internet is having on our youth. The Net Generation is defined to be those who have grown up with the Net for more than half their lives. There are some who have had the Internet in 100% of their life spans. Imagine that, never knowing a world without the Internet. Yes, some children are younger than Google. Imagine explaining to your grandchildren that you grew up in a time that didn't have the Internet, let alone the information organizing superpower known as Google.

Will we look back at this period of transition from print to digital and see it as being as momentous as the shift from an oral culture to a print culture? What would Socrates have thought? Have we become lesser human beings, inextricably tied to the addictive external memories of our computers and mobile phones?

Could it be that George W. Bush infamous "the Internets" quote was just a sign of the stupidness to come? Perhaps Bush was ahead of his time. Perhaps the Flynn effect is about to peak, or already has. Could the greatest learning tool ever created be so useful that we forget how to think as we use it?

This is a great book for anyone who's interested in our society as a whole, how our brains work, the effects of technology, and the process of learning. Highly recommended.

J. Edgar Mihelic "Iconoclast, Juggernaut" (Chicago)My wisdom is getting flatter., December 8, 2010  - See all my reviews

Carr, in his epilogue to this work, warns that we as a species have to be 'attentive to what we stand to lose.' In his view, the brain's adaptation to the newer and newer technologies in effect flattens our minds and deprives the individual human of the depth that once could be called wisdom. I agree with him more because I am an avowed Luddite than for the wonderful argument he lays out.

For example, I am enrolled in a science class at this time. As the semester comes to a close, I have an average approaching 100. The problem is that I have actually learned little of the actual science but I have instead learned to utilize the electronic tools built into the shell; the entire evaluative framework of the class is on-line. I have learned how to find the answers but I do not know the answers. Compare this to a literature class, where you have to maybe read and analyze and memorize things and your own wisdom is grown because you build long-term memories that give crucial context. This may not matter, per se, in the terms of the professions of the future, but they have a real impact when it comes to human-to-human interaction or aesthetic enjoyment.

The ironic thing is that I did not read this book. I instead listened to it on my mp3 player as I worked and from time to time flipped through to look at Amazon or Facebook or Gmail. My own mind has developed according to the standards of the internet -- I am hyperlinks not a straight narrative. I can no longer read one book, but have to be 'reading' many simultaneously. This factor will only increase as time and technology advance as we see ourselves more in terms of machines, and the machines start to see themselves in terms of us.

barvazos

the authors description of our decrease in attention and cognitive capabilities is at times a bit exaggerated.

however, the idea of our mind and thought process physically changing and the descriptions of the evolution of thought are really insightful and coherent.

good book, lot's of stuff to think about.
enjoy

[Aug 25, 2010] Your Brain on Computers - Overuse of Digital Devices May Lead to Brain Fatigue - NYTimes.com

SAN FRANCISCO — It’s 1 p.m. on a Thursday and Dianne Bates, 40, juggles three screens. She listens to a few songs on her iPod, then taps out a quick e-mail on her iPhone and turns her attention to the high-definition television.

Break Time vs. Screen Time

Articles in this series examine how a deluge of data can affect the way people think and behave.

Previous Articles in the Series
Multimedia
 Interactive Feature
The Unplugged Challenge
Jim Wilson/The New York Times

Loren Frank, a professor of physiology, said downtime lets the brain go over experiences, “solidify them and turn them into permanent long-term memories.

Just another day at the gym.

As Ms. Bates multitasks, she is also churning her legs in fast loops on an elliptical machine in a downtown fitness center. She is in good company. In gyms and elsewhere, people use phones and other electronic devices to get work done — and as a reliable antidote to boredom.

Cellphones, which in the last few years have become full-fledged computers with high-speed Internet connections, let people relieve the tedium of exercising, the grocery store line, stoplights or lulls in the dinner conversation.

The technology makes the tiniest windows of time entertaining, and potentially productive. But scientists point to an unanticipated side effect: when people keep their brains busy with digital input, they are forfeiting downtime that could allow them to better learn and remember information, or come up with new ideas.

Ms. Bates, for example, might be clearer-headed if she went for a run outside, away from her devices, research suggests.

At the University of California, San Francisco, scientists have found that when rats have a new experience, like exploring an unfamiliar area, their brains show new patterns of activity. But only when the rats take a break from their exploration do they process those patterns in a way that seems to create a persistent memory of the experience.

The researchers suspect that the findings also apply to how humans learn.

“Almost certainly, downtime lets the brain go over experiences it’s had, solidify them and turn them into permanent long-term memories,” said Loren Frank, assistant professor in the department of physiology at the university, where he specializes in learning and memory. He said he believed that when the brain was constantly stimulated, “you prevent this learning process.”

At the University of Michigan, a study found that people learned significantly better after a walk in nature than after a walk in a dense urban environment, suggesting that processing a barrage of information leaves people fatigued.

Even though people feel entertained, even relaxed, when they multitask while exercising, or pass a moment at the bus stop by catching a quick video clip, they might be taxing their brains, scientists say.

“People think they’re refreshing themselves, but they’re fatiguing themselves,” said Marc Berman, a University of Michigan neuroscientist.

Regardless, there is now a whole industry of mobile software developers competing to help people scratch the entertainment itch. Flurry, a company that tracks the use of apps, has found that mobile games are typically played for 6.3 minutes, but that many are played for much shorter intervals. One popular game that involves stacking blocks gets played for 2.2 minutes on average.

Today’s game makers are trying to fill small bits of free time, said Sebastien de Halleux, a co-founder of PlayFish, a game company owned by the industry giant Electronic Arts.

“Instead of having long relaxing breaks, like taking two hours for lunch, we have a lot of these micro-moments,” he said. Game makers like Electronic Arts, he added, “have reinvented the game experience to fit into micro-moments.”

Many business people, of course, have good reason to be constantly checking their phones. But this can take a mental toll. Henry Chen, 26, a self-employed auto mechanic in San Francisco, has mixed feelings about his BlackBerry habits.

“I check it a lot, whenever there is downtime,” Mr. Chen said. Moments earlier, he was texting with a friend while he stood in line at a bagel shop; he stopped only when the woman behind the counter interrupted him to ask for his order.

Mr. Chen, who recently started his business, doesn’t want to miss a potential customer. Yet he says that since he upgraded his phone a year ago to a feature-rich BlackBerry, he can feel stressed out by what he described as internal pressure to constantly stay in contact.

Read All Comments (217) »

[Aug 24, 2010] Information Overload

August 25, 2010 | Sra. Weldon

Do you ever wonder if you spend too much time online or find that multitasking makes homework feel like it takes forever?  Do you end up getting less sleep because you’re texting, chatting or surfing or because you find that your brain can’t seem to shut itself down for the night?

The NY Times has an entire series devoted to such topics called “Your Brain on Computers.”  In An Ugly Toll of Technology: Impatience and Forgetfulness, a Dr. Kimberly Young is referenced for comparing net addiction to eating disorders.  Even adults are at risk for various problems involving our ability to parent and nurture as is highlighted by  The Risks of Parenting while Plugged in –there was a comparison between said addiction and alcoholism the way a parent might say to an objecting son/daughter “just one more text” while driving.  

In Attached to Computers and Paying a Price, our connection to constant information changes our chemistry with dopamine (which is possibly addictive) squirts in our brains and alters our expectations of daily life in terms of boredom.  More dangerous are the risks involved in using devices while driving, yet it’s easy to understand how we would reach for them since so many of us are bound to be addicted to them.  We read of families missing big business deals, having family fights (and forgetting to pick up kids!) and getting lower grades.

According to the article, “At home, people consume 12 hours of media a day on average, when an hour spent with, say, the Internet and TV simultaneously counts as two hours. That compares with five hours in 1960, say researchers at the University of California, San Diego. Computer users visit an average of 40 Web sites a day, according to research by RescueTime, which offers time-management tools.

As computers have changed, so has the understanding of the human brain. Until 15 years ago, scientists thought the brain stopped developing after childhood. Now they understand that its neural networks continue to develop, influenced by things like learning skills.”

If you’d like to test your own ability to focus, the NY Times features this test.  If you think you might be addicted and would like to find out more, check out the Net Addiction site.

For what it’s worth, I think technology can be helpful for practicing Spanish and staying up to date on yoObsession with Computers and InternetObsession with Computers and Internetth Computers and Internetth Computers and Internetat school.  It’s even useful for contact between teachers, students and parents (at times).  What I take away from the NY Times series is how essential boundaries are in terms of when/where my family and I are plugged in–there have to be concrete limits for us not to get lost and fragmented.

[Nov 25, 2009] Fixed-Schedule Productivity: How I Accomplish a Large Amount of Work in a Small Number of Work Hours

February 15th. 2008 | Study Hacks

My Schedule Should Be Terrible…

I should have an overwhelming, Malox-guzzling, stress-saturated schedule. Here’s why: I’m a graduate student in a demanding program. I’m working on several research papers while also attempting to nail down some key ideas for my dissertation. I’m TA’ing and taking courses. I maintain this blog. I’m a staff writer for Flak Magazine. And to keep things interesting, I’m working on background research for a potential new book project.

You would be reasonable to assume that I must get, on average, 7 - 8 minutes of sleep a night. But you would also be wrong. Let me explain…

For Some Reason It’s Not…

Here is my actual schedule. I work:

That’s it. Unless I’m bored, I have no need to even turn on a computer after 5 during the week or any time on Saturday. I fill these times, instead, doing, well, whatever I want.

How do I balance an ambitious work load with an ambitiously sparse schedule? It’s a simple idea I call fixed-schedule productivity.

Fixed-Schedule Productivity

The system work as follows:

  1. Choose a schedule of work hours that you think provides the ideal balance of effort and relaxation.
  2. Do whatever it takes to avoid violating this schedule.

This sounds simple. But think about it for a moment. Satisfying rule 2 is not easy. If you took your current projects, obligations, and work habits, you’d probably fall well short of satisfying your ideal work schedule. Here’s a simple truth: to stick to your ideal schedule will require some drastic actions. For example, you may have to:

In the abstract, these all seem like hard things to do. But when you have the focus of a specific goal — “I do not want to work past 5 on week days!” — you’d be surprised by how much easier it becomes deploy these strategies in your daily life.

Let’s look at an example…

Case Study: My Schedule

My schedule provides a good case study. To reach my relatively small work hour limit, I have to be careful with how I go about my day. I see enough bleary-eyed insomniacs around here to know how easy it is to slip into a noon to 3 am routine (the infamous “MIT cycle.”) Here are some of the techniques I regularly use to remain within the confines of my fixed schedule:

Why This Works

You could fill any arbitrary number of hours with what feels to be productive work. Between e-mail, and “crucial” web surfing, and to-do lists that, in the age of David Allen, grow to lengths that rival the bible, there is always something you could be doing. At some point, however, you have to put a stake in the ground and say: I know I have a never-ending stream of work, but this is when I’m going to face it. If you don’t do this, you let the never-ending stream of work push you around like a bully. It will force you into tiring, inefficient schedules. And you’ll end up more stressed and no more accomplished.

Fix the schedule you want. Then make everything else fit around your needs. Be flexible. Be efficient. If you can’t make it fit: change your work. But in the end, don’t compromise. No one really cares about your schedule except for yourself. So make it right.

On the BBC News site, Bill Thompson takes the discussion in an interesting new direction:

The Swiss developmental psychologist Jean Piaget described two processes that he believed lay behind the development of knowledge in children. The first is assimilation, where new knowledge fits into existing conceptual frameworks. More challenging is accommodation, where the framework itself is modified to include the new information.

The current generation of 'search engines' seem to encourage a model of exploration that is disposed towards assimilative learning, finding sources, references and documents which can be slotted into existing frameworks, rather than providing material for deeper contemplation of the sort that could provoke accommodation and the extension, revision or even abandonment of views, opinions or even whole belief systems.

Perhaps the real danger posed by screen-based technologies is not that they are rewiring our brains but that the collection of search engines, news feeds and social tools encourages us to link to, follow and read only that which we can easily assimilate.

Globe and Mail, columnist Margaret Wente becomes the latest writer to fess up to an evaporating ability to read long works of prose:

Google has done wondrous things for my stock of general knowledge. It also seems to have destroyed my attention span. Like a flea with ADD, I jump back and forth from the Drudge Report to gardening sites that list the growing time of Green Zebras …

Thanks to Google, we're all turning into mental fast-food junkies. Google has taught us to be skimmers, grabbing for news and insights on the fly. I skim books now too, even good ones. Once I think I've got the gist, I'll skip to the next chapter or the next book. Forget the background, the history, the logical progression of an argument. Just give me the takeaway.

[Jul 1, 2008] Rough Type Nicholas Carr's Blog

Make information free, and we'll become gluttons of information, as Rob Horning notes in an interesting post today:

As behavioral economists (most vociferously, Dan Ariely) have pointed out, we find the promise of free things hard to resist (even when a little thinking reveals that the free-ness is illusory). So when with very little effort we can accumulate massive amounts of “free” stuff from various places on the internet, we can easily end up with 46 days (and counting) worth of unplayed music on a hard drive. We end up with a permanent 1,000+ unread posts in our RSS reader, and a lingering, unshakable feeling that we’ll never catch up, never be truly informed, never feel comfortable with what we’ve managed to take in, which is always in the process of being undermined by the free information feeds we’ve set up for ourselves. We end up haunted by the potential of the free stuff we accumulate, and our enjoyment of any of it becomes severely impinged. The leisure and unparalleled bounty of a virtually unlimited access to culture ends up being an endless source of further stress, as we feel compelled to take it all in. Nothing sinks in as we try to rush through it all, and our rushing does nothing to keep us from falling further behind—often when I attempt to tackle the unread posts in my RSS reader, I end up finding new feeds to add, and so on, and I end up further behind than when I started.

Information may be free, but, as Horning explains, it exacts a price in the time required to collect, organize, and consume it. As we binge on the Net, the time available for other intellectual activities - like, say, thinking - shrinks. Eventually, we get bloated, mentally, and a kind of intellectual nausea sets in. But we can't stop because - hey - it's free.

[Jun 23, 2008] Multitasking Considered Detrimental

Posted by kdawson on Monday June 23, @02:20AM
from the but-we-knew-this dept. djvaselaar sends along an article from The New Atlantis that summarizes recent research indicating that multitasking may be detrimental to work and learning.. It begins, "In one of the many letters he wrote to his son in the 1740s, Lord Chesterfield offered the following advice: 'There is time enough for everything in the course of the day, if you do but one thing at once, but there is not time enough in the year, if you will do two things at a time.' To Chesterfield, singular focus was not merely a practical way to structure one's time; it was a mark of intelligence... E-mails pouring in, cell phones ringing, televisions blaring, podcasts streaming--all this may become background noise, like the 'din of a foundry or factory' that [William] James observed workers could scarcely avoid at first, but which eventually became just another part of their daily routine. For the younger generation of multitaskers, the great electronic din is an expected part of everyday life. And given what neuroscience and anecdotal evidence have shown us, this state of constant intentional self-distraction could well be of profound detriment to individual and cultural well-being."

The New Atlantis » The Myth of Multitasking

In one of the many letters he wrote to his son in the 1740s, Lord Chesterfield offered the following advice: “There is time enough for everything in the course of the day, if you do but one thing at once, but there is not time enough in the year, if you will do two things at a time.” To Chesterfield, singular focus was not merely a practical way to structure one’s time; it was a mark of intelligence. “This steady and undissipated attention to one object, is a sure mark of a superior genius; as hurry, bustle, and agitation, are the never-failing symptoms of a weak and frivolous mind.”

In modern times, hurry, bustle, and agitation have become a regular way of life for many people—so much so that we have embraced a word to describe our efforts to respond to the many pressing demands on our time: multitasking. Used for decades to describe the parallel processing abilities of computers, multitasking is now shorthand for the human attempt to do simultaneously as many things as possible, as quickly as possible, preferably marshalling the power of as many technologies as possible.

In the late 1990s and early 2000s, one sensed a kind of exuberance about the possibilities of multitasking. Advertisements for new electronic gadgets—particularly the first generation of handheld digital devices—celebrated the notion of using technology to accomplish several things at once. The word multitasking began appearing in the “skills” sections of résumés, as office workers restyled themselves as high-tech, high-performing team players. “We have always multitasked—inability to walk and chew gum is a time-honored cause for derision—but never so intensely or self-consciously as now,” James Gleick wrote in his 1999 book Faster. “We are multitasking connoisseurs—experts in crowding, pressing, packing, and overlapping distinct activities in our all-too-finite moments.” An article in the New York Times Magazine in 2001 asked, “Who can remember life before multitasking? These days we all do it.” The article offered advice on “How to Multitask” with suggestions about giving your brain’s “multitasking hot spot” an appropriate workout.

But more recently, challenges to the ethos of multitasking have begun to emerge. Numerous studies have shown the sometimes-fatal danger of using cell phones and other electronic devices while driving, for example, and several states have now made that particular form of multitasking illegal. In the business world, where concerns about time-management are perennial, warnings about workplace distractions spawned by a multitasking culture are on the rise. In 2005, the BBC reported on a research study, funded by Hewlett-Packard and conducted by the Institute of Psychiatry at the University of London, that found, “Workers distracted by e-mail and phone calls suffer a fall in IQ more than twice that found in marijuana smokers.” The psychologist who led the study called this new “infomania” a serious threat to workplace productivity. One of the Harvard Business Review’s “Breakthrough Ideas” for 2007 was Linda Stone’s notion of “continuous partial attention,” which might be understood as a subspecies of multitasking: using mobile computing power and the Internet, we are “constantly scanning for opportunities and staying on top of contacts, events, and activities in an effort to miss nothing.”

Dr. Edward Hallowell, a Massachusetts-based psychiatrist who specializes in the treatment of attention deficit/hyperactivity disorder and has written a book with the self-explanatory title CrazyBusy, has been offering therapies to combat extreme multitasking for years; in his book he calls multitasking a “mythical activity in which people believe they can perform two or more tasks simultaneously.” In a 2005 article, he described a new condition, “Attention Deficit Trait,” which he claims is rampant in the business world. ADT is “purely a response to the hyperkinetic environment in which we live,” writes Hallowell, and its hallmark symptoms mimic those of ADD. “Never in history has the human brain been asked to track so many data points,” Hallowell argues, and this challenge “can be controlled only by creatively engineering one’s environment and one’s emotional and physical health.” Limiting multitasking is essential. Best-selling business advice author Timothy Ferriss also extols the virtues of “single-tasking” in his book, The 4-Hour Workweek.

Multitasking might also be taking a toll on the economy. One study by researchers at the University of California at Irvine monitored interruptions among office workers; they found that workers took an average of twenty-five minutes to recover from interruptions such as phone calls or answering e-mail and return to their original task. Discussing multitasking with the New York Times in 2007, Jonathan B. Spira, an analyst at the business research firm Basex, estimated that extreme multitasking—information overload—costs the U.S. economy $650 billion a year in lost productivity.

Changing Our Brains

To better understand the multitasking phenomenon, neurologists and psychologists have studied the workings of the brain. In 1999, Jordan Grafman, chief of cognitive neuroscience at the National Institute of Neurological Disorders and Stroke (part of the National Institutes of Health), used functional magnetic resonance imaging (fMRI) scans to determine that when people engage in “task-switching”—that is, multitasking behavior—the flow of blood increases to a region of the frontal cortex called Brodmann area 10. (The flow of blood to particular regions of the brain is taken as a proxy indication of activity in those regions.) “This is presumably the last part of the brain to evolve, the most mysterious and exciting part,” Grafman told the New York Times in 2001—adding, with a touch of hyperbole, “It’s what makes us most human.”

It is also what makes multitasking a poor long-term strategy for learning. Other studies, such as those performed by psychologist René Marois of Vanderbilt University, have used fMRI to demonstrate the brain’s response to handling multiple tasks. Marois found evidence of a “response selection bottleneck” that occurs when the brain is forced to respond to several stimuli at once. As a result, task-switching leads to time lost as the brain determines which task to perform. Psychologist David Meyer at the University of Michigan believes that rather than a bottleneck in the brain, a process of “adaptive executive control” takes place, which “schedules task processes appropriately to obey instructions about their relative priorities and serial order,” as he described to the New Scientist. Unlike many other researchers who study multitasking, Meyer is optimistic that, with training, the brain can learn to task-switch more effectively, and there is some evidence that certain simple tasks are amenable to such practice. But his research has also found that multitasking contributes to the release of stress hormones and adrenaline, which can cause long-term health problems if not controlled, and contributes to the loss of short-term memory.

In one recent study, Russell Poldrack, a psychology professor at the University of California, Los Angeles, found that “multitasking adversely affects how you learn. Even if you learn while multitasking, that learning is less flexible and more specialized, so you cannot retrieve the information as easily.” His research demonstrates that people use different areas of the brain for learning and storing new information when they are distracted: brain scans of people who are distracted or multitasking show activity in the striatum, a region of the brain involved in learning new skills; brain scans of people who are not distracted show activity in the hippocampus, a region involved in storing and recalling information. Discussing his research on National Public Radio recently, Poldrack warned, “We have to be aware that there is a cost to the way that our society is changing, that humans are not built to work this way. We’re really built to focus. And when we sort of force ourselves to multitask, we’re driving ourselves to perhaps be less efficient in the long run even though it sometimes feels like we’re being more efficient.”

If, as Poldrack concluded, “multitasking changes the way people learn,” what might this mean for today’s children and teens, raised with an excess of new entertainment and educational technology, and avidly multitasking at a young age? Poldrack calls this the “million-dollar question.” Media multitasking—that is, the simultaneous use of several different media, such as television, the Internet, video games, text messages, telephones, and e-mail—is clearly on the rise, as a 2006 report from the Kaiser Family Foundation showed: in 1999, only 16 percent of the time people spent using any of those media was spent on multiple media at once; by 2005, 26 percent of media time was spent multitasking. “I multitask every single second I am online,” confessed one study participant. “At this very moment I am watching TV, checking my e-mail every two minutes, reading a newsgroup about who shot JFK, burning some music to a CD, and writing this message.”

The Kaiser report noted several factors that increase the likelihood of media multitasking, including “having a computer and being able to see a television from it.” Also, “sensation-seeking” personality types are more likely to multitask, as are those living in “a highly TV-oriented household.” The picture that emerges of these pubescent multitasking mavens is of a generation of great technical facility and intelligence but of extreme impatience, unsatisfied with slowness and uncomfortable with silence: “I get bored if it’s not all going at once, because everything has gaps—waiting for a website to come up, commercials on TV, etc.” one participant said. The report concludes on a very peculiar note, perhaps intended to be optimistic: “In this media-heavy world, it is likely that brains that are more adept at media multitasking will be passed along and these changes will be naturally selected,” the report states. “After all, information is power, and if one can process more information all at once, perhaps one can be more powerful.” This is techno-social Darwinism, nature red in pixel and claw.

Other experts aren’t so sure. As neurologist Jordan Grafman told Time magazine: “Kids that are instant messaging while doing homework, playing games online and watching TV, I predict, aren’t going to do well in the long run.” “I think this generation of kids is guinea pigs,” educational psychologist Jane Healy told the San Francisco Chronicle; she worries that they might become adults who engage in “very quick but very shallow thinking.” Or, as the novelist Walter Kirn suggests in a deft essay in The Atlantic, we might be headed for an “Attention-Deficit Recession.”

Paying Attention

When we talk about multitasking, we are really talking about attention: the art of paying attention, the ability to shift our attention, and, more broadly, to exercise judgment about what objects are worthy of our attention. People who have achieved great things often credit for their success a finely honed skill for paying attention. When asked about his particular genius, Isaac Newton responded that if he had made any discoveries, it was “owing more to patient attention than to any other talent.”

William James, the great psychologist, wrote at length about the varieties of human attention. In The Principles of Psychology (1890), he outlined the differences among “sensorial attention,” “intellectual attention,” “passive attention,” and the like, and noted the “gray chaotic indiscriminateness” of the minds of people who were incapable of paying attention. James compared our stream of thought to a river, and his observations presaged the cognitive “bottlenecks” described later by neurologists: “On the whole easy simple flowing predominates in it, the drift of things is with the pull of gravity, and effortless attention is the rule,” he wrote. “But at intervals an obstruction, a set-back, a log-jam occurs, stops the current, creates an eddy, and makes things temporarily move the other way.”

To James, steady attention was thus the default condition of a mature mind, an ordinary state undone only by perturbation. To readers a century later, that placid portrayal may seem alien—as though depicting a bygone world. Instead, today’s multitasking adult may find something more familiar in James’s description of the youthful mind: an “extreme mobility of the attention” that “makes the child seem to belong less to himself than to every object which happens to catch his notice.” For some people, James noted, this challenge is never overcome; such people only get their work done “in the interstices of their mind-wandering.” Like Chesterfield, James believed that the transition from youthful distraction to mature attention was in large part the result of personal mastery and discipline—and so was illustrative of character. “The faculty of voluntarily bringing back a wandering attention, over and over again,” he wrote, “is the very root of judgment, character, and will.”

Today, our collective will to pay attention seems fairly weak. We require advice books to teach us how to avoid distraction. In the not-too-distant future we may even employ new devices to help us overcome the unintended attention deficits created by today’s gadgets. As one New York Times article recently suggested, “Further research could help create clever technology, like sensors or smart software that workers could instruct with their preferences and priorities to serve as a high tech ‘time nanny’ to ease the modern multitasker’s plight.” Perhaps we will all accept as a matter of course a computer governor—like the devices placed on engines so that people can’t drive cars beyond a certain speed. Our technological governors might prompt us with reminders to set mental limits when we try to do too much, too quickly, all at once.

Then again, perhaps we will simply adjust and come to accept what James called “acquired inattention.” E-mails pouring in, cell phones ringing, televisions blaring, podcasts streaming—all this may become background noise, like the “din of a foundry or factory” that James observed workers could scarcely avoid at first, but which eventually became just another part of their daily routine. For the younger generation of multitaskers, the great electronic din is an expected part of everyday life. And given what neuroscience and anecdotal evidence have shown us, this state of constant intentional self-distraction could well be of profound detriment to individual and cultural well-being. When people do their work only in the “interstices of their mind-wandering,” with crumbs of attention rationed out among many competing tasks, their culture may gain in information, but it will surely weaken in wisdom.

[Feb 24, 2008] The Mythology of Information Overload by Tonyia J. Tidline

1/1/99 | Library Trends,

This project combines ideas from mythology, folklore, and library and information science in an effort to make sense of an aspect of modern culture that is frequently perceived as troublesome. Discussions of information overload, "data glut," or "information anxiety" are abundant in popular culture but do little to shed light on the origin of this problem. Library and information science work sidesteps the need to verify the existence of information overload, seeking instead to mitigate its effects. The discipline has produced a vast literature that addresses user perceptions, information needs, and information-seeking behavior. Information management, information retrieval, and attendant notions such as relevance have also received much attention. Within both popular culture and library and information science research, information overload is usually described or defined by means of anecdote or by associated symptoms.

However constituted, popular and scholarly attention confirms information overload as a recognized and resonant cultural concept that persists even without solid corroboration. Mythology and folkloristics are used here as analytic tools to suggest that information overload can be viewed as a myth of modern culture. Here myth does not mean something that is not true but an overarching prescriptive belief.

[ Mar 09, 2007] A very good memo on mental overload from Washington College

Trying to sip information from the fire hose is a difficult and challenging task :-). This memo might help.

In today’s world, mental overload is a fact of life. Fortunately, by applying some simple techniques from the computer world, you can avoid some of the costly consequences of a too full brain!

SIGNS OF AN OVERLOAD

A too-full computer can:
· give you error messages
· run slower
· take longer to process information
· crash

A too-full brain can cause you to:
· make mistakes
· forget to do something
· let things slip through the cracks
· become sluggish
· loose creativity
· become unproductive
· procrastinate
· become indecisive
· get stressed out
· experience a total mental break down

[ Mar 09, 2007] Mental overload by Katherine Lewis

Does excessive multi-tasking like happens to college students make us stupid?  The answer is tentative yes:
Multi-tasking may be too much for the brain to handle Friday, March 09, 2007

BY KATHERINE REYNOLDS LEWIS

NEWHOUSE NEWS SERVICE

We feel so efficient, listening to a teleconference while sorting e-mail and eating lunch at the same time. But experts warn that instead of completing three tasks in the space of one, we're really spending more time to achieve mediocre results.

"Research that's looked at multi-tasking shows that you can't do it well -- no one can," said Kristin Byron, assistant professor of management at Syracuse University. "You're fighting the way your brain works."

The brain acts on just one task at a time. What we perceive as simultaneous multi-tasking is really rapid switching back and forth to keep different tasks going -- even if one is as simple as deciding to lift the sandwich for another bite.

It's like the classic vaudeville act of spinning plates. Your brain can set a task in motion, then another, and then another, before returning to pick up the first task, explained David Strayer, a psychology professor at the University of Utah in Salt Lake City.

"If the demands of any given task aren't too taxing, you can get two, three, four plates going up, but at some point you're going to reach a threshold when they're going to crash."

You may avoid driving while talking on a cell phone because of the physical challenge of holding both phone and steering wheel. But Strayer's research shows hands-free cell phone use is just as dangerous while driving. The risk comes in toggling between the two mental demands.

Moreover, subjects in a recent study scored significantly lower on IQ tests they took while driving. "When your attention is taken away from a task, you are not going to perform it as smartly," Strayer said.

So does multi-tasking make us stupid?

It's not an outlandish conclusion. A 2005 study sponsored by Hewlett-Packard found the average worker lost 10 IQ points when interrupted by ringing telephones and incoming e-mails -- about equal to the cost of missing an entire night of sleep.

"Interruptions are time-consuming, and they are dangerous in the sense that they can lead to errors," said David E. Meyer, a psychology professor at the University of Michigan in Ann Arbor. "You are trying to feed information through various kinds of processing channels in the brain which have limited capacity and are really only available for one thing at a time."

Whenever we drop one task to perform another, we face "resumption costs" -- the time and energy it takes to orient ourselves when we return to the original task. It's true that interweaving two lengthy tasks can take less total time than performing the tasks separately. But there's a price.

"A lot of tasks we have to do, there are little moments of gaps which you can steal for another task," said Hal Pashler, psychology professor at the University of California in San Diego. "The interesting hidden cost ... is that (we) may be strikingly unable to recollect what happened."

That's because the free moments in each task -- such as waiting for a partner to respond in a conversation -- appear to be used to store or consolidate memories. If we talk on the phone while checking e-mail, it's at the expense of downtime our brains need.

"The conversation plus the e-mail may take less of your life, but the cost is that tomorrow you may not know exactly what you said," Pashler said.

Thus, if you try to take in new material or facts while multi-tasking, you'll have a tougher time learning, said Russell A. Poldrack, psychology professor at the University of California in Los Angeles.

Does all this mean we should never check our Blackberries while waiting in line at the grocery store? Or even sip a cup of coffee while listening to a conference speaker? After all, multi-tasking is woven into the fabric of modern life. More than 85 percent of people multi-task, and 67 percent believe they do it well, according to a survey by Apex Performance, a Charlotte, N.C., training firm.

Fortunately, the experts give us some slack. "You can't say in every situation it would be better to always focus on one task," Poldrack said.

If you're a stock trader who has to respond quickly to a lot of information, it makes sense to monitor multiple televisions and computer screens at once, he said. It may not matter that the next day you're hazy about which news anchor said what.

Certain physical actions, like walking or eating, are so hard-wired that they don't tax our brains much. There's certainly no harm in combining simple, low-stakes tasks, like folding laundry and watching television. And if background music energizes you to finish your work, that may outweigh the cost of your mind shifting between listening and crafting a report, Poldrack said.

Similarly, talking to an adult passenger doesn't hurt your driving the way talking on a cell phone does, Strayer has found. That's because the person in your car is attuned to the driving environment, and will pause the conversation when a tricky maneuver approaches.

To the degree that tasks rely on similar processes, they are more likely to interfere with each other. For instance, talking on the phone and writing an e-mail is hard, because both involve language, Poldrack said.

The answer is to choose carefully when you take on more than one job at once. For high-priority or complex tasks, you might want to shut down your e-mail, turn off the phone and close your office door. Apex Performance founder Louis Czoka even recommends that clients shut their eyes to focus on a teleconference.

[Feb 18, 2007]  Crazy hours becoming the new standard

Forbes.com - MSNBC.com

Just how bad have things gotten? That's the subject of Extreme Jobs: The Dangerous Allure of the 70-Hour Workweek, a recent study from the Center for Work-Life Policy. The study found that 1.7 million people consider their jobs and their work hours extreme, thanks to globalization, BlackBerries, corporate expectations and their own Type A personalities.

... .... ....

What Hewlett and Buck Luce found in their survey was that workers were themselves to blame. Many of the people interviewed for the study say they love their jobs and are reluctant to lessen their work load. In Agoglia's case, working for the small business consulting group was exactly what she wished for. Now she only comes into the office on a need basis. "It offers an opportunity for someone like me who needs more breathing room," she says, "but it also fulfills my desire to be challenged in my job."

That kind of fulfillment has its hazards. Sixty-four percent of those surveyed said their work pressures are self-inflicted but say it is taking a real toll on them individually. Nationally, 70 percent, and globally, 81 percent, say their jobs undermine their health in terms of exercise, diet and the impact of stress. Nationally, 46 percent, and globally, 59 percent, say it gets in the way of their relationships and nationally, 50 percent, say it affects their sex life.

Not surprisingly, men and women have a different take on the extreme nature of their jobs. In the global survey, 58 percent of men and 80 percent of women say they didn't want to work these hours for more than one more year. Says Buck Luce: "For women there's a flight risk. But men get burned out and are able to stick with it. There's a tremendous stigma for men who say, 'I can't do this.' That means there aren't going to be women at the top ranks of companies."

[Jan 18, 2007] Kelly Forrister Is there such a thing as obsessive-compulsive productivity

I wasn't surprised to read that 40% of Americans work 50 hours or more per week and rarely disconnect from their work, even on vacation. I hear about it all the time in my seminars where people feel like an 8 hour day is slacking off and working at night after the kids go to bed and in the morning before the office really opens is the only way they can stay on top of things.

Is it that people have too much to do or is it that they just don't have trusted systems (ala GTD) to feel like they can disconnect?

I've heard David Allen mention that we've always had too much to do. I don't think BlackBerry's necessarily create more work, it's just now people have higher expectations about how fast the work needs to get done.

Someone in one of my seminars recently told me she takes her laptop on vacation just to stay on top of her email (people actually hissed when she said this, perhaps from the fear that this will become expected.) The "vacation tax" of coming back to hundreds, if not thousands, of emails is just not worth it to her.

[Jan 17, 2007] BlackBerrys Don't Fit in Bikinis or obsessive-compulsive productivity by Joe Robinson

August 13, 2006  | Los Angeles Times

It's vacation prime time. Millions of wage-earners are on the road, in the air or on the water in search of overdue recreation, relaxation and adventure. But for too many, it will be a futile quest, thanks to a big, fat killjoy stowed away on the trip: OCP, or obsessive-compulsive productivity, a frantic fixation to wring results from every minute of the day, even our play.

Americans have always had an insistent work ethic. But thanks to technology that allows us to get things done 24/7, growing job demands and the elevation of efficiency to an unofficial national religion, many vacationers simply can't turn off their productive machinery. Every minute of the day, even of play, must be productive.

It's a habit that's increasingly counterproductive, evident in soaring job-stress bills (a $300-billion-a-year tab for U.S. business, according to the American Institute of Stress, a nonprofit organization) and longer workweeks. Nearly 40% of Americans work more than 50 hours a week. The all-output, all-the-time mandate of OCP wires us to do holidays like jobs. We cram downtime with to-do lists and a performance-review mentality that dooms trips to disappointment because we couldn't see or do everything we wanted. The trip's experience is an afterthought in a crazed race to polish off sights to the finish line of the holiday.

But trying to make a vacation productive is like trying to get a cat to bark. It's the wrong animal for the outcome, because vacations aren't about output. Instead, they're about the realm of an increasingly rare species — input — that can't be measured by a performance yardstick. The most packed itinerary can't quantify play, fun, wonder, discovery, adventure. How do you tally the spray of an exploding waterfall? The pattern of ripples on a sand dune? How do you produce quiet?

The productivity of U.S. workers has doubled since 1969, according to Boston College economist Juliet Schor. But none of the dividends have come back in additional free time. The added time that greater productivity creates is simply fodder for more productivity increases — and OCP jitters that we must get more done. How much production is enough?

Even on the job, too much time on task can lead to burnout, heart disease, carpal tunnel syndrome, mistakes, costly do-overs and rote performance. A study last year by the University of Massachusetts Medical School found that chronic 12-hour workdays increase your risk of illness or injury by 37%.

Work without time to think, analyze or recharge feeds knee-jerk performance and the hurry-worry of stress. Everything appears urgent when there isn't time to judge what is truly urgent and what isn't.

More than anybody else's, Americans' identity comes through labor. But the reflex to define self-worth by what we get done makes it hard to relax without a heap of guilt because there's always something next on the horizon to handle. Our focus on future results shrinks our experience of living and, ironically, the very thing we need for optimum performance — input.

The consulting firm McKinsey & Co. asked managers where they got their best ideas. It wasn't at the office. Rather, inspiration came when people were at play — on the golf course, running. Research on fatigue in the workplace since the 1920s shows that performance rises after a break in the action, whether a break of a few seconds or 15 minutes.

Studies have also found that job performance improves after a vacation. Income doubled at the H Group, an investment services company in Salem, Ore., after owner Ron Kelemen increased employee time off to 3 1/2 weeks. When Jancoa, a Cincinnati cleaning company, switched to a three-week vacation policy, worker productivity soared enough to cut overtime. Profits jumped 15%.

The true source of productivity isn't nonstop output. It's a refreshed and energized mind, something vacations specialize in.

But for that to happen, we must leave the OCP drill sergeant at home. Vacations require a different skill set — leisure skills. Without them, we lapse into default mode — produce, produce, produce. My retired father was stunned when he visited his former company and found a couple of his fellow retirees back at their desks. They didn't know what else to do.

As kids, we knew how to entertain ourselves. But many of us lost the knack when we learned that play for its own sake didn't produce rewards — status, pats on the back, money, goodies. Once we're in OCP territory, we've forgotten how to do things simply because we enjoy doing them.

Researchers say we had it right as kids. "Quality of life does not depend on what others think of us or what we own," contends psychology professor Mihaly Csikszentmihalyi in "Flow: The Psychology of Optimal Experience." "The bottom line is, rather, how we feel about ourselves, and about what happens to us. To improve life one must improve the quality of experience."

Famed for his studies on when people are at their happiest, Csikszentmihalyi adds that "when experience is intrinsically rewarding, life is justified in the present."

Things we do for our amusement are particularly good at improving that experience, delivering what's supposed to come out of all that production — self-worth, a sense of competence and, best of all, life satisfaction. Upping levels of performance can't generate happiness, psychologists contend, because production is tied to external approval, which is gone by the next morning's to-do list. But research shows that the more active your leisure lifestyle is, the higher your life satisfaction. Leisure also increases initiative, confidence and a positive mood.

So, if you haven't taken your vacation yet, maybe it's time to dust off the leisure portfolio and resuscitate the childhood practice of play. The packing list should include participation, engagement, spontaneity, a nonjudgmental attitude, the ability to ferret out amusements, take detours, wander without aim, plunge into things you haven't done before, and get out of your head and into direct experience. Along the way you may discover something long forgotten. Recess rules.

Continued

Recommended Links

Softpanorama Top Visited

Softpanorama Recommended

Anxiety and Obsession with Work

What Constitutes an Addiction

Computer addicts tend to lose all sense of time when they are on-line. They are drawn so deeply into the world of bytes and bits that they do not notice entire days passing by.

They forget to eat, sleep, go to school, and even care for their children. They shirk responsibilities, slack off at work, and miss appointments because they are unable to pull themselves away.

The virtual world and the real world are competing for their attention, and the virtual world often wins.

Anxiety Disorders Education Program

The Anxiety Disorders Education Program is a national education campaign developed by the National Institute of Mental Health (NIMH) to increase awareness among the public and health care professionals that anxiety disorders are real medical illnesses that can be effectively diagnosed and treated. More than 19 million Americans suffer from anxiety disorders, which include panic disorder, obsessive-compulsive disorder, post-traumatic stress disorder, phobias and generalized anxiety disorder. They suffer from symptoms that are chronic, unremitting and usually grow progressively worse if left untreated. Tormented by panic attacks, irrational thoughts and fears, compulsive behaviors or rituals, flashbacks, nightmares, or countless frightening physical symptoms, people with anxiety disorders are heavy utilizers of emergency rooms and other medical services. Their work, family and social lives are disrupted, and some even become housebound. Many of them have co-occuring disorders such as depression, alcohol or drug abuse, or other mental disorders. Because of widespread lack of understanding and the stigma associated with these disorders, many people with anxiety disorders are not diagnosed and are not receiving treatments that have been proven effective through research.

DG DISPATCH - ECNP Generalized Anxiety Disorder Has Worst Impact On Quality Of Life

Brain Lock Free Yourself from Obsessive-Compulsive Behavior A Four-Step Self-Treatment Method to Change Your Brain Chemistry

A reader from America , July 2, 1999 5 out of 5 stars Excellent! A DYI approach to OCD and related disorders. A friend gave me this book and it is excellent. If you have OCD or even a related disorder it gives you a practical approach to learning to deal with and outsmart your disorder.

Take me, frinstance, while I do not have any checking compulsions, I have suffered from anxiety disorder and occasionally intrusive, disturbing thoughts for a number of years. (Other than that I am your regular guy, you wouldn't know I had a disorder if you saw me). This book gives you a 4-step method of "reframing" OCD in a way that makes it manageable. Ultimately, the authors say, by using their method you can "retrain your brain" and actually alter your brain chemistry in a positive direction and thus reduce the original symptoms to something liveable.

Buy it (or have a friend give it to you...) :-)

Stop Obsessing! How to Overcome Your Obsessions and Compulsions

The Boy Who Couldn't Stop Washing The Experience and Treatment of Obsessive Compulsive Disorder

A reader from Santa Fe, NM , July 16, 1998 5 out of 5 stars A good description of the problem and some solutions This book contains well-written descriptions of obsessive-compulsive disorder -- it's informative, clear, and a pleasure to read. And for those of us who either suffer from these disorders or are close to someone who does, it's an eye-opener: you are NOT the only person who's ever had to deal with this problem, and there IS hope for curing it! For all these reasons, I highly recommend the book. Two cautions, however: (1) The book gave a good description of the ways of treating OCD as of the date it was written. Since then, however, there have been many new developments, so, if you're specifically interested in treatments, you'll need to look up some more recent books and articles. (2) "Obsessive-compulsive personality disorder" (OCPD) is a related but different condition, and it's possible that someone who exhibits similar symptoms but doesn't have full-blown OCD suffers from this instead. (My mother has never gone in for compulsive hand-washing, but she's rigid, intolerant, controlling, and a pack rat on a truly monumental scale. That's OCPD.) The treatments for the two conditions differ -- drugs are more helpful for OCD than OCPD, for example. As with any mental condition, it's absolutely necessary to have a thorough professional diagnosis; don't just march into your doctor's office demanding Prozac, or stock up on St. John's Wort at your local herbalist's.

Social Anxiety Disorder

Humor

[Jan 15, 1999 ] Man Crashes Car As 50 Pagers Ring At Once -- pretty funny

KIEV (Reuters) - A Ukraine businessman who bought a pager for each member of his staff as a New Year gift was so alarmed when all 50 of them went off at the same time that he drove his car into a lamp post, a newspaper said Thursday.

The unnamed businessman was returning from the pager shop when the accident happened, the Fakty daily reported. ''With no more than 100 meters to go to the office, the 50 pagers on the back seat suddenly burst out screeching. The businessman's fright was such that he simply let go of the steering wheel and the car ploughed into a lamp post.''

After he had assessed the damage to the car, the businessman turned his attention to the message on the 50 pagers. It read: ''Congratulations on a successful purchase!''

Random Findings

Reducing information overload A comparative study of hypertext systems

How to deal with Information Overload on the Internet The intelligent agent concept

The Clever Project -- IBM project

The tremendous growth in the price-performance of networking and storage has fueled the explosive growth of the web. The amount of information easily accessible from the desktop has dramatically increased by several orders of magnitude in the last few years, and shows no signs of abating. Users of the web are being confronted with the consequent information overload problem. It can be exceedingly difficult to locate resources that are both high-quality and relevant to their information needs. Traditional automated methods for locating information are easily overwhelmed by low-quality and unrelated content. Thus, the second generation of search engines will have to have effective methods for focusing on the most authoritative among these documents. The rich structure implicit in the hyperlinks among Web documents offers a simple, and effective, means to deal with many of these problems. The CLEVER search engine incorporates several algorithms that make use of hyperlink structure for discovering high-quality information on the Web.




Etc

Society

Groupthink : Understanding Micromanagers and Control Freaks : Toxic Managers : BureaucraciesHarvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Two Party System as Polyarchy : Neoliberalism  : The Iron Law of Oligarchy : Libertarian Philosophy

Quotes

Skeptical Finance : John Kenneth Galbraith : Keynes : George Carlin : Skeptics : Propaganda  : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Oscar Wilde : Talleyrand : Somerset Maugham : War and Peace : Marcus Aurelius : Eric Hoffer : Kurt Vonnegut : Otto Von Bismarck : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Oscar Wilde : Bernard Shaw : Mark Twain Quotes

Bulletin:

Vol 26, No.1 (January, 2013) Object-Oriented Cult : Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks: The efficient markets hypothesis : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law

History:

Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds  : Larry Wall  : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOSProgramming Languages History : PL/1 : Simula 67 : C : History of GCC developmentScripting Languages : Perl history   : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history

Classic books:

The Peter Principle : Parkinson Law : 1984 : The Mythical Man-MonthHow to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite

Most popular humor pages:

Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor

 

The Last but not Least


Copyright © 1996-2014 by Dr. Nikolai Bezroukov. www.softpanorama.org was created as a service to the UN Sustainable Development Networking Programme (SDNP) in the author free time. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Site uses AdSense so you need to be aware of Google privacy policy. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine. This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...

You can use PayPal to make a contribution, supporting hosting of this site with different providers to distribute and speed up access. Currently there are two functional mirrors: softpanorama.info (the fastest) and softpanorama.net.

Disclaimer:

The statements, views and opinions presented on this web page are those of the author and are not endorsed by, nor do they necessarily reflect, the opinions of the author present and former employers, SDNP or any other organization the author may be associated with. We do not warrant the correctness of the information provided or its fitness for any purpose.

Last modified: February, 19, 2014