Storage Informer
Storage Informer

Tag: Facebook

The Internet May Cause Distraction and Inability to Learn

by on Jul.11, 2010, under Storage

Warning: The Internet May Cause Distraction and Inability to Learn

EMC logo

If Nicholas Carr is correct in his recent book, The Shallows: What the Internet is Doing to Our Brains, you will not read this entire blog post. The main idea of the book is that the internet causes our brains to be easily distracted and also makes us less able to learn deep ideas as we skim more and read deeply less.  The book digs into the science of how the human brain learns.  The brain has “plasticity” – meaning that it will adjust to the activities that it does often, similar to what our muscles do.  Carr lays out the history of communication from the first written world, through the mass production of books, to the internet age and analyzes the impact that these technologies have on society and the human brain.  As he points out in the book, for every new technology, there have been those that have said that it will doom the way that we think or take away the very things that make us human.  But this time, according to Carr, there are real issues.

The Internet Distracts People with…SQUIRREL

Like the dog Dug in the movie Up! (SQUIRREL!) – internet users can find it hard to stay focused.  While I agree that almost any repetitive activity can potentially become addictive, I believe that most people can take control of the tools that they use for communication rather than letting them control you.  Even in pre-internet days, the draw of interrupting technologies was there – do you finish the task that you’re working on, or answer the ringing phone or watch TV (or even read a book)?  The brain can get so used to a stimulus that it will make you crave it when it’s not there – this even affected Dilbert back in 1996.
The difference with the internet is that it is everywhere and people can become like a mouse pressing a lever for a pellet by constantly checking email, RSS, Facebook, or Twitter.  These activities can be properly worked into the flow of the workday rather than as a distraction from getting things done.  Personally I know that I have a tendency to want to stay connected and respond rapidly to messages. (Disclosure: Hi, I’m Stu and I’m an internet addict, see me on Twitter)  The more we allow ourselves to be interrupt driven, the more our brains will see that as “normal” and it will become harder to stay focused for longer periods of time.  Recent studies (including some in the book) show that the cost of context switching is larger than any gains for multitasking.  You have the power to take control of your environment: finish conversations without interruption, check messages when you’re done with a task, not when an alarm tells you that it comes in.

Reading vs. Skimming

If you’ve read this far, congratulations!  In the age of the internet, most people skim rather than read.  The book describes that people read in an “F” shape, that is that they read the first line or two, then partial lines and eventually just start scanning down the page.  As a blogger, I try and keep my posts short (500 words for most posts or 1000 for a deeper discussion) and also try and break up the text visually with some bolding, italics, headers and photos.  Carr also says that even the basic web format with hyperlinks is very distracting.  Each hyperlink that you reach makes you think about clicking it and if you do will you ever get back to where you started (for this article, I put some links at the bottom rather than throughout the text).  There is fascinating research in the book which explains how memories are created and the science behind short term and long term memory.

“How do users read on the web?”…”They don’t”

While in general I feel that Carr is a bit of a pessimist about technology, I do believe that he is correctly raising an alarm on this topic.  The argument in the book is that as we skim more and rely on the internet to store information rather than our brains that our brains will have less context for problem solving or deep thoughts and that we become shallower.  Mass production of books brought learning to everyone, the internet increases information flow, but potentially we understand and internalize less.

There are a few ways that we can still absorb information in the internet age.  Of course the first is to read deeply – I’d recommend picking up The Shallows if you’ve found this discussion interesting (I think that it should be required reading at colleges). Another way is to write; the process of organizing your thoughts and translating them into words helps your memory and critical thinking.  A third way is to have deep discussions with friends and family – nothing like a lively debate to get the brain going.  A final way is just to give yourself some free time to think – where new information isn’t flooding in so that you can sort and process what you’ve brought in.

I consider myself a pragmatic optimist on the new technologies.  Like some of the optimists in the articles listed below, I believe that the internet age brings proliferation of information and opportunity for a globally connected community.  It’s the core of the company that I work at now.

Where do you stand?  Are you an internet optimist? Do you believe that there is validity in Carr’s positions?  Will the internet turn people into shallow shells that can’t function without computers?

Here are some related articles that I’d recommend:

Are You An Internet Optimist or Pessimist? The Great Debate over Technology’s Impact on Society by Adam Thierer  (Adam also reviews The Shallows)

I Know I’m Not the Only Internet Optimist… by Andy McAfee

Carr’s article from The Atlantic: Is Google Making Us Stupid?

Does Multitasking Lead to a More Productive Brain? from NPR

A couple of posts of mine discussing similar topics after reading a book by Neil Postman

Nicholas Carr on The Colbert Report


Share

Update your feed preferences

URL: http://emcfeeds.emc.com/l?s=100003s2f6pa6831qks&r=rss2email&he=687474702533412532462532466665656470726f78792e676f6f676c652e636f6d25324625374572253246426c6f67537475253246253745332532466a6e31544d4a7636766f51253246&i=70726f78793a687474703a2f2f626c6f677374752e776f726470726573732e636f6d2f3f703d31313137

Leave a Comment :, , , , , , , , , more...

An Offer You Can’t Refuse

by on Jul.09, 2010, under Storage

An Offer You Can’t Refuse

EMC logo
From aspirational to pragmatic:

EMC Unified Storage Is 20% More Efficient.  Guaranteed.

That’s the tag line for the storage efficiency campaign we’ve recently launched in this hotly contested part of the market.

And, from all indications, it appears that it’s working quite well …

The Background

If you haven’t been following this particular drama closely, maybe I should bring you up to date.

This specific part of the storage market — dubbed "unified storage" (one storage platform that supports file and block protocols) is one of the most brutally competitive parts of the storage and larger IT landscape.

Smaller organizations use these storage arrays to run just about everything they’ve got.  Larger organizations use these for non-mission-critical applications and general purpose storage.  And some specific organizations occasionally put up vast amounts to support specific online services.

In this category, it’s hard to differentiate on performance, since — well — for many of the use cases good enough is good enough.  Ditto for topics like availability and replication.  And, even though there’s a ton of great software integration betwee n these arrays and environments like VMware and Microsoft, there’s only so much of that integration stuff you can use.

Which leaves us with the central topic of efficiencywho can use less raw storage capacity to get the job done?  At the end of the day, everyone pays pretty much the same for component level inputs … it’s what you get out of it that matters.

Lots Of New Technology Here

Over the past few years, there’s been a lot of new approaches to drive storage efficiency, and they tend to show up in this segment first.  Things like thin provisioning.  Compression and deduplication.  The use of enterprise flash drives to enable use of more low-cost storage devices, like SATA.  Even spin-down and automigration to even lower-cost archives, whether they be internal to the organization or provided as an external service (e.g. cloud).

So much so, in fact, th at it’s very hard to sort through all the noise and fanfare around who’s more efficient.  And, given the competitiveness of this segment, there’s an awful lot of noise indeed.

So we decided to make it easy for everyone.

The First Round Of Storage Guarantees

About a year or so ago, we all saw the first round of "efficiency guarantees" pop up in the market.  Frankly speaking, I and many others saw them for what they were — basically, a cheap marketing gimmick.

Why?  Although they offered up the appearance of considerable savings (e.g. up to 50% !!!) they had some fundamental flaws.

First, they were usually up against easy compares — to qualify, you had to switch between RAID 1 (mirroring) and parity RAID.  That gets you 40%+ just there.  Second, to get these results, frequently you had to use  more exotic configurations that required turning off certain useful features, like snap reserves. 

Yuck.

Second, when you went looking for details, there were all sorts of useful workloads excluded, like databases, or data objects that were already compressed. 

More yuck.

Finally, there were multiple pages of terms and conditions, boatloads of exclusions and caveats, and a registration and acceptance process involved.  All of the work to get any potential value had to be done by the customer. 

Maximum yuck.

Some of us thought we could do better, so we did.

A Better Guarantee?

EMC, in the normal course of our business, purchases and tests just about every decently competitive storage array in the marketplace.  We put them in the lab, and run them through their paces.

Sometimes, it’s for interoperability and compatibility purposes.  A lot of the EMC portfolio has to work well with storage arrays we don’t make.  Other times, it’s to find out what’s really behind all the noisy claims that people make — we really want to know for ourselves.

And, in the course of doing all this, we were continually struck by one observation — many of these competitive storage devices weren’t all that efficient at converting raw storage capacity to usable capacity in a predictable and usable manner.

So we decided to do something about it …

The EMC Unified Storage Guarantee

We tried to make this as simple as possible.

Configure an EMC unified storage platform using our tools and standardized best practices.

Configure the other guy’s unified storage platform using their tools and standardized best practices, or use ours if you don’t have access to theirs.

Compare the raw capacities — if EMC doesn’t do the job with at least 20% less raw capacity, we’ll make up the difference.

No disclaimers, caveats, exceptions, legalese, registration processes, etc. 

Simply put — no BS.

In addition to the program web page, there are a couple of cool promotional videos we’ve done (here and here), as well as Christopher Kusek’s blog (@cxi) where he’s having way too much fun with all of this. The backstory here is also fun: Chris worked for one of our competitors in this space for many years before recently joining EMC.  There’s also a nice Facebook fan page if you’re so inclined.

You’ll see more of t his program in the future for one simple reason: it’s working.

How This Plays Out

Customers and partners of all sizes and shapes are taking us up on this offer. 

It might be a modest 10TB filer through a partner, it might be a multi-petabyte transaction as a direct account — or anything in between.  Again, as I said above, no exceptions and no BS.

The prospect of saving, say 200TB on a petabyte-sized config definitely gets a bit of attention :-)

Customers are putting our configs up against the other guys, and they’re discovering what we’ve known all along — the other guys are pretty inefficient when it comes to converting raw capacity to usable stuff.

Most times, these people are seeing at least a 20% difference, maybe more.  To be fair, there are a few exceptions where we came in a bit under the 20% mark, and EMC has quickly made good with more free capa city with no fuss whatsoever.

Are these customers using the 20% savings to spend less on storage?  No.

Generally speaking, they’re using the savings to get an additional 20% of capacity from EMC.

Think about it: 20% more for your money from EMC.

And that’s a deal that many people are finding just too tempting to pass up.

What Lies Ahead?

As far as I can see, there’s no reason why we wouldn’t make this program a permanent fixture of our competitive offerings going forward.

The underlying basis for our storage efficiencies are architectural, and hard for our competitors to replicate.  The program isn’t really costing us anything, since in most cases the 20% savings is already there, or more. 

This could go on for a very long time indeed — there’s no reason to stop.

So, I have to ask — what are *you* going to do with your extra 20%?

:-)

Update your feed preferences

URL: http://emcfeeds.emc.com/l?s=100003s2f6pa6831qks&r=rss2email&he=68747470253341253246253246636875636b73626c6f672e656d632e636f6d253246636875636b735f626c6f67253246323031302532463037253246616e2d6f666665722d796f752d63616e742d7265667573652e68746d6c&i=70726f78793a687474703a2f2f636875636b73626c6f672e656d632e636f6d2f636875636b735f626c6f672f323031302f3037 2f616e2d6f666665722d796f752d63616e742d7265667573652e68746d6c

Leave a Comment :, , , , , , , , , , , , , , , , , , more...

Can Web 2.0 save the world?

by on Oct.14, 2009, under Storage

Can Web 2.0 save the world?

Remember the age of Web 1.0? Back when it took all night to download one song on a 57k dial-up? Today we have broadband and iTunes and dial-up is a distant memory in the era of Web 2.0. According to Tim O’Reilly and John Battelle, organizers of the upcoming Web 2.0 Summit, “To understand where the Web is going, it helps to return to one of the fundamental ideas underlying Web 2.0, namely that successful network applications are systems for harnessing collective intelligence, meaning that a large group of people can create a collective work whose value far exceeds that provided by any of the individual participants.” Like for example, the Web 2.0 intersection of volunteer computing and social media that is Progress Thru Process (PTP). I wrote about the PTP Facebook application six days ago when it had just over 127,000 fans. Today it has 129,596 fans. All the fans who have downloaded the PTP application are donating their spare CPU cycles to power humanitarian research. Their collective CPU power ranks in the top 250 of the world’s supercomputers. PTP is a demonstration of how Web 2.0 provides people like you and me, one by one, the collective power to do something, amazing.

Comments (0)

URL: http://blogs.intel.com/csr/2009/10/can_web_20_save_the_world.php

Leave a Comment :, , , , , , , , , , more...

Looking for something?

Use the form below to search the site:

Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!

Visit our friends!

A few highly recommended friends...