You are currently browsing the category archive for the ‘Media’ category.

Is Google making us stupid?

That’s the question Nicholas Carr poses to his readers in the Atlantic Monthly. Carr begins by noticing a frightening tendency among himself and his colleagues– the inability to digest the written word in substantial volumes. The internet makes it easier for readers to move seamlessly through different sources, skimming the information they want and discarding the rest.

Carr writes:

“When the Net absorbs a medium, that medium is re-created in the Net’s image. It injects the medium’s content with hyperlinks, blinking ads, and other digital gewgaws, and it surrounds the content with the content of all the other media it has absorbed. A new e-mail message, for instance, may announce its arrival as we’re glancing over the latest headlines at a newspaper’s site. The result is to scatter our attention and diffuse our concentration.”

Not only has the internet altered old media, but the inverse is true as well. In support of this claim, Carr cites the example of New York Times‘ recent editorial move to including in its print edition “article abstracts” which allow readers to get a quick “taste” of the day’s news.

There is, however, no good reason to assume that technology, namely the internet, is the sole impetus beyond the New York Times‘ decision. Indeed, the Wall Street Journal began giving its readers a quick “taste” of the news through its front page staple, the “What’s News” column back when the internet was still a twinkle in Tim Berners-Lee’s eye, during the tenure of Bernard Kilgore, which lasted from 1941 to 1967.

There are perhaps better ways of explaining the prevalence of “news snippets” such as these. First, human beings are curious by nature and have a desire to accumulate more knowledge, particularly knowledge of current events. Second, as the economy changes, people have more hectic schedules and are thus more inclined to prefer to get their news in the form of brief summaries.

I would be foolish to deny that Google and other internet companies have contributed to our collective short attention span. However, this is not where Carr’s critique ends. Rather, Carr goes on to paint Larry Page and Sergey Brin, Google’s founders, as being behind some sort of sinister plan to replace human minds with robots.

Yes, robots.

This is where Carr’s argument becomes stilted. Carr rattles off a variety of different quotations such as this one from Brin’s 2004 interview with Newsweek: “[c]ertainly if you had all the world’s information directly attached to your brain, or an artificial brain that was smarter than your brain, you’d be better off.”

Maybe it’s a matter of opinion, but I read Brin’s comment as indicating merely that it would be advantageous to have all of the word’s information readily accessible through one’s own mind (or at least to have a “smarter” artificial brain). To me, this is an uncontroversial assertion. (Of course it would be great to know everything there is to know!) Carr’s reading, however, is that Brin’s comment demonstrates some sort of ugly antihuman animus, waiting to play itself out when Google takes over the world.

At root, this difference in interpretation comes down to a difference in how one views intelligence. Those, myself included (at least in large part), in Google’s camp take a pragmatic view, favoring efficiency and greater access to information. Carr, for his part, rests on the notion that intelligence must contain something more than this.

Carr goes on to argue that, in a society increasingly connected to the internet, there is a risk that we will lose the ability to reflect and deliberate in the same way we have in an age dominated by print media. Carr writes:”If we lose those quiet spaces, or fill them up with “content,” we will sacrifice something important not only in our selves but in our culture.” Perhaps this debate comes down to a question of balancing the trade-offs.

Finally, it’s interesting to note that Carr at least recognizes the theoretical blind spot in his argument– namely that any argument against technological growth and for the status quo can easily be proved foolish in hindsight. Carr references Plato’s Phaedrus, in which Socrates bemoans the development of the written word. In his article, Carr more or less advances a line of argument which is actually quite similar to the one advanced by Socrates. Carr’s argument hangs on a notion of “true” intelligence and the value of tradition. The question, then, is whether Carr can convince his readers that this somehow a unique case.

I’m not convinced. Not when I can already see the enormous benefits of the internet’s rapid, voluminous nature. Intellectual pursuits are aided as researchers can collect information more efficiently than ever before, and people in all corners of the world are now able to access information and ideas that might would have certainly been unavailable 20 years ago.

Any thoughts? Anything I missed?

Daniel Corbett

Advertisements

After an interesting discussion in my Cyberspace and the Law class this morning, I would like to throw out another question with which cultural relativism must grapple: what should we do about Internet censorship?

For instance, would a cultural relativist support a group like the OpenNet Initiative, whose goal is “to identify and document Internet filtering and surveillance, and to promote and inform wider public dialogue about such practices”?

On the one hand, this goal aims to promote a more open global dialogue about the issue, something which seems to be at the core of cultural relativism. But, on the other hand, don’t we have to assume an objective ethical baseline (censorship bad) in order to achieve this global platform?  As a logical matter, some methods of filtration and blocking must be removed in order for this dialogue to even take place.  How can cultural relativism reconcile this tension?

Daniel Corbett

First, a disclaimer: I do not know nearly enough about (1) economics, particularly as applied to telecommunications markets, or (2) how the Internet works, or, put more festively, “the architecture of the Internet.” Please take my comment with a healthy grain of salt.

On the one hand, this seems very problematic. The Internet functions essentially by sending discrete packets of information anonymously over a decentralized “web” of local networks. If I’m not mistaken, Time Warner’s suggestion would seem to disrupt this model. I’ll steal an analogy I heard when NPR covered this story the other day– this seems no different than a cable company trying to refigure your bill to reflect how much TV you watch. And this seems somewhat absurd to most people.

On the other hand, isn’t this just a free market in action? And aren’t we, in fact, gaining greater efficiency by allowing people to more accurately absorb the costs of their actions, rather than displacing them across a larger population? One way of looking at this might be through a “Tragedy of the Commons” lens. Isn’t privatization the best route for us?

OK, so there’s both poles, at least as I see them. I’d love to have someone help me tidy this up a bit. Any techies and/or economists who can set me straight?

Daniel Corbett

Take a look at this article from the AP via Wired Mag.  Any strong opinions one way or another on the next step away from net neutrality?

On the heels of Universal’s threat to sue YouTube for copyright infringement, Warner Music has taken up a different strategy in dealing with the burgeoning media outlet (and, notably, my current number one method of procrastination). Warner recently struck a deal with YouTube, opening its library in exchange for a share of advertising revenues.

Interestingly, the relationship between Warner and YouTube began with the creation of a “branded channel” on YouTube designed to promote Paris Hilton’s first musical endeavor. It pains me to say it, but something good has finally come (very indirectly) from Paris Hilton and her bulldozer press agentry.

Daniel Corbett 

According to the Chronicle of Higher Education, the University of California has struck a deal with Google in which the university will provide at least 2.5 million volumes to Google for scanning. Said one of the deal’s brokers, Daniel Greenstein, of the California Digital Library,

“I understand [Google’s] ends are commercial,” he said. “But it’s one of these things where their business model, their interests, and our interests align around public access for the public domain forever and for free.”

On a Google-related note, last night my friend Matt asked me the inevitable question: “So, I’m taking bets– when do you think Google will become sentient?”

Daniel Corbett

    The new X-Men? Disappointing? How can it be? In any case, I hope my answer is at least somewhat fulfilling. First, I have to echo my support for your unflinching stand on free speech in cyberspace. The Electronic Frontier Foundation out of San Francisco has been the vanguard of the "bloggers' rights" movement. Why should, within reason, companies rein in what their employees say and do in their private lives?

    But it's this "within reason" caveat– painful but necessary– that comes around to haunt our theoretical defense of Internet free speech. Nondisclosure agreements are the lifeblood of many companies, especially in an idea-based economy. But where do we draw the line? Is blogging publishing, per se? What, exactly, is a "trade secret?"

    These are thorny questions, to be sure. And I can't provide clear answers to them because so much (of life and of law) is contextual. In response to your idea that a Cleveland desk jockey has little in common with the radical, the cafe owner, and the stablehand– I agree partially. You're right that these disparate people have little in common; and even though they can connect via the Web, it doesn't mean they will. This view has dominated some arguments about whether blogging is "publishing" in a meaningful sense. Like many debates before it, however, technology has drastically changed the terms of the debate. Through aggregators and blog-specific search engines, more people are accessing more blogs than ever before. And they don't need to know where to look– just what they're looking for.

    There's a lot I can and will blog about in law school this fall. It's practically expected of me. But I would dare say there are some things about your job that you may not be allowed to share on this blog. Call it a cop-out, but I think the answer here is balance– a case-by-case approach that reconciles the privacy, property, and free speech rights of employer and employee.

Daniel Corbett

      For some it's cause for alarm: the public tarnishment of a corporate image. But for some it's simply the digital water cooler: the free speech of employees in a connected world. Like it or not, workers everywhere are taking to the Web and taking their work experiences with them. From the New York Times:

"Most experienced employees know: Thou Shalt Not Blab About the Company's Internal Business. But the line between what is public and what is private is increasingly fuzzy for young people comfortable with broadcasting nearly every aspect of their lives on the Web, posting pictures of their grandmother at graduation next to one of them eating whipped cream off a woman's belly. For them, shifting from a like-minded audience of peers to an intergenerational, hierarchical workplace can be jarring." 

      This raises an interesting dilemma for as the Internet Generation descends on the working world. More and more, employees are getting the boot for blogging about their company's propietary information. It may be malicous. It may be cathartic. It may (at least to the bloggers and their readerships) be quite funny. But the fact remains, for better or worse, we are increasingly interconnected. And when a boss does a Google search for his or her company, and an employee diatribe comes up– it's not a pretty picture.

      Or is it?

      For many, getting fired for blogging is the best thing that could happen to them. For Kelly Kreth, a marketing director in New York, who lost her job for blogging about employees, she couldn't have made a better career move: "It led to me opening my own business and making triple what I was making before." A writer who was canned for writing about his job at Comedy Central is converting his experience into a book. And workplace tell-alls like "The Devil Wears Prada" and "The Nanny Diaries" are slated to hit the big screen this summer. It seems that behind this cloud, a market is emerging.

      But what will be the social effect of these events? Will companies, seeing green, find a way to make blogging work for them? (Remember Wal-Mart had a mini-scandal involving information it provided Wal-Mart-friendly bloggers.) One thing is for sure; however, companies are going to take notice of blogging. Right now, only 8 percent of HR executives in a survey said their companies had policies about blogging. Given the controversy it is generating, I think we're bound to see some fences put around the digital water cooler.

Daniel Corbett   

    "If the Internet was once ungoverned by etiquette, those days are gone; MySpace and its siblings, by many accounts the future of the Net, are rife with discussions of good manners versus unforgivable faux pas,"writes Steven Barrie-Anthony for the Los Angeles Times.

    The freedom is indeed vanishing. Now the Net is a place where serious questions are hashed out: relationship statuses, rules of syntax, and now– thanks to MySpace– the hierarchy of friendship. You heard it right, a new feature on MySpace allows users to rank their "Top 8," a list that can include friends, family members, significant others, as well as favorite films and bands. And, as is often the case when "real" meets virtual, things get messy. The article details spats between spouses, co-workers, and long-time friends that have emerged on social networks.

    I am nothing short of a technophile (if not personally, then at least theoretically). I honestly believe, contrary to naysayers' arguments, that new technologies have the power to bring people closer together, and make us more human, not less human. But what does it mean when we have 78 million on MySpace already (and, on average, 270,000 joining every day)? Does the "opt-out" argument still apply? Can technology facilitate the feared tyranny of the majority? (I know I have made no empirical claims, but I'm just throwing out some big questions for you.) So, technophiles and Luddites alike, where do we go from here?

    I leave you with the MySpace experience of Michael Block, a search engine marketer from L.A., who received the following indecipherable message from "an unknown 15-year-old in Florida": "y u want people 2 look at u 4. u thinken that u looken sweet 4 da females."

    Wow.

    I admit I am young, but from what I've read, Internet discussion used to look a little more like this.

Daniel Corbett

    Stephen Colbert is funny guy. His show, the "Colbert Report" (that's a soft "t" on both, mind you) pokes fun at rampant patriotism, partisan news coverage, and America's political discourse at large. Colbert's network, Comedy Central has even put together a satirical "fan site" called The Colbert Nation full of unfurled flags, fan testimonials, and bright green hit counters at the bottom of each page. Suffice it to say, I find Colbert's routine amusing, and I think he often provides good social commentary.

    I am, however, inclined to agree– at least partially– with Richard Cohen in his criticism of Colbert's appearance at the recent White House Correspondents' Association Dinner. Cohen expressed frustration at Colbert's hackneyed routine during the dinner. Colbert covered the usual bases: Bush's approval rating, Iraq, and of course, the president's intelligence and public speaking ability.

    You may read Cohen's column and come to the simple conclusion that he just doesn't like Colbert. But this is not Cohen's primary argument; for him, it's really about the responsibility of political humorists. Cohen opines:

"In Washington he was playing to a different crowd, and he failed dismally in the funny person's most solemn obligation: to use absurdity or contrast or hyperbole to elucidate — to make people see things a little bit differently. He had a chance to tell the president and much of important (and self-important) Washington things it would have been good for them to hear. But he was, like much of the blogosphere itself, telling like-minded people what they already know and alienating all the others. In this sense, he was a man for our times."

    Thank you Mr. Cohen. I couldn't have said it any better.

Daniel Corbett

October 2017
M T W T F S S
« Aug    
 1
2345678
9101112131415
16171819202122
23242526272829
3031  

Erin Pyles Photography

Blog Stats

  • 12,802 hits
Bloggers' Rights at EFF

Creative Commons License
This work is licensed under a Creative Commons License.