Archive | January, 2012

Speedy Skepticism Sounds Suspect

31 Jan

In every class I’ve had at UF, at least for journalism, I’ve been told how everyone’s roles at work are getting more complicated. Everyone on a newspaper’s staff seems to need a very wide skill set as technology dictates the future of journalism. Of course, no matter what changes technology brings, some of the fundamentals will remain the same. Sources will need to be found and interviewed, some form of medium will be used to purvey the news to an audience and, perhaps most important, someone will have to verify the information.

Traditionally, the reporter is the first, and sometimes last, fact checker. Often, a newspaper will have a copy editor responsible for fact checking as well, which is a task that requires training and practice. Unfortunately, the speed of news is becoming very problematic. Internet advocates seem to be overjoyed by the fact that they are driving print media into the ground with little resistance; however, as things move more online, there is less time for thorough fact checking. The aphorism, “Get it first, but first get it right,” seems to be dying at the speed of Tweets.

One article suggested to assign a naysayer, which is a great suggestion for avoiding infamous fabrications such as Janet Cooke or Jack Kelley, but it doesn’t seem like a feasible option in today’s blisteringly fast news cycle. Students are taught to be scrupulous in editing and reporting classes. A fact error is an instant “F.” In season 5 of HBO’s The Wire, Gus Haynes represents the type of every editor that professors want students to emulate. He is a veteran editor who fights senior staff at the newspaper to uncover a reporter who has been fabricating stories (who is eventually caught in his lies).

The biggest problem with that who scenario is the lack of time journalists face now. News is being published, at least online, almost in real time. As the filters for news have been gleefully cast aside, there is a void previously filled by the “skeptical editor.” With “The Filter Bubble” in full effect, expect fabrications and errors to run rampant across public domain faster than even the most scrupulous, dedicated editor could keep up with. Even if an editor’s only job were to edit.


Errant Eagles

25 Jan

Elliot Evans- Case Study 1

Sitting in class, it was difficult to discover what was the problem with the story about the eagle flying away with a small dog. We talked about it for awhile. I thought it was kind of a pointless, fluff piece, while others in the group thought maybe the tone was too insensitive. It didn’t occur to any of us that the story was a fake.

I think in this modern era of instant news coverage, editors need to be more wary than ever. With the speed of a medium like Twitter, it’s going to be harder than ever to verify facts before they reach the public. Aside from bad reporting, Twitter can also spread unsubstantiated¬†rumors. Even more damaging is the risk inherent with computer based technology, in that a skilled hacker can create swift, immense damage through fabricated reports.

I don’t think a story, like the one posted on NBC’s hacked Twitter feed, would be nearly as damaging on a more traditional news site. With Twitter, the speed that news comes in at doesn’t allow for editors to filter very much, making the hacked Twitter feed seem credible, whereas a breaking news story like that on a news site would need more supporting evidence.

The hardest part about translating a case study from almost two decades ago, is that journalists are operating in almost a completely different medium. If the eagle picked up a dog nowadays, there would probably be a video on YouTube before we could even get a reporter down to the gas station.

I, unlike many in my generation, believe in preserving print journalism far more than advancing online journalism. While it seems like an impossible task, I think credibility will be the difference. Everyone enjoys the headline-like posts on Twitter, and the speed of news online; however, print journalists would do well to develop those stories further than the online versions.

Credibility is where the editor will make his or her last stand before being totally outmoded. By being extremely discerning and keeping the BS radar at full alert, editors can maintain their role as sages of the newsroom, instead of Jack-of-all-trades, master of none online.

And so the Internet has demanded journalists to adapt or die. I say we need to hold strong in some regards, especially in print where we can take the time to check facts, get interviews and provide depth to readers, lest we have more dogs being carried off by errant eagles.


25 Jan

Elliot Evans-

And so I commence the ironic act of blogging about blogs.

The upshot of the readings this week is that blogs are the future of news writing, but they’re not talking about the blogs of yesteryear. The term blog seems to have become a catchall phrase, which refers to localized topics that have more flexibility in linking to other sites.

The problem I have with a lot of these blogs is that they tend to downplay the pursuit of accuracy. Even among respected newspapers, they have blogs that focus on getting stories out quick without worrying about getting it right. The model is shadowing Twitter-like breaking news posts¬† followed by news as it comes, making for a really confusing layout for a reader that hasn’t been following the story.

The benefit of blogs is the ability to focus on local issues, as well as implement multimedia to complement the story. Perhaps the most useful aspect to blogging is the use of “live blogs”, which is a good way to interact with a live event. For instance, the New York Times coverage of the president’s State of the Union provides commentary and a little bit of analysis.

Another interesting thing is how blogs will move forward into the future. With Twitter moving far more rapidly than blogs, it’ll be hard to discover how blogs will keep themselves from being as outmoded as print media. In fact, others are noticing the trend to move from blogs to Twitter, and it’s a trend that I expect to continue as attention spans shrink. It’s also a valid trend, in my opinion, because blogs are slower, have less interaction with the readers and don’t necessarily have more credibility.¬†The example in class of the “live bloggers” that covered the forum at their school helped illustrate this, as the bloggers were basically following the Twitter format.

My last issue with blogging is how do you weed through them all. The majority of blogs are free and easy to use, which is why over 900,000 posts are created daily. Search engine optimization seems to be the answer for a lot of people, in regard to how to distinguish your blog from others; however, the information on how to increase traffic for a blog is readily accessible. Search engines, especially one like Google, sell space on the search results, which will influence what people see, essentially allowing people to buy traffic for their page.

Curating and Aggregating

18 Jan

Elliot Evans-


As someone who grew up with access to the Internet for the majority of my life, I am painfully behind when it comes to utilizing it. Like most people my age, I’m capable of checking my emails, using Facebook and performing a basic Google search, but when it comes to the advanced use of social media, I’m a little inept. I don’t believe this is because I’m incapable, but rather resistant to lures of the Internet. I mention this because it informs my opinions on the articles assigned for the week.

The distinction between curating and aggregating is a fine one to make. The selection of what is considered newsworthy has historically been the role of the journalist for over 200 years and will continue to be so in the future. Although I’m uncomfortable with it, I am clearly in the minority. The Internet is increasingly the primary news source for Americans. The previous year marked the first time people started using the Internet for news more than newspapers. Having a way to sort through the bog of information online is becoming a more Herculean task by the day, but it’s clearly the job reserved for future journalists, whether they go by that title, blogger or something else.

The most heated part of the debate seems to be setting a code of ethics for aggregating. When an aggregating site, such as the Huffington Post, publishes a story, it uses a story from a news agency (usually a newspaper or TV station). By doing this they draw away hits from the news site that did the majority of the work to report on the story, and that’s the claim set forth in harsh terms by the Miami Herald. The defense of the aggregators is usually one of two things: they cite the source and/or they added information to the original article.

The worst part the aggregators’ defense is they use faulty logic. They claim Internet readers will click the link, but there hasn’t been any statistical data to prove they do. An aggregator can draw hits away from a website that has the actual article written by a reporter who did the work to find the story, making the aggregator nothing more than a “rewriter“.

Yet, anyone reading this entry will notice that I have also been citing sources through hyperlinks. My defense would be the same any ethical aggregator could use. If added analysis and multiple sources contribute more than the original articles, then it is making the best use of technology.