Quantcast
Channel: Post-Advertising » Luke Dringoli
Viewing all articles
Browse latest Browse all 50

Google Rewards Real Storytellers

$
0
0

This post originally appeared in our March issue of “Live Report from the Future of Marketing,” our monthly Post-Advertising newsletter. Subscribe for free here.

Google has finally rewarded those steadfast providers of valuable, high-quality content over purveyors of drek. In the main objective of their recent search algorithm revamp codenamed Farmer, Google has yanked republishers, editorial mills, aggregators, and other wholesale sources of otherwise weak, unorganized content (such as Associated Content and Mahalo, to name two of the most egregious offenders) off their high-ranking perches and slashed their search results by adhering to a new system that attempts to account for actual editorial quality. Adhering to a standard of perceived quality means that news outlets, brands and other publishers of notably engaging stories are finally gaining the additional recognition they deserve. But what of the grey area—the in-between web listings that don’t deserve to be demoted? Will those harmless do-gooders potentially be misplaced, orphaned by Google’s iron hammer? One thing’s for sure: Great content wins—especially when a system exists to recognize it as such.

Ad Age’s Bernhard Warner seems to think all the reshuffling of search priorities means good news specifically for brands and branded content. According to The Online Publishers Association, the change in Google’s search algorithm will redistribute some $1 billion in revenue away from content farms, towards sources of quality content. In this new climate, those “newest of the new breed publishers”—“brands and organizations-turned-publishers,” as he describes—are able to place well above other sources on the same subjects if what they provide is valuable, timely and targeted to other content creators. His example in regards to a recent campaign for the Vatican’s Opera Romana Pellegrinaggi is particularly telling of what’s possible for brand-sponsored journalism.

In a definitive interview by Wired’s Steven Levy, featuring Google search engineering brains Amit Singhal and Matt Cutts, Cutts illustrates the result by way of a story:

I got an e-mail from someone who wrote out of the blue and said, “Hey, a couple months ago, I was worried that my daughter had pediatric multiple sclerosis, and the content farms were ranking above government sites.” Now, she said, the government sites are ranking higher. So I just wanted to write and say thank you.

The letter illustrates the potential seriousness of having accurate, helpful search results on top, every time. Like none before it, our culture relies on quick access to vital information and the best results possible.

As also explained in the interview, the company aimed to account for a measure of perceived quality amongst search results by asking an exhaustive list of frank questions to users based on issues such as privacy, age-appropriateness, reputability and amount of advertisements. Amen.

Google’s revisions and their highly public nature seem likely, at least in part, to be a response to Microsoft’s Bing search engine, of which has been steadily eating up market share and taking swipes in the process, claiming to be “a smarter search.” And there’s the speculation that Microsoft’s “decision engine” returns more relevant, accurate results than the veteran itself. With Bing set to offer a local discount feature à la Groupon, Google found itself needing to prove its efficiency and accuracy once more.

Regarded and lauded by many as a brilliant move by Google, sure to encourage a greater emphasis on earnest, original content, the update has no doubt caught some legitimate destinations in its crosshairs, in the process reducing their overall traffic to rubbish. Can Google accurately discern between the originators of such storytelling in the first place? What does the future of search mean for so many resources constantly vying for our consideration?

Sites complaining about the search switch aren’t just hobbies for their owners—entire livelihoods are at stake. As CDKitchen’s Valerie Whitmore explains, traffic to her site, which features cooking recipes and articles, fell a whopping 39%; in Levy’s interview, he articulates the plight of Suite 101’s owner, who claims his rankings have unfairly plummeted some 94%. Google’s Singhal and Cutts maintained that their algorithm is making the right decision.

But will these deafening decisions put into effect by Google shake the greater Internet community’s trust in the foundation and continued viability of web ventures? Might Google be endangering what it hopes to help prosper? Or, is it all acceptable collateral damage on the march to more pertinent, finely tuned search results? As Google’s Cutts has qualified elsewhere when defending the new system, “no algorithm can be 100 percent accurate.”

(Somewhat) innocent bystanders notwithstanding, the system has so far filtered out a large number of disliked, genuinely inauthentic web sites; engineers compared the outcome of their changes with data from Google’s Chrome Site Blocker and found that 84% of those blocked directly by users were naturally relegated downward by the changes. According to The Atlantic’s Alexis Madrigal, the new algorithm really does do a better job of dropping poor quality content farms, as shown in a test carried out on the typical how-to search. Uri Friedman goes one step further, testing it against The Huffington Post, a new breed of news site that serves as both aggregator and publisher. Their mojo wasn’t affected dramatically—in fact, as he explains, “the original version of the stories that the Huffington Post excerpted did seem to rise a bit in the search rankings under the new system.”

While Google probably could have done a better job of alerting innocent sites of their fate beforehand, in case only simple fixes were needed (just bad SEO tagging? Not enough consistency, a smattering of ads?), such a move most likely wasn’t made in order in order to catch ‘farmers’ by surprise. Such is the cost of war.

But a fundamental question remains: Beyond wiping the most obvious offenses off the Internet map, is there really a foolproof way to statistically judge how poor or great a given site is? Won’t content mills simply become more sophisticated in their methods to game Google’s system? Remember the story of the evil eyeglasses website owner, his fowl approach to customer service and the massive search results it initially earned him? Google is constantly revising its method for plucking out the bad apples. In order to foster a culture that better values compelling content and engaging, original experiences, they must continue onward in their quest to weed out ad-stuffed repositories and poorly done, SEO-gaming webmasters, their bellies bulging ever larger by the click-through.

Another small battle in the war to let great content win.


Viewing all articles
Browse latest Browse all 50

Trending Articles