Subscribe: Comments on: Fair Syndication Consortium: News orgs’ new way to confront Google?
http://www.niemanlab.org/2009/06/fair-syndication-consortium-news-orgs-new-way-to-confront-google/feed/
Added By: Feedage Forager Feedage Grade B rated
Language: English
Tags:
attributor  consortium  content  fair  figure  full content  piracy  problem  publishers  rich pearson  rich  sites  splogs  times   
Rate this Feed
Rate this feedRate this feedRate this feedRate this feedRate this feed
Rate this feed 1 starRate this feed 2 starRate this feed 3 starRate this feed 4 starRate this feed 5 star

Comments (0)

Feed Details and Statistics Feed Statistics
Preview: Comments on: Fair Syndication Consortium: News orgs’ new way to confront Google?

Comments on: Fair Syndication Consortium: News orgs’ new way to confront Google?





Last Build Date: Sat, 20 Jan 2018 23:07:00 -0500

 



By: Zachary M. Seward

Mon, 27 Jul 2009 22:50:55 +0000

I don't feel like I have any way to judge the claim, Scott, but I share your skepticism. (I'm more enthusiastic about the concept, the spirit of the consortium, and the potential for proactive syndication.) Rich Pearson first mentioned the 5x stat to me in an interview on April 23, when, at least according to my notes, he didn't bring up the $250 million or any other dollar figure. I think they prepared that for the Chicago meeting as a more dramatic way of illustrating the problem — even if it's neither dramatic nor a problem. To Attributor's credit, they've been pretty clear about how rough their estimates are: In the slides above, the first two of three outcomes in their "scenario analysis" are failures. This post was written on June 5. What I found most puzzling about today's Times story was its conflation of Attributor and the Fair Syndication Consortium. The former is running the latter, but even if the consortium is a flop, Attributor will still profit from the fees they already charge to track content usage across the web. So they've got a pretty sound business plan in all this, even if newspapers don't. —Zach



By: Scott Rosenberg

Mon, 27 Jul 2009 22:31:33 +0000

i've never understood this "five times more people see the content elsewhere than at its site of original publication" figure. It makes no sense, particularly if this conversation is about splogs. Splogs scrape a little traffic here and there from the bottom of the Google barrel, but there's no way that, you know, five times more people read the NYT's front-page story on splogs than at NYTimes.com. If the 5-times figure is really talking about "people who see a headline and summary on Huffington Post" then it still sounds inflated but it might make a little more sense. But if that's what we're talking about then this doesn't represent traffic (and dollars) that Attributor -- which Rich Pearson says is focusing solely on full-content reuse -- can win back for publishers.



By: Rich Pearson

Fri, 05 Jun 2009 22:10:52 +0000

Zach - no problem and fair points all around. The proof will obviously be in the revenue that is delivered. Until then, it is all speculation :-) And just to reiterate your point as I don't think Marcus understood - the consortium is focused on helping publishers capture their fair share of *full copy* reuse, removing the fair use question.



By: Marcus

Fri, 05 Jun 2009 20:34:54 +0000

Wow.. this piece sure is loaded and slanted... where is the comment from the other side.. the supposed aggregators. I certainly hope that none of these newspapers aggregate. Oh wait they do all the time! Just look at WSJ, DOW, or just about any other paper. Even the Associated Press aggregates. and lets not forget pesky things like fair use and illegal combine law (antitrust to US types).



By: Zachary M. Seward

Fri, 05 Jun 2009 19:54:25 +0000

D, you've hit on one of my two questions about the $250 million figure, which is that AdSense CPMs on those sites are typically pennies, not quarters. My other concern is that they actually just estimated the lost ad revenue for the 25 most popular U.S. publishers and extrapolated from there. Of course they weighted for traffic and everything, but it's my impression that splogs and other forms of full-content piracy are orders of magnitude more common among some of the big blogs like TechCrunch than any regional newspaper — that is, they are pirated at a rate disproportionate to their traffic. Even piracy of The New York Times seems far less common than piracy of, say, Gawker Media sites, likely because the Times doesn't offer any full-content RSS feeds. That's just my anecdotal impression, but I did ask Pitkow for his lost-ad-revenue estimate of just the top 25 publishers. He said he wasn't cleared to release that information, which is fair enough, but it would help clarify this issue. (Am I making sense here? I'm writing quickly on a cramped bus that smells faintly of garbage, and I can only barely see the screen of my laptop, so who knows!) Now, Pitkow was kind enough to go over the $250 million figure with me several times, and he did acknowledge that it's a very rough estimate. If it turned out that piracy is only really common among the top publishers, that wouldn't defeat Attributor's premise. It would just mean that the consortium would only be of real benefit to sites like TechCrunch, Gawker Media, and The Huffington Post. No problem there — unless, of course, you're one of the newspaper companies they were pitching in Chicago. Thanks, Rich, for diving in first with those clarifications. (For everyone else: Rich is VP of marketing at Attributor.) And, Siva, I — and, more to the point, the consortium — are not talking about "legal online aggregation and linking." The whole point of this post is to make that distinction. —Zach



By: Siva Vaidhyanathan

Fri, 05 Jun 2009 19:27:16 +0000

Why did you use the word "piracy?" We are talking about legal online aggregation and linking. Why would you use such a judgmental word? For now, what aggregators do is legal and what newspaper publishers are proposing is illegal (collaborate on setting prices). Please be careful.



By: D

Fri, 05 Jun 2009 19:17:48 +0000

--Attributor CEO Jim Pitkow estimated that publishers are losing a total of $250 million annually to splogs and other sites that copy their content. (In a phone interview yesterday, he told me that estimate was based on a “conservative” CPM of roughly 25 cents...-- As I'm sure your aware, this actually seems like a wildly aggressive CPM estimate for splog-like sites. I would be surprised if their CPMs were higher than a couple pennies. Looking at what Attributor does, I'm not surprised by his math. The CEO has every incentive to exaggerate the cost of the problem.



By: Rich Pearson

Fri, 05 Jun 2009 16:21:41 +0000

Zach, Great series and smart analysis throughout. Anticipating questions about the "math": The ecpm assumptions are pretty tough to nail, particularly in this environment and it is admittedly not Attributor's expertise. We do feel confident in the findings that the audience viewing publisher content on unauthorized sites (mostly "legitimate" sites but also including spam blogs) is 5x the audience on publishers' own destination sites. Another reason why we feel the opportunity is conservative is the analysis was isolated on U.S. ips only. Open to other questions, and if anyone wants to see how widespread their content is being reused, they can go to fairshare.cc and signup for free.