It’s hardly a social strategy, but posting your site’s headlines to your Twitter feed and Facebook page is an important first step for a news site.
I’m pretty sure that most WordPress and Drupal users can do this with a plugin. Things aren’t so simple for users of Expression Engine or some other content management system. You may also want to use a third-party service if you’re reaching the limits of your current system. I’ve wrestled with this problem for year or two. I thought I’d share my experience and save you some time.
My first impulse, and probably yours as well, was to try get my Facebook page to pull in my feed. There’s a way to do it, but I defy you to discover it yourself, even though it’s accessible right there on the page. Once you know the non-obvious directions, it’s easy to do. But this is a very slow way to do an update. Facebook polls RSS feeds at long and seemingly random intervals. It can take hours for a new post to show up on your Facebook page. These days, this isn’t acceptable for J. Random Blogger, let alone a news site. And, of course, that doesn’t get your feed onto Twitter.
You need a third-party service to deliver your headlines as quickly and flexibly as your news cycle demands. Most of these services use the Pubsubhubbub protocol to know the moment you’ve updated your feed. I’m not going to go into a long explanation of how it works. I’m a little fuzzy on that to be honest. But, like the http protocol, you don’t need to know a lot to use it. Superfeedr will manage your push notifications for you. It won’t deliver your feeds to your social pages, but it will make possible for other services to do so more quickly.
Twitterfeed is designed will update social sites from your RSS feed. I found it to be reliable, nicely designed, flexible, and easy to use. It accepts Pubsubhubbub notifications. But its dashboard is a little buggy in my experience, and I’m not using it as much as I have in the past.
Ping.fm aims to be a switching yard for social updates. The good news is that it can you can update it immediately via email, allowing you to be more selective about which stories appear on your social feeds. The bad news is that (1) Ping.fm can’t take RSS feeds directly, making it harder to automate your workflow, and recommends Twitterfeed for RSS notifications. (2) I found the exact format of Ping.fm updates to my Facebook page to be unreliable — they changed over time. This resulted in a Facebook page that was a visual mess and occasionally just stupid-looking when Ping.fm (or Facebook) picked up the wrong image to display with a story. And (3) Ping also seems to have gone into a developmental hiatus since Seesmic bought them last March.
Dlvr.it feels like a more polished solution for posting your feeds. It accepts Pubsubhubbub notifications, is easy to use, has a flexible system for adding feeds and social destinations, provides excellent reports on items delivered and clicked, does a better job of formatting Facebook entries, and allows to to filter your posts based on their content. It can also deliver RSS output, which I used to buffer a friend’s RSS feed which my CMS started polling him too frequently.
All three services (Twitterfeed, Ping.fm, and Dlvr.it) look good and deal creatively with the problem of putting a simple, Web 2.0-style interface on creating complex many-to-many relationships. Ping.fm and Dlvr.it also offer API’s if you have the need and the skills to do something even more custom.
I recommend trying all three (using Superfeedr) However, I’ve found that Dlvr.it is the easiest, fastest, and most powerful way to update my social feeds from my news site.
Building an RSS aggregator with WordPress
I’ve been experimenting for years with how to create and structure community news feeds. Readers want to follow breaking news and relevant information as it becomes available — and not just on Facebook and Twitter. All this time, I’ve been looking for an easy and reliable way to merge a bunch of related RSS feeds into a single feed that would make this easier for my readers.
What seems like a simple problem turns out to be difficult to do well. The problem is that it’s an easy enough problem that everyone thinks they can solve it, so no one is willing to invest what it takes to create a quality implementation. I’ve tried a lot of methods short of writing my own aggregator, which have no interest in doing.
Speed is an issue for all aggregators, because they must recognize that a feed has been updated, read the feed, add it to the merged feed, and deliver the new feed to its subscribers. This can slow things down, because neither the aggregator nor the end user can poll too often. Also, many of these services do not scale very well and slow to a crawl as soon as they become popular.
A lot of sites aim to create merged feeds for you. About half those you’ll find on most lists via Google no longer exist. So how can you count on any of the others to be there in the long run, or to be fast and reliable enough to use in a production system?
Yahoo Pipes is ineffably cool, and a pretty good solution, but it’s slow, kind of obtuse, and unreliable. It doesn’t appear to be getting a lot of attention from Yahoo, which is a good sign you shouldn’t make it part of your infrastructure.
The Simplepie RSS parser has a nice-looking website and lots of links to interesting examples, but the project seems dead in the water. I was able to find a good example of an RSS aggregator that did most of what I was looking for, modified it to make it my own, and added it to my site. However, maintenance became a problem. First, the script slowed down my host. Shortly after that, it started to get aggressive in its reading of my friends’ sites. It was becoming clear that unless I wanted to go from being a publisher to a programmer, that rolling my own aggregator from Simplepie simply took too much work.
I finally realized I could solve this problem using WordPress, of all things. In a couple of hours, I was able to create a news aggregator using WordPress and a plugin called FeedWordPress.
My new aggregator can pull in headlines and summaries from multiple sites and deliver a combined feed. The aggregator itself isn’t intended to be a destination — right now I’m the only person who even knows where it is. It’s just a bit of infrastructure that works reliably and fast. I’m able to read the merged site’s feed with the magpie reader built into Expression Engine and deliver the results as a local news digest.
FeedWordPress has a terrible name and an ugly website, but it works really well and has plenty of options for managing the way you gather and display feeds. And it’s shareware. I sent in my donation.
Wikileaks: This ain't your old man's Japanese tea ceremony
Steve Yelvington is right about the five sad reasons the America press is outraged by Wikileaks.
I’d add another:
American journalists are fundamentally conservative –they hate change. The conventions of American journalism are as ritualistic and metaphorical as a Japanese tea ceremony or diplomatic protocol. Wikileaks is outside their taxonomy and it makes their brains itch.
Daniel Ellsberg’s endorsement notwithstanding, the diplomatic cables are not the Pentagon Papers. And the Pentagon Papers currently define the outer edge of acceptable journalistic behavior.
Traditional American journalism has almost always been about telling the least objectionable story. Usually it serves the powerful, sometimes it serves the public at the expense of the powerful. It is seldom willing to alienate both the powerful and the public when both are on the same page.
Outside.in explains how they serve hyperlocal headlines to CNN
This is an interesting description of how Outside.in is delivering local headlines to CNN. It’s from Mashery’s Business of API conference in New York last month.
I’ve never thought too highly of local aggregators like Outside.in. but they’re doing the right thing and linking those headlines on CNN not to themselves, but to the originating sites.
As a community news publisher I appreciate it.
Sidebar: I can also appreciate the irony of embedding this video after proudly declaring myself to be (mostly) Flash-free. Flash is still an important part of the Internet’s plumbing, but I can tell you my Macbook Pro runs better when it’s not in my browser.
Becoming Flash-free
I removed Flash from my Mac. I couldn’t be happier.
As a user, Flash sucks up my resources, drains batteries, enables indelible cookies, and bypasses popup blocking.
With a lot of tabs open — which is how I work — Safari would become sluggish and intermittently unresponsive. Since I removed Flash, these problems have gone away because I no longer have a dozen Flash animations running in the background. The improved privacy and reduced interruptions are side benefits. If I absolutely need Flash, I can open Chrome, which has Flash built in. But I have FlashBlock installed there by default as well.
I’ve gone back and forth on Flash. For years, I said it was junking up the web. It never really stopped junking up the Web, but in the last five years Flash made it Web video a practical, so I cut it some slack. Before Flash, Web video was a nightmare of proprietary players, constant updates, tiresome visits to the RealPlayer’s home page for yet another attempt to trick you into buying their worthless merch.
As a publisher, Flash is a ticket to misery. I wasted countless hours producing Flash videos for Coastsider, trying to get Flash players to work on my site, and attempting to embed videos from other sites.
I’m done with all that. I recommend you be done with it to.
Get ready for the post-Flash world. Millions of iPhone, iPad, Android and other mobile users are already surfing Flash-free. The thought leaders in Web and software development are beginning to see the light as well. Expect uninstalling Flash to be a trend in 2011.
Resistance is likely to come from three sources:
- Editorial personnel who confuse Flash development with Web development: Do them a favor and get them trained in something more future-proof.
- Advertisers (agencies, actually) love Flash: It makes them feel like artisans. The best you can do with them it to make sure they provide ads in alternative formats for serving to non-Flash users of your site.
- Web producers who see Flash as an easy way to create fancy user experiences: This is already less common in media than it is in the production of commercial sites for restaurants and hotels.
Those who are complaining the loudest about this trend have big investments in Flash development and technology. If that describes your situation, now might be a good time to think about what you’re willing to sacrifice to keep Adobe’s Flash relevant.
You’ll need to continue to use Flash for video as long as a reasonable number of users need it, but you should be preparing now to produce video that works with HTML 5. This will mean multiple formats in the short term. You can ease this by finding a video host that can deliver modern Web video to your users, and Flash when they demand it. I’ve moved all my videos to Vimeo and I’m not looking back.
Hyperlocal race to the bottom between AOL and Yahoo
A friend who runs a local mailing list on Yahoo Groups just got the following email. We live in interesting times.
From: “[email protected]”
Hello,
Yahoo! is creating an exciting new opportunity for San Francisco and San Jose area residents. We’re looking for neighbors who want to participate in their communities by getting out the word on what’s happening around them. Because you have led locally focused Yahoo! Groups, we thought our new initiative might be another way for you to communicate what you care about with others.
You can be among the first to contribute to this soon-to-launch neighborhood content destination by joining the conversation today. Depending on where you live, signup at either http://www.associatedcontent.com/join/sanfrancisco or http://www.associatedcontent.com/join/sanjose. After you sign up (which is free and carries no obligations) on Associated Content by Yahoo!, you’ll be given $10 writing assignments that will explain more about this project and how you can get involved!
To put this in perspective, $10 per story, for a Patch editor producing 2.5 stories/day, 6 days/wk, would yield an annual income of $7,800 — no benefits.
Smells like plagiarism by Patch, but that's not the worst of it
I came into this thinking Bob Cox needed to get a life.
Bob has now convinced me that his local Patch editor’s reuse of some mug shots from his site was plagiarism. I wouldn’t (and he didn’t) call it copyright infringement. But he added value to the photos, and the local Patch editor used them and didn’t credit him.
Looks like a goof by a newbie blogger, but it should have been acknowledged by that editor’s chain of command.
Bob has made some other claims that Ms. O’Connor has refuted, but he has also proven (at least to me) that the following statement by Patch regional editor Kathleen O’Connor is simply false:
any similarity to Mr. Cox’s presentation of those public images is purely coincidental. Linking mug shots together in Photoshop (in this case, apparently doing nothing more than placing three similar sized objects in a row) is standard operating procedure for news organizations everywhere.
Patch is a large, new, decentralized operation. Everyone in it is already working too hard. There will be mistakes. But how Patch deals with those mistakes will be critical to its reputation as a news organization.
Full disclosure: I reused a mug shot from the SF Examiner just last week. I had no problem with reusing the image, which is public property, but I made sure to credit the Examiner as the source and link to the story. After all, who wouldn’t do that?
This is not a strategy. It barely qualifies as a tactic.
Newsday has put all their news behind a pay wall. Subscribers to the paper and the owner CableVision’s Internet services (75% of Long Island residents) can still get free access.
This has resulted in no increase in revenue. The only way this can make money is by slowing the decline in readership of Newsday in print. That seems improbable.
They’ve cut their audience in half and opened the market to news bloggers in the dozens of communities that Newsday serves.
How can I get Media News to do this in the Bay Area?
If you're reading this, I blame Twitter
MediaSavvy is back after a pretty long hiatus: six months since my my last Forrester post, and a few years since my last post as an independent blogger.
I don’t have any illusions about the size of the audience for the site’s RSS feed. This site is mostly read by spiders and spambots these days, but I plan to change that.
So, after posting again for the first time in months the other day, I started piping my RSS feed to Twitter this afternoon.
I’ve been using Twitterfeed for a few months to tweet Coastsider’s headlines to a special Twitter account. Now I’m using it to send MediaSavvy headlines to my personal Twitter feed. And those tweets wind up on my Facebook and FriendFeed pages in reasonably short order via plumbing I’d put in place last year.
Within an hour, I could see a noticeable increase in traffic on MediaSavvy.
I love RSS for a host of reasons: manifesting headlines from other sites on Coastsider, putting together aggregated pages from database searches, reviewing classified ads without visiting the original site, getting data into Yahoo pipes to create even more RSS feeds, and other geeky nonsense.
But I seldom fire up NetNewswire or Google Reader any more just to see what’s going on. And most folks never, ever did.
Twitter is a great way to get your headlines in front of your fans where they’re actually going to get read them. If you don’t already have a Twitter feed for your site, or aren’t already piping your headlines to your personal feed, now is a good time to start.
Twitter didn’t kill RSS as a consumer technology, but it may have buried the body.
Washington Post reporter has world's fakest job
Ian Shapira, reporter for the Washington Post, thinks Gawker ripped off his tiresome trend story about a tiresome trendwatcher.
He may be right, but that’s not what the Post is worried about right now.
Sure, Gawker copied key quotes from Shapira’s Speaking to Generation Nexus
Guru Explains Gens X, Y, Boomer To One Another, which is smeared across three slow-loading pages on the Post’s site. Gawker’s ‘Generational Consultant’ Holds America’s Fakest Job is shorter, funnier, has a better hed, and fits on a single page.
“Generational consultant” Anne Loehr’s “generational cheat sheet”, which should have been a single-page table, takes up another heavily-monetized five pages of the Post’s site. Shapira makes no apologies for that, presumably because he knows the publicity is good for her. After all, if someone gives a generational seminar and the Post doesn’t show up, has it made a sound?
Current law basically allows the Gawkers of the world to appropriate others’ work, repurpose it and sell ads against it with no payment to or legal recourse for the company that paid me while I sat through two hours of a generational seminar.
Maybe Shapira and his editors should consider whether that time might have been better spent on a different story. I hear things are a little tight in the Post newsroom these days.
They want to amend the copyright law so that it restores “unfair competition rights” — which once gave us the power to sue rivals if our stories were being pirated. That change would give news organizations rights that they could enforce in court if “parasitic” free-rider Web sites (the heavy excerpters) refused to bargain with them for a fee or a contract. Marburger said media outlets could seek an order requiring the free-rider to postpone its commercial use or even hand over some advertising revenue linked to the free-riding.
No one objects to copyright protection. OK, almost no one. But the Post already has that. They want something bigger.
What’s on the minds of the traditional media is not plagiarizers and “parasites”. They have Google in their sights. And they need a extension/reification of the “hot news” doctrine that will allow them to have a monopoly on the facts for a period of time greater than zero.
I’m not interested in rewriting copyright and antitrust law to save the occasional baby in all the bathwater the major metros print every day.
How can the Gawker article be considered “unfair competition” when it increased the audience for Shapira’s article on the Post’s site? Because Gawker’s very existence is unfair competition.
The Post just completed its fourth round of buyouts since 2003; and although the company reported on Friday that it had returned to profitability in the second quarter, the newspaper division, which is pretty much us, continues losing money. Standard & Poor’s expects that the company’s gross earnings will drop by 30 percent this year. Gawker Media, on the other hand, reported last week that its revenues in the first two quarters of 2009 were up 45 percent from the first two quarters of last year. …
After all the reporting, it took me about a day to write the 1,500-word piece. How long did it take Gawker to rewrite and republish it, cherry-pick the funniest quotes, sell ads against it and ultimately reap 9,500 (and counting) page views?