Linus Torvalds once said "Talk is cheap. Show me the code." I am cheap, so I'll just talk. If you're a restless developer looking for something to do, you may wish to take this project on yourself. If you do, there are two possible outcomes:
- You spend a few afternoons improving your web development skills.
- You save the internet.
The RSS Reader for the End Times
RSS is dead, and soon we will be too. With massive corporations eating us all alive, it's easy to see why the internet isn't quite as playful as it used to be.
There are plenty of people working hard to improve the situation, but more need to join the fight. RSS may be dead, but it can be revived.
Our goal with this project is to solve a fundamental problem: Content on large networks wins because it's easy to access, not because it's better. If we can make it easy to access content that lives on personal sites and other niches of the internet, we can level the playing field and allow the best content to get eyeballs without going where the people are.
It's not about RSS
First, let's be clear: RSS itself is on the decline. As far as XML nitty-gritties go, you can't expect every source of content to have an RSS feed available. What I'm saying is that if you want to Syndicate, it won't be Really Simple.
Fortunately, if you're clever, it's a solvable problem. Although the feed format is no longer standardized between most websites, it is usually still internally standard within a single source. If I want to get a feed of content from jakearchibald.com, I can just scrape it myself. Searching the HTML of the home page for elements with the selector .h-2 > a
gives me a set of links to the ten most recent posts. If I'm really clever, I could probably even figure out how to get the next ten too.
If you're willing to get dirty and start scraping, you can generate feeds for a lot of content in a lot of places, even if RSS isn't there to do the heavy-lifting.
Selecting elements is hard
Figuring out how to scrape each website differently to find its content is no small task. Fortunately, ad-blockers already have the solution: A browser extension with an "element picker" can help users to generate the selector themselves. Plus, once one person figures it out, you can store that selector in a database somewhere to help out others who want the watch the same feed.
For some services, it may also make sense to allow users to sign in with OAuth. Maybe a user wants a mirror of their YouTube subscriptions in their RSS reader. No problem! Just sign in with Google!
It is about relationships (and 💰)
One of the benefits of a centralized system is that it allows for relationships between people who make online content (like articles, comics, and videos) and those who consume it. These relationships can include internet comments, or, surprisingly, money! Patreon is shockingly successful because users want to pay people they appreciate.
An RSS Reader for the End Times understands this and facilitates it. Each item in the feed can have a public comment section attached, and -- more importantly -- a way to support the creator.
Imagine that you're subscribed to 100 feeds and want to support all the creators. Each month, you want to donate $20 and divvy it up between them. An RSS reader with donation functionality can facilitate this.
Every creator who likes money can sign up to receive payouts when users who are subscribed to their content choose to donate. Then, each donation is split up among the feeds (as dictated by the donating user), and once a creator has a few dollars of total donations in the system they can cash out at their leisure.
If you want your RSS reader to be financially viable without selling users' data or resorting to ads, you could probably even take a little cut yourself. 😉
Top comments (3)
IDK if RSS is really dead, I use it to get news, cause I love to get them all in one app. But I think one of the main differences between content delivery sites like medium and RSS readers is about AI. It wouldn't make medium different from others if it only showed articles from weblogs which I've subscribed, the main difference is it shows content from blogs which I'm not subscribed yet. So adding AI to RSS readers would definitely make them better IMHO.
"If I want to get a feed of content from jakearchibald.com, I can just scrape it myself. Searching the HTML of the home page for elements with the selector .h-2 > a gives me a set of links to the ten most recent posts."
But jakearchibald.com/posts.rss though.
Also, comments are solved with WebMention. Basically, you write a post on your blog as a reply, and then WebMention is a protocol to tell their site about it. However, it seems like it will be very easy for spammers to attack, so the receiving site still should use moderation/Askimet/other spam prevention techniques.