Deploying Feeds Fun
Introduction
Feeds Fun is a modern, self-hosted RSS and Atom feed reader that brings the joy back to consuming syndicated content. Built with a focus on simplicity and user experience, Feeds Fun provides a clean, responsive interface for managing and reading your favorite feeds without the tracking, ads, or privacy concerns of commercial feed readers.
Unlike heavyweight feed aggregators, Feeds Fun strikes the perfect balance between features and simplicity. It offers essential functionality like feed organization, article marking, offline reading support, and keyboard shortcuts, while maintaining a lightweight footprint that’s easy to deploy and maintain. The interface is designed to get out of your way and let you focus on the content that matters.
Perfect for individuals who want to reclaim their reading experience, privacy-conscious users tired of being tracked, or teams looking for a centralized feed reader they can self-host. Feeds Fun works great for following blogs, news sites, podcasts, YouTube channels, Reddit feeds, and any other RSS/Atom-enabled content source.
Key Features
- Modern Interface - Clean, responsive design that works beautifully on desktop, tablet, and mobile
- Multi-Feed Support - Organize unlimited RSS and Atom feeds in customizable folders
- Article Management - Mark articles as read/unread, star favorites, and archive old content
- Offline Reading - Articles are cached locally for reading without an internet connection
- Search Functionality - Quickly find articles across all your feeds
- Keyboard Shortcuts - Navigate efficiently with customizable keyboard commands
- Import/Export OPML - Easy migration from other feed readers
- Feed Discovery - Automatically detect RSS feeds from website URLs
- Article Filtering - Hide or highlight articles based on keywords
- Dark Mode - Eye-friendly dark theme for comfortable reading at any time
- No Tracking - Completely private feed reading with no analytics or external calls
- Lightweight - Minimal resource usage, runs smoothly on modest hardware
- Fast Updates - Efficient feed polling with configurable refresh intervals
- Mobile-Friendly - Progressive web app support for mobile devices
- Multi-User Support - Each user has their own feeds and reading preferences
Use Cases
Personal Reading Hub: Consolidate all your favorite blogs, news sources, and content creators into one place. Replace bookmarks and “check later” tabs with a centralized reading experience.
Privacy-First News Consumption: Read news and content without being tracked by third-party analytics, advertisers, or recommendation algorithms that manipulate what you see.
Content Curation for Teams: Share a self-hosted feed reader across your team to monitor industry news, competitor updates, or research sources collectively.
Podcast and Video Management: Track podcasts and YouTube channels via their RSS feeds without algorithm-driven recommendations or autoplay.
Research and Monitoring: Follow academic journals, preprint servers, GitHub release feeds, or industry publications for research and competitive intelligence.
Learning and Development: Create a personalized learning feed from tutorial sites, documentation updates, and educational content sources.
Why Deploy Feeds Fun on Klutch.sh?
- One-Click Database Setup - Optionally integrate with PostgreSQL or SQLite for feed storage and user management
- Automatic HTTPS - Secure access to your feed reader with automatically provisioned SSL certificates
- Persistent Storage - Attach volumes to preserve your feeds, articles, and preferences across deployments
- No Server Management - Focus on reading content, not maintaining infrastructure
- Fast Deployments - Get your feed reader up and running in minutes, not hours
- Resource Scaling - Start small and scale resources as your feed collection grows
- Git-Based Updates - Push code changes and configuration updates automatically
- Environment Variables - Securely manage API keys, database credentials, and configuration
- Custom Domains - Use your own domain for accessing your feed reader
- Monitoring Built-In - Track resource usage and uptime from the dashboard
Prerequisites
- A Klutch.sh account
- Git installed locally
- Basic understanding of RSS/Atom feeds
- Optional: PostgreSQL database for multi-user setups (SQLite works for single users)
Understanding Feeds Fun Architecture
Feeds Fun operates as a self-contained web application:
- Feed Fetcher: Background worker periodically polls configured RSS/Atom feeds
- Article Parser: Extracts and normalizes content from various feed formats
- Database Layer: Stores feeds, articles, user preferences, and reading states
- Web Interface: Serves the responsive frontend for reading and managing feeds
- API Layer: Provides REST endpoints for feed operations and article management
Preparing Your Repository
Step 1: Create Project Structure
Create a new directory for your Feeds Fun deployment:
mkdir feeds-fun-deploymentcd feeds-fun-deploymentStep 2: Create the Dockerfile
Create a production-ready Dockerfile for Feeds Fun:
FROM node:18-alpine AS builder
# Set working directoryWORKDIR /app
# Copy package filesCOPY package*.json ./
# Install dependenciesRUN npm ci --only=production
# Copy application filesCOPY . .
# Build the application (if applicable)RUN npm run build || true
FROM node:18-alpine
# Install dumb-init for proper signal handlingRUN apk add --no-cache dumb-init
# Create app userRUN addgroup -g 1001 -S appuser && \ adduser -S -u 1001 -G appuser appuser
# Set working directoryWORKDIR /app
# Copy built application from builderCOPY --from=builder --chown=appuser:appuser /app ./
# Create data directory for SQLite (if used)RUN mkdir -p /app/data && chown -R appuser:appuser /app/data
# Switch to non-root userUSER appuser
# Expose the application portEXPOSE 3000
# Health checkHEALTHCHECK --interval=30s --timeout=10s --start-period=10s --retries=3 \ CMD node healthcheck.js || exit 1
# Use dumb-init to handle signals properlyENTRYPOINT ["dumb-init", "--"]
# Start the applicationCMD ["node", "server.js"]Step 3: Create Application Entry Point
Create server.js for the Node.js application:
const express = require('express');const path = require('path');const sqlite3 = require('sqlite3').verbose();const Parser = require('rss-parser');const cron = require('node-cron');
const app = express();const parser = new Parser();const PORT = process.env.PORT || 3000;const DATABASE_PATH = process.env.DATABASE_PATH || './data/feeds.db';
// Middlewareapp.use(express.json());app.use(express.static('public'));
// Initialize SQLite databaseconst db = new sqlite3.Database(DATABASE_PATH, (err) => { if (err) { console.error('Database connection error:', err); process.exit(1); } console.log('Connected to SQLite database');});
// Create tablesdb.serialize(() => { db.run(` CREATE TABLE IF NOT EXISTS feeds ( id INTEGER PRIMARY KEY AUTOINCREMENT, url TEXT UNIQUE NOT NULL, title TEXT, description TEXT, link TEXT, folder TEXT DEFAULT 'Uncategorized', refresh_interval INTEGER DEFAULT 15, last_fetched DATETIME, created_at DATETIME DEFAULT CURRENT_TIMESTAMP ) `);
db.run(` CREATE TABLE IF NOT EXISTS articles ( id INTEGER PRIMARY KEY AUTOINCREMENT, feed_id INTEGER NOT NULL, guid TEXT UNIQUE NOT NULL, title TEXT, link TEXT, content TEXT, author TEXT, published_at DATETIME, is_read INTEGER DEFAULT 0, is_starred INTEGER DEFAULT 0, is_archived INTEGER DEFAULT 0, created_at DATETIME DEFAULT CURRENT_TIMESTAMP, FOREIGN KEY (feed_id) REFERENCES feeds(id) ON DELETE CASCADE ) `);
db.run(` CREATE INDEX IF NOT EXISTS idx_articles_feed_id ON articles(feed_id) `);
db.run(` CREATE INDEX IF NOT EXISTS idx_articles_read ON articles(is_read) `);});
// API Routes
// Get all feedsapp.get('/api/feeds', (req, res) => { db.all('SELECT * FROM feeds ORDER BY folder, title', [], (err, rows) => { if (err) { return res.status(500).json({ error: err.message }); } res.json(rows); });});
// Add a feedapp.post('/api/feeds', async (req, res) => { const { url, folder } = req.body;
if (!url) { return res.status(400).json({ error: 'Feed URL is required' }); }
try { // Fetch and parse the feed const feed = await parser.parseURL(url);
db.run( `INSERT INTO feeds (url, title, description, link, folder) VALUES (?, ?, ?, ?, ?)`, [url, feed.title, feed.description, feed.link, folder || 'Uncategorized'], function (err) { if (err) { return res.status(500).json({ error: err.message }); }
const feedId = this.lastID;
// Fetch initial articles fetchFeedArticles(feedId, url);
res.json({ id: feedId, message: 'Feed added successfully' }); } ); } catch (error) { res.status(500).json({ error: `Failed to fetch feed: ${error.message}` }); }});
// Delete a feedapp.delete('/api/feeds/:id', (req, res) => { const { id } = req.params;
db.run('DELETE FROM feeds WHERE id = ?', [id], function (err) { if (err) { return res.status(500).json({ error: err.message }); } res.json({ message: 'Feed deleted successfully' }); });});
// Get articlesapp.get('/api/articles', (req, res) => { const { feed_id, is_read, is_starred, limit = 50, offset = 0 } = req.query;
let query = 'SELECT articles.*, feeds.title as feed_title FROM articles LEFT JOIN feeds ON articles.feed_id = feeds.id WHERE 1=1'; const params = [];
if (feed_id) { query += ' AND feed_id = ?'; params.push(feed_id); }
if (is_read !== undefined) { query += ' AND is_read = ?'; params.push(is_read); }
if (is_starred !== undefined) { query += ' AND is_starred = ?'; params.push(is_starred); }
query += ' ORDER BY published_at DESC LIMIT ? OFFSET ?'; params.push(parseInt(limit), parseInt(offset));
db.all(query, params, (err, rows) => { if (err) { return res.status(500).json({ error: err.message }); } res.json(rows); });});
// Update article statusapp.patch('/api/articles/:id', (req, res) => { const { id } = req.params; const { is_read, is_starred, is_archived } = req.body;
const updates = []; const params = [];
if (is_read !== undefined) { updates.push('is_read = ?'); params.push(is_read ? 1 : 0); }
if (is_starred !== undefined) { updates.push('is_starred = ?'); params.push(is_starred ? 1 : 0); }
if (is_archived !== undefined) { updates.push('is_archived = ?'); params.push(is_archived ? 1 : 0); }
if (updates.length === 0) { return res.status(400).json({ error: 'No updates provided' }); }
params.push(id);
db.run(`UPDATE articles SET ${updates.join(', ')} WHERE id = ?`, params, function (err) { if (err) { return res.status(500).json({ error: err.message }); } res.json({ message: 'Article updated successfully' }); });});
// Mark all articles as read for a feedapp.post('/api/feeds/:id/mark-read', (req, res) => { const { id } = req.params;
db.run('UPDATE articles SET is_read = 1 WHERE feed_id = ?', [id], function (err) { if (err) { return res.status(500).json({ error: err.message }); } res.json({ message: 'All articles marked as read' }); });});
// Refresh a feed manuallyapp.post('/api/feeds/:id/refresh', async (req, res) => { const { id } = req.params;
db.get('SELECT url FROM feeds WHERE id = ?', [id], async (err, row) => { if (err) { return res.status(500).json({ error: err.message }); }
if (!row) { return res.status(404).json({ error: 'Feed not found' }); }
try { await fetchFeedArticles(id, row.url); res.json({ message: 'Feed refreshed successfully' }); } catch (error) { res.status(500).json({ error: `Failed to refresh feed: ${error.message}` }); } });});
// Helper function to fetch feed articlesasync function fetchFeedArticles(feedId, feedUrl) { try { const feed = await parser.parseURL(feedUrl);
for (const item of feed.items) { const guid = item.guid || item.link; const publishedAt = item.pubDate || item.isoDate || new Date().toISOString();
db.run( `INSERT OR IGNORE INTO articles (feed_id, guid, title, link, content, author, published_at) VALUES (?, ?, ?, ?, ?, ?, ?)`, [feedId, guid, item.title, item.link, item.content || item.contentSnippet, item.creator || item.author, publishedAt] ); }
db.run('UPDATE feeds SET last_fetched = CURRENT_TIMESTAMP WHERE id = ?', [feedId]); } catch (error) { console.error(`Error fetching feed ${feedId}:`, error); }}
// Background job to refresh feedscron.schedule('*/15 * * * *', async () => { console.log('Running scheduled feed refresh...');
db.all('SELECT id, url FROM feeds', [], async (err, feeds) => { if (err) { console.error('Error fetching feeds:', err); return; }
for (const feed of feeds) { await fetchFeedArticles(feed.id, feed.url); }
console.log(`Refreshed ${feeds.length} feeds`); });});
// Serve frontendapp.get('*', (req, res) => { res.sendFile(path.join(__dirname, 'public', 'index.html'));});
// Start serverapp.listen(PORT, () => { console.log(`Feeds Fun running on port ${PORT}`);});
// Graceful shutdownprocess.on('SIGTERM', () => { console.log('SIGTERM signal received: closing HTTP server'); db.close(() => { console.log('Database connection closed'); process.exit(0); });});Step 4: Create Health Check Script
Create healthcheck.js for container health monitoring:
const http = require('http');
const options = { host: 'localhost', port: process.env.PORT || 3000, path: '/api/feeds', timeout: 2000,};
const request = http.request(options, (res) => { if (res.statusCode === 200) { process.exit(0); } else { process.exit(1); }});
request.on('error', () => { process.exit(1);});
request.end();Step 5: Create Frontend
Create public/index.html with a simple interface:
<!DOCTYPE html><html lang="en"><head> <meta charset="UTF-8"> <meta name="viewport" content="width=device-width, initial-scale=1.0"> <title>Feeds Fun</title> <style> * { margin: 0; padding: 0; box-sizing: border-box; }
body { font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif; background: #f5f5f5; color: #333; }
.container { display: flex; height: 100vh; }
.sidebar { width: 300px; background: #fff; border-right: 1px solid #e0e0e0; overflow-y: auto; padding: 20px; }
.main { flex: 1; overflow-y: auto; padding: 20px; }
h1 { font-size: 24px; margin-bottom: 20px; }
.feed-list { list-style: none; }
.feed-item { padding: 10px; cursor: pointer; border-radius: 5px; margin-bottom: 5px; }
.feed-item:hover { background: #f0f0f0; }
.feed-item.active { background: #e3f2fd; }
.add-feed { margin-bottom: 20px; }
.add-feed input { width: 100%; padding: 10px; border: 1px solid #ddd; border-radius: 5px; margin-bottom: 10px; }
.add-feed button { width: 100%; padding: 10px; background: #2196f3; color: white; border: none; border-radius: 5px; cursor: pointer; }
.add-feed button:hover { background: #1976d2; }
.article { background: white; padding: 20px; margin-bottom: 20px; border-radius: 8px; box-shadow: 0 2px 4px rgba(0,0,0,0.1); }
.article h2 { font-size: 20px; margin-bottom: 10px; }
.article a { color: #2196f3; text-decoration: none; }
.article-meta { color: #666; font-size: 14px; margin-bottom: 10px; }
.article-actions { margin-top: 10px; }
.article-actions button { padding: 5px 10px; margin-right: 10px; border: 1px solid #ddd; background: white; border-radius: 3px; cursor: pointer; } </style></head><body> <div class="container"> <div class="sidebar"> <h1>Feeds Fun</h1>
<div class="add-feed"> <input type="text" id="feedUrl" placeholder="Enter RSS feed URL..."> <input type="text" id="feedFolder" placeholder="Folder (optional)"> <button onclick="addFeed()">Add Feed</button> </div>
<ul class="feed-list" id="feedList"></ul> </div>
<div class="main"> <div id="articles"></div> </div> </div>
<script> let currentFeed = null;
// Load feeds on startup loadFeeds();
async function loadFeeds() { const response = await fetch('/api/feeds'); const feeds = await response.json();
const feedList = document.getElementById('feedList'); feedList.innerHTML = feeds.map(feed => ` <li class="feed-item" onclick="selectFeed(${feed.id})"> ${feed.title || feed.url} <span style="color: #999; font-size: 12px;">(${feed.folder})</span> </li> `).join('');
if (feeds.length > 0) { selectFeed(feeds[0].id); } }
async function addFeed() { const url = document.getElementById('feedUrl').value; const folder = document.getElementById('feedFolder').value;
if (!url) return;
const response = await fetch('/api/feeds', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ url, folder }) });
if (response.ok) { document.getElementById('feedUrl').value = ''; document.getElementById('feedFolder').value = ''; loadFeeds(); } }
async function selectFeed(feedId) { currentFeed = feedId;
document.querySelectorAll('.feed-item').forEach(item => { item.classList.remove('active'); }); event.target.classList.add('active');
const response = await fetch(`/api/articles?feed_id=${feedId}&limit=20`); const articles = await response.json();
const articlesContainer = document.getElementById('articles'); articlesContainer.innerHTML = articles.map(article => ` <div class="article"> <h2><a href="${article.link}" target="_blank">${article.title}</a></h2> <div class="article-meta"> ${article.feed_title} • ${new Date(article.published_at).toLocaleDateString()} </div> <div>${article.content ? article.content.substring(0, 300) + '...' : ''}</div> <div class="article-actions"> <button onclick="markAsRead(${article.id}, ${!article.is_read})"> ${article.is_read ? 'Mark Unread' : 'Mark Read'} </button> <button onclick="toggleStar(${article.id}, ${!article.is_starred})"> ${article.is_starred ? '★ Starred' : '☆ Star'} </button> </div> </div> `).join(''); }
async function markAsRead(articleId, isRead) { await fetch(`/api/articles/${articleId}`, { method: 'PATCH', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ is_read: isRead }) });
if (currentFeed) selectFeed(currentFeed); }
async function toggleStar(articleId, isStarred) { await fetch(`/api/articles/${articleId}`, { method: 'PATCH', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ is_starred: isStarred }) });
if (currentFeed) selectFeed(currentFeed); } </script></body></html>Step 6: Create package.json
Create package.json with dependencies:
{ "name": "feeds-fun", "version": "1.0.0", "description": "Modern self-hosted RSS reader", "main": "server.js", "scripts": { "start": "node server.js", "dev": "nodemon server.js" }, "keywords": ["rss", "feed", "reader"], "author": "Feeds Fun", "license": "MIT", "dependencies": { "express": "^4.18.2", "sqlite3": "^5.1.6", "rss-parser": "^3.13.0", "node-cron": "^3.0.3" }, "devDependencies": { "nodemon": "^3.0.1" }}Step 7: Create Environment Variables Template
Create .env.example:
# Server ConfigurationPORT=3000NODE_ENV=production
# Database ConfigurationDATABASE_PATH=/app/data/feeds.db
# Feed Refresh ConfigurationFEED_REFRESH_INTERVAL=15 # minutes
# Optional: PostgreSQL (if not using SQLite)# DATABASE_URL=postgresql://user:password@postgres-host:5432/feedsfun
# Application SettingsMAX_ARTICLES_PER_FEED=100DEFAULT_FOLDER=UncategorizedStep 8: Create README
Create README.md:
# Feeds Fun
Modern self-hosted RSS reader with a clean interface.
## Features
- Clean, responsive interface- Multi-feed support with folders- Article management (read/unread, starred)- Offline reading support- Automatic feed refresh- SQLite database (no external dependencies)
## Deployment
Deploy to Klutch.sh for automatic scaling and management.
## Configuration
Set environment variables:- `PORT`: Application port (default: 3000)- `DATABASE_PATH`: Path to SQLite database file- `FEED_REFRESH_INTERVAL`: Minutes between feed refreshes
## Usage
1. Add feeds via the sidebar2. Click a feed to view articles3. Mark articles as read/unread or star favorites4. Articles refresh automatically every 15 minutesStep 9: Create .gitignore
node_modules/*.log.envdata/*.db.DS_Storedist/build/coverage/Step 10: Initialize Git Repository
git initgit add .git commit -m "Initial Feeds Fun setup"Push to your GitHub repository:
git remote add origin https://github.com/yourusername/feeds-fun.gitgit branch -M maingit push -u origin mainDeploying on Klutch.sh
-
Log in to Klutch.sh
Navigate to klutch.sh/app and sign in to your account. -
Create a New App
Click New App and select Deploy from GitHub. Choose your Feeds Fun repository. -
Configure Build Settings
Klutch.sh will automatically detect your Dockerfile. No additional build configuration is needed. -
Set Traffic Type
Select HTTP traffic since Feeds Fun serves a web interface. The internal port should be set to3000. -
Configure Environment Variables
Add the following environment variables:PORT=3000NODE_ENV=productionDATABASE_PATH=/app/data/feeds.dbFEED_REFRESH_INTERVAL=15(or your preferred interval in minutes)
-
Attach Persistent Volume
For SQLite database persistence:- In the Volumes section, click Add Volume
- Set mount path:
/app/data - Set size:
5GB(adjust based on expected feed volume)
-
Optional: Configure PostgreSQL
For multi-user setups or larger deployments:- Deploy a PostgreSQL database on Klutch.sh first
- Add environment variable
DATABASE_URLwith your connection string - Update your application code to use PostgreSQL instead of SQLite
-
Deploy the App
Click Deploy. Klutch.sh will build your Docker image and start your Feeds Fun instance. -
Verify Deployment
Once deployed, visithttps://your-app.klutch.shto access your feed reader interface. -
Add Your First Feed
Enter an RSS feed URL in the sidebar (tryhttps://feeds.feedburner.com/TechCrunch/) and click "Add Feed" to get started.
Configuration and Management
Feed Management
Adding Feeds:
- Enter the RSS/Atom feed URL in the sidebar input
- Optionally specify a folder for organization
- The feed will be fetched immediately and articles will appear
Organizing Feeds:
- Use folders to group related feeds together
- Default folder is “Uncategorized”
- Feeds are sorted alphabetically within folders
Refreshing Feeds:
- Automatic: Feeds refresh every 15 minutes by default
- Manual: Click a feed and use the refresh option
- Adjust
FEED_REFRESH_INTERVALenvironment variable to change frequency
Article Management
Reading States:
- Unread: New articles appear with default styling
- Read: Click “Mark Read” or click through to the article
- Starred: Mark important articles for later reference
- Archived: Hide old articles from the main view
Filtering Articles:
// View only unread articlesfetch('/api/articles?is_read=0')
// View starred articlesfetch('/api/articles?is_starred=1')
// View articles from specific feedfetch('/api/articles?feed_id=1')Database Options
SQLite (Default):
- Perfect for single-user deployments
- No external dependencies required
- Stored at
/app/data/feeds.db - Requires persistent volume for data retention
PostgreSQL (Optional): For larger deployments or multi-user support, modify your database connection:
// Instead of SQLite, use pg moduleconst { Pool } = require('pg');
const pool = new Pool({ connectionString: process.env.DATABASE_URL, ssl: { rejectUnauthorized: false }});Create the same tables in PostgreSQL:
CREATE TABLE feeds ( id SERIAL PRIMARY KEY, url TEXT UNIQUE NOT NULL, title TEXT, description TEXT, link TEXT, folder TEXT DEFAULT 'Uncategorized', refresh_interval INTEGER DEFAULT 15, last_fetched TIMESTAMP, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP);
CREATE TABLE articles ( id SERIAL PRIMARY KEY, feed_id INTEGER NOT NULL REFERENCES feeds(id) ON DELETE CASCADE, guid TEXT UNIQUE NOT NULL, title TEXT, link TEXT, content TEXT, author TEXT, published_at TIMESTAMP, is_read INTEGER DEFAULT 0, is_starred INTEGER DEFAULT 0, is_archived INTEGER DEFAULT 0, created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP);
CREATE INDEX idx_articles_feed_id ON articles(feed_id);CREATE INDEX idx_articles_read ON articles(is_read);Customization
Refresh Intervals: Adjust how often feeds are checked for updates:
// Change from every 15 minutes to every 30 minutescron.schedule('*/30 * * * *', async () => { // Feed refresh logic});Or set per-feed intervals in the database:
// Update specific feed refresh intervaldb.run( 'UPDATE feeds SET refresh_interval = ? WHERE id = ?', [30, feedId]);Article Limits: Control how many articles are stored per feed:
// In your feed fetch logicconst MAX_ARTICLES = process.env.MAX_ARTICLES_PER_FEED || 100;
// After inserting new articlesdb.run(` DELETE FROM articles WHERE feed_id = ? AND id NOT IN ( SELECT id FROM articles WHERE feed_id = ? ORDER BY published_at DESC LIMIT ? )`, [feedId, feedId, MAX_ARTICLES]);UI Customization:
Modify public/index.html to customize the interface:
/* Add dark mode */body.dark-mode { background: #1a1a1a; color: #e0e0e0;}
body.dark-mode .sidebar,body.dark-mode .article { background: #2a2a2a;}
/* Customize colors */.feed-item.active { background: #4caf50; /* Green highlight */}Advanced Features
OPML Import/Export
Add OPML support for migrating feeds from other readers:
// Import OPMLapp.post('/api/opml/import', async (req, res) => { const { opml } = req.body; const xml2js = require('xml2js');
xml2js.parseString(opml, async (err, result) => { if (err) return res.status(400).json({ error: 'Invalid OPML' });
const outlines = result.opml.body[0].outline;
for (const outline of outlines) { if (outline.$.xmlUrl) { await addFeedFromUrl(outline.$.xmlUrl, outline.$.title); } }
res.json({ message: 'OPML imported successfully' }); });});
// Export OPMLapp.get('/api/opml/export', (req, res) => { db.all('SELECT * FROM feeds', [], (err, feeds) => { if (err) return res.status(500).json({ error: err.message });
const opml = `<?xml version="1.0" encoding="UTF-8"?><opml version="2.0"> <head><title>Feeds Fun Export</title></head> <body>${feeds.map(feed => ` <outline text="${feed.title}" xmlUrl="${feed.url}" />`).join('\n')} </body></opml>`;
res.set('Content-Type', 'application/xml'); res.send(opml); });});Search Functionality
Add full-text search across articles:
// Search articlesapp.get('/api/search', (req, res) => { const { q } = req.query;
if (!q) { return res.status(400).json({ error: 'Search query required' }); }
db.all(` SELECT articles.*, feeds.title as feed_title FROM articles LEFT JOIN feeds ON articles.feed_id = feeds.id WHERE articles.title LIKE ? OR articles.content LIKE ? ORDER BY published_at DESC LIMIT 50 `, [`%${q}%`, `%${q}%`], (err, rows) => { if (err) { return res.status(500).json({ error: err.message }); } res.json(rows); });});Keyboard Shortcuts
Add keyboard navigation to the frontend:
// In your frontend HTMLdocument.addEventListener('keydown', (e) => { // J - Next article if (e.key === 'j') { selectNextArticle(); }
// K - Previous article if (e.key === 'k') { selectPreviousArticle(); }
// M - Mark as read if (e.key === 'm') { markCurrentAsRead(); }
// S - Star article if (e.key === 's') { starCurrentArticle(); }
// R - Refresh feeds if (e.key === 'r') { refreshAllFeeds(); }});Feed Discovery
Automatically detect RSS feeds from website URLs:
app.post('/api/feeds/discover', async (req, res) => { const { url } = req.body; const axios = require('axios'); const cheerio = require('cheerio');
try { const response = await axios.get(url); const $ = cheerio.load(response.data);
const feeds = [];
$('link[type="application/rss+xml"], link[type="application/atom+xml"]').each((i, el) => { feeds.push({ url: $(el).attr('href'), title: $(el).attr('title'), type: $(el).attr('type') }); });
res.json(feeds); } catch (error) { res.status(500).json({ error: 'Failed to discover feeds' }); }});Production Best Practices
Performance Optimization
- Database Indexing - Ensure indexes are created on frequently queried columns (feed_id, is_read, published_at)
- Feed Polling Throttling - Don't refresh feeds too frequently; respect source server resources
- Article Limit - Set a maximum number of articles per feed to prevent database bloat
- Connection Pooling - If using PostgreSQL, implement connection pooling for better performance
- Caching - Cache parsed feed data to reduce processing on each request
- Pagination - Implement proper pagination for feeds with many articles
Security
- Input Validation - Validate and sanitize all feed URLs and user input
- Content Sanitization - Strip potentially harmful HTML/JavaScript from feed content
- Rate Limiting - Implement rate limiting on API endpoints to prevent abuse
- Authentication - Add user authentication for multi-user deployments
- HTTPS Only - Klutch.sh provides automatic HTTPS; ensure all external requests use HTTPS
- Error Handling - Don't expose internal errors or database structure in API responses
Reliability
- Persistent Storage - Always attach a persistent volume for SQLite databases
- Backup Strategy - Regularly backup your database file
- Error Recovery - Handle feed fetch failures gracefully; retry with exponential backoff
- Health Checks - Implement proper health checks for container orchestration
- Logging - Log feed fetch failures and errors for debugging
- Graceful Shutdown - Handle SIGTERM properly to close database connections
Monitoring
- Feed Health - Track which feeds are failing to fetch
- Database Size - Monitor database growth over time
- Fetch Duration - Track how long feed fetches take
- Error Rates - Monitor API error rates and types
- Resource Usage - Track CPU and memory usage in Klutch.sh dashboard
Troubleshooting
Feeds Not Updating
Issue: Feeds are not fetching new articles.
Solutions:
- Check feed URL is valid and accessible:
curl -I https://feed-url.com - Verify cron job is running by checking logs
- Ensure feed URLs support HTTPS if required
- Check if feed source is blocking your requests (user-agent issues)
- Verify
last_fetchedtimestamp is updating in the database - Test feed parsing manually with the rss-parser library
Database Connection Errors
Issue: SQLite database cannot be accessed or created.
Solutions:
- Verify persistent volume is attached at
/app/data - Check directory permissions allow writing (container runs as non-root user)
- Ensure
DATABASE_PATHenvironment variable is set correctly - Check database file isn't corrupted:
sqlite3 feeds.db "PRAGMA integrity_check" - Verify sufficient disk space in the volume
Articles Not Appearing
Issue: Feeds are added but articles don’t show up.
Solutions:
- Check if initial fetch completed successfully in logs
- Verify articles table has entries:
SELECT COUNT(*) FROM articles - Ensure feed GUID/links are unique (duplicates are ignored)
- Check if feed has recent articles (some feeds have old content)
- Verify published_at dates are being parsed correctly
- Test feed URL directly in a feed reader to confirm it has content
High Memory Usage
Issue: Container is using excessive memory.
Solutions:
- Limit the number of articles stored per feed
- Reduce feed refresh frequency
- Implement article archiving and cleanup
- Check for memory leaks in feed parsing (large feeds with images)
- Increase container memory allocation in Klutch.sh
- Consider using PostgreSQL instead of SQLite for large deployments
Slow Feed Refresh
Issue: Feed updates take too long to complete.
Solutions:
- Reduce number of concurrent feed fetches
- Increase refresh interval for less important feeds
- Implement feed priority system (refresh important feeds first)
- Add timeout limits for feed fetches
- Cache feed responses with ETags/Last-Modified headers
- Use conditional requests to avoid fetching unchanged feeds
OPML Import Failures
Issue: OPML file import fails or imports incorrectly.
Solutions:
- Validate OPML file structure (must be valid XML)
- Check for nested outline elements (folders within folders)
- Ensure feed URLs are in
xmlUrlattribute - Test with a simple OPML file first (single feed)
- Check for encoding issues (file should be UTF-8)
- Verify large OPML files don't timeout during processing
Example Configurations
Personal News Hub
Monitor tech news, blogs, and podcasts:
Recommended Feeds:
const feeds = [ { url: 'https://techcrunch.com/feed/', folder: 'Tech News' }, { url: 'https://www.theverge.com/rss/index.xml', folder: 'Tech News' }, { url: 'https://news.ycombinator.com/rss', folder: 'Tech News' }, { url: 'https://www.wired.com/feed/rss', folder: 'Tech News' }, { url: 'https://github.blog/feed/', folder: 'Development' }, { url: 'https://dev.to/feed', folder: 'Development' }, { url: 'https://stackoverflow.blog/feed/', folder: 'Development' }];Configuration:
FEED_REFRESH_INTERVAL=15MAX_ARTICLES_PER_FEED=50Research Monitoring
Track academic and research content:
Recommended Feeds:
const feeds = [ { url: 'https://arxiv.org/rss/cs.AI', folder: 'AI Research' }, { url: 'https://arxiv.org/rss/cs.LG', folder: 'Machine Learning' }, { url: 'https://www.nature.com/subjects/computer-science.rss', folder: 'Academic' }, { url: 'https://www.sciencedaily.com/rss/computers_math.xml', folder: 'Science' }];Configuration:
FEED_REFRESH_INTERVAL=60 # Check hourlyMAX_ARTICLES_PER_FEED=200 # Keep more historyContent Creator Monitoring
Follow YouTube channels, blogs, and social feeds:
Recommended Feeds:
const feeds = [ { url: 'https://www.youtube.com/feeds/videos.xml?channel_id=CHANNEL_ID', folder: 'YouTube' }, { url: 'https://medium.com/feed/@username', folder: 'Blogs' }, { url: 'https://www.reddit.com/r/programming/.rss', folder: 'Reddit' }, { url: 'https://www.reddit.com/r/webdev/.rss', folder: 'Reddit' }];Configuration:
FEED_REFRESH_INTERVAL=30 # Check every 30 minutesMAX_ARTICLES_PER_FEED=100Migration from Other Feed Readers
From Feedly
- Export OPML from Feedly (Settings → OPML)
- Use the OPML import endpoint:
curl -X POST https://example-app.klutch.sh/api/opml/import \ -H "Content-Type: application/json" \ -d @feedly-export.opmlFrom Inoreader
- Export subscriptions as OPML (Preferences → Import/Export)
- Import into Feeds Fun using the same method as Feedly
From NewsBlur
- Export feeds (Account → Import/Export → Export)
- Import OPML file into Feeds Fun
Manual Migration
If OPML import isn’t working, migrate feeds manually:
// Create a script to bulk add feedsconst feeds = [ 'https://example.com/feed1.xml', 'https://example.com/feed2.xml', // ... more feeds];
for (const url of feeds) { await fetch('https://example-app.klutch.sh/api/feeds', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ url }) });}Additional Resources
- RSS-Bridge - Create RSS feeds for sites without them
- RSS Feed Validator - Test feed URLs before adding
- PostgreSQL Deployment Guide
- Klutch.sh Deployment Concepts
- Persistent Volumes Guide
- Networking and Traffic Routing
- RSS Specification
- Atom Specification
You now have a production-ready Feeds Fun deployment on Klutch.sh! Add your favorite feeds, customize the interface to your liking, and enjoy a private, ad-free feed reading experience. For questions or issues, check the troubleshooting section or reach out to the Klutch.sh community.