A different way to read email newsletters.
news is a specialized email client designed for newsletter consumption. Following the UNIX philosophy, it consists of three orthogonal tools:
- Mail Downloader: Downloads emails to a Maildir directory (using existing tools)
- Story Extractor: Extracts stories from newsletter emails using AI
- UI Server: Serves a web interface to browse and read stories
Currently in early development stage.
See SKETCH.md for detailed vision and project plans.
The system uses standard email synchronization tools to download newsletters into a Maildir directory. You can use any of the following tools:
Install via your package manager:
# macOS
brew install isync
# Debian/Ubuntu
apt install isyncCreate ~/.mbsyncrc:
IMAPAccount newsletter
Host imap.example.com
User your-email@example.com
PassCmd "security find-generic-password -s mbsync-newsletter -w"
SSLType IMAPS
IMAPStore newsletter-remote
Account newsletter
MaildirStore newsletter-local
Path ~/Maildir/newsletters/
Inbox ~/Maildir/newsletters/INBOX
Channel newsletter
Far :newsletter-remote:
Near :newsletter-local:
Patterns *
Create Near
Sync All
Sync emails:
mbsync -aInstall:
# macOS
brew install offlineimap
# Debian/Ubuntu
apt install offlineimapCreate ~/.offlineimaprc:
[general]
accounts = newsletter
[Account newsletter]
localrepository = newsletter-local
remoterepository = newsletter-remote
[Repository newsletter-local]
type = Maildir
localfolders = ~/Maildir/newsletters
[Repository newsletter-remote]
type = IMAP
remotehost = imap.example.com
remoteuser = your-email@example.com
remotepass = your-password
ssl = yesSync emails:
offlineimapInstall:
# macOS
brew install fetchmail
# Debian/Ubuntu
apt install fetchmailCreate ~/.fetchmailrc:
poll imap.example.com
protocol IMAP
username "your-email@example.com"
password "your-password"
ssl
mda "/usr/bin/procmail -d %s"
The story extractor processes newsletter emails using AI to extract individual news stories.
The application uses a unified configuration system with the following lower-to-higher precedence:
- Defaults
- Configuration File (
story-extractor.toml) - Environment Variables
- Command Line Flags
1. Config File
Create a story-extractor.toml file (default locations: ./story-extractor.toml, $HOME/story-extractor.toml, or specify via --config):
# Global settings
maildir = "/path/to/maildir"
storydir = "/path/to/stories"
verbose = false
[llm]
provider = "openai"
model = "gpt-4.1-mini" # reasoning models like gpt-5-mini are slower and more expensive
api_key = "your-api-key" # Optional, prefer env var
base_url = "https://api.openai.com/v1"2. Environment Variables
Environment variables override config file values.
- Prefix:
STORY_EXTRACTOR_ - Mapping:
.and-are replaced with_
Examples:
STORY_EXTRACTOR_LLM_API_KEYoverrides[llm] api_keySTORY_EXTRACTOR_MAILDIRoverridesmaildirSTORY_EXTRACTOR_VERBOSE=truesets verbose mode
API Key Security (Recommended):
export STORY_EXTRACTOR_LLM_API_KEY="your-api-key-here"make story-extractorBasic usage:
./story-extractor \
--maildir ~/Maildir/newsletters \
--storydir ~/stories \
--config config.tomlRequired:
--maildir: Path to the Maildir directory containing newsletters--storydir: Path to the directory where stories will be saved as JSON files--config: Path to the TOML configuration file with LLM settings
Optional:
--limit N: Process maximum N emails (useful for testing)--verbose: Enable verbose logging--log-headers: Log email headers (for debugging)--log-bodies: Log email bodies (for debugging)--log-stories: Log extracted stories
- Reads emails from the Maildir directory (recursively scans
cur/andnew/subdirectories) - Parses email headers, body (plain text, HTML, multipart MIME)
- Sends each email to the configured LLM with a prompt to extract news stories
- Saves each story as a JSON file:
<date>_<message-id>_<index>.json - Skips emails that have already been processed (incremental processing)
Example story file (2006-01-02_test@example.com_1.json):
{
"headline": "Example News Headline",
"teaser": "Brief summary of the article in 1-2 sentences.",
"url": "https://example.com/article",
"from_email": "newsletter@example.com",
"from_name": "Example Newsletter",
"date": "2006-01-02T15:04:05Z"
}The UI server provides a web interface to browse and read extracted stories.
make ui-server./ui-server --storydir ~/stories --savedir ~/saved-storiesOr specify a custom port:
./ui-server --storydir ~/stories --savedir ~/saved-stories --port 3000Environment Variables:
- Prefix:
UI_SERVER_ - Examples:
UI_SERVER_PORT=3000,UI_SERVER_STORYDIR=~/stories,UI_SERVER_SAVEDIR=~/saved-stories - Config File:
ui-server.toml(defaults:.,$HOME)
Required:
--storydir: Path to the directory containing story JSON files--savedir: Path to the directory for saved stories
Optional:
--port: Port to listen on (default: 8080)
Open your browser and navigate to:
http://localhost:8080
The UI displays:
- All extracted stories sorted by date (newest first)
- Story headline (clickable link to original article)
- Brief teaser text
- Source newsletter (sender name/email)
- Publication date (shown as relative time: "Today", "2 days ago", etc.)
- Bookmark icon to save stories for later
- Filter tabs to switch between All and Saved stories
Complete workflow from setup to reading stories:
# 1. Download newsletters using mbsync
mbsync -a
# 2. Build and extract stories from newsletters
make story-extractor
./story-extractor \
--maildir ~/Maildir/newsletters \
--storydir ~/stories \
--config story-extractor.toml
# 3. Start the UI server
make ui-server
./ui-server --storydir ~/stories --savedir ~/saved-stories
# 4. Open in browser
open http://localhost:8080./story-extractor \
--maildir ~/Maildir/newsletters \
--storydir ~/stories \
--config story-extractor.toml \
--limit 5 \
--verbose./story-extractor \
--maildir ~/Maildir/newsletters \
--storydir ~/stories \
--config story-extractor.toml \
--limit 1 \
--log-headers \
--log-bodiesSet up a cron job to run daily:
# crontab -e
0 8 * * * /usr/local/bin/mbsync -a && /path/to/story-extractor --maildir ~/Maildir/newsletters --storydir ~/stories --config ~/story-extractor.tomlThis project is written in Go and every single line of code has been created with AI assistance.
Run make help for available targets. make on its own formats, vets, tests, and builds everything.
An experiment in newsletter consumption and AI-assisted development.