Autonomous web intelligence agent that monitors GitHub Trending, Hacker News, and Dev.to using WunderGraph and Redis, then publishes ranked signal reports to Ghost with paid membership monetization. Use when building agents that do real work on the open web.
npx @senso-ai/shipables install vala041786/signal-scout-agentAn agent that monitors the open web for trending developer signals, deduplicates and ranks them, writes a cited intelligence report, and publishes it autonomously to Ghost — monetized behind a paid membership paywall.
Use this skill when the user wants to:
cited.md file with sourced, ranked developer signalsTrigger phrases: "run signal scout", "fetch trending signals", "publish report", "monitor the web", "what's trending in dev today", "ship the agent".
Instructions for the AI agent go here. Describe patterns, best practices, and how to use the tools and resources this skill provides.
This skill works best with the @modelcontextprotocol/server-fetch MCP server.
Data Sources (GitHub, HN, Dev.to, npm) ↓ WunderGraph — federated GraphQL API ↓ Redis — dedup queue + agent memory ↓ Claude agent core — cluster, rank, write ↓ cited.md (local) + Ghost post (published) ↓ Monetize — Ghost paid membership / x402
npx create-wundergraph-app signal-scout --example nextjs
cd signal-scout
Configure three data sources:
import { configureWunderGraphApplication, cors, EnvironmentVariable, introspect } from '@wundergraph/sdk';
const github = introspect.openApi({
apiNamespace: 'github',
source: { kind: 'file', filePath: './github-openapi.yaml' },
headers: builder => builder.addStaticHeader('Authorization', `token ${process.env.GITHUB_TOKEN}`)
});
const hackerNews = introspect.graphql({
apiNamespace: 'hn',
url: 'https://hn.algolia.com/api/v1/search_by_date',
});
configureWunderGraphApplication({
apis: [github, hackerNews],
server: { ... },
operations: { ... },
});
Create .wundergraph/operations/trending.graphql:
query TrendingSignals {
github_trending {
repos { name url stars language description }
}
hn_top {
hits { title url points author created_at }
}
}
Always use @live directive for real-time subscriptions:
query LiveSignals @live(throttleSeconds: 300) { ... }
import Redis from 'ioredis';
const redis = new Redis(process.env.REDIS_URL);
1. Seen URLs set — prevents duplicate processing:
async function isNew(url: string): Promise<boolean> {
const added = await redis.sadd('seen_urls', url);
return added === 1; // 0 means already seen
}
2. Signal queue — FIFO work queue for the agent:
// Producer (WunderGraph webhook)
await redis.rpush('signal_queue', JSON.stringify(signal));
// Consumer (agent worker) — blocks until item available
const [, raw] = await redis.blpop('signal_queue', 0);
const signal = JSON.parse(raw);
3. Agent state hash — tracks runs and stats:
await redis.hset('agent_state', {
last_run: new Date().toISOString(),
reports_published: await redis.hincrby('agent_state', 'reports_published', 1),
signals_processed: count,
});
Always check before enqueuing:
async function enqueueIfNew(signal: Signal) {
const key = signal.url;
const isNew = await redis.sadd('seen_urls', key);
if (isNew) {
await redis.rpush('signal_queue', JSON.stringify(signal));
}
}
Set a TTL on seen_urls to allow re-surfacing after 7 days:
await redis.expire('seen_urls', 60 * 60 * 24 * 7);
const signals: Signal[] = [];
while (signals.length < 20) {
const item = await redis.lpop('signal_queue');
if (!item) break;
signals.push(JSON.parse(item));
}
Send to Anthropic API with this system prompt structure: You are an intelligence analyst for developer trends. Given a list of signals (title, url, source, score), you must:
Cluster signals into 3-5 themes Rank themes by novelty and developer impact Write a concise intelligence briefing in Markdown Cite every source as a numbered footnote [1], [2], etc. Output ONLY the Markdown report, no preamble
GHOST_ADMIN_API_KEYAdmin API key from your Ghost dashboard under Settings > Integrations
REDIS_URLRedis connection URL for agent memory, dedup queue, and pub/sub orchestration