Discover, score, and summarize high-signal fans for a creator using web-agent discovery, Redis ranking, and voice access.
npx @senso-ai/shipables install MirandaCavalie/faniqUse this skill when a creator wants to identify their highest-signal fans, understand why they rank high, talk to the fan intelligence layer by voice, and publish a consumable fan report.
FanIQ finds fans across X and LinkedIn, scores them live with Redis sorted sets, and exposes the results through a leaderboard API and a Vapi voice assistant.
FanIQ runs a creator handle through a multi-step pipeline:
fans:{creator}).Run in demo mode (demo_mode: true) to use pre-seeded fan data with no external API calls. Run with demo_mode: false and a TINYFISH_API_KEY for live discovery.
Always POST to /scan before subscribing to events. Clear prior creator state first if you want a fresh run (the endpoint calls clear_creator automatically).
POST /scan
Content-Type: application/json
{
"creator_handle": "@lexfridman",
"platforms": ["x"],
"demo_mode": true
}
Response:
{
"job_id": "scan_3f8a12b4c9",
"creator_handle": "@lexfridman",
"status": "queued"
}
Store job_id. Use it immediately to open the SSE stream.
Open a persistent connection to the scan events endpoint and process events as they arrive.
GET /scan/{job_id}
The response is text/event-stream. Each line is data: {json}\n\n.
Event types:
type | When it fires | Key fields |
|---|---|---|
agent_step | TinyFish navigation step | sponsor: "tinyfish", message |
redis_write | Fan saved to Redis | sponsor: "redis", command |
fan_found | Fan scored and ready | sponsor: "redis", fan object |
done | Scan complete | total_fans |
error | Scan failed | message |
fan_found event shape:
{
"type": "fan_found",
"sponsor": "redis",
"message": "Ranked @airesearcher_sf with score 847",
"fan": {
"handle": "@airesearcher_sf",
"display_name": "Alex Chen",
"score": 847,
"reason": "9 direct replies, 14 comments, cross-platform engagement, 12,400 follower reach"
},
"timestamp": "2026-04-24T12:30:00Z"
}
Always normalize handles before comparing: strip leading @, lowercase, re-add @. Two handles are the same fan if handle.trim().toLowerCase().replace(/^@*/, '') matches.
Close the EventSource when type === "done" or type === "error". If the connection drops before those events arrive, call GET /fans/{creator_handle} to fetch final state.
GET /fans/{creator_handle}?limit=10
Response:
{
"creator_handle": "@lexfridman",
"total_fans": 15,
"top_fans": [
{
"handle": "@airesearcher_sf",
"display_name": "Alex Chen",
"score": 847,
"platforms": ["x", "linkedin"],
"reason": "9 direct replies, 14 comments, cross-platform engagement, 12,400 follower reach",
"suggested_action": "Invite Alex to a private AI research Q&A and ask for topic feedback.",
"source_urls": ["https://x.com/lexfridman/status/177001"],
"source_tool": "seed"
}
]
}
source_tool values: "seed" (demo), "tinyfish_demo" (TinyFish fallback), "tinyfish_live" (real crawl).
GET /fan/{creator_handle}/{fan_handle}
Returns the full FanProfile including bio, raw_comments, follower_count, comment_count, reply_count, cross_platform, last_seen, and published_url.
FanIQ exposes two voice modes. Always fetch client config before starting a call — it returns the correct public key and assistant ID without exposing the private VAPI_API_KEY.
GET /vapi/client-config?creator_handle=@lexfridman
GET /vapi/client-config?creator_handle=@lexfridman&fan_handle=@airesearcher_sf
Response includes publicKey and assistantId. Use them with the Vapi Web SDK:
const vapi = new Vapi(config.publicKey);
vapi.start(config.assistantId);
Omit fan_handle. The assistant answers questions like "Who is my top fan?" by querying Redis fan data in real time.
Vapi calls POST /v1/chat/completions with:
{
"model": "faniq-intelligence",
"stream": true,
"metadata": { "creator_handle": "@lexfridman" },
"messages": [{ "role": "user", "content": "Who are my top fans?" }]
}
Include fan_handle. The assistant responds in the voice of a synthetic persona built from the fan's collected comments and bio. Answers stay grounded in captured source snippets.
Vapi calls with:
{
"model": "faniq-persona:@lexfridman:@airesearcher_sf",
"stream": true,
"messages": [{ "role": "user", "content": "What did you think of the last episode?" }]
}
Both modes use stream: true and return OpenAI-compatible chat.completion.chunk events ending with data: [DONE].
POST /publish/{creator_handle}
Generates a Markdown fan intelligence report and attempts to publish to cited.md. Always succeeds — falls back to a local file if the Senso API key is missing.
Response:
{
"creator_handle": "@lexfridman",
"published": true,
"url": "https://cited.md/faniq/lexfridman",
"payment_enabled": false,
"fallback_file": "output/published/lexfridman.md"
}
url is the cited.md artifact URL when live publishing succeeds. fallback_file is the local path when it doesn't.
score = (comment_count × 30)
+ (reply_count × 20)
+ min(follower_count / 100, 200)
+ (100 if cross_platform else 0)
+ 50 # recency bonus
cross_platform is true when len(platforms) > 1. A fan on both X and LinkedIn gets +100 on top of their engagement score.
Typical score ranges: 200–400 (single-platform lurker), 500–700 (active commenter), 800–1000 (high-follower cross-platform engaged fan).
| Key pattern | Type | Content |
|---|---|---|
fans:{creator} | Sorted set | {fan_handle} → score |
fan_profile:{creator}:{fan} | String (JSON) | Full FanProfile |
events:{creator} | List (lpush, capped 100) | Recent ScanEvent objects |
scan:{job_id} | String (JSON, TTL 3600s) | ScanJob state |
scan_events:{job_id} | List (rpush, TTL 3600s) | Ordered ScanEvent log |
publish:{creator} | String (JSON) | Latest PublishResult |
sponsor_trace:{creator} | List (lpush, capped 50) | Redis operation log |
fan_memory:{creator}:{fan}:{index} | String | Raw comment snippet |
fan_memory_scores:{creator} | Sorted set | {fan_handle}:{token} → score |
fan_memory_index:{creator}:{token} | String (JSON) | Posting list for memory search |
All creator and fan keys are normalized handles (lowercase, @ prefix, stripped whitespace).
| Sponsor | Role in the pipeline |
|---|---|
| TinyFish | Web agent that navigates X and LinkedIn without needing an API key. Streams agent_step SSE events showing real navigation steps. Configured via TINYFISH_API_KEY. |
| Redis | Sorted set leaderboard, fan profile store, event log, memory search index. Every fan write emits a redis_write SSE event and a sponsor_trace log entry. |
| Vapi | Voice AI layer. Mode A queries Redis fan data. Mode B generates a fan persona from collected comments. Configured via VAPI_API_KEY and VAPI_PUBLIC_KEY. |
| cited.md / Senso | Publishes the fan intelligence report as an agent-consumable artifact at https://cited.md/faniq/{creator}. Optional SENSO_API_KEY. |
Scan returns no fans
python scripts/seed_demo_data.py --creator @lexfridman --clear to pre-populate.TINYFISH_API_KEY is set. The live path falls back to demo_fans()[:5] if TinyFish fails.Leaderboard shows stale data after a new scan
/scan POST calls clear_creator before queuing the job. If you bypass the POST and run run_scan_job directly, call redis_service.clear_creator(handle) first.SSE connection closes before done event
onerror fires when the server stream ends. If scanDone is not yet true, call your onScanDone() handler and fall back to GET /fans/{creator} for final state.Handle mismatch in fan deduplication
handle.trim().toLowerCase().replace(/^@*/, ''), then add @ prefix. Never compare raw handles from different sources directly.Vapi call returns no fan data
PUBLIC_BASE_URL or NGROK_URL is set and reachable. Vapi calls /v1/chat/completions on your server. If the URL is wrong, Vapi falls back to its default model with no Redis context.Publish returns published: false
SENSO_API_KEY is not set or the cited.md endpoint is unreachable. The fallback file at output/published/{creator}.md is always written. Set published: true check to also accept the local file path.