Skip to content

otq-checkin-worker

Cloudflare Worker at ~/garmin-warehouse/cloudflare/otq-checkin/. Replaces three former pieces of laptop-dependent infrastructure: evening_checkin.py + telegram_listener.py + a GitHub Actions workflow + 2 launchd plists. One Worker, runs on Cloudflare's edge, no laptop dependency.

Live at https://otq-checkin.manoscasey.workers.dev.

Quick orientation

  • Account: a20a70bee90d635ffad79328f3edcd5f
  • Bot: @otq_moonbot on Telegram
  • TZ: America/Los_Angeles (env var, used by todayInTZ())
  • Source: ~/garmin-warehouse/cloudflare/otq-checkin/
  • Deploy: npx wrangler deploy (from project dir)
  • Manual triggers (gated by ?secret=<last 12 chars of TG_TOKEN>):
  • POST /run-checkin
  • POST /run-morning-summary
  • POST /test/inject-checkin?date=...&message_id=...&asked_kinds=...
  • POST /test/inject-summary?date=...&message_id=...
  • Stats: GET /stats → JSON

What it does

Two crons → two Telegram messages per day. Webhook handles all replies.

Cron UTC Local (PDT) Method Sends
Evening check-in 0 3 * * * 8:00pm runCheckin() "did you do today's strength/strides/sauna?" if anything's missing from completion_log.jsonl
Morning summary 30 15 * * * 8:30am runMorningSummary() "yesterday: 6mi · CTL 22 · sleep 7h... today plan: ..."

When PST returns in November, both shift +1hr local. Acceptable drift; revisit at DST flips.

Reply routing

The webhook receives Telegram updates. parseUpdate() extracts (messageId, repliedToMessageId, text). Then routing is by which DO table contains the repliedToMessageId:

              webhook update
           parseUpdate (text + reply-to id)
        ┌───────────┼───────────┐
        ▼                       ▼
  is reply to checkin?    is reply to summary?
    (checkins table)      (daily_summaries table)
        │                       │
        ▼                       ▼
   parseReply               classifyAndExtract
   (Haiku 4.5,              (Haiku 4.5, 4 cases)
    extract did_*)              │
        │              ┌────────┼────────┐
        ▼              ▼        ▼        ▼
   write to R2:    correction question  chat
   completion_     │        │        │
   log_worker_*    write    answer   ack 👍
                   to R2:   from
                   correc-  query_
                   tions_*  cache
                            via
                            Haiku

Reply paths in detail

Path 1: check-in reply (existing) → "did you do strength?" - "yep both" → strength + strides booleans extracted → written to R2 at state/<date>/completion_log_worker_<msgId>.jsonl - daily_sync.sh pulls + merges into local completion_log.jsonl

Path 2: summary reply → correction (new, 2026-05-04) - "actually 11.2mi", "knee tweaked at mile 8", "that wasn't strength it was mobility" - Written to R2 at state/<date>/corrections_<msgId>.jsonl - daily_sync.sh pulls + merges into local corrections.jsonl - See ADR 003: Corrections as interpretive overlay for what these mean — corrections do NOT edit raw activity data. - v1: corrections sit in the file; an applier is the next piece.

Path 3: summary reply → question (new) - "what's my CTL trend", "current eFTP", "how does last week's MPW look" - Worker reads R2's state/cache/query_cache.json (built by scripts/cache_for_worker.py during 7am sync) - Passes cache + question to Haiku → concise answer - Out-of-scope questions ("what's the weather") get "I don't have that in the cache"

Path 4: summary reply → chat (new) - "thanks", "looks good", "👍" → reply with "👍"

DO storage

The Durable Object has 3 tables:

Table Rows Purpose
checkins one per day with scheduled events tracks evening check-in sent + asked_kinds
daily_summaries one per day tracks morning summary sent
processed_replies one per reply dedup; webhook is idempotent

Schema is created with CREATE TABLE IF NOT EXISTS in the constructor. No external migration system; adding a column means writing a constructor migration in code.

R2 layout it touches

Key Direction Owner
state/cache/yesterday.json read Worker reads, cache_for_worker.py writes (7am)
state/cache/query_cache.json read same
state/latest/completion_log.jsonl read Worker reads to detect logged kinds (avoid asking about already-done items)
state/<date>/completion_log_worker_<msgId>.jsonl write Worker writes per check-in reply, daily_sync.sh reads + merges
state/<date>/corrections_<msgId>.jsonl write Worker writes per correction, daily_sync.sh reads + merges

Per-source-file pattern (one R2 key per webhook reply) avoids GET-modify-PUT race conditions.

File layout

cloudflare/otq-checkin/
├── wrangler.jsonc          # account, DO binding, R2 binding, 2 crons, vars
├── package.json            # @anthropic-ai/sdk, zod, workers-types, wrangler
├── deploy.sh               # one-shot: wrangler secret put + deploy + register webhook
├── teardown-laptop.sh      # ~7 days after Worker live: unload launchd plists + .disabled the GHA workflow
└── src/
    ├── index.ts            # entry: scheduled() routes by cron, fetch() routes endpoints
    ├── agent.ts            # OTQCheckinAgent DO class (3 tables + 2 cron methods + handleWebhook)
    ├── types.ts            # Env, CheckinRow, DailySummaryRow, YesterdayCache, QueryCache, CorrectionEntry
    ├── intervals_icu.ts    # fetchScheduledToday() (today's planned events)
    ├── telegram.ts         # sendTelegram, sendAck, parseUpdate
    ├── parser.ts           # parseReply (check-in path) — Haiku tool_use → strength/strides/sauna
    ├── summary_intent.ts   # classifyAndExtract (summary path) — 4-case classifier + answerQuestion
    ├── compose.ts          # composeMessage (check-in text)
    ├── morning.ts          # composeMorning (summary text)
    ├── log.ts              # readLatestLog, appendEntries, appendCorrection
    └── r2_cache.ts         # readYesterday, readQueryCache

Secrets (Worker, not env)

Set via wrangler secret put <NAME>:

  • INTERVALS_ICU_KEY — for fetchScheduledToday() API call
  • TELEGRAM_BOT_TOKEN — for sending + replying. Last 12 chars also serve as the manual-trigger gate (anyone with the token can already send messages, so reusing it doesn't expand attack surface)
  • ANTHROPIC_API_KEY — for Haiku 4.5 extraction + Q&A

Vars (in wrangler.jsonc, not secret): TZ, PLAN_PREFIX, TELEGRAM_CHAT_ID.

Deployment lifecycle

  1. Code changenpx wrangler deploy from project dir
  2. Crons unchanged if wrangler.jsonc untouched; otherwise CF re-registers them on deploy
  3. Migrations: SQLite class table additions are auto-applied via CREATE TABLE IF NOT EXISTS. Adding a NEW DO class needs a migration tag bump in wrangler.jsonc. Current tag: v1.
  4. Logs: npx wrangler tail (live) or CF dashboard observability (1.0 head sampling rate enabled)