Skip to content

Latest commit

 

History

History
442 lines (347 loc) · 10.7 KB

File metadata and controls

442 lines (347 loc) · 10.7 KB

Analytics Framework

This document defines the instrumentation and analytics strategy for all products built in the AI Product OS.

All product, engineering, and analytics agents must follow this framework when planning and implementing telemetry.


Core Philosophy

Why Analytics Matter

  1. Validate Hypotheses: Measure if the product solves the stated problem
  2. Detect Issues Early: System health alerts (fallbacks, errors, latency)
  3. Prioritize Roadmap: Data-driven feature decisions
  4. Close the Loop: Feed learning back into the product OS knowledge base

North Star Metric

Every project must define one primary success metric:

  • Gmail WhatsApp Notifier: Daily Summary Read Rate
  • AI Finance Advisor: Weekly Active Spending Reporters
  • Clarity (PM To-Do): Tasks Categorized per Active User per Week

Analytics Stack

Primary Tool: PostHog

  • Why: Open-source, self-hostable, combined product analytics + session replay + feature flags
  • Client Library: posthog-js (web/frontend)
  • Server Library: posthog-node (API routes, cron jobs)
  • Setup: Free tier suitable for MVPs (<1M events/month)

Setup Pattern

// Frontend: app/PostHogProvider.tsx
"use client";
import posthog from 'posthog-js';
import { PostHogProvider as PHProvider } from 'posthog-js/react';
import { useEffect } from 'react';

export default function PostHogProvider({ children }) {
  useEffect(() => {
    posthog.init(process.env.NEXT_PUBLIC_POSTHOG_KEY!, {
      api_host: process.env.NEXT_PUBLIC_POSTHOG_HOST,
      person_profiles: 'identified_only',
      capture_pageviews: true,
      capture_pageleaves: true
    });
  }, []);

  return <PHProvider client={posthog}>{children}</PHProvider>;
}
// Backend: lib/posthog.ts
import { PostHog } from 'posthog-node';

export default function PostHogClient() {
  return new PostHog(process.env.NEXT_PUBLIC_POSTHOG_KEY!, {
    host: process.env.NEXT_PUBLIC_POSTHOG_HOST
  });
}

Event Taxonomy

Naming Convention

Format: [object]_[action] (lowercase, snake_case)

Examples:

  • task_created
  • digest_sent
  • ai_categorization_failed
  • user_signed_up

Event Categories

1. User Lifecycle

  • landing_page_viewed - User arrives
  • signup_started - User begins registration
  • signup_completed - User successfully registered
  • first_action_completed - User completes core action (activation)
  • user_churned - User inactive for X days

2. Core Feature Usage

  • [feature]_initiated - User starts action
  • [feature]_completed - System confirms success
  • [feature]_failed - User-facing error occurred

3. System Health

  • ai_fallback_triggered - AI processing failed, used default
  • api_timeout - External service exceeded threshold
  • rate_limit_hit - Third-party rate limit encountered
  • retry_exhausted - Permanent failure after retries

4. Business Metrics

  • conversion_completed - User converted to paid (if applicable)
  • feature_unlocked - User gained access to premium feature
  • limit_reached - User hit free tier cap

Standard Properties

Every Event Should Include

{
  // Automatic (from PostHog)
  $session_id: string,
  $timestamp: ISO8601,
  $current_url: string,

  // Custom (you must add)
  user_id?: string,           // If authenticated
  feature_version?: string,   // For A/B testing
  environment: 'dev' | 'prod'
}

Context-Specific Properties

Performance Metrics

{
  ai_latency_ms: number,      // Time for AI to respond
  db_query_time_ms: number,   // Database query duration
  total_request_time_ms: number
}

User Input Metrics

{
  input_length: number,       // Character count
  input_type: 'text' | 'image' | 'voice'
}

Error Context

{
  error_type: 'validation' | 'network' | 'ai' | 'database',
  error_message: string,      // Sanitized (no PII)
  error_code?: string         // HTTP status or custom code
}

Implementation Patterns

Frontend Event Tracking

import { usePostHog } from 'posthog-js/react';

function TaskBoard() {
  const posthog = usePostHog();

  const handleSubmit = async () => {
    const startTime = Date.now();

    // Capture initiation
    posthog?.capture('task_submitted', {
      input_length: taskText.length
    });

    try {
      const res = await fetch('/api/tasks', { ... });

      // Capture completion
      posthog?.capture('task_created', {
        task_id: res.data.id,
        time_to_create_ms: Date.now() - startTime
      });
    } catch (error) {
      // Capture failure
      posthog?.capture('task_creation_failed', {
        error_type: 'network',
        error_message: error.message
      });
    }
  };
}

Backend Event Tracking

import PostHogClient from '@/lib/posthog';

export async function POST(req: Request) {
  const ph = PostHogClient();
  const startTime = Date.now();

  try {
    const aiResult = await callGemini(prompt);
    const latency = Date.now() - startTime;

    // Track successful AI categorization
    ph.capture({
      distinctId: userId || 'anonymous',
      event: 'ai_categorization_completed',
      properties: {
        category: aiResult.category,
        ai_latency_ms: latency,
        model: 'gemini-2.5-flash'
      }
    });

    await ph.shutdown(); // Important!
    return NextResponse.json(aiResult);

  } catch (error) {
    // Track AI failure + fallback
    ph.capture({
      distinctId: userId || 'anonymous',
      event: 'ai_fallback_triggered',
      properties: {
        input_length: prompt.length,
        error_type: 'ai_timeout'
      }
    });

    await ph.shutdown();
    // Apply fallback logic...
  }
}

Cron Job Tracking

export async function GET() {
  const ph = PostHogClient();
  const startTime = Date.now();

  const users = await fetchActiveUsers();

  ph.capture({
    distinctId: 'system',
    event: 'cron_digest_started',
    properties: {
      user_count: users.length,
      cron_type: 'daily_digest'
    }
  });

  const results = await Promise.allSettled(
    users.map(user => sendDigest(user))
  );

  const succeeded = results.filter(r => r.status === 'fulfilled').length;
  const failed = results.filter(r => r.status === 'rejected').length;

  ph.capture({
    distinctId: 'system',
    event: 'cron_digest_completed',
    properties: {
      total_users: users.length,
      success_count: succeeded,
      failure_count: failed,
      duration_ms: Date.now() - startTime
    }
  });

  await ph.shutdown();
  return new Response('OK');
}

Funnels to Track

Onboarding Funnel

landing_page_viewed
  “
signup_started
  “
signup_completed
  “
first_[feature]_completed

Goal: Maximize conversion from landing ’ first action

Core Feature Funnel (Example: Clarity)

task_submitted
  “
ai_categorization_completed
  “
task_displayed
  “
task_completed

Goal: Minimize drop-off at each stage

Engagement Funnel

session_started
  “
[feature]_used (5+ times)
  “
weekly_active_user
  “
retained_next_month

Goal: Identify power users vs. one-time visitors


Metric Definitions

Activation Metrics

  • Time to First Action: Seconds from signup to first core feature use
  • Setup Completion Rate: % users who finish onboarding steps

Engagement Metrics

  • DAU / WAU / MAU: Daily/Weekly/Monthly Active Users
  • Feature Adoption Rate: % of users who used feature X
  • Session Duration: Average time spent per session

Retention Metrics

  • D1 / D7 / D30 Retention: % users who return after 1/7/30 days
  • Churn Rate: % users who stop using product

System Health Metrics

  • AI Fallback Rate: % of AI calls that failed
  • Error Rate: % of requests that resulted in 4xx/5xx
  • P50 / P95 Latency: Median and 95th percentile response times

Dashboard Setup

Critical Dashboards (PostHog)

  1. North Star Tracker: Single metric graph + trend
  2. Funnel Health: Conversion rates at each stage
  3. System Reliability: Error rates, fallback triggers, latencies
  4. User Cohorts: Segmented by signup date, feature usage

Alerts to Configure

  • AI Fallback Rate > 10% (investigate model issues)
  • Error Rate > 5% (system degradation)
  • P95 Latency > 5s (performance issue)
  • Daily Active Users drops > 20% (retention crisis)

Integration with Product Workflow

When to Define Analytics

L Anti-Pattern: Post-Launch Instrumentation

create-plan ’ execute-plan ’ review ’ qa-test ’ deploy-check ’ metric-plan
                                                                  ‘ Too late!

� Correct: Upfront Definition

create-plan (include events in spec)
  “
execute-plan (implement events during build)
  “
review (verify events are firing)
  “
deploy-check (validate telemetry in prod)

Metric Plan Command Output

Every /metric-plan execution must produce:

  1. North Star Metric: The one number that defines success
  2. Event List: All events to track with properties
  3. Funnels: Key conversion flows to monitor
  4. Dashboard Mockup: What graphs/charts to create
  5. Alert Thresholds: When to get notified

Privacy & Compliance

PII Handling

  • Never log: Emails, phone numbers, credit cards, passwords
  • Hash when needed: User IDs, session identifiers
  • Sanitize errors: Strip user input from error messages
// L BAD - Logs user email
ph.capture('signup_completed', { email: user.email });

// � GOOD - Uses anonymous ID
ph.capture('signup_completed', { user_id: hashUserId(user.id) });

GDPR Compliance

  • Provide opt-out mechanism
  • Support data deletion requests
  • Document data retention policies (PostHog: 7 years default)

Testing Analytics

Pre-Deploy Checklist

  1. Verify Events Fire: Use PostHog live events view
  2. Check Properties: Ensure all custom props appear
  3. Test Error Paths: Trigger failures, confirm error events
  4. Validate Funnels: Complete user journey, check funnel steps

Local Development

# Use separate PostHog project for dev
NEXT_PUBLIC_POSTHOG_KEY=phc_dev_xxx
NEXT_PUBLIC_POSTHOG_HOST=https://app.posthog.com

Lessons Learned from Postmortems

  1. Telemetry is not optional: Projects without analytics cannot be improved systematically
  2. Implement during build: Adding events post-QA creates deployment blockers
  3. Track fallbacks aggressively: System health metrics prevent silent failures
  4. Don't over-instrument: Focus on North Star + critical funnels, not every click

References


This framework evolves as new analytics patterns emerge from product cycles.