Skip to main content

Logging Infrastructure Testing Guide

This guide walks through end-to-end testing of the logging infrastructure implemented across Stamp (email validator) and Postchi (email sending platform).

Prerequisites

1. Redis Setup

# Start Redis (if not already running)
redis-server

# Verify connection
redis-cli -n 2 PING
# Expected output: PONG

2. Cloudflare R2 Bucket Setup

Create a dedicated R2 bucket for logs (separate from template assets):

  1. Log in to Cloudflare dashboard
  2. Navigate to R2 Object Storage
  3. Create new bucket: postchi-logs
  4. Generate API tokens with read/write access
  5. Note down:
    • Account ID
    • Access Key ID
    • Secret Access Key
    • Bucket Name

3. Environment Configuration

Stamp Validator (postchi-email-validator/.env)

Add to your existing .env file:

# Redis (for usage metering and hot logs)
REDIS_HOST=localhost
REDIS_PORT=6379
REDIS_PASSWORD=
REDIS_DB=2

# Cloudflare R2 (for log archiving)
R2_ACCOUNT_ID=your-account-id
R2_ACCESS_KEY_ID=your-access-key
R2_SECRET_ACCESS_KEY=your-secret-key
R2_BUCKET_NAME=postchi-logs

Postchi API (packages/api/.env)

Add to your existing .env file:

# Redis Logging Database
REDIS_DB_LOGGING=2

# R2 configuration (reuse from root .env or add here)
R2_ACCOUNT_ID=your-account-id
R2_ACCESS_KEY_ID=your-access-key
R2_SECRET_ACCESS_KEY=your-secret-key
R2_BUCKET_NAME=postchi-logs

Postchi Worker (packages/worker/.env)

Add to your existing .env file:

# Redis Logging Database
REDIS_DB_LOGGING=2

# R2 configuration
R2_ACCOUNT_ID=your-account-id
R2_ACCESS_KEY_ID=your-access-key
R2_SECRET_ACCESS_KEY=your-secret-key
R2_BUCKET_NAME=postchi-logs

4. Install Dependencies

# In Stamp validator
cd /Users/amirandalibi/code/@studioaccolade/postchi-email-validator
npm install

# In Postchi shared package
cd /Users/amirandalibi/code/@studioaccolade/postchi/packages/shared
pnpm install
pnpm build

# In Postchi API
cd /Users/amirandalibi/code/@studioaccolade/postchi/packages/api
pnpm install

# In Postchi Worker
cd /Users/amirandalibi/code/@studioaccolade/postchi/packages/worker
pnpm install

# In Dashboard
cd /Users/amirandalibi/code/@studioaccolade/postchi/apps/dashboard
pnpm install

Test Scenarios

Test 1: Email Validation Logging (Stamp)

Step 1: Start Stamp Validator

cd /Users/amirandalibi/code/@studioaccolade/postchi-email-validator
npm run dev

Expected console output:

✅ Logging infrastructure initialized
Redis: localhost:6379 (db 2)
S3: postchi-logs
Server listening at http://0.0.0.0:4893

Step 2: Send Validation Request

curl -X POST http://localhost:4893/api/v1/validate \
-H "Content-Type: application/json" \
-d '{
"email": "test@example.com",
"options": {
"checkFormat": true,
"checkMx": true,
"checkSmtp": false
}
}'

Expected response:

{
"valid": false,
"reason": "mx_not_found",
"details": {
"formatValid": true,
"mxExists": false
}
}

Step 3: Verify Redis Hot Log

# Check log was stored
redis-cli -n 2 ZRANGE logs:EMAIL_VALIDATION:demo-org 0 -1

# Check log count
redis-cli -n 2 ZCARD logs:EMAIL_VALIDATION:demo-org

# Expected output: 1

Step 4: Verify Archive Queue

# Check queue has pending log
redis-cli -n 2 LLEN archive:queue:EMAIL_VALIDATION

# Expected output: 1

# View queued log
redis-cli -n 2 LINDEX archive:queue:EMAIL_VALIDATION 0

Step 5: Verify Usage Counter

# Get today's date in YYYYMMDD format
TODAY=$(date +%Y%m%d)

# Check daily usage counter
redis-cli -n 2 GET "usage:EMAIL_VALIDATION:demo-org:daily:$TODAY"

# Expected output: 1

# Check monthly counter
MONTH=$(date +%Y%m)
redis-cli -n 2 GET "usage:EMAIL_VALIDATION:demo-org:monthly:$MONTH"

# Expected output: 1

Step 6: Check Stamp Logs

Look for log entry confirmation in Stamp console:

[INFO] Validation request completed {
email: "test@example.com",
valid: false,
duration: 150
}

Test 2: Email Sending Logging (Postchi Worker)

Step 1: Start Postchi Services

# Terminal 1: Start API
cd /Users/amirandalibi/code/@studioaccolade/postchi/packages/api
pnpm dev

# Terminal 2: Start Worker
cd /Users/amirandalibi/code/@studioaccolade/postchi/packages/worker
pnpm dev

Expected worker console output:

✅ Logging infrastructure initialized
Redis: localhost:6379 (db 2)
S3: postchi-logs
📧 Email worker started
📦 Log archiver worker started

Step 2: Send Email via API

# Get auth token first (adjust based on your auth setup)
TOKEN="your-jwt-token"

# Send test email
curl -X POST http://localhost:3000/v1/emails/send \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $TOKEN" \
-d '{
"from": {
"email": "test@postchi.io",
"name": "Test Sender"
},
"to": ["recipient@example.com"],
"subject": "Test Email for Logging",
"html": "<p>This is a test email to verify logging works.</p>"
}'

Step 3: Verify Redis Hot Log

# Check email sending log was stored
redis-cli -n 2 ZRANGE logs:EMAIL_SENDING:your-org-id 0 -1

# Check log count
redis-cli -n 2 ZCARD logs:EMAIL_SENDING:your-org-id

# Expected output: 1

Step 4: Verify Archive Queue

# Check queue has pending log
redis-cli -n 2 LLEN archive:queue:EMAIL_SENDING

# Expected output: 1

Step 5: Verify Usage Counter

TODAY=$(date +%Y%m%d)
MONTH=$(date +%Y%m)

# Check daily usage
redis-cli -n 2 GET "usage:EMAIL_SENDING:your-org-id:daily:$TODAY"

# Check monthly usage
redis-cli -n 2 GET "usage:EMAIL_SENDING:your-org-id:monthly:$MONTH"

Test 3: S3 Log Archiving (Worker)

The log archiver worker runs every minute. Wait 1-2 minutes after sending requests.

Step 1: Check Worker Logs

Look for archiver output in worker console:

[INFO] Log archiver job started
[INFO] Archived 2 EMAIL_VALIDATION logs to S3
[INFO] Archived 1 EMAIL_SENDING logs to S3
[INFO] Log archiver job completed

Step 2: Verify Redis Queue is Empty

# Queues should be empty after archiving
redis-cli -n 2 LLEN archive:queue:EMAIL_VALIDATION
redis-cli -n 2 LLEN archive:queue:EMAIL_SENDING

# Expected output: 0 for both

Step 3: Verify S3 Files

Using AWS CLI with R2 endpoint:

# Configure AWS CLI for R2
export AWS_ACCESS_KEY_ID="your-r2-access-key"
export AWS_SECRET_ACCESS_KEY="your-r2-secret-key"
export AWS_ENDPOINT_URL="https://your-account-id.r2.cloudflarestorage.com"

# List logs for today
TODAY_PATH=$(date +%Y/%m/%d)

# Validation logs
aws s3 ls s3://postchi-logs/logs/EMAIL_VALIDATION/demo-org/$TODAY_PATH/ \
--endpoint-url=$AWS_ENDPOINT_URL

# Sending logs
aws s3 ls s3://postchi-logs/logs/EMAIL_SENDING/your-org-id/$TODAY_PATH/ \
--endpoint-url=$AWS_ENDPOINT_URL

# Expected output: List of .json files with timestamps

Step 4: Download and Verify Log File

# Get the most recent validation log file name from previous command
LATEST_FILE="1708531200000-a1b2c3d4.json"

# Download file
aws s3 cp \
s3://postchi-logs/logs/EMAIL_VALIDATION/demo-org/$TODAY_PATH/$LATEST_FILE \
/tmp/test-log.json \
--endpoint-url=$AWS_ENDPOINT_URL

# View contents (should be NDJSON format)
cat /tmp/test-log.json | jq .

Expected log structure:

{
"id": "550e8400-e29b-41d4-a716-446655440000",
"timestamp": "2026-02-21T10:30:00.000Z",
"organizationId": "demo-org",
"apiKeyId": "demo-key",
"product": "EMAIL_VALIDATION",
"email": "test@example.com",
"options": {
"checkFormat": true,
"checkMx": true,
"checkSmtp": false
},
"valid": false,
"reason": "mx_not_found",
"details": {
"formatValid": true,
"mxExists": false
},
"duration": 150,
"s3Key": "logs/EMAIL_VALIDATION/demo-org/2026/02/21/1708531200000-a1b2c3d4.json"
}

Test 4: API Endpoints

Step 1: Get Validation Logs

curl http://localhost:3000/v1/logs/validation?limit=10&offset=0 \
-H "Authorization: Bearer $TOKEN"

Expected response:

{
"success": true,
"data": [
{
"id": "550e8400-e29b-41d4-a716-446655440000",
"timestamp": 1708531200000,
"email": "test@example.com",
"valid": false,
"reason": "mx_not_found",
"duration": 150
}
],
"pagination": {
"total": 1,
"limit": 10,
"offset": 0,
"hasMore": false
},
"meta": {
"product": "EMAIL_VALIDATION"
}
}

Step 2: Get Sending Logs

curl http://localhost:3000/v1/logs/sending?limit=10&offset=0 \
-H "Authorization: Bearer $TOKEN"

Step 3: Get Usage Statistics

# Single product usage
curl http://localhost:3000/v1/logs/usage?product=EMAIL_VALIDATION \
-H "Authorization: Bearer $TOKEN"

# Expected response:
{
"success": true,
"data": {
"dailyUsage": 1,
"monthlyUsage": 1,
"product": "EMAIL_VALIDATION"
}
}

# All products usage
curl http://localhost:3000/v1/logs/usage/all \
-H "Authorization: Bearer $TOKEN"

# Expected response:
{
"success": true,
"data": {
"validation": {
"dailyUsage": 1,
"monthlyUsage": 1,
"product": "EMAIL_VALIDATION"
},
"sending": {
"dailyUsage": 1,
"monthlyUsage": 1,
"product": "EMAIL_SENDING"
}
}
}

Test 5: Dashboard Frontend

Step 1: Start Dashboard

cd /Users/amirandalibi/code/@studioaccolade/postchi/apps/dashboard
pnpm dev

Step 2: Navigate to Activity Logs

  1. Open browser: http://localhost:5173 (or your dev port)
  2. Log in with valid credentials
  3. Click "Activity" in sidebar navigation

Step 3: Verify Display

Expected UI elements:

  • ✅ Usage statistics cards showing daily/monthly counts for both products
  • ✅ Product filter dropdown (ALL, Email Validation, Email Sending)
  • ✅ Table with columns: Time, Type, Details, Status, Duration
  • ✅ Pagination controls (if more than 50 logs)
  • ✅ Refresh button

Step 4: Test Filtering

  1. Select "Email Validation" from dropdown
    • Should show only validation logs with Mail icon badge
  2. Select "Email Sending" from dropdown
    • Should show only sending logs with Send icon badge
  3. Select "All Activity"
    • Should show combined logs from both products, sorted by timestamp

Step 5: Test Pagination

  1. Send 60+ validation requests (to exceed 50 limit)
  2. Verify pagination controls appear
  3. Click "Next" - should load next page
  4. Click "Previous" - should return to first page

Step 6: Test Real-time Updates

  1. Keep Activity page open
  2. Send new validation request via curl
  3. Click "Refresh" button
  4. Verify new log appears at top of table
  5. Verify usage counters increment

Performance Testing

Load Test: 100 Concurrent Validation Requests

# Install Apache Bench (if not installed)
# macOS: brew install httpd
# Linux: apt-get install apache2-utils

# Run load test
ab -n 100 -c 10 -p request.json -T application/json \
http://localhost:4893/api/v1/validate

# Create request.json
cat > request.json << EOF
{"email":"test@example.com","options":{"checkFormat":true}}
EOF

After load test, verify:

# Check all 100 logs were stored
redis-cli -n 2 ZCARD logs:EMAIL_VALIDATION:demo-org

# Check all 100 were queued for archive
redis-cli -n 2 LLEN archive:queue:EMAIL_VALIDATION

# Check usage counter
TODAY=$(date +%Y%m%d)
redis-cli -n 2 GET "usage:EMAIL_VALIDATION:demo-org:daily:$TODAY"
# Expected: 100

Memory Test: Redis Storage

# Check Redis memory usage
redis-cli -n 2 INFO memory | grep used_memory_human

# Check largest keys
redis-cli -n 2 --bigkeys

# Verify TTL is set on log keys
redis-cli -n 2 TTL logs:EMAIL_VALIDATION:demo-org
# Expected: number around 604800 (7 days in seconds)

Troubleshooting

Issue: Logs not appearing in Redis

Symptoms:

redis-cli -n 2 ZCARD logs:EMAIL_VALIDATION:demo-org
# Output: (integer) 0

Debugging steps:

  1. Check Redis connection:
redis-cli -n 2 PING
  1. Check Stamp logs for errors:
# Look for "Failed to store validation log" errors
  1. Verify logging was initialized:
# Look for "✅ Logging infrastructure initialized" on startup
  1. Check config:
# Verify REDIS_DB=2 in .env
grep REDIS_DB /path/to/.env

Issue: Logs not archiving to S3

Symptoms:

  • Redis queue keeps growing
  • No files in S3 bucket

Debugging steps:

  1. Check worker is running:
ps aux | grep worker
  1. Check worker logs for errors:
# Look for "Failed to archive logs" or S3 connection errors
  1. Verify queue has items:
redis-cli -n 2 LLEN archive:queue:EMAIL_VALIDATION
  1. Test R2 credentials manually:
aws s3 ls s3://postchi-logs --endpoint-url=$AWS_ENDPOINT_URL
  1. Check if archiver job is scheduled:
# Worker logs should show:
# "Log archiver scheduled to run every 1 minute"

Issue: High Redis memory usage

Symptoms:

redis-cli -n 2 INFO memory | grep used_memory_human
# Output: 500MB+ for relatively few logs

Solutions:

  1. Check if TTL is working:
redis-cli -n 2 TTL logs:EMAIL_VALIDATION:demo-org
  1. Reduce HOT_LOGS_LIMIT in packages/shared/src/logging/log-writer.ts:
const HOT_LOGS_LIMIT = 500; // Reduced from 1000
  1. Reduce TTL (currently 7 days):
pipeline.expire(logsKey, 86400 * 3); // 3 days instead of 7
  1. Manually trim old logs:
# Remove logs older than timestamp
redis-cli -n 2 ZREMRANGEBYSCORE logs:EMAIL_VALIDATION:demo-org 0 1708444800000

Issue: Usage counters showing 0

Symptoms:

redis-cli -n 2 GET "usage:EMAIL_VALIDATION:demo-org:daily:20260221"
# Output: (nil)

Possible causes:

  1. Date format mismatch - verify format:
# Should be YYYYMMDD
date +%Y%m%d
  1. Counter expired (daily: 2 days, monthly: 35 days)

  2. Organization ID mismatch - check logs for actual org ID used

  3. Check if increment is being called:

# Look in application logs for errors in incrementUsage()

Issue: Dashboard not showing logs

Symptoms:

  • API returns logs successfully
  • Dashboard shows "No activity logs found"

Debugging steps:

  1. Check browser console for errors

  2. Verify API client is calling correct endpoint:

// Should be /v1/logs/validation or /v1/logs/sending
  1. Check authentication token is valid

  2. Verify organizationId from auth context matches logs in Redis

  3. Check network tab - verify API request/response

Success Criteria

After completing all tests, you should have:

  • Stamp validator logging validation requests to Redis and queue
  • Postchi worker logging email sends to Redis and queue
  • Background worker archiving logs to S3 every minute
  • API endpoints returning logs from Redis with correct pagination
  • Dashboard displaying logs with filtering and usage stats
  • Usage counters incrementing correctly per product
  • S3 storage containing NDJSON files organized by org/date
  • Redis memory staying under control with TTL and size limits
  • Performance no noticeable impact on request latency

Next Steps

Once basic testing is complete:

  1. Test with real organizations: Replace demo-org with actual org IDs from auth
  2. Test quota enforcement: Implement usage limits per organization
  3. Test log retention: Verify old logs are cleaned up after TTL expires
  4. Test S3 lifecycle policies: Configure auto-deletion or move to Glacier
  5. Monitor production metrics:
    • Redis queue depth
    • S3 write failures
    • API response times
    • Memory usage trends

Automated Testing (Future)

Consider adding integration tests:

// packages/shared/src/logging/__tests__/integration.test.ts

describe('Logging Infrastructure', () => {
it('should store validation log in Redis', async () => {
const logEntry = createValidationLog();
await storeHotLog(logEntry);
const logs = await getHotLogs('demo-org', ProductType.EMAIL_VALIDATION, 10, 0);
expect(logs).toHaveLength(1);
});

it('should queue log for archive', async () => {
const logEntry = createValidationLog();
await queueLogForArchive(logEntry);
const queueSize = await redis.llen('archive:queue:EMAIL_VALIDATION');
expect(queueSize).toBe(1);
});

it('should increment usage counter', async () => {
await incrementUsage('demo-org', ProductType.EMAIL_VALIDATION);
const usage = await getUsage('demo-org', ProductType.EMAIL_VALIDATION);
expect(usage.dailyUsage).toBeGreaterThan(0);
});
});

Monitoring Dashboard (Future Enhancement)

Create a dedicated monitoring page showing:

  • Current queue depths (validation, sending)
  • Redis memory usage
  • S3 write success/failure rates
  • Average log write latency
  • Top organizations by usage
  • Usage trending graphs

See logging-infrastructure.md for more details on Phase 2 enhancements.