Compare commits

7 Commits

Author SHA1 Message Date
8f1c0746a5 home page updated
Some checks failed
ci / site (push) Has been cancelled
publish-image / publish (push) Has been cancelled
2026-02-10 04:35:03 -05:00
c21614020a wcag and responsiveness
Some checks failed
ci / site (push) Has been cancelled
publish-image / publish (push) Has been cancelled
2026-02-10 03:22:22 -05:00
3b0b97f139 deploy without node
Some checks failed
ci / site (push) Has been cancelled
publish-image / publish (push) Has been cancelled
2026-02-10 02:52:14 -05:00
03df2b3a6c better cards 2026-02-10 02:34:25 -05:00
b63c62a732 better tracking 2026-02-10 01:52:41 -05:00
c1ab51a149 blog umami fix 2026-02-10 01:34:07 -05:00
f056e67eae better cache 2026-02-10 01:20:58 -05:00
110 changed files with 3328 additions and 268 deletions

View File

@@ -7,3 +7,7 @@
**/dist
**/.DS_Store
# Local secrets
**/.env
**/.env.*
!**/.env.example

68
.github/workflows/publish-image.yml vendored Normal file
View File

@@ -0,0 +1,68 @@
name: publish-image
on:
push:
branches: ["main"]
workflow_dispatch:
schedule:
# Rebuild periodically so content sources can be refreshed even without code changes.
- cron: "0 9 * * *"
jobs:
publish:
runs-on: ubuntu-latest
permissions:
contents: read
packages: write
steps:
- uses: actions/checkout@v4
- uses: actions/setup-node@v4
with:
node-version: "24"
cache: "npm"
cache-dependency-path: site/package-lock.json
- name: Install + Fetch Content + Build Site
working-directory: site
env:
YOUTUBE_CHANNEL_ID: ${{ secrets.YOUTUBE_CHANNEL_ID }}
YOUTUBE_API_KEY: ${{ secrets.YOUTUBE_API_KEY }}
PODCAST_RSS_URL: ${{ secrets.PODCAST_RSS_URL }}
WORDPRESS_BASE_URL: ${{ secrets.WORDPRESS_BASE_URL }}
WORDPRESS_USERNAME: ${{ secrets.WORDPRESS_USERNAME }}
WORDPRESS_APP_PASSWORD: ${{ secrets.WORDPRESS_APP_PASSWORD }}
REDIS_URL: ${{ secrets.REDIS_URL }}
run: |
npm ci
npm run fetch-content
npm run build
- uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- uses: docker/setup-buildx-action@v3
- uses: docker/metadata-action@v5
id: meta
with:
images: ghcr.io/${{ github.repository }}
tags: |
type=raw,value=latest
type=sha,format=short,prefix=sha-
- uses: docker/build-push-action@v6
with:
context: .
file: Dockerfile
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}
build-args: |
BUILD_SHA=${{ github.sha }}
BUILD_DATE=${{ github.run_started_at }}
BUILD_REF=${{ github.server_url }}/${{ github.repository }}

View File

@@ -3,15 +3,31 @@ FROM node:24-alpine AS builder
WORKDIR /app/site
COPY site/package.json site/package-lock.json ./
RUN npm ci
RUN npm ci --no-audit --no-fund
COPY site/ ./
# Content is fetched before build (typically in CI) and committed into the build context at
# `site/content/cache/content.json`. If env vars aren't configured, the fetch step gracefully
# skips sources and/or uses last-known-good cache.
RUN npm run build
FROM nginx:1.27-alpine
ARG BUILD_SHA=unknown
ARG BUILD_DATE=unknown
ARG BUILD_REF=unknown
LABEL org.opencontainers.image.title="fast-website"
LABEL org.opencontainers.image.description="Lightweight, SEO-first static site packaged as an nginx image."
LABEL org.opencontainers.image.revision=$BUILD_SHA
LABEL org.opencontainers.image.created=$BUILD_DATE
LABEL org.opencontainers.image.source=$BUILD_REF
COPY deploy/nginx.conf /etc/nginx/conf.d/default.conf
COPY --from=builder /app/site/dist/ /usr/share/nginx/html/
EXPOSE 80
# Operator-friendly version visibility.
RUN printf '{\n "sha": "%s",\n "builtAt": "%s",\n "ref": "%s"\n}\n' "$BUILD_SHA" "$BUILD_DATE" "$BUILD_REF" \
> /usr/share/nginx/html/build.json
EXPOSE 80

View File

@@ -86,7 +86,9 @@ Instrumentation checklist:
## Deployment (Linode + Docker)
Build and run:
The production host is intentionally minimal and only needs Docker (no Node.js on the server).
### Local Docker
```bash
docker compose build
@@ -95,24 +97,29 @@ docker compose up -d
The container serves the static output on port `8080` (map or proxy as needed).
### Refreshing Content (Manual)
### Production (Docker-Only Host)
Content is fetched at build time into `site/content/cache/content.json`.
In production, CI builds and publishes a Docker image (nginx serving the static output). The server updates by pulling that image and restarting the service.
On the Linode host:
Runbook: `deploy/runbook.md`.
### Refreshing Content (Manual, Docker-Only)
Content is fetched at build time into `site/content/cache/content.json` (typically in CI), then packaged into the image.
On the server host:
```bash
./scripts/refresh.sh
```
This:
1. Runs `npm run fetch-content` in `site/`
2. Rebuilds the Docker image
3. Restarts the container
1. Pulls the latest published image
2. Restarts the service (no build on the host)
### Refreshing Content (Scheduled)
Install a daily cron using `deploy/cron.example` as a starting point.
Rollback:
- Re-run `docker compose up -d` with a previously built image/tag, or restore the last known-good repo state and rerun `scripts/refresh.sh`.
- Re-deploy a known-good image tag/digest (see `deploy/runbook.md`).

View File

@@ -0,0 +1,6 @@
services:
web:
image: ${WEB_IMAGE:?Set WEB_IMAGE to the published image tag or digest}
ports:
- "8080:80"

84
deploy/runbook.md Normal file
View File

@@ -0,0 +1,84 @@
## Deploy Runbook (Docker-Only Host)
This runbook is for a minimal production host where **Docker is installed** and **Node.js is not**.
The deployment model is:
- CI builds and publishes a Docker image containing the built static site
- the server updates by pulling that image and restarting the service
### Prerequisites
- Docker + Docker Compose plugin available on the host
- Registry access (e.g., logged in to GHCR if the image is private)
- A `WEB_IMAGE` value pointing at the image to deploy (tag or digest)
Example:
```bash
export WEB_IMAGE=ghcr.io/<owner>/<repo>:latest
```
### First-Time Start
```bash
docker compose -f deploy/docker-compose.prod.yml up -d
```
### Refresh (Pull + Restart)
Pull first (safe; does not affect the running container):
```bash
docker compose -f deploy/docker-compose.prod.yml pull
```
Then restart the service on the newly pulled image:
```bash
docker compose -f deploy/docker-compose.prod.yml up -d --no-build
```
### Verify Deployed Version
1. Check the container's image reference (tag/digest):
```bash
docker compose -f deploy/docker-compose.prod.yml ps
docker inspect --format '{{.Image}} {{.Config.Image}}' <container-id>
```
2. Check build metadata served by the site:
```bash
curl -fsS http://localhost:8080/build.json
```
### Rollback
Re-deploy a known-good version by pinning a previous tag or digest:
```bash
export WEB_IMAGE=ghcr.io/<owner>/<repo>:<known-good-tag>
docker compose -f deploy/docker-compose.prod.yml up -d --no-build
```
Recommended: record the image digest for each release (`docker inspect <image> --format '{{.Id}}'`), and use a digest pin for true immutability.
### Failure Mode Validation (Pull Failure)
If `docker compose pull` fails, **do not run** the restart step. The running site will continue serving the existing container.
To simulate a pull failure safely:
```bash
export WEB_IMAGE=ghcr.io/<owner>/<repo>:this-tag-does-not-exist
docker compose -f deploy/docker-compose.prod.yml pull
```
The pull should fail, but the current service should still be running:
```bash
docker compose -f deploy/docker-compose.prod.yml ps
curl -fsS http://localhost:8080/ > /dev/null
```

View File

@@ -1,8 +1,19 @@
services:
web:
image: ${WEB_IMAGE:-fast-website:local}
build:
context: .
dockerfile: Dockerfile
ports:
- "8080:80"
redis:
image: redis:7-alpine
ports:
# Use 6380 to avoid colliding with any locally installed Redis on 6379.
- "6380:6379"
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 3s
retries: 20

View File

@@ -0,0 +1,48 @@
## Context
The site is an Astro static build served via nginx. Content is gathered by build-time ingestion (`site/scripts/fetch-content.ts`) that reads/writes a repo-local cache file (`site/content/cache/content.json`).
Today, repeated ingestion runs can re-hit external sources (YouTube API/RSS, podcast RSS, WordPress `wp-json`) and re-do normalization work. We want a shared caching layer to reduce IO and network load and to make repeated runs faster and more predictable.
## Goals / Non-Goals
**Goals:**
- Add a Redis-backed cache layer usable from Node scripts (ingestion) with TTL-based invalidation.
- Use the cache layer to reduce repeated network/API calls and parsing work for:
- social content ingestion (YouTube/podcast/Instagram list)
- WordPress `wp-json` ingestion
- Provide a default “industry standard” TTL with environment override.
- Add a manual cache clear command/script.
- Provide verification (tests and/or logs) that cache hits occur and TTL expiration behaves as expected.
**Non-Goals:**
- Adding a runtime server for the site (the site remains static HTML served by nginx).
- Caching browser requests to nginx (no CDN/edge cache configuration in this change).
- Perfect cache coherence across multiple machines/environments (dev+docker is the target).
## Decisions
- **Decision: Use Redis as the shared cache backend (docker-compose service).**
- Rationale: Redis is widely adopted, lightweight, supports TTLs natively, and is easy to run in dev via Docker.
- Alternative considered: Local file-based cache only. Rejected because it doesnt provide a shared service and is harder to invalidate consistently.
- **Decision: Cache at the “source fetch” and “normalized dataset” boundaries.**
- Rationale: The biggest cost is network + parsing/normalization. Caching raw API responses (or normalized outputs) by source+params gives the best win.
- Approach:
- Cache keys like `youtube:api:<channelId>:<limit>`, `podcast:rss:<url>`, `wp:posts`, `wp:pages`, `wp:categories`.
- Store JSON values, set TTL, and log hit/miss per key.
- **Decision: Default TTL = 1 hour (3600s), configurable via env.**
- Rationale: A 1h TTL is a common baseline for content freshness vs load. It also aligns with typical ingestion schedules (hourly/daily).
- Allow overrides for local testing and production tuning.
- **Decision: Cache clear script uses Redis `FLUSHDB` in the configured Redis database.**
- Rationale: Simple manual operation and easy to verify.
- Guardrail: Use a dedicated Redis DB index (e.g., `0` by default) so the script is scoped.
## Risks / Trade-offs
- [Risk] Redis introduces a new dependency and operational moving part. -> Mitigation: Keep Redis optional; ingestion should fall back to no-cache mode if Redis is not reachable.
- [Risk] Stale content if TTL too long. -> Mitigation: Default to 1h and allow env override; provide manual clear command.
- [Risk] Cache key mistakes lead to wrong content reuse. -> Mitigation: Centralize key generation and add tests for key uniqueness and TTL behavior.

View File

@@ -0,0 +1,28 @@
## Why
Reduce IO and external fetch load by adding a shared caching layer so repeated requests for the same content do not re-hit disk/network unnecessarily.
## What Changes
- Add a caching layer (Redis or similar lightweight cache) used by the sites data/ingestion flows.
- Add a cache service to `docker-compose.yml`.
- Define an industry-standard cache invalidation interval (TTL) with a sensible default and allow it to be configured via environment variables.
- Add a script/command to manually clear the cache on demand.
- Add verification that the cache is working (cache hits/misses and TTL behavior).
## Capabilities
### New Capabilities
- `cache-layer`: Provide a shared caching service (Redis or equivalent) with TTL-based invalidation and a manual clear operation for the websites data flows.
### Modified Capabilities
- `social-content-aggregation`: Use the cache layer to avoid re-fetching or re-processing external content sources on repeated runs/requests.
- `wordpress-content-source`: Use the cache layer to reduce repeated `wp-json` fetches and parsing work.
## Impact
- Deployment/local dev: add Redis (or equivalent) to `docker-compose.yml` and wire environment/config for connection + TTL.
- Scripts/services: update ingestion/build-time fetch to read/write via cache and log hit/miss for verification.
- Tooling: add a cache-clear script/command (and document usage).
- Testing: add tests or a lightweight verification step proving cached reads are used and expire as expected.

View File

@@ -0,0 +1,38 @@
## ADDED Requirements
### Requirement: Redis-backed cache service
The system MUST provide a Redis-backed cache service for use by ingestion and content processing flows.
The cache service MUST be runnable in local development via Docker Compose.
#### Scenario: Cache service available in Docker
- **WHEN** the Docker Compose stack is started
- **THEN** a Redis service is available to other services/scripts on the internal network
### Requirement: TTL-based invalidation
Cached entries MUST support TTL-based invalidation.
The system MUST define a default TTL and MUST allow overriding the TTL via environment/config.
#### Scenario: Default TTL applies
- **WHEN** a cached entry is written without an explicit TTL override
- **THEN** it expires after the configured default TTL
#### Scenario: TTL override applies
- **WHEN** a TTL override is configured via environment/config
- **THEN** new cached entries use that TTL for expiration
### Requirement: Cache key namespace
Cache keys MUST be namespaced by source and parameters so that different data requests do not collide.
#### Scenario: Two different sources do not collide
- **WHEN** the system caches a YouTube fetch and a WordPress fetch
- **THEN** they use different key namespaces and do not overwrite each other
### Requirement: Manual cache clear
The system MUST provide a script/command to manually clear the cache.
#### Scenario: Manual clear executed
- **WHEN** a developer runs the cache clear command
- **THEN** the cache is cleared and subsequent ingestion runs produce cache misses

View File

@@ -0,0 +1,23 @@
## MODIFIED Requirements
### Requirement: Refresh and caching
The system MUST cache the latest successful ingestion output and MUST serve the cached data to the site renderer.
The system MUST support periodic refresh on a schedule (at minimum daily) and MUST support a manual refresh trigger.
On ingestion failure, the system MUST continue serving the most recent cached data.
The ingestion pipeline MUST use the cache layer (when configured and reachable) to reduce repeated network and parsing work for external sources (for example, YouTube API/RSS and podcast RSS).
#### Scenario: Scheduled refresh fails
- **WHEN** a scheduled refresh run fails to fetch one or more sources
- **THEN** the site continues to use the most recent successfully cached dataset
#### Scenario: Manual refresh requested
- **WHEN** a manual refresh is triggered
- **THEN** the system attempts ingestion immediately and updates the cache if ingestion succeeds
#### Scenario: Cache hit avoids refetch
- **WHEN** a refresh run is executed within the cache TTL for a given source+parameters
- **THEN** the ingestion pipeline uses cached data for that source instead of refetching over the network

View File

@@ -0,0 +1,19 @@
## MODIFIED Requirements
### Requirement: Build-time caching
WordPress posts, pages, and categories MUST be written into the repo-local content cache used by the site build.
If the WordPress fetch fails, the system MUST NOT crash the entire build pipeline; it MUST either:
- keep the last-known-good cached WordPress content (if present), or
- store an empty WordPress dataset and allow the rest of the site to build.
When the cache layer is configured and reachable, the WordPress ingestion MUST cache `wp-json` responses (or normalized outputs) using a TTL so repeated ingestion runs avoid unnecessary network requests and parsing work.
#### Scenario: WordPress fetch fails
- **WHEN** a WordPress API request fails
- **THEN** the site build can still complete and the blog surface renders a graceful empty state
#### Scenario: Cache hit avoids wp-json refetch
- **WHEN** WordPress ingestion is executed within the configured cache TTL
- **THEN** it uses cached data instead of refetching from `wp-json`

View File

@@ -0,0 +1,26 @@
## 1. Cache Service And Config
- [x] 1.1 Add Redis service to `docker-compose.yml` and wire basic health/ports for local dev
- [x] 1.2 Add cache env/config variables (Redis URL/host+port, DB index, default TTL seconds) and document in `site/.env.example`
## 2. Cache Client And Utilities
- [x] 2.1 Add a small Redis cache client wrapper (get/set JSON with TTL, namespaced keys) for Node scripts
- [x] 2.2 Add logging for cache hit/miss per key to support verification
- [x] 2.3 Ensure caching is optional: if Redis is unreachable, ingestion proceeds without caching
## 3. Integrate With Ingestion
- [x] 3.1 Cache YouTube fetches (API and/or RSS) by source+params and reuse within TTL
- [x] 3.2 Cache podcast RSS fetch by URL and reuse within TTL
- [x] 3.3 Cache WordPress `wp-json` fetches (posts/pages/categories) and reuse within TTL
## 4. Cache Invalidation
- [x] 4.1 Add a command/script to manually clear the cache (scoped to configured Redis DB)
- [x] 4.2 Document the cache clear command usage
## 5. Verification
- [x] 5.1 Add a test that exercises the cache wrapper (set/get JSON + TTL expiration behavior)
- [x] 5.2 Add a test or build verification that a second ingestion run within TTL produces cache hits

View File

@@ -0,0 +1,56 @@
## Context
The site uses Umami custom events via data attributes on clickables (e.g., navigation, CTAs, outbound links). Today, most tracked links include stable identifiers like `target_id`, `placement`, and (for links) `target_url`.
This is sufficient to measure *where* users clicked, but it is limited for content discovery because it does not capture content metadata (e.g., which specific video/post title was clicked). Umami supports adding additional event data via `data-umami-event-*` attributes, which are recorded as strings.
## Goals / Non-Goals
**Goals:**
- Add content metadata fields to Umami click tracking for content-related links:
- `title` (human-readable title)
- `type` (content type)
- Apply consistently across content surfaces (videos, podcast, blog).
- Keep existing taxonomy constraints intact:
- stable deterministic `target_id`
- `placement`
- `target_url` for links
- Avoid tracking PII.
**Non-Goals:**
- Introducing JavaScript-based `window.umami.track` calls (continue using Umami data-attribute tracking).
- Tracking clicks inside arbitrary WordPress-rendered HTML bodies (future enhancement if needed).
- Changing Umami initialization or environment configuration.
## Decisions
- **Decision: Use Option 1 (separate `title` and `type` fields).**
- Rationale: Makes reporting and filtering easier (segment by `type`, then list top `title`). Avoids parsing concatenated strings in analytics.
- Alternative: Option 2 (single `title` field formatted as `[type]-[title]`). Rejected for reduced queryability.
- **Decision: Only apply `title`/`type` to content-related links (not all links).**
- Rationale: Many links do not map cleanly to a single content item (e.g., category nav, pagination, generic navigation).
- **Decision: Normalize type values.**
- Rationale: Stable `type` values enable dashboards to be reused over time.
- Proposed set (from specs): `video`, `podcast_episode`, `blog_post`, `blog_page`.
- **Decision: Prefer shared components to propagate tracking fields.**
- Rationale: Centralize logic and reduce missed clickables.
- Approach:
- Extend existing link/card components (where applicable) to accept optional `umamiTitle` and `umamiType` props.
- For pages that render raw `<a>` tags directly, add attributes inline.
## Risks / Trade-offs
- [Risk] Title values can change over time (content edits) which may reduce longitudinal stability.
- Mitigation: Keep `target_id` deterministic and stable; use `title` for reporting convenience only.
- [Risk] Very long titles.
- Mitigation: Truncate `title` values to a reasonable length (e.g., 120-160 chars) at instrumentation time if needed.
- [Risk] Inconsistent application across surfaces.
- Mitigation: Add tests that assert content clickables include `data-umami-event-title` and `data-umami-event-type` where applicable.

View File

@@ -0,0 +1,28 @@
## Why
Umami click tracking is currently limited to `target_id`/`placement`, which makes it harder to understand *which* specific content items (by title/type) users engage with most. Adding lightweight content metadata to click events enables clearer measurement and reporting.
## What Changes
- Extend Umami click event instrumentation so content-related links include additional event data:
- `data-umami-event-title`: the content title (e.g., post/video/episode/page title)
- `data-umami-event-type`: the content type (e.g., `blog_post`, `blog_page`, `video`, `podcast_episode`)
- Apply the above consistently across all instrumented content links (cards, lists, navigation items that represent a specific piece of content).
- Ensure the metadata is additive and does not replace the existing deterministic identifiers:
- keep `data-umami-event-target_id`
- keep `data-umami-event-placement`
- keep `data-umami-event-target_url` for links
## Capabilities
### New Capabilities
- (none)
### Modified Capabilities
- `interaction-tracking-taxonomy`: add/standardize optional content metadata fields (`title`, `type`) for tracked click events, and define allowed values for `type`.
- `analytics-umami`: require Umami Track Events data-attribute instrumentation to support the above additional `data-umami-event-*` properties on content-related clickables.
## Impact
- Affected code: shared link/card components and content listing/detail pages (videos, podcast, blog posts/pages, and any other instrumented content surfaces).
- Data: Umami event payloads will include two additional string fields for content links; dashboards/reports can segment by `type` and view top-clicked items by `title`.

View File

@@ -0,0 +1,36 @@
## MODIFIED Requirements
### Requirement: Custom event tracking
When Umami is enabled, the site MUST support custom event emission for:
- `cta_click`
- `outbound_click`
- a general click interaction event for all instrumented clickable items (per the site tracking taxonomy)
Each emitted event MUST include enough properties to segment reports by platform and placement when applicable.
All tracked clickable items MUST emit events with a unique, consistent set of data elements as defined by the site tracking taxonomy, including at minimum `target_id` and `placement`.
The site MUST instrument tracked clickables using Umami’s supported Track Events data-attribute method:
- `data-umami-event="<event-name>"`
- optional event data using `data-umami-event-*`
For content-related links (clickables representing a specific piece of content), the site MUST also provide the following Umami event data attributes:
- `data-umami-event-title`
- `data-umami-event-type`
#### Scenario: Emit outbound click event
- **WHEN** a user clicks a non-CTA outbound link from the homepage
- **THEN** the system emits an `outbound_click` event with a property identifying the destination domain
#### Scenario: Emit general click event for any clickable
- **WHEN** a user clicks an instrumented navigation link
- **THEN** the system emits a click interaction event with `target_id` and `placement`
#### Scenario: Content click includes title and type
- **WHEN** a user clicks an instrumented content link (video, podcast episode, blog post/page)
- **THEN** the emitted Umami event includes `title` and `type` properties via `data-umami-event-*` attributes
#### Scenario: Uninstrumented clicks do not break the page
- **WHEN** a user clicks an element with no tracking metadata
- **THEN** the system does not throw and navigation/interaction proceeds normally

View File

@@ -0,0 +1,28 @@
## MODIFIED Requirements
### Requirement: Minimum required properties
Every tracked click event MUST include, at minimum:
- `target_id`
- `placement`
For links, the event MUST also include:
- `target_url` (or a stable target identifier that can be mapped to a URL)
For content-related links (clickables representing a specific piece of content), the event MUST also include:
- `title` (human-readable content title)
- `type` (content type identifier)
The `type` value MUST be one of:
- `video`
- `podcast_episode`
- `blog_post`
- `blog_page`
#### Scenario: Tracking a content card click
- **WHEN** a user clicks a content card link
- **THEN** the emitted event includes `target_id`, `placement`, and `target_url`
#### Scenario: Tracking a content link includes title and type
- **WHEN** a user clicks a content-related link that represents a specific content item
- **THEN** the emitted event includes `target_id`, `placement`, `target_url`, `title`, and `type`

View File

@@ -0,0 +1,15 @@
## 1. Update Tracking Taxonomy
- [x] 1.1 Update shared Umami instrumentation patterns to support optional `title` and `type` event data for content links (without breaking existing events)
- [x] 1.2 Ensure content `type` values are normalized (`video`, `podcast_episode`, `blog_post`, `blog_page`) and do not include PII
## 2. Instrument Content Surfaces
- [x] 2.1 Add `data-umami-event-title` and `data-umami-event-type` to video clickables (listing cards and detail navigation where applicable)
- [x] 2.2 Add `data-umami-event-title` and `data-umami-event-type` to podcast clickables (listing cards and episode links)
- [x] 2.3 Add `data-umami-event-title` and `data-umami-event-type` to blog clickables that represent specific content items (post cards, pages list links)
## 3. Verify
- [x] 3.1 Add/update tests to assert content clickables include `data-umami-event-title` and `data-umami-event-type` where required
- [x] 3.2 Build the site and confirm representative pages render the new data attributes (videos listing, podcast listing, blog listing)

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,44 @@
## Context
The site uses Umami for analytics. Most site clickables are instrumented using Umamis data-attribute method (`data-umami-event` and optional `data-umami-event-*` properties) so events are recorded automatically on click.
The Blog section was added recently and its clickables (post cards, category nav, page links) are not consistently emitting Umami events. This creates a measurement blind spot for the `/blog` surface.
## Goals / Non-Goals
**Goals:**
- Ensure all blog clickables emit Umami events using the documented data-attribute method.
- Ensure every tracked clickable has a deterministic, unique `target_id` and includes at minimum `placement` and `target_url` per taxonomy.
- Keep event names within Umami limits (<= 50 chars) and avoid sending event data without an event name.
- Add tests to prevent regressions (blog pages/components should contain required Umami attributes).
**Non-Goals:**
- Introducing custom JavaScript tracking (`window.umami.track`) for v1; we will use Umamis data-attribute method.
- Adding new analytics providers or changing Umami initialization.
- Tracking PII or user-generated content in event properties.
## Decisions
- **Decision: Use Umami-native data attributes on every blog clickable.**
- Rationale: Aligns with Umamis “Track events” docs and the rest of the sites tracking approach; avoids adding JS listeners that can interfere with other handlers.
- **Decision: Use consistent event names by clickable type.**
- Rationale: Keeps reporting clean while still allowing segmentation via event properties.
- Proposed:
- `click` for internal navigation links (including blog category navigation)
- `outbound_click` for external links (if any in blog chrome)
- **Decision: Add a deterministic `target_id` namespace for blog elements.**
- Rationale: Blog has many repeated elements; we need unique IDs that remain stable across builds.
- Proposed conventions:
- Blog header link: `nav.blog`
- Blog secondary nav: `blog.subnav.all`, `blog.subnav.pages`, `blog.subnav.category.<slug>`
- Blog post card: `blog.card.post.<slug>` (placement `blog.index` or `blog.category.<slug>`)
- Blog post detail back link: `blog.post.back`
- Blog page list links: `blog.pages.link.<slug>`
## Risks / Trade-offs
- [Risk] Some blog content areas render raw HTML from WordPress; links inside content are not instrumented. -> Mitigation: Track the blog chrome (cards/nav/back links) first; consider JS-based delegated tracking for content-body links in a future change if needed.
- [Risk] Over-instrumentation adds noisy events. -> Mitigation: Keep event names simple, rely on `target_id` + `placement` for segmentation, and avoid tracking non-clickable elements.

View File

@@ -0,0 +1,26 @@
## Why
The Blog sections click tracking is not firing reliably in Umami, which prevents measuring what users do in `/blog` and where they go next.
## What Changes
- Update the Blog section UI so every clickable element uses Umamis data-attribute event tracking format:
- `data-umami-event="<event-name>"`
- `data-umami-event-*` attributes for event data
- Ensure every tracked clickable item has a unique, deterministic set of event data elements (especially `target_id`, `placement`, `target_url`) so clicks can be measured independently.
- Add verification/tests to ensure Blog clickables are instrumented and follow the same taxonomy as the rest of the site.
## Capabilities
### New Capabilities
- (none)
### Modified Capabilities
- `blog-section-surface`: instrument blog clickables (post cards, post/page links, category secondary nav, blog header link) using Umami `data-umami-event` attributes.
- `interaction-tracking-taxonomy`: extend/clarify tracking rules to cover blog-specific UI elements and namespaces for `target_id`.
- `analytics-umami`: ensure the implementation adheres to Umamis Track Events specification for data attributes.
## Impact
- Affected UI/components: blog pages and components under `site/src/pages/blog/` and `site/src/components/` (cards and secondary nav), plus any shared navigation link to `/blog`.
- Testing: add/update tests to assert required Umami data attributes exist and are unique per clickable element in blog surfaces.

View File

@@ -0,0 +1,28 @@
## MODIFIED Requirements
### Requirement: Custom event tracking
When Umami is enabled, the site MUST support custom event emission for:
- `cta_click`
- `outbound_click`
- a general click interaction event for all instrumented clickable items (per the site tracking taxonomy)
Each emitted event MUST include enough properties to segment reports by platform and placement when applicable.
All tracked clickable items MUST emit events with a unique, consistent set of data elements as defined by the site tracking taxonomy, including at minimum `target_id` and `placement`.
The site MUST instrument tracked clickables using Umamis supported Track Events data-attribute method:
- `data-umami-event="<event-name>"`
- optional event data using `data-umami-event-*`
#### Scenario: Emit outbound click event
- **WHEN** a user clicks a non-CTA outbound link from the homepage
- **THEN** the system emits an `outbound_click` event with a property identifying the destination domain
#### Scenario: Emit general click event for any clickable
- **WHEN** a user clicks an instrumented navigation link
- **THEN** the system emits a click interaction event with `target_id` and `placement`
#### Scenario: Uninstrumented clicks do not break the page
- **WHEN** a user clicks an element with no tracking metadata
- **THEN** the system does not throw and navigation/interaction proceeds normally

View File

@@ -0,0 +1,47 @@
## MODIFIED Requirements
### Requirement: Blog index listing (posts)
The site MUST provide a blog index page at `/blog` that lists WordPress posts as cards containing:
- featured image (when available)
- title
- excerpt/summary
The listing MUST be ordered by publish date descending (newest first).
Each post card MUST be instrumented with Umami Track Events data attributes and MUST include at minimum:
- `data-umami-event`
- `data-umami-event-target_id`
- `data-umami-event-placement`
- `data-umami-event-target_url`
#### Scenario: Blog index lists posts
- **WHEN** the cached WordPress dataset contains posts
- **THEN** `/blog` renders a list of post cards ordered by publish date descending
#### Scenario: Blog post card click is tracked
- **WHEN** a user clicks a blog post card on `/blog`
- **THEN** the click emits an Umami event with `target_id`, `placement`, and `target_url`
### Requirement: Category-based secondary navigation
The blog section MUST render a secondary navigation under the header derived from the cached WordPress categories.
Selecting a category MUST navigate to a category listing page showing only posts in that category.
Each secondary navigation link MUST be instrumented with Umami Track Events data attributes and MUST include at minimum:
- `data-umami-event`
- `data-umami-event-target_id`
- `data-umami-event-placement`
- `data-umami-event-target_url`
#### Scenario: Category nav present
- **WHEN** the cached WordPress dataset contains categories
- **THEN** the blog section shows a secondary navigation with those categories
#### Scenario: Category listing filters posts
- **WHEN** a user navigates to a category listing page
- **THEN** only posts assigned to that category are listed
#### Scenario: Category nav click is tracked
- **WHEN** a user clicks a category link in the blog secondary navigation
- **THEN** the click emits an Umami event with `target_id`, `placement`, and `target_url`

View File

@@ -0,0 +1,17 @@
## MODIFIED Requirements
### Requirement: Unique identifier for every clickable item
Every clickable item that is tracked MUST have a stable identifier (`target_id`) that is unique across the site (or unique within a documented namespace).
The identifier MUST be deterministic across builds for the same element and placement.
The taxonomy MUST define namespaces for repeated UI surfaces. For the blog surface, the following namespaces MUST be used:
- `blog.subnav.*` for secondary navigation links
- `blog.card.post.<slug>` for blog post cards
- `blog.pages.link.<slug>` for blog page listing links
- `blog.post.*` / `blog.page.*` for detail page chrome links (e.g., back links)
#### Scenario: Two links in different placements
- **WHEN** two links point to the same destination but appear in different placements
- **THEN** their `target_id` values are different so their clicks can be measured independently

View File

@@ -0,0 +1,15 @@
## 1. Audit Blog Clickables
- [x] 1.1 Inventory blog clickables (`site/src/pages/blog/**`, `site/src/components/Blog*`) that should emit Umami events (post cards, category subnav, pages list links, detail chrome links)
- [x] 1.2 Confirm each clickable has the required Umami attributes and a deterministic unique `target_id` per taxonomy
## 2. Implement Umami Attributes
- [x] 2.1 Instrument blog secondary navigation links with `data-umami-event` and required event data (`target_id`, `placement`, `target_url`)
- [x] 2.2 Instrument blog post cards and any inline links in listing UIs with `data-umami-event` and required event data
- [x] 2.3 Instrument blog detail page chrome links (e.g., Back) and pages listing links with required Umami attributes
## 3. Verify
- [x] 3.1 Add/update tests to assert blog components/pages contain Umami `data-umami-event` attributes (and key properties like `target_id`, `placement`, `target_url`)
- [x] 3.2 Build the site and confirm `/blog` and a blog detail page render with instrumented clickables

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,62 @@
## Context
The site renders multiple card-like UI elements today:
- videos/podcast listings use `site/src/components/ContentCard.astro`
- blog listings use `site/src/components/BlogPostCard.astro`
These cards have different layouts and metadata placement. This change standardizes the card information architecture so all cards feel consistent.
The site is statically generated (Astro). Card layout consistency should be enforced primarily through shared components and shared CSS rather than copy/paste per page.
## Goals / Non-Goals
**Goals:**
- Define and implement a single, consistent card structure:
- media (image/placeholder) at top
- title
- trimmed summary/excerpt
- meta row: date (left) + views (right, if available)
- footer: source label (youtube/podcast/blog/etc.)
- Apply to all existing card surfaces:
- `/videos` listing cards
- `/podcast` listing cards
- `/blog` post cards (and category listings)
- Keep the layout resilient when fields are missing (no views, no image, no summary).
**Non-Goals:**
- Redesigning non-card list links (e.g., simple navigation links) into cards unless needed for consistency.
- Changing Umami tracking taxonomy (attributes stay intact).
- Large typographic or theme redesign beyond card structure/spacing.
## Decisions
- **Decision: Implement a shared Card component used by existing card components.**
- Rationale: Centralizes markup and ensures layout stays consistent across surfaces.
- Approach:
- Create a new component (e.g., `Card.astro`) with props for:
- `href`, `title`, `summary`, `imageUrl`, `dateLabel`, `viewsLabel`, `sourceLabel`
- optional tracking attributes pass-through (keep existing `data-umami-*` behavior)
- Update `ContentCard.astro` and `BlogPostCard.astro` to render the shared Card component.
- **Decision: Add an optional `summary` field to normalized items.**
- Rationale: Enables the standard card layout to show trimmed summaries for videos/podcast, similar to blog excerpts.
- Approach:
- Extend the normalized content schema/types with `summary?: string`.
- Populate it during ingestion where available (YouTube description snippet; podcast episode summary/description).
- **Decision: Views are optional and shown only when available.**
- Rationale: Not all sources provide views; the layout should be consistent without forcing synthetic values.
## Risks / Trade-offs
- [Risk] Ingestion sources may provide very long summaries.
- Mitigation: Standardize trimming logic in the card component (single truncation helper).
- [Risk] CSS regressions across multiple pages.
- Mitigation: Add tests that assert key card structure/classes exist; verify build outputs for `/videos`, `/podcast`, `/blog`.
- [Risk] Blog post cards and content cards have different link targets (internal vs outbound).
- Mitigation: Shared Card component should be able to render both internal links and external links (target/rel configurable).

View File

@@ -0,0 +1,29 @@
## Why
The site currently renders multiple card variants (videos/podcast cards, blog post cards, etc.) with inconsistent structure and metadata placement, which makes the UI feel uneven. A standardized card layout will create a consistent UX across the website.
## What Changes
- Standardize the UI structure for all content cards across the site:
- featured image displayed prominently on top (when available)
- title
- summary/excerpt, trimmed
- meta row with date (left) and views (right) when available (`space-between`)
- footer row showing the content source (YouTube/podcast/blog/etc.)
- Update existing card renderers/components to use the standardized structure and styling.
- Where a content source does not provide one of the fields (for example, views for blog posts), the layout MUST still render cleanly with the missing field omitted.
## Capabilities
### New Capabilities
- `card-layout-system`: Define the standard card information architecture (image/title/summary/meta/footer) and rules for optional fields so all surfaces render consistently.
### Modified Capabilities
- `social-content-aggregation`: Extend normalized content items to include an optional `summary`/`excerpt` field where available (e.g., YouTube description snippet, podcast episode summary) so non-blog cards can display a trimmed summary.
- `blog-section-surface`: Standardize blog listing cards to include the meta row (publish date and optional views) and footer source label, consistent with the global card layout system.
## Impact
- Affected code: shared card/link components (e.g., `site/src/components/ContentCard.astro`, `site/src/components/BlogPostCard.astro`) and pages that render listings (`/`, `/videos`, `/podcast`, `/blog`).
- Data model: normalized cached items may gain an optional summary field; ingestion code may need to populate it for YouTube/podcast.
- Styling: global CSS updates to ensure consistent spacing/typography and footer/meta layout.

View File

@@ -0,0 +1,33 @@
## MODIFIED Requirements
### Requirement: Blog index listing (posts)
The site MUST provide a blog index page at `/blog` that lists WordPress posts as cards containing:
- featured image (when available)
- title
- excerpt/summary
- publish date
The card MUST render a footer bar that includes:
- publish date on the left
- views on the right when available (if views are not provided by the dataset, the card MUST omit views without breaking layout)
- a content source label (e.g., `blog`)
The listing MUST be ordered by publish date descending (newest first).
Each post card MUST be instrumented with Umami Track Events data attributes and MUST include at minimum:
- `data-umami-event`
- `data-umami-event-target_id`
- `data-umami-event-placement`
- `data-umami-event-target_url`
#### Scenario: Blog index lists posts
- **WHEN** the cached WordPress dataset contains posts
- **THEN** `/blog` renders a list of post cards ordered by publish date descending
#### Scenario: Blog post card click is tracked
- **WHEN** a user clicks a blog post card on `/blog`
- **THEN** the click emits an Umami event with `target_id`, `placement`, and `target_url`
#### Scenario: Blog post card layout is standardized
- **WHEN** `/blog` renders a blog post card
- **THEN** the card shows featured image (when available), title, trimmed excerpt, and a footer bar containing date, optional views, and a source label

View File

@@ -0,0 +1,27 @@
## ADDED Requirements
### Requirement: Standard card information architecture
All content cards rendered by the site MUST use a standardized layout so cards across different surfaces look consistent.
The standard card layout MUST be:
- featured image displayed prominently at the top (when available)
- title
- summary/excerpt text, trimmed to a fixed maximum length
- footer bar showing:
- publish date on the left
- views when available (if omitted, the footer MUST still render cleanly)
- the content source label (e.g., `youtube`, `podcast`, `blog`)
If a field is not available (for example, views for some sources), the card MUST still render cleanly with that field omitted.
#### Scenario: Card renders with all fields
- **WHEN** a content item has an image, title, summary, publish date, views, and source
- **THEN** the card renders those fields in the standard card layout order
#### Scenario: Card renders without views
- **WHEN** a content item has no views data
- **THEN** the card renders the footer bar with date + source and omits views without breaking the layout
#### Scenario: Card renders without featured image
- **WHEN** a content item has no featured image
- **THEN** the card renders a placeholder media area and still renders the remaining fields

View File

@@ -0,0 +1,27 @@
## MODIFIED Requirements
### Requirement: Normalized content items
The system MUST normalize all ingested items (YouTube videos, Instagram posts, podcast episodes) into a single internal schema so the website can render them consistently.
The normalized item MUST include at minimum:
- `id` (stable within its source)
- `source` (`youtube`, `instagram`, or `podcast`)
- `url`
- `title`
- `publishedAt` (ISO-8601)
- `thumbnailUrl` (optional)
The system MUST support an optional summary field on normalized items when available from the source:
- `summary` (optional, short human-readable excerpt suitable for cards)
#### Scenario: Normalizing a YouTube video
- **WHEN** the system ingests a YouTube video item
- **THEN** it produces a normalized item containing `id`, `source: youtube`, `url`, `title`, and `publishedAt`
#### Scenario: Normalizing a podcast episode
- **WHEN** the system ingests a podcast RSS episode
- **THEN** it produces a normalized item containing `id`, `source: podcast`, `url`, `title`, and `publishedAt`
#### Scenario: Summary available
- **WHEN** an ingested item provides summary/description content
- **THEN** the normalized item includes a `summary` suitable for rendering in cards

View File

@@ -0,0 +1,20 @@
## 1. Card Component + Styles
- [x] 1.1 Create a shared card component implementing the standard card layout (media, title, summary, meta row, footer)
- [x] 1.2 Add/adjust shared CSS so the card meta row uses `space-between` and the footer consistently shows the source label
## 2. Data Model Support
- [x] 2.1 Extend normalized `ContentItem` to support an optional `summary` field and ensure it is persisted in the content cache
- [x] 2.2 Populate `summary` for YouTube and podcast items during ingestion (safe trimming / fallback when missing)
## 3. Apply Across Site
- [x] 3.1 Update `ContentCard` surfaces (`/`, `/videos`, `/podcast`) to use the shared card layout and include date/views/source in the standard positions
- [x] 3.2 Update blog post cards (`/blog`, category listings) to use the shared card layout (including publish date and `blog` source footer)
- [x] 3.3 Ensure cards render cleanly when optional fields are missing (no image, no views, no summary)
## 4. Verify
- [x] 4.1 Add/update tests to assert standardized card structure/classes across `ContentCard` and blog post cards
- [x] 4.2 Build the site and verify `/videos`, `/podcast`, and `/blog` render cards matching the standard layout

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,60 @@
## Context
- Production server environment is intentionally minimal: Docker is available, but Node.js is not installed on the host.
- The site needs a repeatable way to get to the latest built content on that server.
## Goals / Non-Goals
**Goals:**
- Update the deployed site to the latest content using Docker-only operations on the server.
- Keep the server host clean (no Node.js installation required).
- Make the refresh procedure repeatable and verifiable.
**Non-Goals:**
- Building site artifacts directly on the server host outside containers.
- Introducing a new CMS/content authoring workflow.
- Solving content freshness triggers end-to-end (webhooks, scheduling) beyond what is needed to support a Docker-based refresh.
## Decisions
1. Build in CI, deploy as a Docker image
Why: keeps host clean and makes deploy deterministic.
Alternatives considered:
- Install Node.js on the host: rejected (violates clean server requirement).
- Build on the host inside a one-off container writing to a bind mount/volume: possible, but adds operational complexity and makes server resources part of the build pipeline.
2. Refresh by pulling a published image and restarting the service
Why: the server only needs Docker + registry access.
Alternatives considered:
- File-based sync (rsync/scp) of static assets: can work, but requires separate artifact distribution and is easier to drift.
- Automated image updating (e.g., watchtower): may be useful later, but start with an explicit, documented operator command.
3. Version visibility via image metadata
Why: operators need to confirm what is running.
Approach:
- Publish images with an immutable identifier (digest) and a human-friendly tag.
- Expose build metadata through standard Docker inspection and/or a small endpoint/static file in the image.
## Risks / Trade-offs
- [Risk] Content can be stale if the CI build does not run when content changes
Mitigation: add a scheduled build and/or content-change trigger in CI (future enhancement if not already present).
- [Risk] Registry auth/secrets management on the server
Mitigation: use least-privilege registry credentials and Docker-native secret handling where available.
- [Risk] Short downtime during restart
Mitigation: use `docker compose up -d` to minimize downtime; consider health checks and rolling strategies if/when multiple replicas are used.
## Migration Plan
- Add or update the Docker image build to produce a deployable image containing the built site output.
- Update server deployment configuration (compose/service) to run the published image.
- Document the operator refresh command(s): pull latest image, restart service, verify deployed version.
- Rollback strategy: re-deploy the previously known-good image tag/digest.
## Open Questions
- What is the authoritative "latest content" source (e.g., WordPress, filesystem, git) and what is the trigger to rebuild/publish a new image?
- Where should operator commands live (Makefile, `ops/` scripts, README section)?
- What is the current deployment target (single host compose, swarm, k8s) and should this change be scoped to one?

View File

@@ -0,0 +1,25 @@
## Why
The production server only provides Docker and does not have Node.js installed. We need a way to refresh the site to the latest content on that server without installing Node.js on the host.
## What Changes
- Add a Docker-first mechanism to update the deployed site to the latest content without requiring host-installed build tooling (no Node.js on the server).
- Standardize the deploy/update flow so the server updates are performed via Docker (e.g., pulling a new artifact/image and restarting, or running a containerized refresh job).
- Document and automate the update command(s) so content refresh is repeatable and low-risk.
## Capabilities
### New Capabilities
- `docker-content-refresh`: The deployed site can be updated to the latest content on a Docker-only server (no host Node.js), using a containerized workflow.
### Modified Capabilities
None.
## Impact
- Deployment/runtime: Docker compose/service definitions, update procedure, and operational docs.
- CI/CD: build/publish pipeline may need to produce and publish deployable artifacts suitable for Docker-only servers.
- Secrets/credentials: any content source credentials needed for refresh/build must be handled via Docker-friendly secret injection.
- Observability/ops: add or adjust logging/health checks around the refresh/update step to make failures visible.

View File

@@ -0,0 +1,26 @@
## ADDED Requirements
### Requirement: Host update does not require Node.js
The system MUST provide an operator workflow to update the deployed site to the latest content without installing Node.js on the server host. Any build or content-fetch steps MUST run in containers and/or CI, not via host-installed Node.js.
#### Scenario: Operator updates without host Node.js
- **WHEN** the server host has Docker available but does not have Node.js installed
- **THEN** the operator can complete the update procedure using Docker commands only
### Requirement: Image-based content refresh is supported
The system MUST support refreshing the deployed site to the latest content by pulling a newly built deployable artifact (for example, a Docker image) and restarting the running service.
#### Scenario: Successful refresh to latest image
- **WHEN** the operator runs the documented refresh command
- **THEN** the server pulls the latest published image and restarts the service using that image
#### Scenario: Refresh failure does not break running site
- **WHEN** the operator runs the documented refresh command and the pull fails
- **THEN** the site remains running on the previously deployed image
### Requirement: Refresh is repeatable and auditable
The system MUST document the refresh procedure and provide a way to verify which version is deployed (for example, image tag/digest or build metadata).
#### Scenario: Operator verifies deployed version
- **WHEN** the operator runs the documented verification command
- **THEN** the system reports the currently deployed version identifier

View File

@@ -0,0 +1,25 @@
## 1. Discovery And Current State
- [x] 1.1 Identify current deploy target and mechanism (compose/swarm/k8s, image vs files) and document constraints in `README` or `ops/` docs
- [x] 1.2 Identify the content source(s) that define "latest content" (e.g., WordPress/API) and how builds currently fetch content
- [x] 1.3 Confirm current build output (static assets) and runtime server (e.g., nginx) requirements
## 2. Build And Publish A Deployable Artifact
- [x] 2.1 Ensure the repo can produce a deterministic production build inside CI (no host dependencies)
- [x] 2.2 Create or update a Dockerfile to build the site and package the built output into a runtime image
- [x] 2.3 Add build metadata to the image (tagging convention and/or embedded version file)
- [x] 2.4 Configure CI to build and publish the image to a registry accessible by the server
## 3. Server-Side Docker-Only Refresh Workflow
- [x] 3.1 Add or update the server Docker Compose/service definition to run the published image
- [x] 3.2 Add documented operator commands to refresh to the latest image (pull + restart)
- [x] 3.3 Add a verification command/procedure to show the currently deployed version (tag/digest/build metadata)
- [x] 3.4 Define rollback procedure to re-deploy a previous known-good tag/digest
## 4. Validation
- [x] 4.1 Validate a refresh on a test/staging server: pull latest image, restart, confirm content changes are visible
- [x] 4.2 Validate failure mode: simulate pull failure and confirm the existing site remains serving
- [x] 4.3 Update docs with a minimal "runbook" for operators (refresh, verify, rollback)

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,88 @@
## Context
- The site is an Astro-based static site with shared global styling in `site/public/styles/global.css` and shared layout/navigation in `site/src/layouts/*`.
- Current UX gaps:
- Responsive behavior is inconsistent at smaller breakpoints (navigation does not collapse into a mobile-friendly menu).
- The background gradient shows abrupt cuts/banding on larger resolutions.
- Typography relies on system fonts; a smoother, display-friendly font is desired.
- Accessibility baseline is not formally enforced; target is WCAG 2.2 AA minimum standard (not necessarily 100% compliance).
## Goals / Non-Goals
**Goals:**
- Establish an explicit baseline of WCAG 2.2 AA-aligned behavior for the site shell and common interactive elements.
- Implement responsive layouts across common breakpoints; ensure primary navigation collapses into a hamburger menu with mild animation.
- Ensure the mobile menu is fully keyboard accessible and screen-reader friendly (correct semantics, labeling, focus management).
- Improve background rendering so gradients do not cut abruptly on large displays.
- Introduce a display-friendly font and apply it consistently across pages and components.
- Add lightweight verification (tests and/or build checks) that ensures the baseline remains intact.
**Non-Goals:**
- Full accessibility audit and remediation of all possible WCAG 2.2 AA items across all content (e.g., all third-party embeds, all user-provided HTML).
- Building a complete design system or replacing all visual styling.
- Implementing complex client-side routing or heavy JS frameworks.
## Decisions
1. Use a small client-side navigation controller for the hamburger menu
Why: Astro renders static HTML; a small, isolated script can provide toggling + focus management without adding framework complexity.
Alternatives considered:
- CSS-only checkbox hack: rejected (harder to manage focus/ARIA correctly, less robust).
- A full component framework (React/Vue): rejected (unnecessary weight).
2. Prefer semantic HTML + minimal ARIA
Why: Better interoperability across assistive technologies and less risk of incorrect ARIA.
Approach:
- Use a `<button>` to toggle the menu.
- Control a `<nav>` region (or a `<div>` wrapper) via `aria-controls` and `aria-expanded`.
- Ensure menu items remain standard links with predictable tab order.
3. Add explicit focus styles and reduced-motion support globally
Why: Focus visibility and motion preferences are core accessibility/usability requirements; implementing globally reduces drift.
Approach:
- Provide a consistent `:focus-visible` outline that meets contrast requirements.
- Wrap animations/transitions in `@media (prefers-reduced-motion: reduce)` fallbacks.
4. Fix gradient banding/cuts via a single, oversized background layer
Why: Multiple fixed-size radial gradients can show cutoffs on ultrawide/large viewports.
Approach:
- Render the background using a `body::before` fixed-position layer with large gradients and `inset: -40vmax` (or similar) to eliminate edges.
- Keep the existing aesthetic but adjust sizes/stops so it scales smoothly.
Alternatives considered:
- Using an image background: rejected (asset management, potential compression artifacts).
5. Use a webfont with good UI readability and a limited weight range
Why: Improve perceived polish while keeping performance predictable.
Approach:
- Choose a modern UI font family (e.g., `Inter`, `DM Sans`, `Manrope`, or `Space Grotesk`) with 2-3 weights.
- Prefer self-hosted font assets or a single external source with `font-display: swap`.
Decision will be finalized during implementation based on desired look and licensing.
## Risks / Trade-offs
- [Risk] Mobile menu introduces accessibility regressions (trap focus, broken escape handling)
-> Mitigation: implement standard patterns (toggle button, ESC closes, return focus, body scroll lock optional) and add tests for key attributes.
- [Risk] Global CSS changes affect existing layouts
-> Mitigation: keep changes scoped to site shell classes and add visual spot-checks for key pages (`/`, `/videos`, `/podcast`, `/blog`, `/about`).
- [Risk] Webfont increases page weight
-> Mitigation: limit to necessary weights, use `woff2`, preload critical fonts, and keep fallbacks.
## Migration Plan
1. Implement navigation collapse + hamburger toggle script and styles.
2. Add global focus-visible styling and reduced-motion fallbacks.
3. Fix background gradient rendering on large displays.
4. Add/replace typography stack and adjust headings/line-height as needed.
5. Add verification (tests / lint checks) and confirm responsive behavior on key pages.
Rollback:
- Revert navigation script + CSS changes to restore previous behavior.
## Open Questions
- Which specific webfont should be used (and will it be self-hosted or loaded via a provider)?
- Should the mobile menu lock body scroll while open (common pattern, but optional)?
- Should the menu close on route navigation (likely yes), and should it close on outside click (likely yes)?

View File

@@ -0,0 +1,29 @@
## Why
The site needs a more robust, state-of-the-art UX baseline: a minimum standard of WCAG 2.2 AA accessibility and a consistently responsive UI across devices and large displays.
## What Changes
- Establish a minimum accessibility baseline targeting WCAG 2.2 AA (without aiming for perfect/100% compliance).
- Make the UI fully responsive across common breakpoints (mobile/tablet/desktop/large desktop).
- Update the primary navigation to collapse into a hamburger menu on smaller viewports with mild animation.
- Fix the background gradient so it does not show abrupt cuts/banding at larger resolutions.
- Introduce a smoother, display-friendly font stack (and apply it consistently).
## Capabilities
### New Capabilities
- `wcag-responsive-ui`: Accessibility + responsive UI shell standards for layout, navigation, typography, and global styling (WCAG 2.2 AA baseline).
### Modified Capabilities
<!-- None expected; this change introduces a new UI-shell capability that affects multiple pages/components. -->
## Impact
- Frontend UI: `site/src/layouts/*`, header/navigation components, shared UI components, and global CSS (`site/public/styles/global.css`).
- Interaction patterns: keyboard navigation and focus styles, menu toggle behavior, and motion controls (respecting reduced-motion preferences).
- Visual design: typography and background rendering across large screens.
- Verification: add/update checks/tests for responsive nav behavior and basic accessibility expectations (e.g., menu toggle labeling, focus visibility).

View File

@@ -0,0 +1,67 @@
## ADDED Requirements
### Requirement: Responsive layout baseline
The site MUST be responsive across common breakpoints (mobile, tablet, desktop, and large desktop) and MUST not exhibit broken layouts (overlapping content, horizontal scrolling, clipped navigation).
#### Scenario: Mobile viewport does not horizontally scroll
- **WHEN** the site is viewed on a small mobile viewport
- **THEN** content reflows to a single-column layout and the page does not require horizontal scrolling to read primary content
#### Scenario: Large viewport uses available space without visual artifacts
- **WHEN** the site is viewed on a large desktop viewport (ultrawide / high resolution)
- **THEN** the background and layout scale without visible abrupt gradient cutoffs or banding artifacts
### Requirement: Collapsible primary navigation (hamburger menu)
The primary navigation MUST collapse into a hamburger menu on smaller viewports.
The menu toggle MUST be a `<button>` with:
- `aria-controls` referencing the menu container
- `aria-expanded` reflecting open/closed state
- an accessible label (e.g., `aria-label="Open menu"`/`"Close menu"` or equivalent)
When the menu is open, the menu items MUST be visible and keyboard navigable.
#### Scenario: Menu collapses on small viewport
- **WHEN** the viewport is below the mobile navigation breakpoint
- **THEN** the primary navigation renders in a collapsed state and can be opened via a hamburger toggle
#### Scenario: Menu toggle exposes accessible state
- **WHEN** the user toggles the menu open and closed
- **THEN** `aria-expanded` updates correctly and the toggle remains reachable via keyboard
### Requirement: Keyboard and focus behavior baseline (WCAG 2.2 AA aligned)
The site MUST support keyboard navigation for all primary interactive elements.
The site MUST provide visible focus indication for keyboard users using `:focus-visible` styles.
For the mobile menu:
- pressing `Escape` MUST close the menu (when open)
- closing the menu MUST return focus to the menu toggle button
#### Scenario: Focus is visible on links and buttons
- **WHEN** a keyboard user tabs through the page
- **THEN** the focused element shows a visible focus indicator
#### Scenario: Escape closes the menu
- **WHEN** the menu is open and the user presses `Escape`
- **THEN** the menu closes and focus returns to the menu toggle
### Requirement: Reduced motion support
The site MUST respect user motion preferences:
- if `prefers-reduced-motion: reduce` is set, animations/transitions for the menu and other UI elements MUST be reduced or disabled.
#### Scenario: Reduced motion disables menu animation
- **WHEN** the user's system preference is `prefers-reduced-motion: reduce`
- **THEN** opening/closing the menu does not use noticeable animation
### Requirement: Typography baseline (display-friendly font)
The site MUST use a display-friendly font stack consistently across pages, including headings and navigation.
The site MUST ensure text remains readable:
- reasonable line height
- sufficient contrast against the background for primary text and focus indicators
#### Scenario: Font is applied consistently
- **WHEN** a user navigates between pages
- **THEN** typography (font family and basic scale) remains consistent

View File

@@ -0,0 +1,27 @@
## 1. Navigation + Responsive Shell
- [x] 1.1 Identify the current header/nav implementation and decide the mobile breakpoint for collapsing navigation
- [x] 1.2 Implement hamburger toggle UI (button + icon) with correct ARIA (`aria-controls`, `aria-expanded`, accessible label)
- [x] 1.3 Implement the mobile menu panel styles + mild open/close animation (and close-on-route navigation)
- [x] 1.4 Add keyboard behavior: `Escape` closes menu and focus returns to toggle; ensure tab order remains sane
- [x] 1.5 Add reduced-motion fallback: disable/reduce menu animations when `prefers-reduced-motion: reduce`
- [x] 1.6 Ensure desktop navigation links remain clickable/accessible (no `inert`/`aria-hidden` desktop regression)
## 2. WCAG 2.2 AA Baseline
- [x] 2.1 Add/standardize global `:focus-visible` styles for links/buttons (high-contrast, consistent, not clipped)
- [x] 2.2 Ensure interactive elements meet minimum hit target expectations where feasible (spacing/padding for nav + key buttons)
- [x] 2.3 Add skip-to-content link and verify it is visible on focus and works across pages/layouts
- [x] 2.4 Audit and fix obvious contrast issues for primary text and focus outlines against the background
## 3. Background + Typography Polish
- [x] 3.1 Fix large-resolution background gradient cutoffs (move to a scaled, oversized background layer/pseudo-element)
- [x] 3.2 Introduce a display-friendly font (webfont) and apply consistently across the site; ensure sensible type scale/line-height
- [x] 3.3 Verify responsive behavior on key pages (`/`, `/videos`, `/podcast`, `/blog`, `/about`) at common breakpoints
## 4. Verification
- [x] 4.1 Add/update tests to ensure hamburger toggle ARIA attributes exist and update correctly
- [x] 4.2 Add/update tests or checks for focus-visible styling presence and reduced-motion rules
- [x] 4.3 Build the site and perform a keyboard-only smoke test (nav, cards, blog category nav, menu open/close)

View File

@@ -0,0 +1,2 @@
schema: spec-driven
created: 2026-02-10

View File

@@ -0,0 +1,52 @@
## Context
This change introduces a Service Worker to improve perceived load time and reduce network usage on repeat visits by caching critical assets in the browser.
The site is a static Astro build. That means:
- The Service Worker should live at the site root (`/sw.js`) so it can control all routes.
- Navigations (HTML documents) should not be cached in a way that causes indefinite staleness after new deploys.
## Goals / Non-Goals
**Goals:**
- Improve repeat-visit performance by pre-caching the critical site shell assets.
- Add runtime caching for media assets (images) with bounded storage usage.
- Ensure safe update behavior: cache versioning and cleanup on activate.
- Keep local development predictable by not registering the Service Worker in dev by default.
**Non-Goals:**
- Full offline-first experience for all routes/content.
- Background sync, push notifications, or complex offline fallbacks.
- Server-side caching (handled by separate changes, if desired).
## Decisions
1. **Implement a lightweight, custom Service Worker (no Workbox)**
Rationale: The project already outputs static assets and the needed caching strategies are straightforward. A small custom `/sw.js` avoids adding a build-time dependency and keeps behavior explicit.
Alternatives considered:
- Workbox: powerful, but adds dependency surface area and build configuration overhead.
2. **Cache strategy by request type**
Rationale: Different resources have different freshness requirements.
- Navigations (HTML documents): **Network-first**, fallback to cache on failure. This minimizes stale HTML risks while still helping resiliency.
- Static shell assets (CSS/JS/fonts/icons): **Pre-cache** on install and serve from cache for speed.
- Images/media: **Cache-first** with a size bound and eviction to avoid unbounded storage.
3. **Versioned caches + activation cleanup**
Rationale: Static sites frequently redeploy; versioning ensures updates can be picked up and old assets are not served after deploy. On activate, the SW deletes prior version caches.
Implementation approach:
- Use cache names like `shell-v<version>` and `media-v<version>`.
- Update the version string on build (initially a constant; later can be automated).
4. **Disable SW registration in development by default**
Rationale: Service worker caching can confuse local iteration and cause stale assets during development.
Implementation approach:
- Register SW only when `import.meta.env.PROD` is true (Astro build-time flag) or an explicit runtime guard is met.
## Risks / Trade-offs
- **[Stale or broken assets after deploy]** → Use versioned caches and delete old caches during activation. Prefer network-first for navigations.
- **[Over-caching HTML causes outdated content]** → Do not use cache-first for navigation; do not pre-cache HTML pages.
- **[Storage growth due to images]** → Enforce a max-entry limit with eviction for media cache.
- **[Browser compatibility gaps]** → Service worker is progressive enhancement; site must still function without it.

View File

@@ -0,0 +1,27 @@
## Why
Improve page load performance (especially repeat visits) by caching key assets closer to the user and reducing unnecessary network requests.
## What Changes
- Add a Service Worker to the site so the browser can cache and serve core assets efficiently.
- Pre-cache the critical shell (CSS, JS, fonts, icons) and use a runtime caching strategy for images and other large assets.
- Ensure safe update behavior on deploy (new service worker activates and old caches are cleaned up).
- Keep development experience predictable (service worker disabled or bypassed in dev by default).
## Capabilities
### New Capabilities
- `service-worker-performance`: Provide a service worker-based caching strategy that improves perceived load time and reduces network usage on repeat visits, while ensuring safe updates on new deploys.
### Modified Capabilities
- (none)
## Impact
- Adds new client-side assets for the service worker (e.g., `sw.js`) and registration logic in the site layout.
- Changes browser caching behavior; must avoid serving stale HTML indefinitely and ensure caches are versioned/invalidated on deploy.
- Service workers require a secure context (HTTPS) in production; local dev behavior should be explicitly controlled to avoid confusing caching during iteration.

View File

@@ -0,0 +1,45 @@
## ADDED Requirements
### Requirement: Service Worker registration
The site SHALL register a Service Worker on supported browsers when running in production (HTTPS), scoped to the site root so it can control all site pages.
#### Scenario: Production registration
- **WHEN** a user loads any page in a production environment
- **THEN** the site registers a service worker at `/sw.js` with scope `/`
#### Scenario: Development does not register
- **WHEN** a user loads any page in a local development environment
- **THEN** the site does not register a service worker
### Requirement: Pre-cache critical site shell assets
The Service Worker SHALL pre-cache a set of critical static assets required to render the site shell quickly on repeat visits.
#### Scenario: Pre-cache on install
- **WHEN** the service worker is installed
- **THEN** it caches the configured site shell assets in a versioned cache
### Requirement: Runtime caching for media assets
The Service Worker SHALL use runtime caching for media assets (for example images) to reduce repeat network fetches, while ensuring content can refresh.
#### Scenario: Cache-first for images
- **WHEN** a user requests an image resource
- **THEN** the service worker serves the cached image when available, otherwise fetches from the network and stores the response in the media cache
#### Scenario: Enforce cache size bounds
- **WHEN** the number of cached media items exceeds the configured maximum
- **THEN** the service worker evicts older entries to stay within the bound
### Requirement: Navigation requests avoid indefinite staleness
The Service Worker MUST NOT serve stale HTML indefinitely for navigation requests.
#### Scenario: Network-first navigation
- **WHEN** a user navigates to a page route (a document navigation request)
- **THEN** the service worker attempts to fetch from the network first and falls back to a cached response if the network is unavailable
### Requirement: Safe updates and cache cleanup
The Service Worker SHALL use versioned caches and remove old caches during activation to ensure updated assets are used after a new deploy.
#### Scenario: Activate new version and clean old caches
- **WHEN** a new service worker version activates
- **THEN** it deletes caches from older versions and begins using the current versioned caches

View File

@@ -0,0 +1,22 @@
## 1. Setup
- [x] 1.1 Add `sw.js` to site root output (place in `site/public/sw.js`)
- [x] 1.2 Add service worker registration to the base layout (register only in production)
## 2. Pre-cache Site Shell
- [x] 2.1 Implement versioned cache names and an explicit cache version constant
- [x] 2.2 Implement `install` handler to pre-cache critical shell assets
- [x] 2.3 Implement `activate` handler to delete old version caches
## 3. Runtime Caching
- [x] 3.1 Implement network-first strategy for navigation/document requests with cache fallback
- [x] 3.2 Implement cache-first strategy for images/media with network fallback
- [x] 3.3 Add a bounded eviction policy for media cache size
## 4. Verification
- [ ] 4.1 Verify service worker registers in production build and does not register in dev
- [ ] 4.2 Verify repeat navigation and asset loads hit cache (Chrome DevTools Application tab)
- [ ] 4.3 Verify a new deploy triggers cache version update and old caches are removed

View File

@@ -23,6 +23,14 @@ Each emitted event MUST include enough properties to segment reports by platform
All tracked clickable items MUST emit events with a unique, consistent set of data elements as defined by the site tracking taxonomy, including at minimum `target_id` and `placement`.
The site MUST instrument tracked clickables using Umami's supported Track Events data-attribute method:
- `data-umami-event="<event-name>"`
- optional event data using `data-umami-event-*`
For content-related links (clickables representing a specific piece of content), the site MUST also provide the following Umami event data attributes:
- `data-umami-event-title`
- `data-umami-event-type`
#### Scenario: Emit outbound click event
- **WHEN** a user clicks a non-CTA outbound link from the homepage
- **THEN** the system emits an `outbound_click` event with a property identifying the destination domain
@@ -31,6 +39,10 @@ All tracked clickable items MUST emit events with a unique, consistent set of da
- **WHEN** a user clicks an instrumented navigation link
- **THEN** the system emits a click interaction event with `target_id` and `placement`
#### Scenario: Content click includes title and type
- **WHEN** a user clicks an instrumented content link (video, podcast episode, blog post/page)
- **THEN** the emitted Umami event includes `title` and `type` properties via `data-umami-event-*` attributes
#### Scenario: Uninstrumented clicks do not break the page
- **WHEN** a user clicks an element with no tracking metadata
- **THEN** the system does not throw and navigation/interaction proceeds normally

View File

@@ -0,0 +1,95 @@
## Purpose
Expose a blog section on the site backed by cached WordPress content, including listing, detail pages, and category browsing.
## ADDED Requirements
### Requirement: Primary navigation entry
The site MUST add a header navigation link to the blog index at `/blog` labeled "Blog".
#### Scenario: Blog link in header
- **WHEN** a user views any page
- **THEN** the header navigation includes a "Blog" link that navigates to `/blog`
### Requirement: Blog index listing (posts)
The site MUST provide a blog index page at `/blog` that lists WordPress posts as cards containing:
- featured image (when available)
- title
- excerpt/summary
- publish date
The listing MUST be ordered by publish date descending (newest first).
The card MUST render a footer row that includes:
- publish date on the left
- views on the right when available (if views are not provided by the dataset, the card MUST omit views without breaking layout)
- a content source label (e.g., `blog`)
Each post card MUST be instrumented with Umami Track Events data attributes and MUST include at minimum:
- `data-umami-event`
- `data-umami-event-target_id`
- `data-umami-event-placement`
- `data-umami-event-target_url`
#### Scenario: Blog index lists posts
- **WHEN** the cached WordPress dataset contains posts
- **THEN** `/blog` renders a list of post cards ordered by publish date descending
#### Scenario: Blog post card click is tracked
- **WHEN** a user clicks a blog post card on `/blog`
- **THEN** the click emits an Umami event with `target_id`, `placement`, and `target_url`
#### Scenario: Blog post card layout is standardized
- **WHEN** `/blog` renders a blog post card
- **THEN** the card shows featured image (when available), title, trimmed excerpt, and a footer bar containing date, optional views, and a source label
### Requirement: Blog post detail
The site MUST provide a blog post detail page for each WordPress post that renders:
- title
- publish date
- featured image (when available)
- full post content
#### Scenario: Post detail renders
- **WHEN** a user navigates to a blog post detail page
- **THEN** the page renders the full post content from the cached WordPress dataset
### Requirement: WordPress pages support
The blog section MUST support WordPress pages by rendering page detail routes that show:
- title
- featured image (when available)
- full page content
#### Scenario: Page detail renders
- **WHEN** a user navigates to a WordPress page detail route
- **THEN** the page renders the full page content from the cached WordPress dataset
### Requirement: Category-based secondary navigation
The blog section MUST render a secondary navigation under the header derived from the cached WordPress categories.
Selecting a category MUST navigate to a category listing page showing only posts in that category.
Each secondary navigation link MUST be instrumented with Umami Track Events data attributes and MUST include at minimum:
- `data-umami-event`
- `data-umami-event-target_id`
- `data-umami-event-placement`
- `data-umami-event-target_url`
#### Scenario: Category nav present
- **WHEN** the cached WordPress dataset contains categories
- **THEN** the blog section shows a secondary navigation with those categories
#### Scenario: Category listing filters posts
- **WHEN** a user navigates to a category listing page
- **THEN** only posts assigned to that category are listed
#### Scenario: Category nav click is tracked
- **WHEN** a user clicks a category link in the blog secondary navigation
- **THEN** the click emits an Umami event with `target_id`, `placement`, and `target_url`
### Requirement: Graceful empty states
If there are no WordPress posts available, the blog index MUST render a non-broken empty state and MUST still render header/navigation.
#### Scenario: No posts available
- **WHEN** the cached WordPress dataset contains no posts
- **THEN** `/blog` renders a helpful empty state

View File

@@ -0,0 +1,42 @@
## Purpose
Provide a shared caching layer (Redis-backed) for ingestion and content processing flows, with TTL-based invalidation and manual cache clearing.
## ADDED Requirements
### Requirement: Redis-backed cache service
The system MUST provide a Redis-backed cache service for use by ingestion and content processing flows.
The cache service MUST be runnable in local development via Docker Compose.
#### Scenario: Cache service available in Docker
- **WHEN** the Docker Compose stack is started
- **THEN** a Redis service is available to other services/scripts on the internal network
### Requirement: TTL-based invalidation
Cached entries MUST support TTL-based invalidation.
The system MUST define a default TTL and MUST allow overriding the TTL via environment/config.
#### Scenario: Default TTL applies
- **WHEN** a cached entry is written without an explicit TTL override
- **THEN** it expires after the configured default TTL
#### Scenario: TTL override applies
- **WHEN** a TTL override is configured via environment/config
- **THEN** new cached entries use that TTL for expiration
### Requirement: Cache key namespace
Cache keys MUST be namespaced by source and parameters so that different data requests do not collide.
#### Scenario: Two different sources do not collide
- **WHEN** the system caches a YouTube fetch and a WordPress fetch
- **THEN** they use different key namespaces and do not overwrite each other
### Requirement: Manual cache clear
The system MUST provide a script/command to manually clear the cache.
#### Scenario: Manual clear executed
- **WHEN** a developer runs the cache clear command
- **THEN** the cache is cleared and subsequent ingestion runs produce cache misses

View File

@@ -0,0 +1,32 @@
## Purpose
Define a standardized card layout so content cards across surfaces look consistent.
## Requirements
### Requirement: Standard card information architecture
All content cards rendered by the site MUST use a standardized layout so cards across different surfaces look consistent.
The standard card layout MUST be:
- featured image displayed prominently at the top (when available)
- title
- summary/excerpt text, trimmed to a fixed maximum length
- footer row showing:
- publish date on the left
- views when available (if omitted, the footer MUST still render cleanly)
- the content source label (e.g., `youtube`, `podcast`, `blog`)
If a field is not available (for example, views for some sources), the card MUST still render cleanly with that field omitted.
#### Scenario: Card renders with all fields
- **WHEN** a content item has an image, title, summary, publish date, views, and source
- **THEN** the card renders those fields in the standard card layout order
#### Scenario: Card renders without views
- **WHEN** a content item has no views data
- **THEN** the card renders the footer bar with date + source and omits views without breaking the layout
#### Scenario: Card renders without featured image
- **WHEN** a content item has no featured image
- **THEN** the card renders a placeholder media area and still renders the remaining fields

View File

@@ -0,0 +1,31 @@
## Purpose
Enable operators to refresh the deployed site to the latest content on a Docker-only host (no Node.js installed on the server).
## Requirements
### Requirement: Host update does not require Node.js
The system MUST provide an operator workflow to update the deployed site to the latest content without installing Node.js on the server host. Any build or content-fetch steps MUST run in containers and/or CI, not via host-installed Node.js.
#### Scenario: Operator updates without host Node.js
- **WHEN** the server host has Docker available but does not have Node.js installed
- **THEN** the operator can complete the update procedure using Docker commands only
### Requirement: Image-based content refresh is supported
The system MUST support refreshing the deployed site to the latest content by pulling a newly built deployable artifact (for example, a Docker image) and restarting the running service.
#### Scenario: Successful refresh to latest image
- **WHEN** the operator runs the documented refresh command
- **THEN** the server pulls the latest published image and restarts the service using that image
#### Scenario: Refresh failure does not break running site
- **WHEN** the operator runs the documented refresh command and the pull fails
- **THEN** the site remains running on the previously deployed image
### Requirement: Refresh is repeatable and auditable
The system MUST document the refresh procedure and provide a way to verify which version is deployed (for example, image tag/digest or build metadata).
#### Scenario: Operator verifies deployed version
- **WHEN** the operator runs the documented verification command
- **THEN** the system reports the currently deployed version identifier

View File

@@ -38,7 +38,8 @@ When `metrics.views` is not available, the system MUST render the high-performin
### Requirement: Graceful empty and error states
If a module has no content to display, the homepage MUST render a non-broken empty state for that module and MUST still render the rest of the page.
The Instagram module is an exception: if there are no Instagram items to display, the homepage MUST omit the Instagram module entirely (no empty state block) and MUST still render the rest of the page.
#### Scenario: No Instagram items available
- **WHEN** the cached dataset contains no Instagram items
- **THEN** the Instagram-related module renders an empty state and the homepage still renders other modules
- **THEN** the Instagram-related module is not rendered and the homepage still renders other modules

View File

@@ -18,6 +18,12 @@ Every clickable item that is tracked MUST have a stable identifier (`target_id`)
The identifier MUST be deterministic across builds for the same element and placement.
The taxonomy MUST define namespaces for repeated UI surfaces. For the blog surface, the following namespaces MUST be used:
- `blog.subnav.*` for secondary navigation links
- `blog.card.post.<slug>` for blog post cards
- `blog.pages.link.<slug>` for blog page listing links
- `blog.post.*` / `blog.page.*` for detail page chrome links (e.g., back links)
#### Scenario: Two links in different placements
- **WHEN** two links point to the same destination but appear in different placements
- **THEN** their `target_id` values are different so their clicks can be measured independently
@@ -30,14 +36,27 @@ Every tracked click event MUST include, at minimum:
For links, the event MUST also include:
- `target_url` (or a stable target identifier that can be mapped to a URL)
For content-related links (clickables representing a specific piece of content), the event MUST also include:
- `title` (human-readable content title)
- `type` (content type identifier)
The `type` value MUST be one of:
- `video`
- `podcast_episode`
- `blog_post`
- `blog_page`
#### Scenario: Tracking a content card click
- **WHEN** a user clicks a content card link
- **THEN** the emitted event includes `target_id`, `placement`, and `target_url`
#### Scenario: Tracking a content link includes title and type
- **WHEN** a user clicks a content-related link that represents a specific content item
- **THEN** the emitted event includes `target_id`, `placement`, `target_url`, `title`, and `type`
### Requirement: No PII in event properties
The taxonomy MUST prohibit including personally identifiable information (PII) in event names or event properties.
#### Scenario: Tracking includes only categorical metadata
- **WHEN** tracking metadata is defined for a clickable item
- **THEN** it contains only categorical identifiers (ids, placements, domains) and does not include user-provided content

View File

@@ -45,9 +45,19 @@ The site MUST provide:
- `sitemap.xml` enumerating indexable pages
- `robots.txt` that allows indexing of indexable pages
The sitemap MUST include the blog surface routes:
- `/blog`
- blog post detail routes
- blog page detail routes
- blog category listing routes
#### Scenario: Sitemap is available
- **WHEN** a crawler requests `/sitemap.xml`
- **THEN** the server returns an XML sitemap listing `/`, `/videos`, `/podcast`, and `/about`
- **THEN** the server returns an XML sitemap listing `/`, `/videos`, `/podcast`, `/about`, and `/blog`
#### Scenario: Blog URLs appear in sitemap
- **WHEN** WordPress content is available in the cache at build time
- **THEN** the generated sitemap includes the blog detail URLs for those items
### Requirement: Structured data
The site MUST support structured data (JSON-LD) for Video and Podcast content when detail pages exist, and MUST ensure the JSON-LD is valid JSON.

View File

@@ -11,6 +11,9 @@ The normalized item MUST include at minimum:
- `publishedAt` (ISO-8601)
- `thumbnailUrl` (optional)
The system MUST support an optional summary field on normalized items when available from the source:
- `summary` (optional, short human-readable excerpt suitable for cards)
#### Scenario: Normalizing a YouTube video
- **WHEN** the system ingests a YouTube video item
- **THEN** it produces a normalized item containing `id`, `source: youtube`, `url`, `title`, and `publishedAt`
@@ -19,6 +22,10 @@ The normalized item MUST include at minimum:
- **WHEN** the system ingests a podcast RSS episode
- **THEN** it produces a normalized item containing `id`, `source: podcast`, `url`, `title`, and `publishedAt`
#### Scenario: Summary available
- **WHEN** an ingested item provides summary/description content
- **THEN** the normalized item includes a `summary` suitable for rendering in cards
### Requirement: YouTube ingestion with stats when available
The system MUST support ingesting YouTube videos for channel `youtube.com/santhoshj`.
@@ -57,6 +64,8 @@ The system MUST support periodic refresh on a schedule (at minimum daily) and MU
On ingestion failure, the system MUST continue serving the most recent cached data.
The ingestion pipeline MUST use the cache layer (when configured and reachable) to reduce repeated network and parsing work for external sources (for example, YouTube API/RSS and podcast RSS).
#### Scenario: Scheduled refresh fails
- **WHEN** a scheduled refresh run fails to fetch one or more sources
- **THEN** the site continues to use the most recent successfully cached dataset
@@ -65,3 +74,6 @@ On ingestion failure, the system MUST continue serving the most recent cached da
- **WHEN** a manual refresh is triggered
- **THEN** the system attempts ingestion immediately and updates the cache if ingestion succeeds
#### Scenario: Cache hit avoids refetch
- **WHEN** a refresh run is executed within the cache TTL for a given source+parameters
- **THEN** the ingestion pipeline uses cached data for that source instead of refetching over the network

View File

@@ -0,0 +1,71 @@
## Purpose
Define a minimum UX baseline for accessibility (WCAG 2.2 AA aligned) and responsive behavior for the site shell (navigation, focus, motion, typography, and background behavior).
## Requirements
### Requirement: Responsive layout baseline
The site MUST be responsive across common breakpoints (mobile, tablet, desktop, and large desktop) and MUST not exhibit broken layouts (overlapping content, horizontal scrolling, clipped navigation).
#### Scenario: Mobile viewport does not horizontally scroll
- **WHEN** the site is viewed on a small mobile viewport
- **THEN** content reflows to a single-column layout and the page does not require horizontal scrolling to read primary content
#### Scenario: Large viewport uses available space without visual artifacts
- **WHEN** the site is viewed on a large desktop viewport (ultrawide / high resolution)
- **THEN** the background and layout scale without visible abrupt gradient cutoffs or banding artifacts
### Requirement: Collapsible primary navigation (hamburger menu)
The primary navigation MUST collapse into a hamburger menu on smaller viewports.
The menu toggle MUST be a `<button>` with:
- `aria-controls` referencing the menu container
- `aria-expanded` reflecting open/closed state
- an accessible label (e.g., `aria-label="Open menu"`/`"Close menu"` or equivalent)
When the menu is open, the menu items MUST be visible and keyboard navigable.
#### Scenario: Menu collapses on small viewport
- **WHEN** the viewport is below the mobile navigation breakpoint
- **THEN** the primary navigation renders in a collapsed state and can be opened via a hamburger toggle
#### Scenario: Menu toggle exposes accessible state
- **WHEN** the user toggles the menu open and closed
- **THEN** `aria-expanded` updates correctly and the toggle remains reachable via keyboard
### Requirement: Keyboard and focus behavior baseline (WCAG 2.2 AA aligned)
The site MUST support keyboard navigation for all primary interactive elements.
The site MUST provide visible focus indication for keyboard users using `:focus-visible` styles.
For the mobile menu:
- pressing `Escape` MUST close the menu (when open)
- closing the menu MUST return focus to the menu toggle button
#### Scenario: Focus is visible on links and buttons
- **WHEN** a keyboard user tabs through the page
- **THEN** the focused element shows a visible focus indicator
#### Scenario: Escape closes the menu
- **WHEN** the menu is open and the user presses `Escape`
- **THEN** the menu closes and focus returns to the menu toggle
### Requirement: Reduced motion support
The site MUST respect user motion preferences:
- if `prefers-reduced-motion: reduce` is set, animations/transitions for the menu and other UI elements MUST be reduced or disabled.
#### Scenario: Reduced motion disables menu animation
- **WHEN** the user's system preference is `prefers-reduced-motion: reduce`
- **THEN** opening/closing the menu does not use noticeable animation
### Requirement: Typography baseline (display-friendly font)
The site MUST use a display-friendly font stack consistently across pages, including headings and navigation.
The site MUST ensure text remains readable:
- reasonable line height
- sufficient contrast against the background for primary text and focus indicators
#### Scenario: Font is applied consistently
- **WHEN** a user navigates between pages
- **THEN** typography (font family and basic scale) remains consistent

View File

@@ -0,0 +1,69 @@
## Purpose
Provide a build-time content source backed by a WordPress site via the `wp-json` REST APIs.
## ADDED Requirements
### Requirement: WordPress API configuration
The system MUST allow configuring a WordPress content source using environment/config values:
- WordPress base URL
- credentials (username + password or application password) when required by the WordPress instance
The WordPress base URL MUST be used to construct requests to the WordPress `wp-json` REST APIs.
#### Scenario: Config provided
- **WHEN** WordPress configuration values are provided
- **THEN** the system can attempt to fetch WordPress content via `wp-json`
### Requirement: Fetch posts
The system MUST fetch the latest WordPress posts via `wp-json` and map them into an internal representation with:
- stable ID
- slug
- title
- excerpt/summary
- content HTML
- featured image URL when available
- publish date/time and last modified date/time
- category assignments (IDs and slugs when available)
#### Scenario: Posts fetched successfully
- **WHEN** the WordPress posts endpoint returns a non-empty list
- **THEN** the system stores the mapped post items in the content cache for rendering
### Requirement: Fetch pages
The system MUST fetch WordPress pages via `wp-json` and map them into an internal representation with:
- stable ID
- slug
- title
- excerpt/summary when available
- content HTML
- featured image URL when available
- publish date/time and last modified date/time
#### Scenario: Pages fetched successfully
- **WHEN** the WordPress pages endpoint returns a non-empty list
- **THEN** the system stores the mapped page items in the content cache for rendering
### Requirement: Fetch categories
The system MUST fetch WordPress categories via `wp-json` and store them for rendering a category-based secondary navigation under the blog section.
#### Scenario: Categories fetched successfully
- **WHEN** the WordPress categories endpoint returns a list of categories
- **THEN** the system stores categories (ID, slug, name) in the content cache for blog navigation
### Requirement: Build-time caching
WordPress posts, pages, and categories MUST be written into the repo-local content cache used by the site build.
If the WordPress fetch fails, the system MUST NOT crash the entire build pipeline; it MUST either:
- keep the last-known-good cached WordPress content (if present), or
- store an empty WordPress dataset and allow the rest of the site to build.
When the cache layer is configured and reachable, the WordPress ingestion MUST cache `wp-json` responses (or normalized outputs) using a TTL so repeated ingestion runs avoid unnecessary network requests and parsing work.
#### Scenario: WordPress fetch fails
- **WHEN** a WordPress API request fails
- **THEN** the site build can still complete and the blog surface renders a graceful empty state
#### Scenario: Cache hit avoids wp-json refetch
- **WHEN** WordPress ingestion is executed within the configured cache TTL
- **THEN** it uses cached data instead of refetching from `wp-json`

View File

@@ -3,12 +3,10 @@ set -eu
cd "$(dirname "$0")/.."
echo "[refresh] fetching content"
(cd site && npm ci && npm run fetch-content)
echo "[refresh] pulling latest image"
docker compose -f deploy/docker-compose.prod.yml pull
echo "[refresh] building + restarting container"
docker compose build web
docker compose up -d --force-recreate web
echo "[refresh] restarting service (no build)"
docker compose -f deploy/docker-compose.prod.yml up -d --no-build
echo "[refresh] done"

View File

@@ -19,3 +19,17 @@ WORDPRESS_BASE_URL=
# Optional credentials (prefer an Application Password). Leave blank if your WP endpoints are public.
WORDPRESS_USERNAME=
WORDPRESS_APP_PASSWORD=
# Cache layer (optional; used by ingestion scripts)
# If unset, caching is disabled.
#
# Using docker-compose redis:
# CACHE_REDIS_URL=redis://localhost:6380/0
CACHE_REDIS_URL=
# Alternative config if you prefer host/port/db:
CACHE_REDIS_HOST=localhost
CACHE_REDIS_PORT=6380
CACHE_REDIS_DB=0
# Default cache TTL (seconds). 3600 = 1 hour.
CACHE_DEFAULT_TTL_SECONDS=3600

File diff suppressed because one or more lines are too long

101
site/package-lock.json generated
View File

@@ -10,6 +10,7 @@
"dependencies": {
"@astrojs/sitemap": "^3.7.0",
"astro": "^5.17.1",
"redis": "^4.7.1",
"rss-parser": "^3.13.0",
"zod": "^3.25.76"
},
@@ -1241,6 +1242,65 @@
"integrity": "sha512-70wQhgYmndg4GCPxPPxPGevRKqTIJ2Nh4OkiMWmDAVYsTQ+Ta7Sq+rPevXyXGdzr30/qZBnyOalCszoMxlyldQ==",
"license": "MIT"
},
"node_modules/@redis/bloom": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@redis/bloom/-/bloom-1.2.0.tgz",
"integrity": "sha512-HG2DFjYKbpNmVXsa0keLHp/3leGJz1mjh09f2RLGGLQZzSHpkmZWuwJbAvo3QcRY8p80m5+ZdXZdYOSBLlp7Cg==",
"license": "MIT",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/client": {
"version": "1.6.1",
"resolved": "https://registry.npmjs.org/@redis/client/-/client-1.6.1.tgz",
"integrity": "sha512-/KCsg3xSlR+nCK8/8ZYSknYxvXHwubJrU82F3Lm1Fp6789VQ0/3RJKfsmRXjqfaTA++23CvC3hqmqe/2GEt6Kw==",
"license": "MIT",
"dependencies": {
"cluster-key-slot": "1.1.2",
"generic-pool": "3.9.0",
"yallist": "4.0.0"
},
"engines": {
"node": ">=14"
}
},
"node_modules/@redis/graph": {
"version": "1.1.1",
"resolved": "https://registry.npmjs.org/@redis/graph/-/graph-1.1.1.tgz",
"integrity": "sha512-FEMTcTHZozZciLRl6GiiIB4zGm5z5F3F6a6FZCyrfxdKOhFlGkiAqlexWMBzCi4DcRoyiOsuLfW+cjlGWyExOw==",
"license": "MIT",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/json": {
"version": "1.0.7",
"resolved": "https://registry.npmjs.org/@redis/json/-/json-1.0.7.tgz",
"integrity": "sha512-6UyXfjVaTBTJtKNG4/9Z8PSpKE6XgSyEb8iwaqDcy+uKrd/DGYHTWkUdnQDyzm727V7p21WUMhsqz5oy65kPcQ==",
"license": "MIT",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/search": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/@redis/search/-/search-1.2.0.tgz",
"integrity": "sha512-tYoDBbtqOVigEDMAcTGsRlMycIIjwMCgD8eR2t0NANeQmgK/lvxNAvYyb6bZDD4frHRhIHkJu2TBRvB0ERkOmw==",
"license": "MIT",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@redis/time-series": {
"version": "1.1.0",
"resolved": "https://registry.npmjs.org/@redis/time-series/-/time-series-1.1.0.tgz",
"integrity": "sha512-c1Q99M5ljsIuc4YdaCwfUEXsofakb9c8+Zse2qxTadu8TalLXuAESzLvFAvNVbkmSlvlzIQOLpBCmWI9wTOt+g==",
"license": "MIT",
"peerDependencies": {
"@redis/client": "^1.0.0"
}
},
"node_modules/@rollup/pluginutils": {
"version": "5.3.0",
"resolved": "https://registry.npmjs.org/@rollup/pluginutils/-/pluginutils-5.3.0.tgz",
@@ -2515,6 +2575,15 @@
"node": ">=6"
}
},
"node_modules/cluster-key-slot": {
"version": "1.1.2",
"resolved": "https://registry.npmjs.org/cluster-key-slot/-/cluster-key-slot-1.1.2.tgz",
"integrity": "sha512-RMr0FhtfXemyinomL4hrWcYJxmX6deFdCxpJzhDttxgO1+bcCnkk+9drydLVDmAMG7NE6aN/fl4F7ucU/90gAA==",
"license": "Apache-2.0",
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/color-convert": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
@@ -3090,6 +3159,15 @@
"node": "^8.16.0 || ^10.6.0 || >=11.0.0"
}
},
"node_modules/generic-pool": {
"version": "3.9.0",
"resolved": "https://registry.npmjs.org/generic-pool/-/generic-pool-3.9.0.tgz",
"integrity": "sha512-hymDOu5B53XvN4QT9dBmZxPX4CWhBPPLguTZ9MMFeFa/Kg0xWVfylOVNlJji/E7yTZWFd/q9GO5TxDLq156D7g==",
"license": "MIT",
"engines": {
"node": ">= 4"
}
},
"node_modules/get-caller-file": {
"version": "2.0.5",
"resolved": "https://registry.npmjs.org/get-caller-file/-/get-caller-file-2.0.5.tgz",
@@ -4687,6 +4765,23 @@
"url": "https://paulmillr.com/funding/"
}
},
"node_modules/redis": {
"version": "4.7.1",
"resolved": "https://registry.npmjs.org/redis/-/redis-4.7.1.tgz",
"integrity": "sha512-S1bJDnqLftzHXHP8JsT5II/CtHWQrASX5K96REjWjlmWKrviSOLWmM7QnRLstAWsu1VBBV1ffV6DzCvxNP0UJQ==",
"license": "MIT",
"workspaces": [
"./packages/*"
],
"dependencies": {
"@redis/bloom": "1.2.0",
"@redis/client": "1.6.1",
"@redis/graph": "1.1.1",
"@redis/json": "1.0.7",
"@redis/search": "1.2.0",
"@redis/time-series": "1.1.0"
}
},
"node_modules/regex": {
"version": "6.1.0",
"resolved": "https://registry.npmjs.org/regex/-/regex-6.1.0.tgz",
@@ -6745,6 +6840,12 @@
"node": ">=10"
}
},
"node_modules/yallist": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
"integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
"license": "ISC"
},
"node_modules/yaml": {
"version": "2.8.2",
"resolved": "https://registry.npmjs.org/yaml/-/yaml-2.8.2.tgz",

View File

@@ -7,6 +7,7 @@
"build": "astro build",
"preview": "astro preview",
"fetch-content": "tsx scripts/fetch-content.ts",
"cache:clear": "tsx scripts/cache-clear.ts",
"verify:blog": "npm run build && tsx scripts/verify-blog-build.ts",
"typecheck": "astro check",
"format": "prettier -w .",
@@ -17,6 +18,7 @@
"dependencies": {
"@astrojs/sitemap": "^3.7.0",
"astro": "^5.17.1",
"redis": "^4.7.1",
"rss-parser": "^3.13.0",
"zod": "^3.25.76"
},

View File

@@ -8,6 +8,7 @@
--stroke: rgba(255, 255, 255, 0.16);
--accent: #ffcd4a;
--accent2: #5ee4ff;
--focus: rgba(94, 228, 255, 0.95);
}
* {
@@ -22,12 +23,10 @@ body {
body {
margin: 0;
color: var(--fg);
background:
radial-gradient(1000px 600px at 10% 10%, rgba(94, 228, 255, 0.22), transparent 55%),
radial-gradient(900px 600px at 90% 20%, rgba(255, 205, 74, 0.18), transparent 50%),
radial-gradient(900px 700px at 30% 90%, rgba(140, 88, 255, 0.14), transparent 55%),
linear-gradient(180deg, var(--bg0), var(--bg1));
background: linear-gradient(180deg, var(--bg0), var(--bg1));
/* Prefer a display-friendly font if available; fall back to system fonts. */
font-family:
"Manrope",
ui-sans-serif,
system-ui,
-apple-system,
@@ -41,11 +40,53 @@ body {
"Segoe UI Emoji";
}
/* Oversized fixed background layer to avoid gradient cutoffs on large screens. */
body::before {
content: "";
position: fixed;
inset: -40vmax;
z-index: -1;
pointer-events: none;
background:
radial-gradient(1200px 800px at 10% 10%, rgba(94, 228, 255, 0.22), transparent 60%),
radial-gradient(1100px 800px at 90% 20%, rgba(255, 205, 74, 0.18), transparent 58%),
radial-gradient(1200px 900px at 30% 90%, rgba(140, 88, 255, 0.14), transparent 62%);
}
a {
color: inherit;
text-decoration: none;
}
/* WCAG-ish baseline: make keyboard focus obvious. */
a:focus-visible,
button:focus-visible,
input:focus-visible,
select:focus-visible,
textarea:focus-visible {
outline: 3px solid var(--focus);
outline-offset: 3px;
}
.skip-link {
position: absolute;
left: 14px;
top: 12px;
z-index: 999;
padding: 10px 12px;
border-radius: 999px;
border: 1px solid rgba(255, 255, 255, 0.18);
background: rgba(10, 14, 28, 0.92);
color: var(--fg);
font-weight: 800;
transform: translateY(-220%);
transition: transform 140ms ease;
}
.skip-link:focus {
transform: translateY(0);
}
.container {
width: min(1100px, calc(100% - 48px));
margin: 0 auto;
@@ -78,10 +119,122 @@ a {
color: var(--muted);
}
.nav a {
padding: 10px 12px;
border-radius: 999px;
}
.nav a:hover {
color: var(--fg);
}
.nav-toggle {
display: none;
align-items: center;
justify-content: center;
width: 44px;
height: 44px;
border-radius: 999px;
border: 1px solid rgba(255, 255, 255, 0.14);
background: rgba(255, 255, 255, 0.04);
color: var(--fg);
}
.nav-toggle-icon {
width: 18px;
height: 12px;
position: relative;
display: block;
}
.nav-toggle-icon::before,
.nav-toggle-icon::after {
content: "";
position: absolute;
left: 0;
right: 0;
height: 2px;
border-radius: 999px;
background: rgba(242, 244, 255, 0.92);
}
.nav-toggle-icon::before {
top: 0;
box-shadow: 0 5px 0 rgba(242, 244, 255, 0.92);
}
.nav-toggle-icon::after {
bottom: 0;
}
@media (max-width: 760px) {
.site-header {
position: sticky;
}
.nav-toggle {
display: inline-flex;
}
.nav {
position: absolute;
top: calc(100% + 10px);
right: 14px;
width: min(86vw, 320px);
padding: 10px;
display: flex;
flex-direction: column;
gap: 6px;
border-radius: 16px;
border: 1px solid rgba(255, 255, 255, 0.14);
background: rgba(10, 14, 28, 0.92);
box-shadow:
0 18px 60px rgba(0, 0, 0, 0.55),
0 0 0 1px rgba(255, 255, 255, 0.05) inset;
transform-origin: top right;
transition:
opacity 160ms ease,
transform 160ms ease,
visibility 0s linear 160ms;
}
.nav[data-open="false"] {
opacity: 0;
transform: translateY(-6px) scale(0.98);
pointer-events: none;
visibility: hidden;
}
.nav[data-open="true"] {
opacity: 1;
transform: translateY(0) scale(1);
pointer-events: auto;
visibility: visible;
transition:
opacity 160ms ease,
transform 160ms ease,
visibility 0s;
}
.nav a {
padding: 12px 12px;
border-radius: 12px;
background: rgba(255, 255, 255, 0.04);
border: 1px solid rgba(255, 255, 255, 0.08);
}
}
@media (prefers-reduced-motion: reduce) {
*,
*::before,
*::after {
scroll-behavior: auto !important;
transition-duration: 0.001ms !important;
animation-duration: 0.001ms !important;
animation-iteration-count: 1 !important;
}
}
.subnav {
display: flex;
gap: 10px;
@@ -114,46 +267,7 @@ a {
gap: 14px;
}
.blog-card {
border-radius: 16px;
border: 1px solid rgba(255, 255, 255, 0.1);
background: rgba(255, 255, 255, 0.04);
overflow: hidden;
transition:
transform 120ms ease,
background 120ms ease;
}
.blog-card:hover {
transform: translateY(-2px);
background: rgba(255, 255, 255, 0.06);
}
.blog-card img {
width: 100%;
height: 180px;
object-fit: cover;
display: block;
border-bottom: 1px solid rgba(255, 255, 255, 0.08);
}
.blog-card-body {
padding: 12px 12px 14px;
}
.blog-card-title {
margin: 0 0 8px;
font-size: 15px;
line-height: 1.25;
letter-spacing: -0.01em;
}
.blog-card-excerpt {
margin: 0;
color: var(--muted);
font-size: 13px;
line-height: 1.5;
}
/* blog cards are now rendered via the shared `.card` component styles */
.prose {
line-height: 1.75;
@@ -256,53 +370,88 @@ a {
display: grid;
grid-template-columns: repeat(3, minmax(0, 1fr));
gap: 14px;
align-items: stretch;
}
.card {
display: grid;
grid-template-columns: 110px 1fr;
gap: 12px;
padding: 12px;
display: flex;
flex-direction: column;
height: 100%;
border-radius: 16px;
border: 1px solid rgba(255, 255, 255, 0.1);
background: rgba(255, 255, 255, 0.04);
overflow: hidden;
transition:
transform 120ms ease,
background 120ms ease;
}
.card-media {
flex: 0 0 auto;
}
.card:hover {
transform: translateY(-2px);
background: rgba(255, 255, 255, 0.06);
}
.card-media img {
width: 110px;
height: 70px;
border-radius: 10px;
width: 100%;
height: 180px;
object-fit: cover;
border: 1px solid rgba(255, 255, 255, 0.1);
display: block;
border-bottom: 1px solid rgba(255, 255, 255, 0.08);
}
.card-placeholder {
width: 110px;
height: 70px;
border-radius: 10px;
width: 100%;
height: 180px;
background: rgba(255, 255, 255, 0.06);
border: 1px solid rgba(255, 255, 255, 0.1);
border-bottom: 1px solid rgba(255, 255, 255, 0.08);
}
.card-meta {
.card-body {
display: flex;
gap: 10px;
align-items: center;
font-size: 12px;
flex: 1;
flex-direction: column;
padding: 0;
}
.card-content {
flex: 1;
padding: 12px 12px 12px;
background: linear-gradient(180deg, rgba(15, 27, 56, 0.75), rgba(11, 16, 32, 0.32));
}
.card-title {
margin: 8px 0 0;
font-size: 14px;
line-height: 1.35;
margin: 0 0 8px;
font-size: 15px;
line-height: 1.25;
letter-spacing: -0.01em;
}
.card-summary {
margin: 0;
color: var(--muted);
font-size: 13px;
line-height: 1.5;
}
.card-footer {
margin-top: auto;
display: flex;
align-items: center;
justify-content: space-between;
gap: 12px;
padding: 10px 12px;
border-top: 1px solid rgba(255, 255, 255, 0.08);
background: rgba(11, 16, 32, 0.45);
font-size: 12px;
}
.card-footer .card-views {
flex: 1;
text-align: center;
}
.pill {
@@ -326,6 +475,10 @@ a {
border-color: rgba(255, 205, 74, 0.35);
}
.pill-blog {
border-color: rgba(140, 88, 255, 0.35);
}
.empty {
padding: 16px;
border-radius: 14px;
@@ -351,12 +504,8 @@ a {
.blog-grid {
grid-template-columns: 1fr;
}
.card {
grid-template-columns: 90px 1fr;
}
.card-media img,
.card-placeholder {
width: 90px;
height: 60px;
height: 200px;
}
}

148
site/public/sw.js Normal file
View File

@@ -0,0 +1,148 @@
/* Service Worker: lightweight caching for a static Astro site.
- Navigations: network-first (avoid stale HTML indefinitely)
- Site shell: pre-cache on install
- Images: cache-first with bounded eviction
*/
// Bump this value on deploy to invalidate caches.
const CACHE_VERSION = "v1";
const CACHE_SHELL = `shell-${CACHE_VERSION}`;
const CACHE_PAGES = `pages-${CACHE_VERSION}`;
const CACHE_MEDIA = `media-${CACHE_VERSION}`;
const SHELL_ASSETS = ["/", "/styles/global.css", "/favicon.svg", "/favicon.ico", "/robots.txt"];
// Keep media cache bounded so we don't grow indefinitely.
const MAX_MEDIA_ENTRIES = 80;
const isGet = (request) => request && request.method === "GET";
const isNavigationRequest = (request) =>
request.mode === "navigate" || request.destination === "document";
const isImageRequest = (request, url) => {
if (request.destination === "image") return true;
const p = url.pathname.toLowerCase();
return (
p.endsWith(".png") ||
p.endsWith(".jpg") ||
p.endsWith(".jpeg") ||
p.endsWith(".webp") ||
p.endsWith(".gif") ||
p.endsWith(".avif") ||
p.endsWith(".svg")
);
};
async function trimCache(cacheName, maxEntries) {
const cache = await caches.open(cacheName);
const keys = await cache.keys();
const extra = keys.length - maxEntries;
if (extra <= 0) return;
// Cache keys are returned in insertion order in practice; delete the oldest.
for (let i = 0; i < extra; i += 1) {
await cache.delete(keys[i]);
}
}
async function cachePutSafe(cacheName, request, response) {
// Only cache successful or opaque responses. Avoid caching 404/500 HTML.
if (!response) return;
if (response.type !== "opaque" && !response.ok) return;
const cache = await caches.open(cacheName);
await cache.put(request, response);
}
self.addEventListener("install", (event) => {
event.waitUntil(
(async () => {
const cache = await caches.open(CACHE_SHELL);
await cache.addAll(SHELL_ASSETS);
// Activate new worker ASAP to pick up new caching rules.
await self.skipWaiting();
})(),
);
});
self.addEventListener("activate", (event) => {
event.waitUntil(
(async () => {
const keep = new Set([CACHE_SHELL, CACHE_PAGES, CACHE_MEDIA]);
const keys = await caches.keys();
await Promise.all(keys.map((k) => (keep.has(k) ? Promise.resolve() : caches.delete(k))));
await self.clients.claim();
})(),
);
});
self.addEventListener("fetch", (event) => {
const { request } = event;
if (!isGet(request)) return;
const url = new URL(request.url);
// Only handle http(s).
if (url.protocol !== "http:" && url.protocol !== "https:") return;
// Network-first for navigations (HTML documents). Cache as fallback only.
if (isNavigationRequest(request)) {
event.respondWith(
(async () => {
try {
const fresh = await fetch(request);
// Cache a clone so we can serve it when offline.
await cachePutSafe(CACHE_PAGES, request, fresh.clone());
return fresh;
} catch {
const cached = await caches.match(request);
if (cached) return cached;
// Fallback: try cached homepage shell.
const home = await caches.match("/");
if (home) return home;
throw new Error("No cached navigation fallback.");
}
})(),
);
return;
}
// Cache-first for images/media with bounded cache size.
if (isImageRequest(request, url)) {
event.respondWith(
(async () => {
const cached = await caches.match(request);
if (cached) return cached;
const res = await fetch(request);
await cachePutSafe(CACHE_MEDIA, request, res.clone());
await trimCache(CACHE_MEDIA, MAX_MEDIA_ENTRIES);
return res;
})(),
);
return;
}
// Stale-while-revalidate for styles/scripts/fonts from same-origin.
if (
url.origin === self.location.origin &&
(request.destination === "style" ||
request.destination === "script" ||
request.destination === "font")
) {
event.respondWith(
(async () => {
const cached = await caches.match(request);
const networkPromise = fetch(request)
.then(async (res) => {
await cachePutSafe(CACHE_SHELL, request, res.clone());
return res;
})
.catch(() => null);
return cached || (await networkPromise) || fetch(request);
})(),
);
}
});

View File

@@ -0,0 +1,22 @@
import "dotenv/config";
import { createCacheFromEnv } from "../src/lib/cache";
function log(msg: string) {
// eslint-disable-next-line no-console
console.log(`[cache-clear] ${msg}`);
}
async function main() {
const cache = await createCacheFromEnv(process.env, { namespace: "fast-website", log });
await cache.flush();
await cache.close();
log("ok");
}
main().catch((e) => {
// eslint-disable-next-line no-console
console.error(`[cache-clear] failed: ${String(e)}`);
process.exitCode = 1;
});

View File

@@ -4,6 +4,8 @@ import { promises as fs } from "node:fs";
import path from "node:path";
import { getIngestConfigFromEnv } from "../src/lib/config";
import { createCacheFromEnv } from "../src/lib/cache";
import { cachedCompute } from "../src/lib/cache/memoize";
import type { ContentCache, ContentItem } from "../src/lib/content/types";
import { readInstagramEmbedPosts } from "../src/lib/ingest/instagram";
import { fetchPodcastRss } from "../src/lib/ingest/podcast";
@@ -42,6 +44,11 @@ async function main() {
const all: ContentItem[] = [];
const outPath = path.join(process.cwd(), "content", "cache", "content.json");
const kv = await createCacheFromEnv(process.env, {
namespace: "fast-website",
log,
});
// Read the existing cache so we can keep last-known-good sections if a source fails.
let existing: ContentCache | undefined;
try {
@@ -56,17 +63,29 @@ async function main() {
log("YouTube: skipped (missing YOUTUBE_CHANNEL_ID)");
} else if (cfg.youtubeApiKey) {
try {
const items = await fetchYoutubeViaApi(cfg.youtubeChannelId, cfg.youtubeApiKey, 25);
const cacheKey = `youtube:api:${cfg.youtubeChannelId}:25`;
const { value: items, cached } = await cachedCompute(kv, cacheKey, () =>
fetchYoutubeViaApi(cfg.youtubeChannelId!, cfg.youtubeApiKey!, 25),
);
log(`YouTube: API ${cached ? "cache" : "live"} (${items.length} items)`);
log(`YouTube: API ok (${items.length} items)`);
all.push(...items);
} catch (e) {
log(`YouTube: API failed (${String(e)}), falling back to RSS`);
const items = await fetchYoutubeViaRss(cfg.youtubeChannelId, 25);
const cacheKey = `youtube:rss:${cfg.youtubeChannelId}:25`;
const { value: items, cached } = await cachedCompute(kv, cacheKey, () =>
fetchYoutubeViaRss(cfg.youtubeChannelId!, 25),
);
log(`YouTube: RSS ${cached ? "cache" : "live"} (${items.length} items)`);
log(`YouTube: RSS ok (${items.length} items)`);
all.push(...items);
}
} else {
const items = await fetchYoutubeViaRss(cfg.youtubeChannelId, 25);
const cacheKey = `youtube:rss:${cfg.youtubeChannelId}:25`;
const { value: items, cached } = await cachedCompute(kv, cacheKey, () =>
fetchYoutubeViaRss(cfg.youtubeChannelId!, 25),
);
log(`YouTube: RSS ${cached ? "cache" : "live"} (${items.length} items)`);
log(`YouTube: RSS ok (${items.length} items)`);
all.push(...items);
}
@@ -76,7 +95,11 @@ async function main() {
log("Podcast: skipped (missing PODCAST_RSS_URL)");
} else {
try {
const items = await fetchPodcastRss(cfg.podcastRssUrl, 50);
const cacheKey = `podcast:rss:${cfg.podcastRssUrl}:50`;
const { value: items, cached } = await cachedCompute(kv, cacheKey, () =>
fetchPodcastRss(cfg.podcastRssUrl!, 50),
);
log(`Podcast: RSS ${cached ? "cache" : "live"} (${items.length} items)`);
log(`Podcast: RSS ok (${items.length} items)`);
all.push(...items);
} catch (e) {
@@ -103,11 +126,17 @@ async function main() {
wordpress = existing?.wordpress || wordpress;
} else {
try {
const wp = await fetchWordpressContent({
baseUrl: cfg.wordpressBaseUrl,
username: cfg.wordpressUsername,
appPassword: cfg.wordpressAppPassword,
});
const cacheKey = `wp:content:${cfg.wordpressBaseUrl}`;
const { value: wp, cached } = await cachedCompute(kv, cacheKey, () =>
fetchWordpressContent({
baseUrl: cfg.wordpressBaseUrl!,
username: cfg.wordpressUsername,
appPassword: cfg.wordpressAppPassword,
}),
);
log(
`WordPress: wp-json ${cached ? "cache" : "live"} (${wp.posts.length} posts, ${wp.pages.length} pages, ${wp.categories.length} categories)`,
);
wordpress = wp;
log(
`WordPress: wp-json ok (${wp.posts.length} posts, ${wp.pages.length} pages, ${wp.categories.length} categories)`,
@@ -119,14 +148,16 @@ async function main() {
}
}
const cache: ContentCache = {
const contentCache: ContentCache = {
generatedAt,
items: dedupe(all),
wordpress,
};
await writeAtomic(outPath, JSON.stringify(cache, null, 2));
log(`Wrote cache: ${outPath} (${cache.items.length} total items)`);
await writeAtomic(outPath, JSON.stringify(contentCache, null, 2));
log(`Wrote cache: ${outPath} (${contentCache.items.length} total items)`);
await kv.close();
}
main().catch((e) => {

View File

@@ -1,25 +1,44 @@
---
import type { WordpressPost } from "../lib/content/types";
import StandardCard from "./StandardCard.astro";
type Props = {
post: WordpressPost;
placement: string;
targetId: string;
};
const { post } = Astro.props;
const { post, placement, targetId } = Astro.props;
function truncate(s: string, n: number) {
if (!s) return "";
const t = s.trim();
const t = (s || "").trim();
if (!t) return "";
if (t.length <= n) return t;
return `${t.slice(0, Math.max(0, n - 1)).trimEnd()}…`;
}
const d = new Date(post.publishedAt);
const dateLabel = Number.isFinite(d.valueOf())
? d.toLocaleDateString(undefined, { year: "numeric", month: "short", day: "numeric" })
: "";
---
<a class="blog-card" href={`/blog/post/${post.slug}`}>
{post.featuredImageUrl ? <img src={post.featuredImageUrl} alt="" loading="lazy" /> : null}
<div class="blog-card-body">
<h3 class="blog-card-title">{post.title}</h3>
<p class="blog-card-excerpt">{truncate(post.excerpt || "", 180)}</p>
</div>
</a>
<StandardCard
href={`/blog/post/${post.slug}`}
title={post.title}
summary={post.excerpt}
imageUrl={post.featuredImageUrl}
dateLabel={dateLabel}
viewsLabel={undefined}
sourceLabel="blog"
isExternal={false}
linkAttrs={{
"data-umami-event": "click",
"data-umami-event-target_id": targetId,
"data-umami-event-placement": placement,
"data-umami-event-target_url": `/blog/post/${post.slug}`,
"data-umami-event-title": truncate(post.title, 160),
"data-umami-event-type": "blog_post",
}}
/>

View File

@@ -10,16 +10,36 @@ const { categories, activeCategorySlug } = Astro.props;
---
<nav class="subnav" aria-label="Blog categories">
<a class={!activeCategorySlug ? "active" : ""} href="/blog">
<a
class={!activeCategorySlug ? "active" : ""}
href="/blog"
data-umami-event="click"
data-umami-event-target_id="blog.subnav.all"
data-umami-event-placement="blog.subnav"
data-umami-event-target_url="/blog"
>
All
</a>
<a class={activeCategorySlug === "__pages" ? "active" : ""} href="/blog/pages">
<a
class={activeCategorySlug === "__pages" ? "active" : ""}
href="/blog/pages"
data-umami-event="click"
data-umami-event-target_id="blog.subnav.pages"
data-umami-event-placement="blog.subnav"
data-umami-event-target_url="/blog/pages"
>
Pages
</a>
{categories.map((c) => (
<a class={activeCategorySlug === c.slug ? "active" : ""} href={`/blog/category/${c.slug}`}>
<a
class={activeCategorySlug === c.slug ? "active" : ""}
href={`/blog/category/${c.slug}`}
data-umami-event="click"
data-umami-event-target_id={`blog.subnav.category.${c.slug}`}
data-umami-event-placement="blog.subnav"
data-umami-event-target_url={`/blog/category/${c.slug}`}
>
{c.name}
</a>
))}
</nav>

View File

@@ -1,5 +1,6 @@
---
import type { ContentItem } from "../lib/content/types";
import StandardCard from "./StandardCard.astro";
type Props = {
item: ContentItem;
@@ -7,6 +8,12 @@ type Props = {
};
const { item, placement } = Astro.props;
function truncate(s: string, n: number) {
const t = (s || "").trim();
if (t.length <= n) return t;
return `${t.slice(0, Math.max(0, n - 1)).trimEnd()}…`;
}
const d = new Date(item.publishedAt);
const dateLabel = Number.isFinite(d.valueOf())
? d.toLocaleDateString(undefined, { year: "numeric", month: "short", day: "numeric" })
@@ -19,40 +26,35 @@ try {
} catch {
domain = "";
}
const umamiType =
item.source === "youtube"
? "video"
: item.source === "podcast"
? "podcast_episode"
: undefined;
const umamiTitle = umamiType ? truncate(item.title, 160) : undefined;
---
<a
class="card"
<StandardCard
href={item.url}
target="_blank"
rel="noopener noreferrer"
data-umami-event="outbound_click"
data-umami-event-target_id={targetId}
data-umami-event-placement={placement}
data-umami-event-target_url={item.url}
data-umami-event-domain={domain || "unknown"}
data-umami-event-source={item.source}
data-umami-event-ui_placement="content_card"
>
<div class="card-media">
{
item.thumbnailUrl ? (
<img src={item.thumbnailUrl} alt="" loading="lazy" />
) : (
<div class="card-placeholder" />
)
}
</div>
<div class="card-body">
<div class="card-meta">
<span class={`pill pill-${item.source}`}>{item.source}</span>
{dateLabel ? <span class="muted">{dateLabel}</span> : null}
{
item.metrics?.views !== undefined ? (
<span class="muted">{item.metrics.views.toLocaleString()} views</span>
) : null
}
</div>
<h3 class="card-title">{item.title}</h3>
</div>
</a>
title={item.title}
summary={item.summary}
imageUrl={item.thumbnailUrl}
dateLabel={dateLabel}
viewsLabel={item.metrics?.views !== undefined ? `${item.metrics.views.toLocaleString()} views` : undefined}
sourceLabel={item.source}
isExternal={true}
linkAttrs={{
"data-umami-event": "outbound_click",
"data-umami-event-target_id": targetId,
"data-umami-event-placement": placement,
"data-umami-event-target_url": item.url,
"data-umami-event-title": umamiTitle,
"data-umami-event-type": umamiType,
"data-umami-event-domain": domain || "unknown",
"data-umami-event-source": item.source,
"data-umami-event-ui_placement": "content_card",
}}
/>

View File

@@ -0,0 +1,63 @@
---
type Props = {
href: string;
title: string;
summary?: string;
imageUrl?: string;
dateLabel?: string;
viewsLabel?: string;
sourceLabel: string;
isExternal?: boolean;
linkAttrs?: Record<string, any>;
};
const {
href,
title,
summary,
imageUrl,
dateLabel,
viewsLabel,
sourceLabel,
isExternal,
linkAttrs,
} = Astro.props;
function truncate(s: string, n: number) {
const t = (s || "").trim();
if (!t) return "";
if (t.length <= n) return t;
// ASCII ellipsis to avoid encoding issues in generated HTML.
return `${t.slice(0, Math.max(0, n - 3)).trimEnd()}...`;
}
const summaryText = truncate(summary || "", 180);
---
<a
class="card"
href={href}
target={isExternal ? "_blank" : undefined}
rel={isExternal ? "noopener noreferrer" : undefined}
{...(linkAttrs || {})}
>
<div class="card-media">
{imageUrl ? <img src={imageUrl} alt="" loading="lazy" /> : <div class="card-placeholder" />}
</div>
<div class="card-body">
<div class="card-content">
<h3 class="card-title">{title}</h3>
{summaryText ? <p class="card-summary">{summaryText}</p> : null}
</div>
<div class="card-footer">
<span class="muted card-date">{dateLabel || ""}</span>
<span class="muted card-views" aria-hidden={viewsLabel ? undefined : "true"}>
{viewsLabel || ""}
</span>
<span class={`pill pill-${sourceLabel}`}>{sourceLabel}</span>
</div>
</div>
</a>

View File

@@ -40,6 +40,14 @@ const canonicalUrl = `${siteUrl}${canonicalPath.startsWith("/") ? canonicalPath
<link rel="icon" type="image/svg+xml" href="/favicon.svg" />
<link rel="icon" href="/favicon.ico" />
<!-- Display-friendly font (swap to avoid blocking render). -->
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link
href="https://fonts.googleapis.com/css2?family=Manrope:wght@400;600;800&display=swap"
rel="stylesheet"
/>
<link rel="stylesheet" href="/styles/global.css" />
{
@@ -47,8 +55,27 @@ const canonicalUrl = `${siteUrl}${canonicalPath.startsWith("/") ? canonicalPath
<script async defer data-website-id={cfg.umami.websiteId} src={cfg.umami.scriptUrl} />
) : null
}
{
// Register SW only in production builds (Astro sets import.meta.env.PROD at build time).
import.meta.env.PROD ? (
<script is:inline>
{`
if ("serviceWorker" in navigator) {
// SW requires HTTPS (or localhost). In prod we expect HTTPS.
window.addEventListener("load", () => {
navigator.serviceWorker.register("/sw.js", { scope: "/" }).catch(() => {
// noop: SW is progressive enhancement
});
});
}
`}
</script>
) : null
}
</head>
<body>
<a class="skip-link" href="#main-content">Skip to content</a>
<header class="site-header">
<a
class="brand"
@@ -60,7 +87,18 @@ const canonicalUrl = `${siteUrl}${canonicalPath.startsWith("/") ? canonicalPath
>
SanthoshJ
</a>
<nav class="nav">
<button
class="nav-toggle"
type="button"
aria-controls="primary-nav"
aria-expanded="false"
aria-label="Open menu"
data-nav-toggle
>
<span class="nav-toggle-icon" aria-hidden="true"></span>
</button>
<nav class="nav" id="primary-nav" data-open="false">
<a
href="/videos"
data-umami-event="click"
@@ -88,24 +126,83 @@ const canonicalUrl = `${siteUrl}${canonicalPath.startsWith("/") ? canonicalPath
>
Blog
</a>
<a
href="/about"
data-umami-event="click"
data-umami-event-target_id="nav.about"
data-umami-event-placement="nav"
data-umami-event-target_url="/about"
>
About
</a>
</nav>
</header>
<main class="container">
<main class="container" id="main-content">
<slot />
</main>
<footer class="site-footer">
<p class="muted">© {new Date().getFullYear()} SanthoshJ</p>
<p class="muted">&copy; {new Date().getFullYear()} SanthoshJ</p>
</footer>
<script is:inline>
(() => {
const toggle = document.querySelector("[data-nav-toggle]");
if (!toggle) return;
const controlsId = toggle.getAttribute("aria-controls");
if (!controlsId) return;
const panel = document.getElementById(controlsId);
if (!panel) return;
const mql = window.matchMedia("(max-width: 760px)");
const setOpen = (open) => {
toggle.setAttribute("aria-expanded", open ? "true" : "false");
toggle.setAttribute("aria-label", open ? "Close menu" : "Open menu");
panel.dataset.open = open ? "true" : "false";
// Only hide/disable the nav panel in mobile mode. On desktop, nav is always visible/clickable.
if (mql.matches) {
panel.setAttribute("aria-hidden", open ? "false" : "true");
if (open) panel.removeAttribute("inert");
else panel.setAttribute("inert", "");
} else {
panel.removeAttribute("aria-hidden");
panel.removeAttribute("inert");
}
if (!open) toggle.focus({ preventScroll: true });
};
// Default state: closed on mobile, open on desktop.
setOpen(!mql.matches);
toggle.addEventListener("click", () => {
const isOpen = toggle.getAttribute("aria-expanded") === "true";
setOpen(!isOpen);
if (!isOpen) {
const firstLink = panel.querySelector("a");
if (firstLink) firstLink.focus({ preventScroll: true });
}
});
document.addEventListener("keydown", (e) => {
if (e.key !== "Escape") return;
if (toggle.getAttribute("aria-expanded") !== "true") return;
setOpen(false);
});
panel.addEventListener("click", (e) => {
const t = e.target;
if (t && t.closest && t.closest("a")) setOpen(false);
});
document.addEventListener("click", (e) => {
if (toggle.getAttribute("aria-expanded") !== "true") return;
const t = e.target;
if (!t || !t.closest) return;
if (t.closest("[data-nav-toggle]")) return;
if (t.closest("#" + CSS.escape(controlsId))) return;
setOpen(false);
});
// If viewport changes, keep desktop usable and default mobile to closed.
mql.addEventListener("change", () => setOpen(!mql.matches));
})();
</script>
</body>
</html>

28
site/src/lib/cache/index.ts vendored Normal file
View File

@@ -0,0 +1,28 @@
import type { CacheLogFn, CacheStore } from "./redis-cache";
import {
createRedisCache,
resolveDefaultTtlSecondsFromEnv,
resolveRedisUrlFromEnv,
} from "./redis-cache";
import { createNoopCache } from "./noop-cache";
export async function createCacheFromEnv(
env: NodeJS.ProcessEnv,
opts?: { namespace?: string; log?: CacheLogFn },
): Promise<CacheStore> {
const url = resolveRedisUrlFromEnv(env);
if (!url) return createNoopCache(opts?.log);
try {
return await createRedisCache({
url,
defaultTtlSeconds: resolveDefaultTtlSecondsFromEnv(env),
namespace: opts?.namespace,
log: opts?.log,
});
} catch (e) {
opts?.log?.(`cache: disabled (redis connect failed: ${String(e)})`);
return createNoopCache(opts?.log);
}
}

16
site/src/lib/cache/memoize.ts vendored Normal file
View File

@@ -0,0 +1,16 @@
import type { CacheStore } from "./redis-cache";
export async function cachedCompute<T>(
cache: CacheStore,
key: string,
compute: () => Promise<T>,
ttlSeconds?: number,
): Promise<{ value: T; cached: boolean }> {
const hit = await cache.getJson<T>(key);
if (hit !== undefined) return { value: hit, cached: true };
const value = await compute();
await cache.setJson(key, value, ttlSeconds);
return { value, cached: false };
}

49
site/src/lib/cache/memory-cache.ts vendored Normal file
View File

@@ -0,0 +1,49 @@
import type { CacheStore } from "./redis-cache";
type Entry = { value: string; expiresAt: number };
export function createMemoryCache(defaultTtlSeconds: number): CacheStore {
const store = new Map<string, Entry>();
function nowMs() {
return Date.now();
}
function isExpired(e: Entry) {
return e.expiresAt !== 0 && nowMs() > e.expiresAt;
}
return {
async getJson<T>(key: string) {
const e = store.get(key);
if (!e) return undefined;
if (isExpired(e)) {
store.delete(key);
return undefined;
}
try {
return JSON.parse(e.value) as T;
} catch {
store.delete(key);
return undefined;
}
},
async setJson(key: string, value: unknown, ttlSeconds?: number) {
const ttl = Math.max(1, Math.floor(ttlSeconds ?? defaultTtlSeconds));
store.set(key, {
value: JSON.stringify(value),
expiresAt: nowMs() + ttl * 1000,
});
},
async flush() {
store.clear();
},
async close() {
// no-op
},
};
}

19
site/src/lib/cache/noop-cache.ts vendored Normal file
View File

@@ -0,0 +1,19 @@
import type { CacheLogFn, CacheStore } from "./redis-cache";
export function createNoopCache(log?: CacheLogFn): CacheStore {
return {
async getJson() {
return undefined;
},
async setJson() {
// no-op
},
async flush() {
log?.("cache: noop flush");
},
async close() {
// no-op
},
};
}

92
site/src/lib/cache/redis-cache.ts vendored Normal file
View File

@@ -0,0 +1,92 @@
import { createClient } from "redis";
export type CacheLogFn = (msg: string) => void;
export type CacheStore = {
getJson<T>(key: string): Promise<T | undefined>;
setJson(key: string, value: unknown, ttlSeconds?: number): Promise<void>;
flush(): Promise<void>;
close(): Promise<void>;
};
type RedisCacheOptions = {
url: string;
defaultTtlSeconds: number;
namespace?: string;
log?: CacheLogFn;
};
function nsKey(namespace: string | undefined, key: string) {
return namespace ? `${namespace}:${key}` : key;
}
export async function createRedisCache(opts: RedisCacheOptions): Promise<CacheStore> {
const log = opts.log;
const client = createClient({ url: opts.url });
client.on("error", (err) => {
log?.(`cache: redis error (${String(err)})`);
});
await client.connect();
return {
async getJson<T>(key: string) {
const k = nsKey(opts.namespace, key);
const raw = await client.get(k);
if (raw == null) {
log?.(`cache: miss ${k}`);
return undefined;
}
log?.(`cache: hit ${k}`);
try {
return JSON.parse(raw) as T;
} catch {
// Bad cache entry: treat as miss.
return undefined;
}
},
async setJson(key: string, value: unknown, ttlSeconds?: number) {
const k = nsKey(opts.namespace, key);
const ttl = Math.max(1, Math.floor(ttlSeconds ?? opts.defaultTtlSeconds));
const raw = JSON.stringify(value);
await client.set(k, raw, { EX: ttl });
},
async flush() {
await client.flushDb();
},
async close() {
try {
await client.quit();
} catch {
// ignore
}
},
};
}
export function resolveRedisUrlFromEnv(env: NodeJS.ProcessEnv): string | undefined {
const url = env.CACHE_REDIS_URL;
if (url) return url;
const host = env.CACHE_REDIS_HOST;
const port = env.CACHE_REDIS_PORT;
const db = env.CACHE_REDIS_DB;
if (!host) return undefined;
const p = port ? Number(port) : 6379;
const d = db ? Number(db) : 0;
if (!Number.isFinite(p) || !Number.isFinite(d)) return undefined;
return `redis://${host}:${p}/${d}`;
}
export function resolveDefaultTtlSecondsFromEnv(env: NodeJS.ProcessEnv): number {
const raw = env.CACHE_DEFAULT_TTL_SECONDS;
const n = raw ? Number(raw) : NaN;
if (Number.isFinite(n) && n > 0) return Math.floor(n);
return 3600;
}

View File

@@ -14,6 +14,11 @@ type IngestConfig = {
wordpressBaseUrl?: string;
wordpressUsername?: string;
wordpressAppPassword?: string;
cacheRedisUrl?: string;
cacheRedisHost?: string;
cacheRedisPort?: number;
cacheRedisDb?: number;
cacheDefaultTtlSeconds?: number;
};
export function getPublicConfig(): PublicConfig {
@@ -37,5 +42,12 @@ export function getIngestConfigFromEnv(env: NodeJS.ProcessEnv): IngestConfig {
wordpressBaseUrl: env.WORDPRESS_BASE_URL,
wordpressUsername: env.WORDPRESS_USERNAME,
wordpressAppPassword: env.WORDPRESS_APP_PASSWORD,
cacheRedisUrl: env.CACHE_REDIS_URL,
cacheRedisHost: env.CACHE_REDIS_HOST,
cacheRedisPort: env.CACHE_REDIS_PORT ? Number(env.CACHE_REDIS_PORT) : undefined,
cacheRedisDb: env.CACHE_REDIS_DB ? Number(env.CACHE_REDIS_DB) : undefined,
cacheDefaultTtlSeconds: env.CACHE_DEFAULT_TTL_SECONDS
? Number(env.CACHE_DEFAULT_TTL_SECONDS)
: undefined,
};
}

View File

@@ -9,6 +9,7 @@ export type ContentItem = {
source: ContentSource;
url: string;
title: string;
summary?: string;
publishedAt: string; // ISO-8601
thumbnailUrl?: string;
metrics?: ContentMetrics;

View File

@@ -8,16 +8,40 @@ export async function fetchPodcastRss(rssUrl: string, limit = 50): Promise<Conte
return normalizePodcastFeedItems(feed.items || [], limit);
}
function stripHtml(s: string) {
return (s || "")
.replace(/<[^>]+>/g, " ")
.replace(/\s+/g, " ")
.trim();
}
function truncate(s: string, n: number) {
const t = stripHtml(s);
if (!t) return "";
if (t.length <= n) return t;
return `${t.slice(0, Math.max(0, n - 1)).trimEnd()}`;
}
export function normalizePodcastFeedItems(items: any[], limit: number): ContentItem[] {
const out = (items || []).slice(0, limit).map((it) => {
const url = it.link || "";
const id = (it.guid || it.id || url).toString();
const publishedAt = (it.isoDate || it.pubDate || new Date(0).toISOString()).toString();
const summary = truncate(
(it.contentSnippet ||
it.summary ||
it.content ||
it["content:encoded"] ||
it.itunes?.subtitle ||
"").toString(),
240,
);
return {
id,
source: "podcast" as const,
url,
title: (it.title || "").toString(),
summary: summary || undefined,
publishedAt: new Date(publishedAt).toISOString(),
thumbnailUrl: (it.itunes?.image || undefined) as string | undefined,
};

View File

@@ -6,11 +6,26 @@ type YoutubeApiVideo = {
id: string;
url: string;
title: string;
summary?: string;
publishedAt: string;
thumbnailUrl?: string;
views?: number;
};
function stripHtml(s: string) {
return (s || "")
.replace(/<[^>]+>/g, " ")
.replace(/\s+/g, " ")
.trim();
}
function truncate(s: string, n: number) {
const t = stripHtml(s);
if (!t) return "";
if (t.length <= n) return t;
return `${t.slice(0, Math.max(0, n - 1)).trimEnd()}`;
}
export async function fetchYoutubeViaRss(channelId: string, limit = 20): Promise<ContentItem[]> {
const feedUrl = `https://www.youtube.com/feeds/videos.xml?channel_id=${encodeURIComponent(channelId)}`;
const parser = new Parser();
@@ -32,11 +47,16 @@ export function normalizeYoutubeRssFeedItems(items: any[], limit: number): Conte
const url = it.link || "";
const id = (it.id || url).toString();
const publishedAt = (it.isoDate || it.pubDate || new Date(0).toISOString()).toString();
const summary = truncate(
(it.contentSnippet || it.summary || it.content || it["content:encoded"] || "").toString(),
240,
);
return {
id,
source: "youtube" as const,
url,
title: (it.title || "").toString(),
summary: summary || undefined,
publishedAt: new Date(publishedAt).toISOString(),
thumbnailUrl: (it.enclosure?.url || undefined) as string | undefined,
};
@@ -47,7 +67,12 @@ export function normalizeYoutubeRssFeedItems(items: any[], limit: number): Conte
export function normalizeYoutubeApiVideos(
items: Array<{
id: string;
snippet: { title: string; publishedAt: string; thumbnails?: Record<string, { url: string }> };
snippet: {
title: string;
description?: string;
publishedAt: string;
thumbnails?: Record<string, { url: string }>;
};
statistics?: { viewCount?: string };
}>,
): ContentItem[] {
@@ -55,6 +80,7 @@ export function normalizeYoutubeApiVideos(
id: v.id,
url: `https://www.youtube.com/watch?v=${encodeURIComponent(v.id)}`,
title: v.snippet.title,
summary: v.snippet.description ? truncate(v.snippet.description, 240) : undefined,
publishedAt: new Date(v.snippet.publishedAt).toISOString(),
thumbnailUrl: v.snippet.thumbnails?.high?.url || v.snippet.thumbnails?.default?.url,
views: v.statistics?.viewCount ? Number(v.statistics.viewCount) : undefined,
@@ -65,6 +91,7 @@ export function normalizeYoutubeApiVideos(
source: "youtube",
url: v.url,
title: v.title,
summary: v.summary,
publishedAt: v.publishedAt,
thumbnailUrl: v.thumbnailUrl,
metrics: v.views !== undefined ? { views: v.views } : undefined,

View File

@@ -1,34 +0,0 @@
---
import BaseLayout from "../layouts/BaseLayout.astro";
import CtaLink from "../components/CtaLink.astro";
import { LINKS } from "../lib/links";
---
<BaseLayout
title="About | SanthoshJ"
description="About SanthoshJ and where to follow."
canonicalPath="/about"
>
<section class="section">
<div class="section-header">
<h2>About</h2>
<span class="muted">Tech, streaming, movies, travel</span>
</div>
<div class="empty">
<p style="margin-top: 0;">
This is a lightweight site that aggregates my content so it can be discovered via search and
shared cleanly.
</p>
<div class="cta-row">
<CtaLink platform="youtube" placement="about" url={LINKS.youtubeChannel} label="YouTube" />
<CtaLink
platform="instagram"
placement="about"
url={LINKS.instagramProfile}
label="Instagram"
/>
<CtaLink platform="podcast" placement="about" url={LINKS.podcast} label="Podcast" />
</div>
</div>
</section>
</BaseLayout>

View File

@@ -44,7 +44,11 @@ if (!activeCategory) {
{posts.length > 0 ? (
<div class="blog-grid">
{posts.map((p) => (
<BlogPostCard post={p} />
<BlogPostCard
post={p}
placement={`blog.category.${activeCategory.slug}`}
targetId={`blog.category.${activeCategory.slug}.card.post.${p.slug}`}
/>
))}
</div>
) : (

View File

@@ -25,7 +25,7 @@ const pages = wordpressPages(cache);
{posts.length > 0 ? (
<div class="blog-grid">
{posts.map((p) => (
<BlogPostCard post={p} />
<BlogPostCard post={p} placement="blog.index" targetId={`blog.index.card.post.${p.slug}`} />
))}
</div>
) : (
@@ -37,18 +37,34 @@ const pages = wordpressPages(cache);
<section class="section">
<div class="section-header">
<h2>Pages</h2>
<a class="muted" href="/blog/pages">
<a
class="muted"
href="/blog/pages"
data-umami-event="click"
data-umami-event-target_id="blog.index.pages.browse"
data-umami-event-placement="blog.index.pages_preview"
data-umami-event-target_url="/blog/pages"
>
Browse pages →
</a>
</div>
<div class="empty">
{pages.slice(0, 6).map((p) => (
<div>
<a href={`/blog/page/${p.slug}`}>{p.title}</a>
<a
href={`/blog/page/${p.slug}`}
data-umami-event="click"
data-umami-event-target_id={`blog.index.pages.link.${p.slug}`}
data-umami-event-placement="blog.index.pages_preview"
data-umami-event-target_url={`/blog/page/${p.slug}`}
data-umami-event-title={p.title}
data-umami-event-type="blog_page"
>
{p.title}
</a>
</div>
))}
</div>
</section>
) : null}
</BlogLayout>

View File

@@ -33,7 +33,16 @@ const metaDescription = (page.excerpt || "").slice(0, 160);
<section class="section">
<div class="section-header">
<h2 style="margin: 0;">{page.title}</h2>
<a class="muted" href="/blog">Back →</a>
<a
class="muted"
href="/blog"
data-umami-event="click"
data-umami-event-target_id="blog.page.back"
data-umami-event-placement="blog.page"
data-umami-event-target_url="/blog"
>
Back →
</a>
</div>
{page.featuredImageUrl ? (
<img
@@ -46,4 +55,3 @@ const metaDescription = (page.excerpt || "").slice(0, 160);
<div class="prose" set:html={page.contentHtml} />
</section>
</BlogLayout>

View File

@@ -25,7 +25,17 @@ const pages = wordpressPages(cache);
<div class="empty">
{pages.map((p) => (
<div style="padding: 6px 0;">
<a href={`/blog/page/${p.slug}`}>{p.title}</a>
<a
href={`/blog/page/${p.slug}`}
data-umami-event="click"
data-umami-event-target_id={`blog.pages.link.${p.slug}`}
data-umami-event-placement="blog.pages.list"
data-umami-event-target_url={`/blog/page/${p.slug}`}
data-umami-event-title={p.title}
data-umami-event-type="blog_page"
>
{p.title}
</a>
</div>
))}
</div>
@@ -34,4 +44,3 @@ const pages = wordpressPages(cache);
)}
</section>
</BlogLayout>

Some files were not shown because too many files have changed in this diff Show More