Building Millrace.ca Redux: A Flatstock Experiment
2026-02-11
Building Millrace.ca Redux: A Flatstock Experiment
We are redoing millrace.ca. The previous site was a successful experiment in deploying a new site using an AI agent (and containing an AI agent as its main interface), but I wanted something more substantial—a place for people to understand what millrace stands for and what guides us. Thoughts on technology and business, and a showcase for our projects.
I'm a big fan of the aesthetic of block.xyz—clean, simple, content-focused. I also have a strong philosophy about data ownership which I call "Flatstock": data should live in flat files, accessible and readable by humans and machines, not locked away in proprietary databases.
So, I decided to pair program with an AI agent to build this "Redux" version.
The Stack: Integrity in the File System
We chose Next.js for this build. The previous iteration was a Vite app, but for a content-heavy site that needs good SEO and server-side rendering, Next.js felt like the right move, especially for handling markdown content at build time.
The core of the site is "Flatstock-lite"—a simple data access layer that reads Markdown files with YAML frontmatter from the file system. No database, no API calls for content. Just files.
// The "Joinery" of our data layer
export function getContentBySlug(type, slug) {
// Read file, parse frontmatter, return content
}
It feels incredibly robust. The content is the record.
The AI Pair Programmer
Working with the AI has been... enlightening. It handles the boilerplate effortlessly. We had a hiccup right at the start where create-next-app failed silently, so we had to manually scaffold the project. It was actually refreshing to build the packages.json and directory structure by hand (well, by AI hand) to see exactly what went into it.
The RAG Saga: "Too Many Requests"
The most interesting part of this build was the decision to replace standard site search with a RAG-based chat interface. I wanted users to be able to ask questions about me and my projects and get answers grounded in the content of the site.
We decided to use Google's Gemini models since I already had an API key.
However, we hit road bump after road bump with model availability.
- Started with
gemini-pro. 404 Not Found. - Tried
gemini-pro-latest. Hit a rate limit immediately ("Free Tier limit: 0"), which was confusing because I have a Gemini Advanced subscription. - Turns out, the API billing is separate from the consumer subscription.
- We tried
gemini-1.5-pro. 404. - Tried
gemini-2.0-flash(preview). Rate limited again. - Finally, we landed on
gemini-flash-latest. It works, it's fast, and it seems to be available on my current tier.
It was a good reminder that "AI" isn't a monolith. You have to navigate models, versions, quotas, and regional availability just like any other infrastructure.
Design: The Canal Lock System
With the infrastructure in place, we turned to the visual identity. We moved away from the "personal blog" feel to a proper product studio aesthetic.
We implemented the Canal Lock System:
- Palette: Deep Water Blue (
#1B4965), Mill Stone Grey (#62676E), and Canal Lock Black (#2C3531). - Typography:
Interfor clean, geometric readability. - The Raceway Glyph: A custom geometric mark representing the flow of water (and ideas) through a constrained channel to generate power.
The Terminal: Bitchat Redux
For the chat interface, we ditched the standard "chatbot bubble" for something more visceral. Inspired by the "Bitchat" aesthetic, we built a Terminal Search interface.
- It's a modal that looks like a command line (
user@millrace:~). - It supports commands like
helpandclear. - It acts as the RAG interface: you type natural language questions, and it queries the vector store to give you answers based on our content.
Automating Flatstock with GitHub Actions
The final piece of the puzzle was deployment. Since our "database" is just a local JSON file generated from markdown, we needed a way to keep it in sync without running scripts manually on every edit.
We set up a GitHub Action that watches the content/ directory.
- You edit a Markdown file on GitHub.
- The Action spins up, runs the
ingest.tsscript using Gemini. - It commits the updated
vector-store.jsonback to the repo. - Vercel detects the commit and redeploys the site.
The result is a "serverless" CMS where the content is just text, the database is just a file, and the AI keeps everything connected automatically.
What's Next
The site is live (locally). The chat interface works—it's vector-searching my thoughts and projects and answering questions.
I'm writing this very post as a Markdown file in the new system. The cycle is complete.