Choosing the tech stack

Choosing the tech stack

Every tool was chosen deliberately — here's the reasoning

By Roger Rajaratnam 30 March 2026
SeriesPart of How this blog was built — twenty posts on every decision that shaped this site.

Before writing a single line of code for this blog, I spent some time thinking about the stack. It is a personal site — nobody is paying me to make good architectural decisions here — but that is exactly when it is worth being honest about what you actually need rather than defaulting to whatever you already know.

The requirements were simple: write posts in Markdown, serve fast static HTML, add the odd interactive feature without shipping a full JavaScript runtime, and host it somewhere cheap with minimal ongoing maintenance. No CMS, no database, no server.

Here is what I landed on, and the thinking behind each choice.

Tech

The technical side covers the framework, the language, the build pipeline, the hosting, and the services that handle email.

TypeScript

Before going into each tool: one choice runs through all of them. Everything here is TypeScript. For a solo project where there are no code reviews to catch mistakes, having the compiler surface errors early is genuinely useful — types flow through the codebase, refactoring is safe, and mistakes that would otherwise show up at runtime get caught at build time instead.

All the tools below have first-class TypeScript support, which made it a non-decision.

Astro

Astro describes itself as “the web framework for content-driven websites” — which is a fair summary. You write .astro components with a template syntax that sits close to HTML, with a TypeScript frontmatter block at the top for any logic. The output is static HTML by default. JavaScript only reaches the browser when you explicitly include a <script> block or use a client directive.

That trade-off suited this project well. Most of the pages here are articles. They do not need a JavaScript runtime — they need to load quickly and be readable.

Content collections

Content collections were the feature that made Astro the obvious choice. You define a collection in src/content.config.ts, give it a Zod schema, and point it at a directory of Markdown files. Every piece of frontmatter is then typed, validated at build time, and queryable via getCollection().

src/content.config.ts
const posts = defineCollection({
loader: glob({ pattern: ["*.md", "**/*.md"], base: "./collections/posts" }),
schema: ({ image }) =>
z.object({
title: z.string(),
pubDate: z.coerce.date(),
tags: z.array(z.string()),
draft: z.boolean().default(false),
}),
});

A missing title or malformed pubDate fails the build rather than silently producing a broken page. That is the right behaviour for a content site.

Zero JS by default

The features I wanted — syntax-highlighted code blocks, a dark/light toggle, a scroll-tracked table of contents, Mermaid diagrams — do not require a framework runtime. In Astro, each is a <script> block in a component, bundled at build time and only shipped when the component is used. The Mermaid library is the heaviest dependency on the page, and even that only loads on posts that actually contain a diagram.

The build pipeline

Plugging in Expressive Code for syntax highlighting, a custom remark plugin for Mermaid, and emoji support was a few lines in astro.config.mjs:

astro.config.mjs
export default defineConfig({
site: "https://sourcier.uk",
integrations: [expressiveCode({ /* ... */ })],
markdown: {
remarkPlugins: [remarkMermaid, [emoji, { emoticon: true, accessible: true }]],
syntaxHighlight: false,
},
});

syntaxHighlight: false disables Astro’s built-in Shiki so it doesn’t conflict with Expressive Code, which runs its own highlighting pipeline.

Remark Emoji

Posts on this blog support emoji shortcodes — writing :rocket: in Markdown produces 🚀, and emoticons like :-) are converted too. This is handled by remark-emoji, a remark plugin that runs as part of the Astro markdown pipeline.

It is a small thing, but it means emoji work consistently across editors and operating systems without relying on platform-specific input methods. The shortcode syntax is also easier to read in raw Markdown than pasting a Unicode character directly.

The plugin takes two options worth knowing about:

[emoji, { emoticon: true, accessible: true }]

emoticon: true enables the ASCII emoticon conversion. accessible: true wraps each emoji in a <span> with a role="img" and aria-label, so screen readers announce them rather than reading out the raw Unicode character name.

Netlify

The blog has two requirements that go beyond static files: a comment system and a mailing list. Both need server-side logic. Netlify handles this through serverless functions — Node.js handlers in netlify/functions/ that run without a provisioned server. The comment approval flow, subscriber handling, and transactional email via Resend all run this way.

The rest of what Netlify provides is straightforward: git push triggers a build, the output lands on a CDN, and every pull request gets a preview URL. For a site like this, the free tier covers everything comfortably.

Resend

The mailing list and comment notifications both send transactional email. Resend handles that.

The main alternatives were SendGrid, Postmark, and AWS SES. All of them work, but they each carry some friction — verbose SDKs, legacy dashboard UIs, or IAM configuration in the case of SES. Resend is newer and has been built with developers in mind: the API is simple, the SDK is small, and the free tier (3,000 emails per month) is generous enough for a personal site.

The integration is a few lines in a Netlify function:

import { Resend } from "resend";
const resend = new Resend(process.env.RESEND_API_KEY);
await resend.emails.send({
from: "Sourcier <hello@sourcier.uk>",
to: subscriber.email,
subject: "New post: " + post.title,
html: emailBody,
});

One thing worth noting: Resend requires a verified sending domain. That means adding DNS records and waiting for propagation, which is a slightly annoying one-time setup. After that it is transparent.

Design

The visual side is handled by three tools: a CSS framework for layout and components, an icon library, and a photography source for cover images.

Bulma CSS

I wanted a CSS framework that would give me a reasonable baseline — grid, spacing, components — without requiring a JavaScript runtime, a PostCSS configuration, or a purge step. Bulma fits that description. It is a pure CSS framework with no JavaScript at all.

I import it once and override what I need in global.scss using CSS custom properties. The visual layer is entirely predictable at build time, which keeps things simple. Bulma’s modifier class convention (is-*, has-*) also composes well with Astro’s scoped component styles — global tokens in one file, component styles in the component.

It is not the most fashionable choice in 2026, but it does the job without getting in the way.

Font Awesome

Icons on the site — the theme toggle, social links, the navbar burger, share buttons, and the tech stack grid on the about page — all come from Font Awesome. The free tier covers everything used here.

The integration is through the SVG core package rather than a web font or a <script> tag:

import { faGithub } from "@fortawesome/free-brands-svg-icons";
import { faMoon } from "@fortawesome/free-solid-svg-icons";

Each icon is an array of metadata — dimensions, path data — that a small helper converts to an inline SVG string. That string is then passed to Astro’s set:html directive:

<span set:html={faIcon(faMoon, { size: 18 })} />

The reason for this approach over a web font or CDN-loaded script is that no extra network request is needed and no characters-as-glyphs trick is involved. The SVG paths are tree-shaken at build time — only the icons actually imported end up in the output. On a page that uses three icons, three icons ship.

The one downside is that it is slightly more verbose than dropping in an <i> tag. For a component-based setup that is a reasonable trade.

Unsplash

Photography on this blog comes from Unsplash — a library of freely licensed photography. The licence allows use in commercial and non-commercial projects without attribution, though I include credits anyway as a matter of courtesy to the photographers.

The practical reason for Unsplash over commissioning or sourcing images elsewhere is straightforward: the selection is large, the quality is high, and there is no licensing friction. For a personal blog, that is the right trade.

Testing

Testing is a deliberate omission from this stack description — not because it isn’t important, but because it deserves its own treatment. A content-driven Astro site has different testing concerns to a standard web application: build-time validation through Zod schemas catches a category of errors before they reach production, but there is still meaningful ground to cover around unit testing utilities, integration testing serverless functions, and end-to-end testing the rendered output.

That will be the subject of a dedicated series. This series focuses on building the thing — the testing series will focus on verifying it.

AI

AI is also absent from this stack description — and for the same reason: it deserves its own series, not a footnote.

The tooling is moving fast enough that anything specific I write today would be out of date within months. What I can say is that AI-assisted development is not going away, and treating it as a gimmick at this point is a choice with real consequences. A dedicated series on how to work with AI effectively — in code review, in architecture, in the day-to-day mechanics of building software — is coming.

What I do want to say here is this: the rise of AI makes human judgement more important, not less. The engineers who will get the most out of these tools are the ones who already understand what good looks like — who can read generated code and spot the subtle wrong, who know when an abstraction is heading somewhere problematic, who understand the trade-offs well enough to push back when the tool confidently picks the wrong one. AI amplifies whatever understanding you bring to it. It does not replace the need to actually understand.

That’s part of why this blog exists. The skills worth preserving aren’t the mechanical ones — those are exactly what AI is good at. The skills worth preserving are the ones that require experience to develop: knowing what to build, knowing why, and knowing when not to.

When this stack falls short

For a project with complex client-side state, a real-time feed, or a heavily interactive UI, this stack would be the wrong choice. Astro is not set up for that, and Netlify Functions are not a substitute for a proper backend. Those projects are better served by a framework with client-side routing and a dedicated API layer.

But for a blog, it is a good fit. The build is fast, the output is simple, and there is very little to maintain. That is roughly what I was after.

The rest of this series goes into each part in more detail — starting with typed content collections and working through everything from dark mode to the comment system.

Need help choosing your stack?

If you’re at the early stages of a project and want a second opinion on the architecture — or you’ve already built something and want a review — I’m available for consulting.

Get in touch via the contact page and tell me what you’re working on.

Free · No spam · Unsubscribe any time

Get new posts in your inbox

When the next article drops, I'll send a short note — a link and a summary, nothing else. One email per post.

Did you find this useful?

Comments

Loading comments…

Leave a comment