Johan wrote this
@ 2023-05-31


I’ve been the sole developer of Bidders Highway for something like a year and a half now. It’s been rewarding in many ways, if kind of lonely at times, and I suppose the distillation of it feels like “my skills and experience having coalesced into something that probably not everyone can do.” Good for introspection, not great for getting better at the craft. Velocity is off the charts and has been since day one, but my personal development (and, I suppose, career ‒ I’m at home in front of my screen all day rather than shaking hands and making plans) has been put on the back burner for a while. Maybe that sounds strange? If I’m doing all the development, wouldn’t I be getting a whole lot of hands-on practice?

Know yourself

Well, I get to set my own goals for basically every feature and I’m really big on “this is good enough until we know it isn’t” which means I’ll rarely wander into how-the-fuck-do-I-solve-this territory. I’ll mostly stick to what I like: get features out with a minimum of fuss and a decent entourage of tests, and fix them in post if something is up. There’s nobody there to shake their head and tell me to do better while I’m building, or argue details.

Still, I’m happy with my tradeoffs and abstractions thus far. I’m sure I’d find lots of things to hurriedly fix if somebody came to audit the code as part of a big ol’ Due Diligence, but I’ve always favored readability and simplicity over formal proofs and data-driven tooling, and that shows in the codebase. There’s a linter and a vulnerability scanner in there and that’s basically it.

I honestly never ever think about my test coverage, hit rate on database indexes, or potential nil-return values on my functions. Why would I lose sleep over that stuff? They’re all secondary properties of bigger things like Performance or Error Rate or Development Speed. If an exception pops up on AppSignal, I’ll track it down and fix it. If something is slow, I’ll find ways to make it faster. These events are rare. I focus on the big-picture shit instead: am I shipping stuff that will make a difference? Do I have confidence in my test suite? Does the site feel fast?

Know your enemy

…which brings us to third-party analytics. I cram approximately 55kB of JavaScript and 10kB CSS into the end-user’s device today. Not bad, not great. If I could replace Turbo and Stimulus with something more lightweight, I bet I could get the JS for the entire site down below 10kB, same as the CSS. It’s totally feasible. I’m doing deferred execution, so it’s less of a deal than it might otherwise have been, but I care deeply about frontend performance.

So, as you can imagine, Google Tag Manager’s absolutely atrocious TWO HUNDRED AND FIFTY SIX KILOBYTES over-the-wire of assorted tracking scripts makes me want to tear my gorgeous sandy-blonde hair out. And why am I taking that performance hit again? Oh, right: I’m doing it in order to let Silicon Valley track and monetize what people are doing on a site I built from scratch. Not a great feeling. I’ve long toyed with the idea of just nuking it. 99% of the stats are useless to us, anyway. As I see it, we care about two things:

  1. How many users are visiting the site, and
  2. Where did they come from?

Know your customer

I imagine developing a site with millions of daily visitors rather than high single-thousands is different: you can actually test things like “how did moving this button affect our signup rate” by looking at the data. But for us, a weekly 10% increase in signups could be anything. Somebody mentioned us on a podcast, somebody posted a popular link on Facebook, the weather was bad so people stayed inside. The numbers are coarse, and so are the effects.

The relationship between number of viewers on the site vs newsletter and user sign-ups is fairly stable, at least. That means my only metric of interest is how many users came to the site (often as a result of us paying money to some ad-monopoly shitbag) and how many accounts or bookmarks were created. Plausible easily takes care of counting visitors using 1.52kB of JS, and I’ve connected those stats to Blazer to graph other things that may be of interest, like age distribution or time to first bid. Why am I letting megacorps strip-mine every shred of dignity from my users?

The shakedown

The two things that ultimately make me hold off on removing the script are:

  1. Big Tech will definitely punish us by withholding traffic or making our ads more expensive or some combination thereof, and
  2. The people we’re graciously paying to run our digital ads get their hands tied and can’t improve our CAC by retargeting etc.

If it were all up to me, I would run a scrupulously moral business. I resent being extorted by these robber barons. There would be no tracking outside of server logs. Ads on DuckDuckGo, independent networks, and print rather than blood money disappearing wholesale into Meta and Google. No hosting on AWS. I want to make the web and the world better, not be a cog in the ever-grinding wheels of oligopoly! I would probably run us right into the fucking ground, but at least it would be a beautiful failure.

We’ll see about the Google Analytics script, though. I may still remove that, just to see what happens.