Tokens are payroll in disguise
Be respectful with the tokens you use; they could have been paid as salaries to people who were laid off, people who are now unable to find work, or it could be you someday.
This is what I think now-and-then when I use LLM to boost my productivity* now a days.
For context, I recently jumped from TrueLayer, the Open Banking pioneer I helped grow starting from Series B round until it became an unicorn (= shipped hard to production, to brag properly 😉). The new company (can't disclose publicly yet due to security/compliance reason) is very BIG on AI.
In just one week of onboarding, I’ve spoken to more AI agents than to humans in the flesh. There isn’t an AI agent we don’t have access to, with near-unlimited usage. I’ve learned more in that single week - about standard procedures and existing codebases - than would normally take weeks, if not months.
However, for all the joy of being productive, I can’t help thinking the tokens are sacred and must not be wasted. I feel guilty when I submit poor instructions and realise it in hindsight - those tokens translate into real cost, money that could have paid the salaries of people in this industry who were laid off, are about to be laid off, or may never be hired because of this surge in productivity.
So here's another day of being mindful about wasted tokens.
A note about my 10x productivity!
No, I’m not a 10x engineer because of LLMs. I can only move as fast as I can review the code, clean up the slop, handle edge cases - which AI still struggles with - and make it look as though it were written by me. That last part is non-negotiable: AI isn’t at the point where it can take responsibility for deployed code or the incidents it might cause. It will simply apologise and wait for the next instruction.
Until that day comes, I have to stay on top of every piece of logic, every corner case, every hack and clever trick running in production - dev can burn. The day AI becomes a responsible adult, I’ll retire from the industry, if I haven’t already by then.