Each of these cycles has been larger and lasted longer than the last, and I want to be clear: each cycle has produced genuinely useful technology. It’s just that each follows the progress of a sigmoid curve that everyone mistakes for an exponential one.
There is an initial burst of rapid improvement, followed by gradual improvement, followed by a plateau. Initial promises imply or even state outright “if we pour more {compute, RAM, training data, money} into this, we’ll get improvements forever!” The reality is always that these strategies inevitably have a limit, usually one that does not take too long to find.
Tomorrow Corporation tech demo 👏👏👏, thx James. A striking reminder of what’s possible for a dev team in control of its “tech stack”, and of how much control we sacrifice when we layer frameworks upon dependencies upon toolkits that look like free lunch, but come at a hefty { weight / performance, flexibility, maintenance } cost.
Washed Out: The Hardest Part, “the first official commissioned music video collaboration between a music artist and filmmaker made with OpenAI’s Sora video model”.
500 mile email, “a curated list of absurd software bug stories”.
AI
LLMs are like a trained circus bear that can make you porridge in your kitchen. It’s a miracle that it’s able to do it at all, but watch out because no matter how well they can act like a human on some tasks, they’re still a wild animal. They might ransack your kitchen, and they could kill you, accidentally or intentionally! — Alex Komoroske, via Simon Willison.