🎵 The song for this post is Theme from Rawhide, from the Blues Brothers Soundtrack. 🎵
You may have seen this video circulating, a demo of the new Unreal engine:
I love how it's unofficially-but-obviously Tomb Raider. I played the first two games in the reboot series (which led to me seeing the movie lol) and I remember being extremely impressed with how beautiful the game was. A lot of hay was made over how 3.4 million copies of a $60 game was in the first 4 weeks was "disappointing", and the cost of the engine was a big part of that. Wonder what weird contortions AAA will "require" as we enter this next phase in game engines.
I feel pretty blessed that I took the Graphics course in my CS degree. We didn't go super deep (barely touch shader programming, for example) but writing a raytracer and what makes lighting challenging has been a delightful bone for my brain to chew when watching something like this.
On that last note: youth is wasted on the young. I'm never "wishing I could go back," but it's funny how much information you can gather (that follows you forever!) in a single semester class, and how much harder it is when you're not a full-time student receiving dedicated instruction. So many times I've thought "this is when I'll dig down and write a Minix" or "I'll study machine learning techniques so I can make it practical in my projects" and I still fail to get the depth of any of these higher-level classes. I might still! But it is so much harder lol.
"Should be intuitive!"
Nick Heer agrees with Brent Simmons on the idea that "the ideal iPhone app first-run experience is none at all," that it should just be intuitive. There's even Steve Jobs quotes ("iPhone apps should be so easy to use that they don’t need Help.") so you Take It Seriously.
Reading in good faith and honoring pedantry: they don't say anything too challenging. They explicitly specify "iOS apps from the App Store," which is a pretty broad base of explicitly consumer products, and Nick says many "may not truly reach the point where users must not need help, but they ought to be designed with that goal in mind." So what I say next doesn't strictly apply to these two, but:
This thinking drives me absolutely bonkers. No tool, habit, or product that's truly useful takes zero training to learn. I dare you to tell me that I should have dropped this because it's not "intuitive" what to do at this very moment:
Think of version control. Think of airline cockpits. If you A/B test an interface so that it's "intuitive" you'll get software that's only certainly good at passing that PM's metrics. It'll be the functionality equivalent of a late Evony ad.
Obviously one shouldn't ship user-hostile software, obviously documentation and user experience are critically important and should be given a lot of love. But stopping something as soon as it requires literally any thought at all then demoting that thing mentally because you can't be bothered to think is exactly how you get smooth-brains running everything.
(also, remember the converse isn't true: if something is hard to learn, that doesn't make it "better" either. this is how you get shitty showboating rituals, and Git)
Would you believe that the page that's gotten the most hits on this blog is literally the third post I wrote on it, 11 years ago? I was so excited to learn Unification and Prolog that I wrote up a post about it, and I think it shows up somewhere in some search results that it gets a small number of weekly visitors lol.
Logic programming always scratched an itch for me, but I've never been able to dedicate the time I'd like to it. I followed a comment thread and found these old posts from earlier in the decade, on "what killed Prolog," given that it was pretty hot in the 80s. This one ties it into the grander computing narrative of the time; it's really interesting to read a history where there was an AI "arms race" against Asia for who could do generalizable intelligence first, a theme with China today in that often-wrong book I read. The author suggests that once that "arms race" died for both sides because generalizable intelligence is hard, Prolog lost its biggest project, never to recover. A response post piggybacks on to talk more about the programming model itself. I don't fully agree with either, but got a lot from reading them.
I'm not saying the other things most people actually talk about don't matter or get worked into the narrative, just that it's the most consistent thing I can find when considering "why are people using this?" Hit me with counterexamples, the main one I can think of is Rust, where people really did want the features (a performant, memory-safe language that could viably replace C). They wanted it so bad some of them called Go a "systems language" for a hot 6 months before using it for long enough to be like "oh lol, it's got garbage collection." Go is also a bit of a weird case, I'd chalk that one up to hundreds of millions of dollars of marketing/evangelism investment.
Anyway, "this is the year we move to OCaml" is my "this is the year of the Linux desktop!" 😄
I'm still cooking. The novelty has worn off a bit but I'm still finding it peaceful. I was hungover this morning, feeling antsy and lonely, but feeling more like a person after putting on some tunes and making eggs, beans, and coffee.
Last week I made a risotto. That… was a lot of stirring! I made one back when I lived in SF; this one seemed to take longer, and while it doesn't look Obviously Appetizing (most of my food is some flavorful slop), I'm really happy with how it came out. I'm condemned to make more too, since we have a ton of arborio rice and bouillon cubes.
Thanks for the read! Disagreed? Violent agreement!? Feel free to drop me a line at , or leave a comment below! I'd love to hear from you 😄