Brittany Goris, Professional Dirtbag, Climbing in Rocklands
My mind went back to a statement from a street performer in Cape Town the previous day: “Check this out, I’m about to be amazing!” At the time I had joked about using his catch phrase for my climbing, but I hadn’t really meant it because this whole time I’d been so focused on the opposite: staying humble, no egos allowed.
What if that was the kind of energy I needed right now though? What if there was something in between my meek humility on this trip and the toxic rage from my youth? The paradigm shifted a little more. Why not try a little reckless egotism? It’s just a form of belief, after all, and didn’t that stranger’s bravado mirror the exact kind of belief I’d once wielded like a weapon? Maybe he had stumbled across the alchemical gold without even realizing it.
“I’m about to be amazing,” a shy voice whispered in my head as I pulled off the ground and the world around me faded away, leaving just the void of a perfect crack stretching out before me into the abyss.
I love reading Goris's posts on climbing. She’s constantly traveling to amazing places and then writes about them in a way that makes me super envious. Why can’t I write like this?
Even if you're not a climber, you'll probably find her writeup engaging. And if you are a climber, you'll want to read it because you'll certainly learn a thing or two.
DeltaDB From Zed (the Code Editor)
From a post on Zed.dev: Introducing DeltaDB: Operation-Level Version Control:
Our vision is turn your IDE into a collaborative workspace where humans and AI agents work together across a range of time scales, with every insight preserved and linked to the code forever. To make this possible, we're building DeltaDB: a new kind of version control that tracks every operation, not just commits.
DeltaDB uses CRDTs to incrementally record and synchronize changes as they happen. Its designed to interoperate with Git, but its operation-based design supports real-time interactions that aren't supported by Git's snapshots.
That sounds pretty neat, and I'm really interested to see how it goes.
Zed fascinates me for some reason, and I can't quite pin down why. Is it because of the hardware accelerated text rendering? Is it because it's crazy fast? It's written in Rust? Some combo of those and other things? Beats me, but I think Zed is cool.
It's also worth noting that Zed raised $32M in a Series B investment. I had no idea they were funded. In my brain it was a plain old open source product, but it looks like they have some serious funding to make it.
August 20, 2025
Retrobatch 2.3 has been released. Some of the highlights:
Image Diffing: You can now compare your modified image against the original image using new handy new toolbar modes, or keyboard shortcuts.
PDF Rasterizer (Pro only): Convert all vector and text components of a PDF to full page images (handy for folks who want to "bake" their PDFs so text or elements can't be removed to find hidden information).
Send Notification node: Does what it sounds like — send out system notifications for when a workflow starts, ends, or for each image processed.
The Write node now has options to make indexed PNG files. This replaces the dedicated "Indexed PNG" node, so it's all wrapped together in a single node now.
There are little changes and bug fixes of course. The full release notes are available as usual. If you already have Retrobatch 2 installed, use the Retrobatch ▸ Check for Updates… menu item to update to the latest version.
One thing I did, which isn't in the release notes, is the preview canvas got a big upgrade as far as zooming, panning, and deep color support. I did a lot of refactoring when I was implementing the image diffing tools because I wanted to bring that feature to Acorn as well. So with some smart subclassing and such, I've got a shared Metal accelerated canvas that works for both Acorn and Retrobatch. This new class is used in quite a few places in the next release of Acorn (Web Export, RAW Image preview, New View windows, etc), and obviously in Retrobatch's main preview window. It's nice having a fully built out class that can I can just drop in for standard image editing behaviors.
Macrowave is pretty cool
Macrowave: Turn Your Mac Into a Private Radio Station
It's a Mac and iOS app for sending and receiving audio.
I don't have a need for this app, but I saw the UI and instantly wanted to play with it. It's a nice throwback to the MacAmp or Winamp days.
August 1, 2025
Sub Club Podcast: The Past, Present, and Future of Building on Apple.
Another interview with John Gruber about the history of developer relations with Apple.
I thought this was a great episode, and it brought back some memories. I can remember when Apple would call you up and let you know early bird pricing for WWDC was about to go away. How quaint that seems now. (I even went to WWDC on their dime back in the late 90s when I was working for Mizzou).
I still get on well with the folks in Apple Developer relations, and I guess I always have. Maybe because I've been developing on the Mac for so long, and I tend to keep a cool head about things? Of course there's always ups and downs, but I try to remember there are very different divisions in Apple, with different opinions on how things should be. "Apple" isn't really a single entity anymore, at least from my perspective.
August 1, 2025
The Talk Show: Ep. 427 ‘THE SHIFT-2 CROWD’,, With Jason Snell:
Jason Snell returns to the show to talk about the early PC platform rivalries of the 1980s, iOS 26 leaks (and Apple suing YouTuber Jon Prosser), the various Apple OS 26 public betas and the state of Liquid Glass, and more.
I thought the discussion on Liquid Glass, especially the state of it on the Mac, was pretty good. Like Gruber and Snell, I worry that the Mac is more of an afterthought when it comes to the new UI.
One can hope that it gets better as the betas come along (as they always do).
Duct Tape and Bailing Wire
Tim Wood on Mastodon:
OK, OK, ok, story time.
Way back when (early 90s), when Omni was consulting for McCaw Cellular (or AT&T Wireless, not sure which it was at the time), we were working on apps for NeXTSTEP for sales, customer care, and such for cell phones, nation wide. We'd occasionally get a crash reports and I don't even remember how those got back to us back in the day before automated collection and reporting, but eventually we were able to reproduce it.
I won't spoil the punchline as there's only two paragraphs remaining, so go treat yourself.
June 14, 2025
Acorn was in Apple's WWDC 2025 keynote! How f'n rad is that?
So a little before WWDC I got a chance to fly down to Cupertino in the dead of night, to secretly adopt Acorn's UI to use the new look and feel of Apple's forthcoming Liquid Glass. NDAs were signed and I can't say much—other than it was super quick to adopt for Acorn's main canvas window. Acorn is a mostly Objective-C app¹ with a codebase that's effectively 20 years old now, so that speaks to how relatively smooth this transition should be.
I was super stoked to be able to do it, and though I'm sure there will be the usual bumps along the way (especially for iOS devs), it's great to have a new UI playground to run around in.
You can watch Acorn's 15 seconds on the big screen with this timestamped YouTube link: Apple's WWDC 2025.
¹ As John Gruber pointed out during The Talk Show Live From WWDC 2025, which you should also watch.
June 1, 2025
Mark Alldritt: Script Debugger Retired:
The day has finally come. After 30 years of continuous development, Script Debugger has been retired and will no longer be available for sale. Please see this post for more information.
I just looked up when I made my first purchase of Script Debugger — version 4.5 in 2009 for $199, and it was worth every penny.
I still use Script Debugger to this day.
30 years of development is a long time, and Script Debugger is such a great app. Congrats Mark - you made something awesome.
May 25, 2025
Brent Simmons: My Wildly Incorrect Bias About Corporate Engineers
Before I went to work for Audible (five years ago now — time flies!) I had a bias about engineers that worked for large corporations. I assumed that they weren’t as good as indies and engineers at small companies, or else they’d actually be indies or work at small shops like Omni.
…
And so I learned very quickly when I started at Audible that I was very wrong. I was impressed, and grew more impressed as time went on, by my fellow engineers’ rigor, talent, professionalism, care, and, especially, ability to work with other people toward common goals.
Not long after Brent joined Audible, I clearly remember him telling me that there were really smart people there who were amazing coders. And he was completely shocked by this. The look on Brent's face was just pure amazement, and he was so happy about it!
I laughed and laughed at him. Of course there's talented people there! Only crazy people are willing to put up with having to file business taxes, mess with social security, find healthcare, deal with all the stuff you have to handle to be indie. And you don't even have to be a particularly good programmer. You just have to be persistent.
Sometimes I really miss working with other folks, so I'm especially happy for Brent that his exit to retirement was working with a great group of people, in a giant ass company. Everyone should experience that at least once.
And now Brent is retiring! I've known this for a while as well, because he's insanely excited and absolutely won't shut up about it (in a good way). I bet he's even counting down the number of meetings he has left at this point.
Pure freedom to work on what you want, for the rest of your life. And of course he's going to be coding, because Brent is a developer, and I don't think you could stop him anyway.
Congrats, Brent.
Microsoft Edit Is Now Open Source
Christopher Nguyen:
Edit is a new command-line text editor in Windows. Edit is open source, so you can build the code or install the latest version from GitHub!
…
Edit is a small, lightweight text editor. It is less than 250kB, which allows it to keep a small footprint in the Windows 11 image.
I'll probably never use it, but I find Edit delightful. It's a new 64 bit CLI editor, with curses like menus. It's 2025, and … just look at it!
April 24, 2025
Christian Ekrem: Coding as Craft: Going Back to the Old Gym:
But the core of programming – the thinking, the designing, the architectural decisions – these are the parts I want to preserve as my craft. These are the muscles I want to keep strong by training in the old gym.
…
Just as a chef might use a food processor for tedious prep work but would never dream of automating the creative aspects of recipe development and flavor balancing, we should use AI for what it’s good at while preserving the parts of coding that bring us joy and growth.
I love using AI tools to help me figure out things or to brainstorm, but it's the little highs I get when I solve a good problem that I don't ever want to go away.
April 20, 2025
From The Verge earlier this week:
During Meta’s antitrust trial today, lawyers representing Apple, Google, and Snap each expressed irritation with Meta over the slides it presented on Monday that The Verge found to contain easy-to-remove redactions. Attorneys for both Apple and Snap called the errors “egregious,” with Apple’s representative indicating that it may not be able to trust Meta with its internal information in the future. Google’s attorney also blamed Meta for jeopardizing the search giant’s data with the mistake.
Presumably, the folks from Meta used a PDF editor to draw a black vector box around the sections to be redacted. The problem with this technique is that you can open up that same PDF in an editor and delete the box to see what’s underneath.
The correct solution would have been to rasterize the PDF (which is the process of turning vectors to pixels), so any attempt to remove the redacting box would reveal an empty area. When you rasterize a page, you’re essentially baking the PDF.
As John Gruber notes:
You can properly redact a PDF digitally, but botched digital redactions are so commonplace (and at times disastrous and/or humiliating) that when then Attorney General William Barr released the Mueller Report in 2019, the DOJ printed the unredacted original, did the redactions on paper, and then scanned it back in to create the redacted PDF.
But there’s an easier way: use Retrobatch of course!
The workflow would look something like this:
You read the PDF, split the pages up, add a matte background to each page which also happens to rasterize it as well, paste the pages back together with the PDF Maker node, and then write the PDF back out.
You can also control the resolution of the rasterization by using the “Set DPI” node before the page splitter.
I think there’s enough specialized PDF tasks that need to be done that I should probably make whole PDF category in Retrobatch, including a standard PDF rasterization node.
April 18, 2025
Gabriel Nicholas at Wired: The Subjective Charms of Objective-C
But the longer I spent writing Objective-C, the more I felt it hid rather than revealed. Long, sentence-like function names buried the most pertinent information under a fog of dependent clauses. Small features required long-winded pull requests, making it easy for engineers to get distracted during reviews and to miss bugs. Objective-C’s excess words, multiplied across thousands of files and millions of lines of code, made for an exhausting codebase.
My own experience with Objective-C has been very different. I wonder if that’s because I work as a solo developer, and the architecture of my apps has always been stable? I always found the early mantra “If it feels hard, you’re probably doing it wrong” when working with AppKit and Objective-C to be more true than not.
Anytime I hit a stumbling block something like “The Way of the Code Samurai” from Wil Shipley would play through my head. Were people who disliked Objective-C fighting it rather than flowing with it?
To me, Objective-C has always felt expressive and capable, doubly so when I first started using it. After coding in Java for years I felt like I could fly.
Swift is the thing now, and both Acorn and Retrobatch use it for parts. But Swift is a heavy and unsettled language, not to mention extremely slow to compile.
I hope someday we’ll get a version of Swift that isn’t chasing whatever the hot new coding paradigm currently is, and isn’t weighed down by ever expanding complexity. I think that could be pretty nice.
Chris Lattner, the creator of Swift, in an interview:
“Swift, the original idea was factor complexity (…) massively failed, in my opinion (…) Swift has turned into a gigantic, super complicated bag of special cases, special syntax, special stuff”
I wonder, what comes after Swift?
April 17, 2025
Alex Harri: A Flowing WebGL Gradient, Deconstructed:
This effect is written in a WebGL shader using noise functions and some clever math.
In this post, I’ll break it down step by step. You need no prior knowledge of WebGL or shaders — we’ll start by building a mental model for writing shaders and then recreate the effect from scratch.
This was an absolutely wonderful read on constructing a nice looking animated WebGL shader, from the very basics up to the end product.
New to me in this post was the concept of stacking sine waves — what a clever idea.
You might remember Harri’s post “The Engineering behind Figma’s Vector Networks” from back in 2019.
April 14, 2025
Geoffrey Litt: Stevens: a hackable AI assistant using a single SQLite table and a handful of cron jobs
The assistant is called Stevens, named after the butler in the great Ishiguro novel Remains of the Day. Every morning it sends a brief to me and my wife via Telegram, including our calendar schedules for the day, a preview of the weather forecast, any postal mail or packages we’re expected to receive, and any reminders we’ve asked it to keep track of. All written up nice and formally, just like you’d expect from a proper butler.
SQLite, cron, and open APIs. This is the type of hacking that I really dig and I've considered putting together as well.
This passage from the end really resonated with me:
I’ve written before about how the endgame for AI-driven personal software isn’t more app silos, it’s small tools operating on a shared pool of context about our lives.
I keep on coming back to the idea that I need to gather more data about myself and store it somewhere easily accessible by custom AI tools. I write a little bit of stuff in Day One, but I keep meaning to build something on top of SQLite as well. Of course, I also keep on hoping Apple would do the same, but they'd probably move too slow to make it interesting.
If only I could clone myself. I have a ton of ideas but not enough time to implement them all. Such is the beauty of life i guess.
April 11, 2025
With the release of Acorn 8 last December, I published "ACTN002 Acorn's Native File Format" as part of the documentation updates, which is exactly what it sounds like.
Without going into details (that's what the technote is for), Acorn's file format is a SQLite database, with a simple three-table schema, containing TIFF or PNG bitmaps to represent bitmap layers, and a plist to represent shape layers. Acorn has kept this simple format since version 2.0 back in 2009.
And since the format is a SQLite database, it is incredibly easy for a programmer or anyone else who isn't afraid of Terminal.app to get a composite out of an Acorn file:
echo "select writefile('/tmp/pizza.png', value) from image_attributes where name = 'composite'" | sqlite3 pizza.acorn
That's it. You've now got a PNG copy of the Acorn file "pizza.acorn
" written to /tmp/pizza.png
.
SQLite is bundled with pretty much everything these days, which means you can write some code in Python, Swift, Objective-C, whatever, and easily support reading Acorn files. Here's an incredibly short Python script to do that:
import sqlite3
import sys
conn = sqlite3.connect(sys.argv[1])
cursor = conn.cursor()
cursor.execute("select value from image_attributes where name = 'composite'")
result = cursor.fetchone()
with open(sys.argv[2], "wb") as f:
f.write(result[0])
Note: you should really perform some error checking in actual Python code.
What about in Swift? That's easy too.
This file format has worked well in Acorn for 16 years now, and I plan on keeping it the same moving forward.
March 26, 2025
Last week I bought a 13" MacBook Air in Midnight (24GB memory, 512GB SSD).
I hadn't been planning on buying it. Instead, I was expecting to upgrade my current desktop (M1 Ultra) to an M4 Ultra later this year. Assuming, of course, that we would see an M4 Ultra later this year. But as we know, that didn't happen. (Maybe we still will*?)
At any rate. This machine. A 13" Midnight MacBook Air.
It's beautiful. Have you seen it yet?
I ended up buying one because it's cheap, and I haven't had a travel laptop in a while. This new laptop could also double as a new build server to replace my M1 Mac mini, and I figured someday I'll hand it down to my daughter.
The size is perfect for what I'm after. I can use my 13" iPad Pro as a second display, with the bonus that I've now got instant stylus support in Acorn because of that.
And it's so amazingly fast.
I wasn't expecting that last part. So fast.
Its name is Jimi by the way.
What I'm personally interested in is, how fast can Jimi build Acorn (~200k lines of code). Sure, the single-threaded performance of the M4 processor will certainly beat my M1 Ultra (named "SRV"), but the Ultra has so much more RAM and CPU cores. How close will the Air match the performance of my desktop?
With a full build of Acorn, including running hundreds of regression tests, Jimi outperforms my M1 Ultra at 3m21s vs 4m43s. And when purely compiling Acorn, where you'd think the Ultra would have an edge, I get 1:36 (Air) vs 2:05 (Ultra).
I'm sorry, what?
This $1400 machine is beating my $4000 desktop machine with a 20 core CPU, 48 core GPU, and 64GB of memory? What why how?
So I'm pretty happy with this dinky little travel / build machine. It's a joy to hold and fun to use.
*
(As an aside, there's a lot of speculation as to what is going on with the M4 ultra. Does it take a long time to design? Is it just not a priority? Is there something bigger and better coming for both the Studio and Mac Pro? My guess is on the last option. The Ultra is awesome, but I feel like it might be time for Apple to make a workstation specific processor.)
March 14, 2025
There's been a lot flying around the social web the past couple of days about Apple completely botching their AI push, and I haven't seen a whole lot of solutions (I fully admit I could completely be missing it). But off the top of my head, here's one idea that I think could really help and reap benefits for both Apple and developers.
Build a semantic index (SI), and allow apps to access it via permissions given similar to what we do for Address Book or Photos.
Maybe even make the permissions to the SI a bit more fine-grained than you normally would for other personal databases. Historical GPS locations? Scraping contents of the screen over time? Indexed contents of document folder(s)? Make these options for what goes into the SI.
And of course, the same would be true for building the SI. As a user, I'd love to be able to say "sure, capture what's on the screen and scrape the text out of that, but nope - you better not track where I've been over time".
And similar to the Spotlight indexing API, developers should be able to provide data to the SI along with rich metadata. Rev the Spotlight plugin API so that it can do more, or come up with a new API.
Is this information collected for the SI going to be the most sensitive bucket of bits on your device? Yes, of course it is.
But give developers the opportunity, and then customers will have something to choose from. Make the Mac and iOS the best platform to build personalized LLMs.
Let the apps die and live based on their own merit and reputation. Apple can build the platform, and maybe expand on it over time and use it themselves.
I want to see the apps that are made outside of Cupertino. I want to see what can happen when developers have a solid foundation to build on.
March 10, 2025
As my usage of LLMs has been increasing lately, I find myself more and more frustrated with Siri, specifically on the Mac.
As a Mac user, I have this incredible wealth of GPU and CPU power, which in turn allows me to run LLMs locally.
A few weeks ago, before a trip out of the country for my daughter's spring break, I set up a local instance of DeepSeek and made sure I could connect to it via Tailscale running on my Mac.
Why did I do this? Two reasons.
The first was because I could and there's something inherently cool and fun about running these models locally. It's a joy to play around with this stuff.
The second was a tinge of paranoia. What if I wasn't able to access the models I usually use from out of the country? LLMs are so useful for so many things, I really don't want to lose access now that I know about them. Yes, I could route all requests through my VPN, but … still, what if I couldn't?
So I can run models locally on my M1 Mac, and while it's not as fast as running it on Anthropic or OpenAI's servers, it was still usable. Which is mind blowing to me. I honestly never expected to see this tech in my lifetime. (Yes, LLMs get a lot wrong, but they also get so many things right and help me out with tedious coding chores).
A week or so ago I was grousing to some friends that Apple needs to open up things on the Mac so other LLMs can step in where Siri is failing. In theory we (developers) could do this today, but I would love to see a blessed system where Apple provided APIs to other LLM providers.
Are there security concerns? Yes, of course there are, there always will be. But I would like the choice.
The crux of the issue in my mind is this: Apple has a lot of good ideas, but they don't have a monopoly on them. I would like some other folks to come in and try their ideas out. I would like things to advance at the pace of the industry, and not Apple's. Maybe with a blessed system in place, Apple could watch and see how people use LLMs and other generative models (instead of giving us Genmoji that look like something Fisher-Price would make). And maybe open up the existing Apple-only models to developers. There are locally installed image processing models that I would love to take advantage of in my apps.
I'm glad I'm not the only one thinking about this. Ben Thompson writes at the end of Apple AI’s Platform Pivot Potential:
This doesn’t necessarily preclude finally getting new Siri to work; the opportunity Apple is pursuing continues to make sense. At the same time, the implication of the company’s differentiation shifting to hardware is that the most important job for Apple’s software is to get out of the way;
This passage isn't the crux of the article, but it really resonated with me, and I hope it does with some folks inside Apple as well.
…
(Update) Manton Reese is thinking along the same lines: Apple's response to AI:
I’m not sure Apple knows what a big risk they are taking by letting OpenAI and others lap them in the AI race. It’s a risk that will pay off if they can execute. Just as likely, though, we are seeing such a disruption in computing that Apple is vulnerable for the first time in a decade.