The Shape of Everything
A website mostly about Mac stuff, written by Gus Mueller
» Acorn
» Retrobatch
» Twitter
» Mastodon
» Instagram
» Github
» Maybe Pizza?
» Archives
» Feed
» Micro feed
Converting Acorn Images on the Command-Line

Alex Chan shows how he used AppleScript to batch convert a bunch of Acorn images to PNG.

You could also use Retrobatch to do this pretty easily, but I thought it was a great example of using the tools already at your disposal to get something done.

Ben Lacy on IG

Charlie Hunter (an amazing guitarist you should check out) posted this video of Ben Lacy playing Everybody Wants to Rule the World. If you're a guitar player, you should watch and take a listen.

As a guitar player I can see and hear everything that he's doing, but combined it makes no sense to my brain. A single guitar player should not be able to get this much sound and rhythm out of their instrument. Yet, here it is.

There's a handful of other Ben Lacy videos like this scattered around the internet, which are well worth listening to. And Charlie Hunter has a bunch of amazing clips up on Instagram as well.

Pilkington on AppKit

Martin Pilkington: Appreciating AppKit, Part 1:

“AppKit is Apple's UI framework for building apps for the Mac. It has existed in one form or another for around 30 years and is the basis for many of the concepts and features of UIKit on iOS. Understandably, given its age, it has quite a few quirks and dated features. Some can simply be ignored, such as drawers. Others are still core to how parts of AppKit function, such as NSCell. These features can make AppKit seem daunting and difficult to work with, especially for those who have only known UIKit.

“However, the upside of AppKit's age is that it has an incredibly rich feature set. If you get over the initial hurdle you find a framework far more powerful than UIKit has ever been. Indeed it is this power that has played a significant role in allowing small teams to build the sorts of apps the Mac has been celebrated for.”

Hockenberry on Marzipan

Craig Hockenberry writing for the Iconfactory: What to Expect From Marzipan

“What I’m going to focus on today is how this new technology will affect product development, design, and marketing. I see many folks who think this transition will be easy: my experience tells me that it will be more difficult than it appears at first glance.”

This is a great post by Craig. I have thoughts on Marzipan as well, but they’ll have to wait until after WWDC. I really want to see firsthand what Apple has been doing the past year in this area.

Ever Seen a JPEG Up This Close Before?

Omar Shehata: Unraveling the JPEG

“JPEG images are everywhere in our digital lives, but behind the veil of familiarity lie algorithms that remove details that are imperceptible to the human eye. This produces the highest visual quality with the smallest file size—but what does that look like? Let's see what our eyes can't see!”

This is a pretty amazing post going over what it takes to encode (and thus decode) a JPEG image. The best part is you can view the data of the sample images as numbers in a big text box, which you can change. And then the JPEG updates in realtime to show your changes.

April 16, 2019

Wasmer is a Python library for executing WebAssembly binaries.

There is a toy program in examples/, written in Rust (or any other language that compiles to WebAssembly):

pub extern fn sum(x: i32, y: i32) -> i32 {
x + y

After compilation to WebAssembly, the examples/simple.wasm binary file is generated.

Then, we can excecute it in Python:

from wasmer import Instance

wasm_bytes = open('simple.wasm', 'rb').read()
instance = Instance(wasm_bytes)
result = instance.exports.sum(5, 37)

print(result) # 42!

I know a lot of devs are ignoring WebAssembly, but I think it's very cool and might be pretty awesome for making desktop like applications run in a browser. I know if I was a desktop only vector illustration tool, I'd really be getting worried about it.

And so much interesting work is being done server side with it- it's like a mini JVM with security in mind. LLVM 8 even has a target for WebAssembly now!

March 21, 2019

Retrobatch 1.2 is going to be out sometime soon, but I'm having a little public beta first. If you're interested in trying it out, you can grab it from the latest builds page.

There's a handful of new features and nodes, including:

  • Making animated GIF and PNG files
  • Turning workflows into droplets
  • New "Round Corner", "Image Grid", and "Limit" nodes
  • New Write node option for writing back to the original image
  • And a new JavaScript Plugin API, where you can write your own nodes using JavaScript and Cocoa APIs (Retrobatch Pro only).

I'm calling the new JavaScript Plugin API a beta for now, as I've been refining while developing. It seems like every couple of days someone will write in with a request that's a little odd and very specific, but then I think "can this be done with a plugin?". And those questions are perfect for developing this API. I don't want to create 100 built-in nodes that only one or two customers will ever use, but if I can make a one-off plugin while also making the API a little better… well that's awesome for everyone.

But it's incredibly useful as it is right now, so I'm wanting to put it in customer's hands sooner rather than later. (It's also using the new JavaScript/Cocoa bridge I've been working on, FMJS).

Limited documentation is available online for the new API, but if you're interested in how things are really put together you'll want to check out the samples.

And of course if you have questions or bugs, send us an email:

January 25, 2019

Zeplin Gazette: Automate notarizing MacOS apps

"With the release of macOS Mojave, Apple introduced a notary service to validate macOS apps that are not distributed through App Store. Although this process is currently optional, in a blog post published past October, Apple announced that Gatekeeper will require software to be notarized in an upcoming release."

I finally went through all the steps yesterday to get dev builds of Acorn notarized. I'm glad I held off on doing this till now, because previously Apple's notarization servers could take a long time- up to an hour to process things. I think yesterday the average wait time was about 3-4 minutes, which made the code/validate/fix cycle much faster than it would have been.

Acorn has a number of executables and frameworks in its app package, which made things a bit more complicated. And I was intent on making the notarization process an automated part of my build scripts. If your app is less complicated, it'll probably take less time for figuring out what needs to be done for your app.

And automating the notarization steps weren't as bad as I thought they would be. I wrote a Python script which calls out to the notarization tools with the addition of having the response format set to xml (and it was in the plist format, which was then easily fed into Foundation.NSDictionary). The script analyzed the current state of things and either waited a while before querying the notarization servers again to see if it's done, or stapled on the notarization bits if it was. Then a new build of Acorn makes its way to the internet.

Notarization adds a few minutes to the build time, but maybe it'll be worth it? I don't like having to depend on Apple's servers to put something up on mine. But if notariation prevents those " is an app downloaded from the Internet. Are you sure you want to open it?" boxes from scaring customers unnecessarily, it will be worth the hassle. I can hope at any rate.

Acorn OS Stats

I managed to ship a bug affecting folks making 16bpc images on 10.11, and I idly wondered what the percentage of folks using Acorn on 10.11 were. The answer is a little above 2%. That surprised me and is lower than I thought it would be. Folks on 10.14 are a little above 75%, which also surprised me. Almost 20% for 10.13, and a little above 3% for 10.12.

January 15, 2019

This is a long post, for developers and folks who are code curious.

Acorn 6.3 was a reasonably fun release to work on. There was nothing huge in it, but it did included some nice updates to existing capabilities along with a few new features. Here's what I think is worth writing about.

A New Brushing Engine

I had originally planned to release a new brush engine in 6.2, but writing it ended up being a bit trickier than I expected. I got something together for 6.3 though- and it's certainly faster than the previous engine.

Wait, what's a brush engine? Well in Acorn, it's the mechanics which take input from your mouse/trackpad/stylus and then converts that into scribbles for your image. There is a lot of code that exists to make this possible. The engine needs to think about the size of the brush and opacity (which can change depending on much pressure you're putting on a stylus), as well as "flow" of the ink/paint you're putting down. There's also little shortcuts such as holding the shift key between clicks to draw straight lines, and holding down the option+command key to turn your brush into an eraser. And let's not forget about rotation of the brush while following a path, blend modes, smoothing, scatter, and getting the spacing right between all the dabs in a brush stroke.

Then of course, you get to make sure that all happens without any slowdowns in the UI. Is the brushing keeping up with what you're attempting to draw? Acorn has a goal of 120 FPS for most brushes (when running on a Late 2015 iMac), so it needs to be speedy.

When starting out with this rewrite, I didn't want to just modify the existing engine. I've already been doing this for a number of years and I felt that it was time to mostly start over. So instead of adding new classes into Acorn I created a test app to develop in. And when the engine was pretty close to being feature complete, I could drop it into Acorn and continue working on it from there.

The first thing I wanted to do was have all the drawing happen in a background thread. From profiling I knew that Acorn spent a lot of time drawing pixels to the right locations and while that was happening, input about where to draw was frequently missed. So the new engine takes input at around 480hz on the main thread (also known as the UI thread), and a background thread will do the actual calculations and drawing to your layer. Then at a minimum of 1/90th of a second, the engine will grab the bitmap and throw it up to the screen (composited with any other layers and effects you might have). Depending on how much work is going on I've seen Quartz Debug reporting around 300fps on the previously mentioned iMac. Things were good, but YMMV depending on what the brush has things set to. Ironically, smaller brush sizes require a lot more work for Acorn and can tend to be slower.

The actual bitmap blitting is done with Core Image, and I had to invent a few tricks in order to make it keep up with the input. Acorn can't just tell Core Image to 'draw this dot', have it do that, and then say 'ok, now draw this dot'. Well, you can- and that's what Acorn used to do (in the Acorn 4.x timeframe (and it would previously perform reasonably well when everything's done on the CPU!)). But these days most everything is done on the GPU, so you need to batch your drawing together and send it all up the GPU in reasonably sized chunks. And the amount of drawing stacked up in those batches also depends on how fast and furious the input from the user is. Pressure can build up on the input side as well as the output side, and there's little mechanisms in Acorn to balance it all out. This was pretty fun to write, including dealing with the requisite threading issues.

The new brush engine also added a number of other little features which were present in the test app, but which I haven't exposed yet in Acorn. Doing some would require some UI changes that I wasn't ready to commit to for this update, and I wasn't 100% happy with them quite yet.

Oh, and then of course there was compatibility. Acorn 6.3 was developed on 10.14, but it supports MacOS back to 10.11. My tricks for batching things together in Core Image didn't work so well on 10.11 and 10.12, and actually exposed a number of bugs in Apple's framework. So for folks running on older MacOS releases, they would get the previous engine.

I also added some new widgets to the brush palette, but there's still more things that I want to change and add (which will happen eventually). But real artists ship, so out the door it went.

iOS 12 Portrait Matte Support and Layer Mask Features

Apple added a new feature to its latest iPhones in the iOS 12 update called "Portrait Matte". It's a special image embedded in HEIC images which is based off the depth data and some machine learning in your photo. You can then use this image as a mask to blur parts of your image (which is what the iOS "Portrait" camera setting does), or you can use this data to remove backgrounds.

But how should Acorn expose this matte? My first stab was to have Acorn add the matte as an additional layer. After playing with it a bit, it just felt off. So I ended up adding the matte as a mask to the main layer when opening the image. But folks are obviously going to want to do more than just mask out the background so I added new features to Acorn where you could easily drag and drop the layer mask into into its own layer. I also made it easy to move an existing layer to another layer's mask via drag and drop. I can't predict what people are going to want to do with the mask, but I might as well make it easy to move around.

It was also during this development that I found some bugs in Apple's My Photo Stream. The matte was showing up rotated incorrectly when opening images out of Photos. At first I figured I was just reading the data wrong, but nope- under certain conditions when images with the portrait mask were uploaded to MPS, the rotation data from the camera went missing. After some communication and a Radar filed at Apple, this bug was fixed in an OS update. Bug fixes like this don't happen very often, but when they do it makes filing all the other Radars worth it. Mostly.

Getting an image with the Portrait Mask onto your Mac is a bit tricky (iOS really wants to flatten effects before sending an image out). What's worked best for me is taking a selfie, and then using Image Capture to transfer over the HEIC. Why selfie? Well, the depth data for the back facing cameras is obviously there somewhere in the image (because can modify it), but it's not showing up as a portrait mask for Acorn. Investigations are ongoing as to why this is.

8bit PNGs / Indexed PNGs

Recently I purchased a commercial license for PNGQuant for use in the next update to Retrobatch. It's a great utility which can take a PNG image, reduce the number of colors used, and add dithering to make it look pretty close to the original. The benefit of this is that you get much smaller file sizes. "8 bit PNGs" has been a feature request for Acorn for years, so I threw it in 6.3 as well.

Combine this with Acorn's (already built in) use of PNGCrush, and you get some pretty small PNG file sizes.

PNGQuant and PNGCrush are usually used as command line tools- but both come with source, which meant that I could compile the sources into Acorn's binary and not have to call out to a tool. This is what I was currently doing with PNGCrush. I compile the source into Acorn and call a pngmain() function with arguments to a temp file. It works, but it's kind of a hack, and I didn't have any way to cancel an in-progress action.

So for 6.3, I decided to pull the PNGCrush code out into a separate tool, and include PNGQuant as a tool as well. I could call them via NSTask, and since it was a separate process, I could terminate them if settings where changed in Acorn's web export.

This worked pretty well, until I tried to submit Acorn to the App Store. I had completely forgot about signing the binaries and giving it the right entitlements for acceptance to the MAS.

There were a bunch of things I had to do in order to get it all working, and some really dumb mistakes by me- but Timo Perfitt wrote a post just last week which almost mirrors exactly what I was going through. Go read his post if you're curious about mach-o headers and adding Info.plist files to executables.


Acorn 6.3 introduced a new way of rendering PDFs, where vector data and bitmap data are layered together in a PDF. Previously when making a PDF, Acorn would flatten the image and insert the bitmap into a PDF (the one exception was if your image was 100% vector layers- the the resulting PDF would be all vector). Well of course I screwed this up a little bit, so that sometimes your bitmap layers would be drawn much larger than they appeared on screen. And this would happen for PDF export as well as when printing. Why when printing? Because printing on MacOS is pretty much "here's a PDF, send it off to the printer". Acorn of course uses the same PDF routines when exporting and printing.

But this bug was fixed for 6.3.1, along with some curves improvements I mentioned in a previous post on this site.

Pay via Stripe on the Flying Meat Store

This isn't really an Acorn thing, but I also reworked our online store so that it now accepts Apple Pay. I had moved to Stripe last year for collecting payments, but adding Apple Pay support required a little bit more work on my part. I put that off until this past week, and it's up and running now.

Previously I was using PayPal Web Payments Pro, which was pretty nice ten years ago- but compared to what Stripe offers these days, as well as the ease of development… well, it was really no contest. Stripe is awesome.

The End

Hey, that's a lot of writing. Now I know why I don't do postmortems more often.