The Shape of Everything
A website mostly about Mac stuff, written by Gus Mueller
» Acorn
» Retrobatch
» Twitter
» Micro.blog
» Mastodon
» Instagram
» Github
» Maybe Pizza?
» Archives
» Feed
» Micro feed
January 25, 2019

Zeplin Gazette: Automate notarizing MacOS apps

"With the release of macOS Mojave, Apple introduced a notary service to validate macOS apps that are not distributed through App Store. Although this process is currently optional, in a blog post published past October, Apple announced that Gatekeeper will require software to be notarized in an upcoming release."

I finally went through all the steps yesterday to get dev builds of Acorn notarized. I'm glad I held off on doing this till now, because previously Apple's notarization servers could take a long time- up to an hour to process things. I think yesterday the average wait time was about 3-4 minutes, which made the code/validate/fix cycle much faster than it would have been.

Acorn has a number of executables and frameworks in its app package, which made things a bit more complicated. And I was intent on making the notarization process an automated part of my build scripts. If your app is less complicated, it'll probably take less time for figuring out what needs to be done for your app.

And automating the notarization steps weren't as bad as I thought they would be. I wrote a Python script which calls out to the notarization tools with the addition of having the response format set to xml (and it was in the plist format, which was then easily fed into Foundation.NSDictionary). The script analyzed the current state of things and either waited a while before querying the notarization servers again to see if it's done, or stapled on the notarization bits if it was. Then a new build of Acorn makes its way to the internet.

Notarization adds a few minutes to the build time, but maybe it'll be worth it? I don't like having to depend on Apple's servers to put something up on mine. But if notariation prevents those "Foo.app is an app downloaded from the Internet. Are you sure you want to open it?" boxes from scaring customers unnecessarily, it will be worth the hassle. I can hope at any rate.

Acorn OS Stats

I managed to ship a bug affecting folks making 16bpc images on 10.11, and I idly wondered what the percentage of folks using Acorn on 10.11 were. The answer is a little above 2%. That surprised me and is lower than I thought it would be. Folks on 10.14 are a little above 75%, which also surprised me. Almost 20% for 10.13, and a little above 3% for 10.12.

January 15, 2019

This is a long post, for developers and folks who are code curious.

Acorn 6.3 was a reasonably fun release to work on. There was nothing huge in it, but it did included some nice updates to existing capabilities along with a few new features. Here's what I think is worth writing about.

A New Brushing Engine

I had originally planned to release a new brush engine in 6.2, but writing it ended up being a bit trickier than I expected. I got something together for 6.3 though- and it's certainly faster than the previous engine.

Wait, what's a brush engine? Well in Acorn, it's the mechanics which take input from your mouse/trackpad/stylus and then converts that into scribbles for your image. There is a lot of code that exists to make this possible. The engine needs to think about the size of the brush and opacity (which can change depending on much pressure you're putting on a stylus), as well as "flow" of the ink/paint you're putting down. There's also little shortcuts such as holding the shift key between clicks to draw straight lines, and holding down the option+command key to turn your brush into an eraser. And let's not forget about rotation of the brush while following a path, blend modes, smoothing, scatter, and getting the spacing right between all the dabs in a brush stroke.

Then of course, you get to make sure that all happens without any slowdowns in the UI. Is the brushing keeping up with what you're attempting to draw? Acorn has a goal of 120 FPS for most brushes (when running on a Late 2015 iMac), so it needs to be speedy.

When starting out with this rewrite, I didn't want to just modify the existing engine. I've already been doing this for a number of years and I felt that it was time to mostly start over. So instead of adding new classes into Acorn I created a test app to develop in. And when the engine was pretty close to being feature complete, I could drop it into Acorn and continue working on it from there.

The first thing I wanted to do was have all the drawing happen in a background thread. From profiling I knew that Acorn spent a lot of time drawing pixels to the right locations and while that was happening, input about where to draw was frequently missed. So the new engine takes input at around 480hz on the main thread (also known as the UI thread), and a background thread will do the actual calculations and drawing to your layer. Then at a minimum of 1/90th of a second, the engine will grab the bitmap and throw it up to the screen (composited with any other layers and effects you might have). Depending on how much work is going on I've seen Quartz Debug reporting around 300fps on the previously mentioned iMac. Things were good, but YMMV depending on what the brush has things set to. Ironically, smaller brush sizes require a lot more work for Acorn and can tend to be slower.

The actual bitmap blitting is done with Core Image, and I had to invent a few tricks in order to make it keep up with the input. Acorn can't just tell Core Image to 'draw this dot', have it do that, and then say 'ok, now draw this dot'. Well, you can- and that's what Acorn used to do (in the Acorn 4.x timeframe (and it would previously perform reasonably well when everything's done on the CPU!)). But these days most everything is done on the GPU, so you need to batch your drawing together and send it all up the GPU in reasonably sized chunks. And the amount of drawing stacked up in those batches also depends on how fast and furious the input from the user is. Pressure can build up on the input side as well as the output side, and there's little mechanisms in Acorn to balance it all out. This was pretty fun to write, including dealing with the requisite threading issues.

The new brush engine also added a number of other little features which were present in the test app, but which I haven't exposed yet in Acorn. Doing some would require some UI changes that I wasn't ready to commit to for this update, and I wasn't 100% happy with them quite yet.

Oh, and then of course there was compatibility. Acorn 6.3 was developed on 10.14, but it supports MacOS back to 10.11. My tricks for batching things together in Core Image didn't work so well on 10.11 and 10.12, and actually exposed a number of bugs in Apple's framework. So for folks running on older MacOS releases, they would get the previous engine.

I also added some new widgets to the brush palette, but there's still more things that I want to change and add (which will happen eventually). But real artists ship, so out the door it went.

iOS 12 Portrait Matte Support and Layer Mask Features

Apple added a new feature to its latest iPhones in the iOS 12 update called "Portrait Matte". It's a special image embedded in HEIC images which is based off the depth data and some machine learning in your photo. You can then use this image as a mask to blur parts of your image (which is what the iOS "Portrait" camera setting does), or you can use this data to remove backgrounds.

But how should Acorn expose this matte? My first stab was to have Acorn add the matte as an additional layer. After playing with it a bit, it just felt off. So I ended up adding the matte as a mask to the main layer when opening the image. But folks are obviously going to want to do more than just mask out the background so I added new features to Acorn where you could easily drag and drop the layer mask into into its own layer. I also made it easy to move an existing layer to another layer's mask via drag and drop. I can't predict what people are going to want to do with the mask, but I might as well make it easy to move around.

It was also during this development that I found some bugs in Apple's My Photo Stream. The matte was showing up rotated incorrectly when opening images out of Photos. At first I figured I was just reading the data wrong, but nope- under certain conditions when images with the portrait mask were uploaded to MPS, the rotation data from the camera went missing. After some communication and a Radar filed at Apple, this bug was fixed in an OS update. Bug fixes like this don't happen very often, but when they do it makes filing all the other Radars worth it. Mostly.

Getting an image with the Portrait Mask onto your Mac is a bit tricky (iOS really wants to flatten effects before sending an image out). What's worked best for me is taking a selfie, and then using Image Capture to transfer over the HEIC. Why selfie? Well, the depth data for the back facing cameras is obviously there somewhere in the image (because Photos.app can modify it), but it's not showing up as a portrait mask for Acorn. Investigations are ongoing as to why this is.

8bit PNGs / Indexed PNGs

Recently I purchased a commercial license for PNGQuant for use in the next update to Retrobatch. It's a great utility which can take a PNG image, reduce the number of colors used, and add dithering to make it look pretty close to the original. The benefit of this is that you get much smaller file sizes. "8 bit PNGs" has been a feature request for Acorn for years, so I threw it in 6.3 as well.

Combine this with Acorn's (already built in) use of PNGCrush, and you get some pretty small PNG file sizes.

PNGQuant and PNGCrush are usually used as command line tools- but both come with source, which meant that I could compile the sources into Acorn's binary and not have to call out to a tool. This is what I was currently doing with PNGCrush. I compile the source into Acorn and call a pngmain() function with arguments to a temp file. It works, but it's kind of a hack, and I didn't have any way to cancel an in-progress action.

So for 6.3, I decided to pull the PNGCrush code out into a separate tool, and include PNGQuant as a tool as well. I could call them via NSTask, and since it was a separate process, I could terminate them if settings where changed in Acorn's web export.

This worked pretty well, until I tried to submit Acorn to the App Store. I had completely forgot about signing the binaries and giving it the right entitlements for acceptance to the MAS.

There were a bunch of things I had to do in order to get it all working, and some really dumb mistakes by me- but Timo Perfitt wrote a post just last week which almost mirrors exactly what I was going through. Go read his post if you're curious about mach-o headers and adding Info.plist files to executables.

Bugs

Acorn 6.3 introduced a new way of rendering PDFs, where vector data and bitmap data are layered together in a PDF. Previously when making a PDF, Acorn would flatten the image and insert the bitmap into a PDF (the one exception was if your image was 100% vector layers- the the resulting PDF would be all vector). Well of course I screwed this up a little bit, so that sometimes your bitmap layers would be drawn much larger than they appeared on screen. And this would happen for PDF export as well as when printing. Why when printing? Because printing on MacOS is pretty much "here's a PDF, send it off to the printer". Acorn of course uses the same PDF routines when exporting and printing.

But this bug was fixed for 6.3.1, along with some curves improvements I mentioned in a previous post on this site.

Pay via Stripe on the Flying Meat Store

This isn't really an Acorn thing, but I also reworked our online store so that it now accepts Apple Pay. I had moved to Stripe last year for collecting payments, but adding Apple Pay support required a little bit more work on my part. I put that off until this past week, and it's up and running now.

Previously I was using PayPal Web Payments Pro, which was pretty nice ten years ago- but compared to what Stripe offers these days, as well as the ease of development… well, it was really no contest. Stripe is awesome.

The End

Hey, that's a lot of writing. Now I know why I don't do postmortems more often.

January 9, 2019

Acorn 6.3 is available, and the full release notes are up as well.

Here's what I think is awesome in this release:

Portrait Mask Support. If you have an iPhone running iOS 12 (and can take Portrait photos), Acorn will now detect the Portrait Matte from those images and turn it into a layer mask. The Portrait Matte is the image data which enables blurring in the background, or other fancy camera tricks. This means you can use this matte to erase and add fancy backgrounds or custom blurs for your image, all within Acorn.

Other Mask Features. You can now drag and drop masks from the layers list into another layer, or copy it out as a new layer. When exporting layers you now have an option to apply the mask on export, or just write it as an additional image along with everything else. There are a number of new shortcuts when dealing with layer masks as well.

Brush Stuff. If you're running MacOS 10.13 or later, you get a performance boost when brushing (painting, smudging, cloning, etc…). This is especially noticeable when brusing on deep color images.

I've also added options to the brush palette for adjusting flow, softness and blending. In addition to all this, there's a bunch of new brushes under the "Basic Round" category which are designed for the new brush engine.

Other Stuff. There's other good things including improved PDF export, various MacOS Mojave UI fixes, additional speed improvements with with deep images, and more. And as always, it's a free upgrade for anyone who has already purchased Acorn 6.

January 5, 2019

I've got curves on the mind for some reason. And my mind was thinking- hey the curves code you've got fills everything into 256 buckets for each color channel (which acts as the LUT (Look Up Table)), and that's not going to work great for deep color images, is it?

So I dug up a 64 bit image I use for testing and ran it through the curves filter and sure enough, it looked like it had been converted to a 32 bit image. Not good at all.

So, what can I do to fix this? Well, there are a couple of options:

1: Make the LUT big enough to match the number of colors that could possibly fit into each channel. This would mean I would have to create a different sized LUT for each 32/64/128 bit image. That means more code. It also means more work for the CPU to do to fill up all those buckets as well as calculate the values based on the curve paths.

2: Change the code so that it doesn't take a LUT, but rather the vector data for the curves and then calculate the color changes on the fly based off that data, but on the GPU. This sounds like an awesome problem to solve, but I'm not sure it would be worth the time spent.

Honestly, I didn't want to do either of those solutions, but I was pretty resigned to implementing method #1. But I kept on thinking about it and went climbing for a bit and came back with another idea.

3: Keep the existing code, but use bilinear interpolation when accessing values from the LUT to figure out the color shifts. This is an extremely easy change and is probably about 98% as good as the previous two options. And it's certainly going to be way better than the current situation.

So I made that change, noticed nothing different, and then realized I was calling floor() in the filter kernel, completely the interpolation I was trying to use. So I got to delete code, and it made things better.

Here are the results. Old on the left, new on the right. Click on the image to make it bigger, because you're not going to see any differences at the size it's currently at.

Deep color curves.

You'll see a lot more banding on the left image than you do on the right image. And this image is actually a 24 bit PNG, where on my Mac it's a 64 bit PNG running through a wide gamut monitor. The differences are much more drastic here, and there's no perceivable banding at all for the fixed one.

What's next for curves? I'm not sure, maybe I'm done for the moment? I'd like to throw the alpha channel in there as well, but outside of a few interesting cases, it probably wouldn't be worth it.

Both these changes, and the changes I mentioned in my previous post are now availabe on Acorn's latest builds page.

January 4, 2019

I was working on a bug in Acorn yesterday with the Curves filter, and I noticed that when pushing the green channel to extremes that it became pretty blown out. While first looking at the filtered image I thought the results might be right… but maybe it wasn't? So I threw a mental note in the back of my head to compare what Curves in Acorn is doing versus Photos and Photoshop.

And then when was I knee deep in actually fixing the original bug, I realized there was a better way of creating the LUT (lookup table) from the curves data and that idea just kind of stuck in my head as well.

So after fixing some last minute sandboxing issues for Acorn 6.3 today (it's always the sandbox!) and submitting it to Apple, I decided to look into these two curves things.

Turns out that my idea for creating the LUT also fixed the blown out colors bug.

Blown out curves.
(click to embiggen)

The left window in this screenshot is the original image. The middle image is what is shipping in Acorn 6.3 and earlier. And finally we have the fixed version (aka, Acorn 6.3.1, which hasn't left my Mac yet). You can see the whites and the yellows not mixing very well when compared to the fixed version. And finally the filter window on the right is what the green channel was set to, so you can see how much I was pushing it.

It's always delightful when things come together like this.

Detailing the iOS Menu

The folks behind Codea have a great post about adding a Mac like menu bar to their iPad app. I bet this kind of thing gives the designers at Apple fits, but I really like it.

Loopback 2's New UI

Neale Van Fleet: The Design of Loopback 2

I really dig the design of Rogue Amoeba's Loopback 2, and Neale has a nice writeup on the evolution of it.

November 28, 2018

Dominik Wagner: SubEthaEdit 5 – Now free and open source!

It's story time.

Way back in 2003, when SubEthaEdit was still called Hydra, it won a round of the Mac OS X Innovators Contest from O'Reilly. My app at the time, VoodooPad (now owned by Primate Labs), also won a place in the contest. Audio Hijack Pro also made an appearance as a runner up to VoodooPad.

Because of international and political reasons, VoodooPad got first place with an award of a premier level membership to Apple's developer program. This was a pretty big deal! The regular membership cost $500 a year, but it came with a 20% discount on hardware which usually made up for the cost.

The premier membership was $3500, but it came with a pass to WWDC along with 10 hardware discounts. Ten! And a pass to WWDC!

My company was just me, so I only needed one hardware discount. But back in the day, indies helped indies and the hardware discounts were transferable, and I did what was the completely obvious thing to do.

So I sent a couple of the hardware discounts to the folks at Rogue Amoeba (makers of Audio Hijack) and the folks at the Coding Monkeys (who made SubEthaEdit). We were all pretty happy.

Then a few days later I got a call for ADC. "What the heck are you doing?" they asked. I said that I didn't need that many and gave a couple of discounts to them. "Are they doing work for you or something? Because the Coding Monkeys have a student ADC account, and it's not possible for them to have a hardware discounts and we're going to transfer those back to you."

Well crap. OK.

"And what about Rogue Amoeba?". Well, uh- yes. They are doing work for me. Yep. They sure are. Sub-contracting, it's completely official. Long pause. "OK."

To be fair, Paul Kafasis from Rogue Amoeba was helping me out a lot. Even though he's about 10 years my junior, he had been doing indie software development for a bit longer at the time and as far as I was concerned, was also quite a bit wiser about the whole business end of it. I got lots of good advice, and his company got some hardware discounts.

Anyway, that's my SubEthaEdit story. They eventually won an ADA as well, which I'd rather have than 100 hardware discounts.

Multiple Return Values in JavaScript

Luciano Mammino: Emerging JavaScript pattern: multiple return values.

You probably know already that JavaScript does not support multiple return values natively, so this article will actually explore some ways to “simulate” this behavior.

There are some neat ideas in here enabled by features from ES2015 which made no sense to me at the time, which suddenly do now.