The Shape of Everything
A website mostly about Mac stuff, written by August "Gus" Mueller
» Acorn
» Retrobatch
» Mastodon
» Micro.blog
» Instagram
» Github
» Maybe Pizza?
» Archives
» Feed
» Micro feed
October 8, 2015

While debugging a problem in Acorn yesterday, I came across a little oddity in Xcode that I thought was worth writing up for the greater good of my fellow Mac OS X developers. (Hey everyone- how's it going? iOS might get all the attention, but at least we can charge livable prices for our apps, amiright?).

Yesterday, Acorn had a bug where pasting a bitmap into an image would occasionally cause the bitmap show up as an empty layer. I could never get it to reproduce on my machine of course. I knew it was real though, I even had video evidence of it in action. It would not happen for me.

Then all of a sudden it did happen. And of course, I didn't have a debugger attached from Xcode so I couldn't break and see what the heck was going on. So I launched Acorn from Xcode, and tried to reproduce it. No luck.

Then it happened again, and I found a pretty reliable way to reproduce it. But again I wasn't running from Xcode and when I launched it from Xcode I couldn't reproduce it.

This shouldn't be happening. Why would the debugger "fix" this problem? Well, that's easy enough to test- I'll just turn off the debugger when running from Xcode. But then the bug didn't happen.

OK, so I was going down the wrong trail. It's not the debugger causing it. What else could be causing problems? Environment variables I turned on for debugging maybe? I checked the Run scheme to see if I had anything turned on there- and I did (Core Image flags), but turning those off wouldn't cause the bug to occur.

Maybe Xcode was setting an environment variable for me that I wasn't aware of? This was easily tested- I just added a couple of lines to Acorn to print the env vars, and compared the output from when I ran from Xcode vs. the Finder. One entry jumped out at me immediately- MallocNanoZone was set to 1 when run from Xcode, and it wasn't around when run from the Finder.

So I set MallocNanoZone to 0 in Acorn's Run scheme, launched Acorn from Xcode, and the bug reproduced right away. I had found what was masking the bug, and it was pretty easy to find the actual bug in Acorn after that.

So what does the MallocNanoZone env variable do? It's a flag that changes the memory allocator for your app, and for the frameworks your app uses. I don't know the specifics of this allocator vs whatever the normal one is, but I do know how it hid this bug from me in Acorn. When MallocNanoZone was set, the allocator worked in such a way that when I used CFRelease with a CGImageRef, and then used CGImageGetWidth with that same (bad) reference, it would return the correct answer (CGImageCreateCopyWithColorSpace() may have been involved as well). When MallocNanoZone was off the normal allocator was used and CGImageGetWidth would return a bad answer (as it should!). This bad answer caused Acorn to put the bitmap in the wrong spot and then the layer showed up empty.

Why does Xcode set MallocNanoZone=1? My understanding is that the Finder used to do this, so Xcode set the variable to behave the same as the Finder. However, I checked 10.8-10.11 and couldn't find any evidence of this.

TL;DR: If you're working on an OS X app, open up the run scheme and make sure to set MallocNanoZone=0 in your environment variables. If you don't, then you're using a different allocator than your users are, and behavior of your app might be different.

The fixes will be in Acorn 5.1.1 and 4.5.7, available now on the latest builds page.

Update October 12th:
So I was wrong about checking the environment variable to determine wether or not your process is using nanomalloc. The correct way is to do something like this in Terminal:

heap Acorn | grep MallocHelperZone

If you see output, nanomalloc is being used. If it's not there, then you're not. On my machine, when running Acorn from Xcode I can see that nanomalloc is being used by default. When run from the Finder, it isn't. A couple of little birdies have implied to me that this is an OS bug introduced in 10.11, and to expect a fix in the future along with a free 2016 WWDC ticket for me*.

* I wish. But hey— wouldn't that be cool? Apple gives out one WWDC ticket a week for the best bug found? I've long thought that Apple should add some sort of gamification to Radar.