Some things I wish I had know before starting to automate Mac Developer ID Notarization

It’s Day 5 of Notarization Week and it’s time to wrap up and write down my experiences.

Notarization itself is not incredibly difficult. You can learn the basics by watching the 40 minutes talk from WWDC 2019. Unlike sandboxing, notarization should not have any detrimental effects for most Mac apps.

As always the real trouble starts when you are trying to inject Notarization into the tangled web of modern Mac software development: entitlements, certificates, automated Xcode build chains, build settings, etc..

First you need to adopt the “Hardened Runtime” for your application. For the two apps that I tested with, this was simply a matter of switching it on in the “Capabilities” tab of your target. By default, all the hardened runtime features are switched on and I was able to leave them all on without any problem.

The first gotcha is that you can’t really test your application’s compatibility with the hardened runtime in Xcode, because it will run in debug mode. Since the hardened runtime would not allow inspection of your code, the default “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” build setting   will inject the “” entitlement into the debug version of your build product. This is a “normal” entitlement, just like those used for sandboxing.. and no the sandbox does not need to be turned on for notarization to work (sigh of relief).

Another gotcha is that your app will not be notarized as long as this entitlement is switched on, so we need to turn it off for the release build. This should not be a worry, but you will probably spend many frustrating hours chasing down this very problem nonetheless..

The next thing on the compliance list is that secure timestamps for codesign need to be turned on. Many developers have a “–timestamp=none” flag somewhere in their build settings.. because the Apple timestamp servers are slow and often down (at least here in Luxembourg) and you can no longer build a release without an internet connection. So if you have a build server without internet connection.. that is about to change. To make doubly sure, you should probably add “OTHER_CODE_SIGN_FLAGS=’$(inherited)  –timestamp” to your build settings.

In this context, it would have saved me a lot of time if I had known how to find out whether a product has in fact been signed with a secure timestamp. Executing “codesign –verify –deep –strict –verbose=4 –display  -r- /path/to/my/product” will display loads of things. If there is a line with “Signed Time” among it, that means that you did not sign with a secure timestamp. If you have a line with “Timestamp” in it, it means you do have a secure timestamp. It’s another brilliant example of how an Apple engineer’s language choice can cost tens of thousands of lost developer hours. “Signed Time (insecure)” would have been a great help.

In a similar vein, “codesign -d –entitlements :- /path/to/my/product” displays all the entitlements for the product and will reveal the dreaded “” entitlement if it is still present.

Once you have a build product, you can send it to Apple for notarization with the “xcrun altool –notarize-app -f /my/archive –primary-bundle-id -u username -p apps-peci-fic-pass-word”.

This is where things get a little weird. You can send either a disk image or a zip archive, but not an application bundle. I distribute my software as disk images and my software updates as zips. If you send a zip file, make sure that you use the “ditto” tool as instructed by Apple, so that you don’t run into problems with extended attributes. You need to supply your username (email address) and a password. You can generate an application specific password and that worked fine for me straight away.

The command line will upload the archive and then return a “request-id” which is a UUID that you can use to look up the state of the notarization. This is not a real-time, synchronous affair. It was fairly quick when I used it, taking usually only a few minutes, but it is obviously a challenging problem for automation. You could write a script that extracts the request-id and then polls the Apple servers for its status before continuing, but realistically you probably want to have a two or three stage build process now.

I subdivided my own build process from one into three phases: build, request notarization and a combined stapling and verification phase.

Which brings us to stapling, which is the fun and easy part. You just type “xcrun stapler staple my.dmg” or “xcrun stapler staple” and that’s that.

One thing to note is that the entire notarization process is completely free of build and version numbers, which is so wonderful. If only app review worked this way! There is no mention on how it works; it could be that Apple uses the entire archive as a hash code or that they create a hash of your upload. In any event, there is zero problem building a thousand different versions of your program and getting them all notarized.

The second thing to notice is that you can either staple app bundles or disk images, but not zip archives. Not sure which is weirder, but it kind of makes sense. In practical terms, this means that you can staple your notarization receipt to a dmg without having to open it, which is super easy. If I have understood this correctly, this means that both the dmg and the app are stapled and will open without any funny user warnings. Not being able to staple zip files, however, complicates things somewhat, because you now have to zip the app bundle to notarize it, staple the original unzipped app bundle and then re-zip it.

So far so good. Now enter the much dreaded Sparkle.framework, the foundation of all automated software updates across the Developer ID world, maintained by a clever, intrepid group of volunteers that deserve our eternal gratitude.. and the bane of my developer life.

For most of my products, Sparkle is the only framework that I bundle, so I blame it for the entire dreaded complexity and wasted time of framework signing.. which is a lot of blame. Signing frameworks is hell.. or used to be hell.. and now is hell again.

I don’t use Carthage or other “download stuff from all over the internet written by who knows who, run buggy build and install scripts and then ship the whole lot with my product” build systems. I actually just place a binary copy of the framework into the /Library/Frameworks/ folder and work with that. If you are using one of those build systems, you probably will have different problems.

The current (as of 26/July/2019) binary distribution of Sparkle is neither signed, nor built with the hardened runtime, so is unusable for notarizated apps. Downloading the source as a zip archive leaves out crucial files. So I did a “git clone –recursive” to get what I assume must be the master branch version (I have some deeply strange git expertise that overlaps with nobody else’s).

Building it with “make release”, despite affirmations to the contrary, did not result in a hardened version. One of worst things (I’m pretty sure it’s unavoidable and I’m not dissing its developers at all, but it is still absolutely dreadful) about Sparkle is that it includes two executables as well as the framework. and fileop always cause incredible signing headaches. The default option of just ticking the “Sign upon copy” option in Xcode, won’t sign these properly and you inevitably end up with gatekeeper problems.. even though it had just gone through a phase of actually working.. but no more.

I’m sure that at the heart of all my signing problems is a lack of understanding, aka ignorance. The thing is that I’m a Mac developer, not a cryptography geek. Knowing just enough to get by in the context of cryptography means knowing quite a lot about quite a few things, followed by endless trial and error that eventually ends for unknowable reasons.

After a very long time, I finally got a Sparkle build that I could use by opening the project in Xcode, adding the “OTHER_CODE_SIGN_FLAGS=’$(inherited)  –timestamp”, “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” to every relevant target and manually adding my Developer ID signing identity to all targets. I have no idea why this was necessary; as far as I understand the framework does not need to be signed at all, and will in any event be re-signed when it is copied into my app, but it would not work without this. Perhaps the entitlements only get added during signing?

I then spent most of a day chasing down the origin of the “” entitlement on the “fileop” executable that steadfastly refused to go away, despite having no debug build and having plastered the “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” build settings everywhere throughout the Sparkle project. Around 11PM, I decided to just delete Xcode’s “Derived” folder (what else was there left?).. and that promptly solved the problem.

With the Sparkle problems solved, the rest was fairly straightforward.

All told I’ve spent an entire week on learning about Notarization and integrating it into my build system.  It’s not badly designed. In fact it works fairly well and I would even go as far as calling some of the design decisions enlightened. It is certainly a lot better thought through than either App Review or the Sandbox.

Unfortunately, it adds yet more complexity to an already incredibly complex development environment. Today’s apps are not much more complex than those from the 1990s. Phone apps are mostly much less complex. It should be easier to develop for the Mac today than it was back in the 1990s. Yet nearly 30 years of development tool, framework and API progress has yielded a development context that is no more productive and far more complex. Notarization adds yet another layer of complexity and yet another time sink for Mac developers.

There are some positives: Apple can now revoke specific builds of an app, rather than just turning off all apps from the same developer id. The hardened runtime gives the developer the possibility of shielding his/her software from malicious modification, but allows him/her to decide which “holes” need to be blasted into runtime for the program to continue working. Actually scanning apps for malware adds peace of mind when you release a program into the world.

In an ideal world, Apple would turn around and ditch its Mac App Store sandbox requirement. It could even offer notarization as a way to side-load software on the iPad. After all, notarization gives it the tools to prevent malware from being published and to switch off on every single Mac in the world should it get through anyway.

As a long time Mac developer (since 1994), however, I can’t help thinking though that the security people at Apple would have done better ironing out the bugs and limitations of the sandbox to get it work properly and be less of a nuisance, rather than adding yet another security approach.

If early reports about Catalina are to be believed, it looks like there are so many people working on Mac security that they have to roll out new security features at each release, whether they are a net benefit to users or not. Perhaps, these people could be tasked with making macOS great again instead?

Quick Reaction to Mac Pro Leaks

Daring Fireball has a story about Apple sharing their plans for the future of the Mac Pro. It is weird to communicate with Pro users via a blogger, but what the hell: it’s Apple.

Apple are working on new Mac Pros with a completely new design, but they won’t ship “this year” and there’s no firm commitment to shipping them in 2018 either. In the meantime, they have processor bumped the existing machines but without USB-C and thus without LG 5K monitor support.

My gut reaction is relief. At least they are working on it and they haven’t whole sale abandoned Pro users. There’s also hints about new Pro displays and new iMacs.

It is also nice to see that Apple has realized that the existing Mac Pro design is a complete failure. Dual GPUs are no good, integrated components and a small form factor are incompatible with fast, low cost updates. They are talking about a “modular system” now.

All things said, however, nobody outside of Apple ever thought that their new Mac Pros were anything other than the product of a deranged mind and it took them nearly 3 years to acknowledge that. Furthermore, a “modular” system is exactly what the original Mac Pro and all but its latest incarnation already were. Building this fabled new modular Mac Pro is thus as easy as slapping an industry-standard motherboard with dual Xeons into the old Mac Pro enclosure and supporting it in software. There is literally nothing to it. If they don’t like the old cheese grater enclosure just spray it Jet Black already.

All this makes me worry about what that “new” direction is: worryingly there has been no acknowledgement that a Mac Pro needs dual CPUs. There’s only talk about dual GPUs, which nobody has asked for. Are they going to mess it up by overthinking it again?

All I want is a significantly faster Mac. Something that beats my no longer supported 2009 Mac Pro.. and it’s not hard to deliver now.


The unbearable fragility of modern Mac OS X development

There I’ve done it again: I shipped a broken A Better Finder Rename release despite doubling down on build system verification, code signing requirements validation and gatekeeper acceptance checks, automation, quality assurance measures, etc.

Only in October, I had a similar issue. Luckily that time around it only took a few minutes to become aware of the problem and a few hours to ship a fix so very few users were affected. Right now I don’t know how many users were affected by the “botched” A Better Finder Rename 10.01 release.

This didn’t use to happen. Despite the fact that I did not spend nearly as much time ensuring that everything worked properly with the release management. Nor am I alone in this situation. Lots of big as well as small developers have recently shipped similarly compromised releases.

The situation on the Mac App Store is much, much worse. Nobody other than Apple knows how many Mac App Store customers were affected by the recent MAS certificate fiasco that had the distinction of making it all the way into the pages of Fortune magazine.

The truth is that Mac OS X development has become so very fragile.

The reasons for this are manifold and diverse but boil down to: too much changetoo little communicationtoo much complexity and finally too little change management and quality control at Apple.

The recent Mac App Store (MAS) fiasco that left many (1% of Mac App Store users? 100%? Nobody knows) users unable to use their apps purchased from the Mac App Store was down to Apple’s root certificate expiring. This was a planned event: certificates are used for digitally signing applications and they are only valid for a particular period of time, after which they need to be replaced with new certificates.

When the Mac App Store certificate expired, it was replaced with a new certificate but there were two problems. First, the now expired certificate was still cached by some of Apple’s servers: when Mac OS X opens an application it checks its signature, which in the end is guaranteed by Apple’s root certificate. Since this was no longer valid, Mac OS X refused to launch them and reported them as “broken”, leaving users and developers equally baffled. After far too long, Apple investigated the problem and emptied their caches which made the problem go away.

The second problem which was not solved by updating the caches, was due to Apple also replacing the certificate with a new, higher security version; of course without telling anybody. The new certificate could not be verified with the old version of OpenSSL that was used in the receipt checking code of many shipping apps.

When Apple created the Mac App Store, it provided a “receipt” that each application should check to see whether it has been properly bought on the Mac App Store. This is just a signed file that contains details about what was bought and when. Instead of doing the obvious thing, which would have been to provide developers with an API for checking the validity of the receipt against Apple’s own rules, they just publishing snippets of sample code so that each developer could “roll their own” verification code. Supposedly this was for added security (by not providing a single point of failure), but it seems more likely that they couldn’t be bothered to ship an API just for the Mac App Store.

This decision came back to haunt them, because most developers are not crypto experts and so had to rely on developer contributed code to check their app’s receipts. Once this worked properly, most developers wouldn’t dream of touching the code again.. which is how it came to pass that many, quite possibly a majority, of Mac App Store apps shipped with the same old receipt checking code in 2015 that they originally shipped with in 2010(?). This was fixed by Apple revoking the new style certificate and downgrading it to the old standard.

For once, I had been ahead of the curve and had recently updated all the receipt code in my applications (no small feat) and I have yet to hear from any customers who had problems.

Just before the Mac App Store fiasco, however, many non-MAS had also shipped with broken auto-update functionality.

Apple does not offer any auto-update facility for applications that are not on the Mac App Store, which lead to Andy Matuschak’s “Sparkle” framework becoming the de-facto standard for adding an auto-update features to Mac applications.

Driven by abuse of some HTTP communications in iOS apps, Apple decided that in iOS 9 it would by default opt all developers into using only (more secure) HTTPS connections within their applications. What is good for iOS 9 can’t be bad for Mac OS X 10.11 El Capitan, so Mac applications also got opted into this scheme.

Unfortunately, that broke Sparkle for applications which do not point to HTTPS “app casts” such as mine. I have long resisted installing my own HTTPS certificates because I was worried about messing up the expiry periods, etc.. apparently just the way that Apple did with the Mac App Store certificates.

Most developers will have been unaware of the change since Apple never announced it, but I had happened to see the WWDC conference videos that mentioned this in passing. Unfortunately, nothing “clicked” in my head when I heard this. My applications do not communicate with a server anywhere and I thus thought that this was not something I had to worry about. I had forgotten that Sparkle might use this internally.

Fortunately, I caught this at 6AM when I released A Better Finder Rename 10 final. I was just doing a normally completely redundant check through all the features of the program when I noticed that the new version failed when trying to check for updates. By 8AM, I had identified and fixed the problem so that very few people indeed could have been caught out by it. That was luck though.

The nefarious element here was that applications were opted in automatically and silently. Before 10.11 El Capitan was installed on your Mac, my applications updated just fine. Afterwards, they no longer did. Just because they were on El Capitan. Gee thanks!

Of course, this would not have happened if I hadn’t built A Better Finder Rename 10 with the Mac OS X 10.11 SDK (Software Development Kit) at the last moment.

It is somewhat crazy for a developer to change the SDK that s/he builds a brand-new version of their software against in the middle of the beta phase. Changing the SDK always introduces errors because the entire environment in which the code executes is changed. This may bring out bugs that were already present; things that should never have worked, but worked just because the API happened not to trigger the bug. It also introduces bugs that are just part of the new SDK and that you now have to work around. Changing SDKs makes existing programs fragile.

I’m very conservative when it comes to changing SDKs because I’m well aware of the risks. That’s why I’ve been building my code against older SDKs for the past 15 years. A Better Finder Rename 10 was built against the Mac OS X 10.7 SDK which is forwards-compatible with newer versions of Mac OS X.

The main reason for doing so, is that I wanted to be certain that I didn’t accidentally break A Better Finder Rename on older systems, which brings us to the next problem with Mac OS X development.

Xcode lets you specify a “deployment target”, for instance 10.7, while building with a newer SDK. This is the recommended way of developing on Mac OS X and keeping backwards compatibility. Xcode will, however, happily let you use APIs that are not compatible with your deployment target and thereby ensure that your application will crash on anything other than the latest Mac OS X.

In fact, Xcode encourages you to use the latest features that are not backwards compatible and will rewrite your code for you if you let it, so that it will crash. It will give you “deprecation warnings” for any API usage that is not in the latest SDK and resolving those warnings is likely to break backwards compatibly as well. Of course, you won’t know this until you run it on the old Mac OS X version.

Now which developer can afford to keep testing rigs with 10.7, 10.8, 10.9 and 10.10? Never mind spend the time re-testing every change on multiple platforms for each change?

Thus I happily built with the 10.7 SDK. Apple did not make this easy by not shipping the old SDKs with Xcode, but you could manually install them and they would work just fine.

Imagine my surprise after installing Xcode 7 and finding out that this no longer worked. The only workable solution was to build against the 10.11 SDK, so jumping forwards not one but 4 SDK versions. A bunch of code wouldn’t compile any longer because the libraries were gone. Luckily the receipt checking code was amongst those, so it got modernised just in time to avoid the Mac App Store receipt fiasco.

Nonetheless, now my entire code base had become fragile and largely un-tested between the last beta release and the final shipping product. Nightmare!

On top of that was it still even 10.7 compatible? or indeed 10.10 compatible? Just quickly running it on older systems wouldn’t provide more than a little additional confidence since it’s impossible to go through every code path of a complex product.

After installing virtual machines to test on, I still couldn’t be 100% certain. The solution came in the form of deploymate, a now truly essential developer tool which does what Xcode can’t do: check that API usage is compatible with the deployment target.

I have since spent many weeks trying to ensure that I won’t run into the same problems again by adding (additional) automated verification processes to my build system. My build system now runs the built product through SDK compatibility checking courtesy of deploymate, code signing validation and gatekeeper verifications on each build. I’m still working though deprecation warnings and the like and my code base will soon be bullet proofed at least until the next forced changes arrive.

You’d think that this was a long enough list of problems for one year, but this still does not account for Apple also changing the code signing rules (once again) earlier in the year (in a point update of 10.10 no less). This time it affected how resources and frameworks are signed. So applications that were signed correctly for years, now suddenly became incorrectly signed and Mac OS X would refuse to launch them because they were “broken”.

All this points to the underlying issues with the current spade of fragility of Mac applications: Apple keeps changing the status quo and neither it, nor developers have any chance of keeping up.

Apple’s own applications are full of bugs now. None more so than Xcode, which is both the lynch pin of all Mac OS X, iOS, watchOS and tvOS development and no doubt Apple most fragile app offering. Xcode is in beta at least 6 months a year and never really stabilises in between. Each new version has new “improvements” to code signing, app store uploading, verification code, etc.. and each new version breaks existing code and introduces its very own new bugs and crashes. From one day to the next, you don’t know as a developer whether your code works or not. Existing code that had worked fine on Friday evening, no longer works on Monday morning. Worse, chances are that you are not hunting for your own bugs, but those in your development tools, the operating system or Apple supplied SDKs.

All this is driven by the one-release-a-year schedule that Apple has imposed on itself. This leaves all of Apple’s software in various stages of brokenness. When Apple’s own staff cannot deal with this constantly shifting environment, how are third party developers supposed to?

Case in point: Apple’s own apps are not all iOS 9 compatible yet. Many don’t support the iPad Pro’s new native resolution yet. Some have gained Apple Watch extensions, but most haven’t.

Reliability is a property of a system that is changed slowly and deliberately and where all constitute parts are themselves reliable. The Mac and all other Apple platforms are currently undergoing the worst dip in reliability since Mac OS X was introduced.

Apple is pushing out half-baked annual releases of all its software products, as well as apparently completely unmanaged changes to policies, external rules and cloud services at an ever more frenetic pace.

These could be written off as temporary “growing pains”, but the big question is: Do all these annual updates equate to real progress?

When I switch on my Mac today, I use it for much the same things that I used it for 10 years ago. A lot has changed. Cumulatively Mac OS X 10.11 El Capitan is somewhat better than 10.6 Snow Leopard.. and yet if you discount cosmetic changes and new hardware, nothing much has changed. Certainly nothing much has actually improved.

I can’t help thinking that if we had had 2 or possibly 3 Mac OS X updates instead of 5 over those last 5 years, we’d be in a much better shape now. Apple and developers would have time to provide user benefits and rock solid reliability rather than just endlessly chasing their own tail.

The beauty of the Mac used to be that it just worked. I want to get back to that.

A Better Finder Rename 10 on the horizon


Since the beginning of the year, I have been working flat out on version 10 of A Better Finder Rename and we are nearing the first beta release.

There are a few things that I want to get out there before the first beta ships and those are mostly to do with the Mac App Store and upgrades.

As many of you will be aware of, the Mac App Store is not much loved by Mac OS X software developers, because it is very different from the “traditional” Mac Indie software distribution that many of us feel is superior in very many ways.

Nonetheless most of us “old timers” have made our software available on the Mac App Store due largely to popular demand. Clearly the Mac App Store is better for some customers.

A few years ago, the Mac App Store started to demand that all applications be sandboxed and that was the beginning of the end for many professional productivity applications on the Mac App Store.

Sandboxing itself is a good idea. In a nutshell, it just means that applications cannot access your entire computer, but are restricted to a “safe” container with their own memory and disk space. Access to anything outside that “container” needs to be specifically allowed either by the user or by Apple during their review process.

Many categories of software (i.e. games) work very well in their sandbox, but most professional applications require fairly unfettered file system access, inter-application communication and/or internet access.

Tools such as BBEdit (an awesome text editor), TextExpander (an awesome snippet expander), Panic’s Coda (an awesome web development tool) and many others (many of them awesome) are leaving the Mac App Store because of these limitations.

A Better Finder Rename 9 is in the app store as “Better Rename 9” and we have managed to keep it non-sandboxed by only shipping “fixes” and no major upgrades for years.

By its very nature, a file renaming tool needs unfettered access to the file system. There’s no chance of Apple granting us an “entitlement” to do that during the review process. The reason is that this pretty much defeats the objective of being sandboxed in the first place.

In the idealized sandbox world, it is the user who implicitly grants permission to manipulate files by selecting them in a Open File… dialog or by just drag & dropping them onto the application or its icon. This works fine for our other file utilities such as File Multi ToolA Better Finder Attributes, but not for A Better Finder Rename.

The reason for this is simple: on a Unix system such as Mac OS X, the name of a file is not stored in the file itself but in the folder that contains it. Dragging & dropping files only gives access to the file and not to its “parent folder”, so you can change everything except its name.

We would thus either need to ask the you to give Better Rename 9 permission for the parent folder every time you want to rename something, or store that permission somewhere after the first time. Alternatively, we could also ask you to give us permission for the entire disk.

This is not an elegant solution and Apple may or may not accept it. Having played around a bit with other programs that have similar problems, it seems that Apple would most likely allow this kind of “hack” where the program brings up an Open File… dialog and says “Sorry I want to access this file but I can’t, please select it for me!”. Yuck.

We have most of the code necessary to do this and are ready to ship it, but it will undermine the usability of the tool, so we are not certain whether we will continue to support a Mac App Store version beyond version 9.

The next huge problem is how to implement paid upgrades. Nobody wants to pay for upgrades, but upgrade revenue is important for developers and customers alike. The economics of software development are currently bifurcated: you have the traditional developers such as Panic, OmniGroup, BareBones, Red Sweater to name but a few, who diligently plug away at making their products ever more awesome.. then you have the newer App-Store generation authors who create an app, launch it, get a good pay day (or more likely not) then see revenues collapse and move on to the next app product.

Abandonware is okay for some categories of software. Who cares whether Flappy Birds gets updated for iOS 9? Professional software, however, is used for more than mere entertainment. Customers buy into professional software, learn how to use it and expect to be using it for many years to come. They expect the software to supported, for bugs to fixed, for it to work with the latest operating system version and to continously evolve with their own growing needs.

A Better Finder Rename was first published in 1996 on System 7 running on PowerPC-compatible Macs and has constantly evolved since. Version 10 is the most awesome version yet and contains many improvements that would have been just as relevant in 1996 as they are today, as well as many that nobody could have predicted back then. At this point, it has probably broken through 100,000 hours of development and support time. A substantial amount of this time was paid for by upgrade fees.

Paid upgrades have another crucial advantage for long term customers: while fire-and-forget developers optimize for immediate appeal, paid upgrades are almost always targeted squarely at the needs of long term users. It’s a different mind set: The success of a new app depends on how many people buy it now, the success of a paid upgrade depends on how many people are willing to pay for the improvements and the new features.

Paid upgrades are great for professional level software because they allow software developers to spend time addressing the needs of existing customers. That’s why it’s particularly troubling that Apple does not allow for any upgrade pricing on the Mac App Store.. and that’s why developers like me are not very happy about it.

Apple makes the Lion’s part of its revenue on hardware. Software for them is something that makes people buy their hardware, so they can afford to give their software away for free to make you buy more hardware.

Indie software developers are only selling software and don’t get a cut from the hardware sales. In fact if we sell through the Mac App Store, Apple gets a 30% cut of our revenue and that’s after sales tax in most countries (though not most parts of the US). For a 19.99 EUR sale in Germany for instance, a developer only gets 11.76 EUR paid out; the missing 41% goes to Apple and the German VAT office. After the tax office and social security payments here in Luxembourg, there is less than 6 EUR left for me of any one sale of Better Rename 9.

Apple has changed software pricing on mobile devices but also on the Mac quite dramatically. They started by offering iLife (iPhoto, iMovie, etc.) for $19.95, then added iWork (Pages, Numbers,…) again at bargain basement prices. At those price points, just charging you another $19.95 every year is perfectly fine. In the end, all those products are now completely free and Apple makes all its money off the hardware.

Most importantly, Apple has never had to finance the development of those software titles by their actual purchase price. They produced these titles to sell $2,000 MacBooks and iMacs, not for the sake of the $19.95 upgrade pricing. Not surprisingly, none of those applications has seen real effort put into maintaining either backwards compatibility or expanding their feature sets. They are entry level applications because Apple has no real interest in driving their development forwards.

Unfortunately, neither charging the full price for each upgrade or making upgrades free, works for applications such as BBEdit, OmniFocus, Coda or indeed A Better Finder Rename. I don’t want to ask customers to pay another $19.95 for A Better Finder Rename 10, but I can’t afford to make it free either.

Many developers have tried to overcome this problem in a variety of creative but imperfect ways. The ball has been in Apple’s court for years, but it’s very clear that they don’t mean to ever pick it up.

I’m not excluding bringing A Better Finder Rename 10 to the Mac App Store eventually, but in a first phase, A Better Finder Rename 10 will only be available from the website.

Our upgrade terms have always been quite generous: paid upgrades cost 50% of the initial purchase price, are fairly infrequent (every 2-5 years), you can get forever upgrades which cost 100% of the initial purchasing price and if you have only recently bought the product, you get a free upgrade.

For A Better Finder Rename 10 the upgrade terms are as follows:

  •  if you have purchased A Better Finder Rename or Better Rename 9 after the  1st of January 2015, you get a free upgrade
  • if you own a forever upgrade, you get a free upgrade
  • otherwise, you have to purchase a discounted paid upgrade

As you may or may not be aware of, anybody who has purchased Better Rename 9 from the Mac App Store can also run A Better Finder Rename 9 from our website for free. In 99% of all cases, A Better Finder Rename will automatically detect that you have previously bought Better Rename 9 and unlock automatically.

If it does not unlock automatically, all you need to do is to download Better Rename 9 to your machine and run it once. After that you can delete it and A Better Finder Rename will still remember that it was there once.

Once released, A Better Finder Rename 10 should automatically unlock if you have purchased Better Rename 9 from the Mac App Store after the 1st of January 2015. So if you buy Better Rename 9 from the Mac App Store now, or even after A Better Finder Rename 10 is out, you are covered.  If you run into any problems, contact us at and we will sort everything out with you.

Likewise, if you own A Better Finder Rename 9 or Better Rename 9 but have bought it before the 1st of January 2015, you can buy the discounted upgrade to version 10 from the upgrade page. You can do so even before A Better Finder Rename 10 comes out.

After version 10 has been out for a while, we will reconsider whether we’ll submit Better Rename 10 to the Mac App Store complete with the crippling “please let me rename this” dialog or leave things as they are.

For us, the important thing is that no matter whether you buy on the Mac App Store or from us directly you will have access to the same versions and will not be penalized in any way.

Unfortunately, Apple does not tell us the identities of anybody who purchases our products on the Mac App Store, so we cannot contact existing customers to let them know of these arrangements.. so if anybody wants to post a comment (developers can’t leave or reply to comments) saying “you can get a free upgrade from!”, you’re more than welcome.

Image Capture Workflow Updated for A Better Finder Rename 9

Back in 2009, we published a couple of blog posts describing how to
use OS X’s Image with A Better Finder Rename 8:

The second post linked to an Automator workflow to use with this process, but the release of A Better Finder Rename 9 “broke” this workflow; it works only with A Better Finder Rename 8.

We have updated this workflow for A Better Finder Rename 9, so you no longer have to settle for the default import location. Instead, you may choose your import destination at run-time. Grab a copy of the workflow here:

MacBook Pro Retina external display problems & resolution

I’ve just spent an entire afternoon messing around with my MacBook Pro Retina trying to get it to work with multiple high resolution monitors and I thought I’d share my misadventures here so that other people may benefit from what I’ve found out..

I run my 2013 MacBook Pro 15″ retina in two different offices both with two external Dell 2711/2713 displays.. I love all that extra space…

Today, I tried to connect up a third 24″ display also from Dell using a new cable and all hell broke lose.. suddenly ALL my monitors were stuck in HDMI modes making it impossible to select their native 2560×1440 resolution. It took me three hours to figure out how to change things back.. so this is what I’ve found.

The 2560×1440 resolutions were gone from the Display Preference Pane and I was stuck on 1080p and the screen mirroring kept coming on.

Things to try:

1. Zap PRAM

There’s actually no longer any PRAM in Macs, but the newer NVRAM works the same. The NVRAM stores some basic system settings including the screen resolutions of attached monitors.

So just shut down your Mac, press the Power button, hold the Command, Option, “P” and “R” keys (before the gray screen comes up) and keep them pressed until you hear the chime once.

No luck, the screen resolutions weren’t back.

2. Zap the System Management Controller (SMC)

In theory, this shouldn’t be necessary, but it’s another standard step in rectifying “weird” behaviors, so I went ahead anyway.

Shut down your Mac and start it up by holding down the Shift, Control, Option (all on the left side) and Power keys.

Still no change. So I spent an hour playing around with SwitchRes X, which is the step that I should have skipped.

3. In the Display Preferences Pane, Option-Click the “Scaled” radio button..

Suddenly all resolutions appear and you can choose whichever you want, not just the ones deemed “safe”. Hurray!

Except there still isn’t the the 2560×1440 resolution I’m after..

4. Finally, switch off the monitors while connected to the MacBook and pull their power supply cables, wait a few seconds, then reconnect the cables, switch the monitors back on and do the option-click trick.. hurray!

The 2560×1440 resolution appears and everything is fine again.

Obviously as always with this type of problem, who knows which bits are optional and which are necessary? It’s possible that if I had known about the Option-click trick and having to switch the monitors off, it would have worked straight away without a single reboot.

I’d recommend pulling the power supply cables of the monitors and trying the Option-click trick first before restarting anything. It does appear like the problem was that the monitors defaulted into some kind of “safe” mode where they don’t show any non-HDMI modes.. but as always: who knows?

I hope that somebody will benefit from this mini-report.

Best regards,


All our software is Mac OS X Mavericks ready

As you may know Apple have released Mac OS X 10.9 Mavericks during their keynote last night.

It is available for free on the Mac App Store as of now.

Apple has given us developers early access to the new release to give us time to test compatibility and take advantage of new features. We have over the past months quietly released a few minor updates to address compatibility issues and at the time of writing this all our current software titles are Mavericks-compatible.

As always, however, problems that are not apparent “in the lab” can start popping up when a new version is released into the “wild”, so if you encounter any problem, please let us know at and do not rely on other people to do so.. most people wait for “somebody else” to report the problem and then months later frustrated users post a “I can’t believe they still haven’t fixed this!” review.. while all the while developers are blissfully unaware of anything being amiss.. please just drop us a line.

There is one known problem for Vitamin-R on Mavericks that affects only multiple display setups.

Mavericks is the first Mac OS release ever to feature “multiple menu bars”; this is to make it easier to select items in the menu bar without having to mouse back to the main screen all the time. Unfortunately, Mac OS X was never designed to work that way and while Apple’s implementation works it leaves much to be desired.

The main problem is that menu bar items, such as Vitamin-R’s (R) icon appear in multiple different places at once, but Apple hides this fact from the program. As a result, programs only get to know one location for the menu item, but it exists in two or more places. On top of that which location the program is made aware of seems pretty random and is not documented.

All this makes it hard to position popup menus such as Vitamin-R’s or indeed Fantastical’s or Dropbox’s accurately.

We’ve done our best, but with only a single beta tester, we are not entirely certain that it works for everyone. We thought we’d have a little more time to test this before releasing it, but with yesterday’s surprise announcement, we’ll be releasing an updated version of Vitamin-R today.

Let us know if you run into problems!

Why the Mac Pro is perfect for Mac and iOS developers.

I love my Mac Pro. There I said it.

I’ve been a user of Apple’s pro desktop line since the PowerMac G3 (blue & white stripes) came out in 1999. It was the first Mac that I bought with money that I had “made” from Mac shareware sales.

Back in the late 90s, I had grown accustomed to working on Sun and Silicon Graphics Unix “workstations” at the Computer Science department at Lancaster University and the PowerMac to me felt like my “personal workstation”. There was something great about having “the best”; it made me feel all grown up.

Now the PowerMac G3 by today’s standards looks plain awful, but so does the transparent plastic iMac from which it borrowed much of its design. You could legitimately feel good about “thinking differently”, especially since Apple back then was part of the IT counter-culture.

The PowerMac G3 was quickly replaced by the G4 and G5 of which I owned quite a few due the more powerful PowerPC’s tendency to auto-combust at the drop of a hat. Back then there was a lot of PowerPC versus Intel mud-slinging going on and it was kind of fun trading floating point performance figures with my brainwashed PC-adoring friends.

That all changed when Apple ditched the “superior” PowerPC processors and migrated to Intel CPUs. Funnily enough, the renamed Mac Pro was a powerful machine and the Xeon processors just refused to blow up. I’ve owned a couple over the intervening years and I have lost only a single one to a lightning strike (no joke).

Ever since the stellar success of the iPod, Apple has morphed from being the “Mac company” to being the “iPod-company-that-also-makes-Macs”. The success of the iPhone and the iPad have further removed the Mac from the limelight and for a while one could have been forgiven for forgetting that Apple made computers at all.

For me as a Mac developer since 1993 and for many people like me, Apple will always remain the Mac company. We do love our Macs because we spend all day working on them. Big screens, powerful processors and gigabit ethernet cables may sound antiquated to all you iPhone-, iPad- and MacBook Air-touting hipsters but for us, the Mac is the hub of our professional lives.

That’s why people like Marco Arment and, dare I say it, myself are so attached to our Mac Pros. They just make our lives so much better.

Ask anybody who knows me: I’m not a patient man. I hate waiting around while Xcode is double broom-ing my project and while the antiquated little spinning disks are compiling, linking and starting my executables. Instant is quick enough. Anything else just breaks my flow.

Developing software is largely about keeping an awful lot of state in your head at the same time, while acting locally and thinking strategically. It’s hard. Real hard. A big screen, or better yet, two big screens really help you keep all this stuff organized.

The same goes for keyboards and other input devices. Most people are well served with a laptop keyboard or even a touch-screen keyboard. It’s fine for writing email.

Many developers, however, have taken the time to learn to touch-type. Some like myself have even learned the DVORAK layout in a quest for performance and a vain attempt at preventing RSI. The great thing about touch-typing is not that you can type quicker though, but that you can type without taking your eyes off the screen. In that way, you can keep all the stuff that you need spread out in front of you and let your thoughts just flow out of your finger tips.

Working on a laptop is pure torture once you have become used to a full workstation-style setup. The tiny screen means that all the information that you are used to surround yourself with is hidden from you. It feels like you see the world through a pin hole. The keyboard makes it hard to work comfortably because you are hunched over it all the time and the whole experience feels like you are trying to play a miniature guitar. The frets are too close together, your giant fingers keep muting the strings.. your music sucks.

The Mac Pro then is the developer’s tool of choice. It may not be mobile, but neither is the developer. Working in a busy coffee shop is great some of the time, but once you get into the zone, you don’t want to be disturbed. You probably don your sound-cancelling headphones and the only thing that the rest of the world can do for you now is to deliver an espresso once an hour (or more) and leave you alone.

The most important things for a developer’s machine then are the ability to connect multiple large displays, plug in a “regular” keyboard and mouse and provide as much raw power, storage and connectivity as it is possible to engineer at this point in time.

This is in stark contrast with the other group of people who love their Mac Pros: video professionals who need to have a lot of non-standard hardware: huge disks and the most powerful graphics cards available to mankind. While big disks are a plus for us developers too, we care more about how fast they are than on how many terabytes of storage they give us. I certainly wouldn’t trade my 512GB of SSD disk for 20 terabytes of spinning disk.

The new Mac Pro’s that Apple has teased at this year’s WWDC are a great fit for developers.

Video professionals will bemoan the lack of built-in storage bays and the inability to add bigger graphics cards may well be a deal breaker for many of them. For developers, however, those two things don’t really matter.

The new Mac Pros deliver in spades for developers where it matters:

  • Master-of-the-Universe Wow factor
  • Multiple 4K displays
  • Really fast storage
  • Superior CPU performance
  • Whisper quiet operation

Yes, the new Mac Pro looks the part. I have no doubt that setting up one of these machines in your office will give you a buzz and make you feel good about your choice of career. The aesthetics of the new Mac Pro are clearly designed to appeal to our demographic (young and not so young men who fancy a BMW M3) and even steadfastly anti-Apple developers have to admit that they just want one..

Any developer who has had the opportunity to work on one of the new Mac Book Pro Retina displays is impatiently waiting for Apple to finally bring the gorgeousness of those displays to larger screens. Right now, the price for 4K displays is so prohibitive that I can’t see this happening for a while yet and yet I think Apple has paved the way for just that.

One of the major technical challenges of making a full-size retina display is how can you manage to move that many pixels around? The new Mac Pro has double graphics cards as standard. A curious choice unless you’re planning on driving retina displays. Apple talked about “third party 4K displays” at WWDC but one would imagine that they must have such a display in the works themselves no matter how expensive it might be.

All PowerMacs and all Mac Pros so far have come in multi-processor configurations; Apple also made some single processor models but those were really more of an exception.

Yesterday, a benchmark appeared that showed a single processor Mac Pro and this caused quite a stir, because like me most people had assumed that the new Mac Pro would still have two processors.

Careful examination of the videos on Apple’s teaser website, however, seem to show room only for a single CPU and two graphic cards. A curious choice, until you start thinking a bit about it.

Xeon processors are really expensive easily costing $1000 and more per unit. If you put two in a box, you are in old Mac Pro territory before you even add anything else. Yet CPU performance for many tasks is becoming less and less important.

In typical development tasks, it is the hard disk that is the limiting factor. Apple have eschewed spinning disks altogether for the new Mac Pro. Again a strike against those poor video editors. It looked like a foregone conclusion that Apple would leverage its fusion drive technology to offer fast disk access coupled with huge storage capacity.

Again for developers, this hardly matters. What matters is that SSDs speed up build times in a way that a faster CPU just can’t manage. Still, two CPUs are faster than one. So why just one? I think there are two reasons: cost and fan noise.

Hands down the main reason why not every developer has a Mac Pro on the desk is the price. They are really expensive. Largely because the Xeon CPUs that they sport are such high margin processors. Fitting only a single processor drops the price of a Mac Pro by at least $1000. It also makes heat management so much easier..

Fan noise is another one of those things that many people don’t particularly care about. Alienware gaming PCs (and their laptops for that matter) can sound like leaf blowers and many Mac Pros in the past have had annoying fan noise. While fan noise really doesn’t matter all that much on a gaming machine where it gets drowned out by the sound of explosions, it does wear on you when you’re trying to do highly concentrated work. So developers such as myself do care.. as clearly does Jony Ive and Apple and in many ways the wind tunnel design of the new Mac Pro is its stand-out feature.

Many people are asking the obvious question: Why does it matter so much to Apple that it’s Mac Pro is so compact?

Mac Pros have always had handles. Yet they have rarely been moved, being far too heavy to ferry about easily, so we are quick to dismiss the presence of a handle as just a nice design touch. On top of that Apple’s obsession with built-height often takes on ridiculous proportions: A really thin iMac!?. Who cares!?

Yet, this time around I think the handle might be part of the equation. The new Mac Pro is actually portable if not exactly mobile.

I’m not alone amongst Indie developers to share my time between my home office and a co-working space or a dedicated office. Like many developers I’m trying to keep parity between my home and my work setup so that I can get straight on with work, no matter where I’m currently at.

I’m mostly using my retina MacBook Pro right now, but for all the reasons outlined above rather than working on it as a laptop, I plug it right into a dual monitor setup either at home or at the office. The fact that the MacBook Pro has its own screen and keyboard only comes in handy from time to time. Mostly it’s just a portable Mac and prevents me from having to break the bank by buying two powerful machines. The new Mac Pro is small and light enough to be comfortably transported from one place to another in the back of the car and once there it can be plugged into a Thunderbolt display in just the same way that my MacBook Pro can be. On top of that you don’t have to pay for the screen and the keyboard every time you upgrade it.

In other words, the new Mac Pro is pretty much perfect for Mac and iOS developers who want a lightning fast desktop machine, but don’t want to re-buy a screen every time. It’s more portable than an iMac and its air-tunnel design allows it to house much more powerful hardware than would be feasible in an iMac or Mac mini enclosure. On top of that, it may not be quite as expensive as previous models because it houses a single CPU. Whether the dual graphics cards setup really makes sense will only become clear once Apple unveils its first 4K display.

I can’t help thinking that Jony Ive designed this machine more for the participants of WWDC and the extended developer community, than for Apple’s traditional market of creative professionals.

The Mac Pro then won’t make many video editors very happy, but is a great machine for Mac and iOS developers and I suspect that is just as it was planned.

Vitamin-R 2.0 Upgrades & Mac App Store

It’s been almost 3 years since Vitamin-R was first released into the public eye.

Those who experienced version 0.01 beta 1 (!) can testify to how much the product changed between then and the 1.0 release and of course Vitamin-R has never since stood still for more than a few weeks. In total, we have up to this point released no fewer than 109 updates and I think you would agree that it’s now nearly time for the big 2.0 release.

If you own any of our software, you will have noticed that we release features as soon as they’re ready. This helps us create better product as we get feedback earlier and it also gets new features into your hands quicker.

The price we pay for this practice is that we don’t get to do the “big reveal” when the time comes to ask you for an upgrade fee.. but rest assured that we are making an extra special effort to make 2.0 more than just another point update!

Many people hate upgrade fees and we accommodate these people by providing “forever upgrades” both with your initial purchase and at any point after.

While upgrade fees may be a little painful, they are instrumental in ensuring that a product meets the requirements of experienced users. Without them there is little economic incentive to develop a product beyond what is necessary for its immediate appeal. It is no coincidence that most iPhone apps get used only a couple of times before they are forgotten forever.

Without upgrade fees there’s no economic incentive for developers to look past the moment of the sale, leading to software that is optimized for immediate appeal but fails to live up to its promise shortly thereafter. We want none of that. With Vitamin-R we want to introduce you to a more productive and enjoyable way of working and support you at every stage of your journey. In order to do so we want to furnish you with the tools to evolve your own style. This means making Vitamin-R highly customizable and leveraging your usage information to provide you with insights into your own work patterns; neither of which does much to increase the immediate appeal of the product to prospective new clients.

Please note that none of this means that we are intend on making Vitamin-R more complicated. On the contrary, streamlined operation is even more important for experienced than for novice users.

In the past, we have always given customers very generous “grace periods”, meaning that if you bought the product shortly before a major upgrade, we would grant you a free upgrade.

Unfortunately in the changing world of Mac software development this is no longer so easily done.

As you may know the Mac App Store, through which many of copies of Vitamin-R are bought, does not offer any support for paid upgrades. Instead Apple is charging full price for major new versions of its software such as “Pages”, “Numbers”, “Keynote”, “Final Cut Pro”, “Aperture”, etc.

This makes a lot of sense for Apple who operate on the “big reveal” model more than perhaps any other company in history and who of course make money on the software sale, their 30% App Store processing fee and on hardware sales.

This puts software developers like us into an awkward position.

We are masters of our own web stores and can continue to offer discounted upgrade pricing, forever upgrades and “grace periods” and we will.

On the Mac App Store, however, it’s Apple’s rules all the way. There are no discounted upgrades, no grace periods and we do not even know the identities of the people who buy our software.

All third party software developers are facing the same problems. Some decide to go Mac App Store only. Some decide to stay off the Mac App Store altogether. Most, like us are trying to mitigate the problem as much as we can.

We realize that not everybody will be happy with our solution, but what we have decided to do is the following.

To Get A Discounted Upgrade to Vitamin-R From Our Web Site / To Take Advantage of the “Grace Period”

1. Direct and Mac App Store customers alike will be able to buy a discounted upgrade from our web store via

2. Direct customers who have purchased Vitamin-R after the 1st of January 2013 will be able to obtain a free upgrade code to version 2 from

3. Mac App Store customers who have bought Vitamin-R after the 1st of January 2013 will be able to obtain a free upgrade by mailing their iTunes Store receipt to

This is much the same retrofitted solution that OmniGroup are going to apply to OmniFocus 2 upgrades.