Some things I wish I had know before starting to automate Mac Developer ID Notarization

It’s Day 5 of Notarization Week and it’s time to wrap up and write down my experiences.

Notarization itself is not incredibly difficult. You can learn the basics by watching the 40 minutes talk from WWDC 2019. Unlike sandboxing, notarization should not have any detrimental effects for most Mac apps.

As always the real trouble starts when you are trying to inject Notarization into the tangled web of modern Mac software development: entitlements, certificates, automated Xcode build chains, build settings, etc..

First you need to adopt the “Hardened Runtime” for your application. For the two apps that I tested with, this was simply a matter of switching it on in the “Capabilities” tab of your target. By default, all the hardened runtime features are switched on and I was able to leave them all on without any problem.

The first gotcha is that you can’t really test your application’s compatibility with the hardened runtime in Xcode, because it will run in debug mode. Since the hardened runtime would not allow inspection of your code, the default “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” build setting   will inject the “com.apple.security.get-task-allow” entitlement into the debug version of your build product. This is a “normal” entitlement, just like those used for sandboxing.. and no the sandbox does not need to be turned on for notarization to work (sigh of relief).

Another gotcha is that your app will not be notarized as long as this entitlement is switched on, so we need to turn it off for the release build. This should not be a worry, but you will probably spend many frustrating hours chasing down this very problem nonetheless..

The next thing on the compliance list is that secure timestamps for codesign need to be turned on. Many developers have a “–timestamp=none” flag somewhere in their build settings.. because the Apple timestamp servers are slow and often down (at least here in Luxembourg) and you can no longer build a release without an internet connection. So if you have a build server without internet connection.. that is about to change. To make doubly sure, you should probably add “OTHER_CODE_SIGN_FLAGS=’$(inherited)  –timestamp” to your build settings.

In this context, it would have saved me a lot of time if I had known how to find out whether a product has in fact been signed with a secure timestamp. Executing “codesign –verify –deep –strict –verbose=4 –display  -r- /path/to/my/product” will display loads of things. If there is a line with “Signed Time” among it, that means that you did not sign with a secure timestamp. If you have a line with “Timestamp” in it, it means you do have a secure timestamp. It’s another brilliant example of how an Apple engineer’s language choice can cost tens of thousands of lost developer hours. “Signed Time (insecure)” would have been a great help.

In a similar vein, “codesign -d –entitlements :- /path/to/my/product” displays all the entitlements for the product and will reveal the dreaded “com.apple.security.get-task-allow” entitlement if it is still present.

Once you have a build product, you can send it to Apple for notarization with the “xcrun altool –notarize-app -f /my/archive –primary-bundle-id my.primary.bundle.id -u username -p apps-peci-fic-pass-word”.

This is where things get a little weird. You can send either a disk image or a zip archive, but not an application bundle. I distribute my software as disk images and my software updates as zips. If you send a zip file, make sure that you use the “ditto” tool as instructed by Apple, so that you don’t run into problems with extended attributes. You need to supply your username (email address) and a password. You can generate an application specific password and that worked fine for me straight away.

The command line will upload the archive and then return a “request-id” which is a UUID that you can use to look up the state of the notarization. This is not a real-time, synchronous affair. It was fairly quick when I used it, taking usually only a few minutes, but it is obviously a challenging problem for automation. You could write a script that extracts the request-id and then polls the Apple servers for its status before continuing, but realistically you probably want to have a two or three stage build process now.

I subdivided my own build process from one into three phases: build, request notarization and a combined stapling and verification phase.

Which brings us to stapling, which is the fun and easy part. You just type “xcrun stapler staple my.dmg” or “xcrun stapler staple my.app” and that’s that.

One thing to note is that the entire notarization process is completely free of build and version numbers, which is so wonderful. If only app review worked this way! There is no mention on how it works; it could be that Apple uses the entire archive as a hash code or that they create a hash of your upload. In any event, there is zero problem building a thousand different versions of your program and getting them all notarized.

The second thing to notice is that you can either staple app bundles or disk images, but not zip archives. Not sure which is weirder, but it kind of makes sense. In practical terms, this means that you can staple your notarization receipt to a dmg without having to open it, which is super easy. If I have understood this correctly, this means that both the dmg and the app are stapled and will open without any funny user warnings. Not being able to staple zip files, however, complicates things somewhat, because you now have to zip the app bundle to notarize it, staple the original unzipped app bundle and then re-zip it.

So far so good. Now enter the much dreaded Sparkle.framework, the foundation of all automated software updates across the Developer ID world, maintained by a clever, intrepid group of volunteers that deserve our eternal gratitude.. and the bane of my developer life.

For most of my products, Sparkle is the only framework that I bundle, so I blame it for the entire dreaded complexity and wasted time of framework signing.. which is a lot of blame. Signing frameworks is hell.. or used to be hell.. and now is hell again.

I don’t use Carthage or other “download stuff from all over the internet written by who knows who, run buggy build and install scripts and then ship the whole lot with my product” build systems. I actually just place a binary copy of the framework into the /Library/Frameworks/ folder and work with that. If you are using one of those build systems, you probably will have different problems.

The current (as of 26/July/2019) binary distribution of Sparkle is neither signed, nor built with the hardened runtime, so is unusable for notarizated apps. Downloading the source as a zip archive leaves out crucial files. So I did a “git clone –recursive https://github.com/sparkle-project/Sparkle” to get what I assume must be the master branch version (I have some deeply strange git expertise that overlaps with nobody else’s).

Building it with “make release”, despite affirmations to the contrary, did not result in a hardened version. One of worst things (I’m pretty sure it’s unavoidable and I’m not dissing its developers at all, but it is still absolutely dreadful) about Sparkle is that it includes two executables as well as the framework. Autoupdate.app and fileop always cause incredible signing headaches. The default option of just ticking the “Sign upon copy” option in Xcode, won’t sign these properly and you inevitably end up with gatekeeper problems.. even though it had just gone through a phase of actually working.. but no more.

I’m sure that at the heart of all my signing problems is a lack of understanding, aka ignorance. The thing is that I’m a Mac developer, not a cryptography geek. Knowing just enough to get by in the context of cryptography means knowing quite a lot about quite a few things, followed by endless trial and error that eventually ends for unknowable reasons.

After a very long time, I finally got a Sparkle build that I could use by opening the project in Xcode, adding the “OTHER_CODE_SIGN_FLAGS=’$(inherited)  –timestamp”, “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” to every relevant target and manually adding my Developer ID signing identity to all targets. I have no idea why this was necessary; as far as I understand the framework does not need to be signed at all, and will in any event be re-signed when it is copied into my app, but it would not work without this. Perhaps the entitlements only get added during signing?

I then spent most of a day chasing down the origin of the “com.apple.security.get-task-allow” entitlement on the “fileop” executable that steadfastly refused to go away, despite having no debug build and having plastered the “CODE_SIGN_INJECT_BASE_ENTITLEMENTS=YES” build settings everywhere throughout the Sparkle project. Around 11PM, I decided to just delete Xcode’s “Derived” folder (what else was there left?).. and that promptly solved the problem.

With the Sparkle problems solved, the rest was fairly straightforward.

All told I’ve spent an entire week on learning about Notarization and integrating it into my build system.  It’s not badly designed. In fact it works fairly well and I would even go as far as calling some of the design decisions enlightened. It is certainly a lot better thought through than either App Review or the Sandbox.

Unfortunately, it adds yet more complexity to an already incredibly complex development environment. Today’s apps are not much more complex than those from the 1990s. Phone apps are mostly much less complex. It should be easier to develop for the Mac today than it was back in the 1990s. Yet nearly 30 years of development tool, framework and API progress has yielded a development context that is no more productive and far more complex. Notarization adds yet another layer of complexity and yet another time sink for Mac developers.

There are some positives: Apple can now revoke specific builds of an app, rather than just turning off all apps from the same developer id. The hardened runtime gives the developer the possibility of shielding his/her software from malicious modification, but allows him/her to decide which “holes” need to be blasted into runtime for the program to continue working. Actually scanning apps for malware adds peace of mind when you release a program into the world.

In an ideal world, Apple would turn around and ditch its Mac App Store sandbox requirement. It could even offer notarization as a way to side-load software on the iPad. After all, notarization gives it the tools to prevent malware from being published and to switch off on every single Mac in the world should it get through anyway.

As a long time Mac developer (since 1994), however, I can’t help thinking though that the security people at Apple would have done better ironing out the bugs and limitations of the sandbox to get it work properly and be less of a nuisance, rather than adding yet another security approach.

If early reports about Catalina are to be believed, it looks like there are so many people working on Mac security that they have to roll out new security features at each release, whether they are a net benefit to users or not. Perhaps, these people could be tasked with making macOS great again instead?

Re-Learning Touch Typing with the Workman Layout

I learned touch typing when I was in my mid-teens and WordPerfect was the new hotness on DOS. It got me into a fair amount of trouble more than a decade later when I was writing up my PhD thesis and I developed my first proper RSI symptoms. As I mentioned in the previous post, it was the combination of two main ingredients, switching to my beloved Kinesis Advantage keyboard and the Dvorak keyboard layout that saved my hands and career.

I have used that combo to write and code for two decades now.. and yet I’m writing these words on a laptop keyboard using the Workman layout.

First things first. I’m not an über-typist. I think at the peak of my Dvorak typing I got to 80-90 words per minute, which is fast but not exceptional. I measured myself at 75wpm before starting my new adventures in touch-typing, which is just fine, because I can’t think at more than perhaps 60 wpm anyway.

Something I have realised over time is that maximum performance is not nearly as important than comfort when typing. My main success criteria for a keyboard arrangement are:

  • must feel comfortable
  • must minimise strain on my body, thus preventing injury
  • must let me concentrate on what I’m writing, not how I’m writing
    • for me that means that I need to be able to keep my eyes on the screen at all times and my fingers need to be able to find the keys without distracting me
  • must be able to keep up with my thoughts
  • must be easy to navigate and edit text
  • must enable me to use keyboard shortcuts easily

My current quest for a new keyboard layout was triggered by the fact that I want to be less dependant on my desktop setup and be able to work effectively in coffee shops and similar settings.

Unfortunately laptops come with the standard crappy staggered key arrangements and there is precisely zero hope that Apple is ever going to come out with a matrix keyboard on a laptop. So you’ve got to make do with what they give you..

I have at various times tried to use Dvorak on a MacBook keyboard, but never with any real success. My fingers have memorised the key positions on the Kinesis’ straight rows of keys and I mishit the keys on the bottom row almost constantly. All this is made worse by the fact that I’m subconsciously peeking at the QWERTY labels on the keys because the screen is right above the keyboard. The actual labels on the keyboard become especially irresistible when I reach for a keyboard shortcut when my fingers have not been resting on their home row positions.

Eventually I settled on using hunt and peck on the laptop and just live with QWERTY. It wasn’t a huge deal because I was a very occasional laptop user.

A year ago, however, all that changed because I fell in love with the 12″ MacBook. It quickly pushed my iPad out of my day bag and I found myself writing code and answering email on the go.. and with that came my dissatisfaction with not being able to touch type on it. On my desktop my thoughts just magically flow through my fingers onto the virtual paper while on the laptop I’m plodding along at quarter speed..

So I decided to bite the bullet and re-learn QWERTY touch typing. I got to 45 wpm after about a week and I felt that this would be perfectly fine. While it still felt slow I felt certain that by just sticking with it I was surely going to get faster with time. Six weeks later, however, I was still no faster and more importantly it still felt awful. All those finger contortions; the fact that the most frequently used keys are in the least accessible places and most importantly: the God awful rhythm.

One thing that most Dvorak users note is the nicely flowing rhythm of the layout. Most of the time when you type in Dvorak, successive keys are on alternating hands. One hand presses a key while the other is getting in position. The very fastest typists tend to be Dvorak users (sustained 150 wpm for 50 minutes, peaking at 212wpm) and I think that the fact that the hands alternate so often might be a key factor in that. Dvorak is to QWERTY in that respect as the Brandenburg Concertos are to slowly scratching chalk over a blackboard.

The Colemak partially optimized keyboard layout

The Colemak partially optimized keyboard layout

This got me started on learning the Colemak layout. This is a “modern” optimized keyboard layout and claims to be faster and “more optimal” than Dvorak. It scores over Dvorak in a number of ways, but most significantly it is much easier for QWERTY users to learn. Known as a “partial optimization”, it relocates only some keys, concentrating on getting the most frequently used keys under your finger tips on the home row. Particularly on the bottom row, keys stay pretty much where they were. This also means that the most common shortcuts stay in the same positions. So copy, cut, paste and undo problems as I experienced previously, simply do not arise. As far as possible all keys are also typed with the same hand as in QWERTY. This is especially significant for using the Shift key properly, as this requires coordination with the opposite hand.

I found Colemak much more difficult to learn than QWERTY because everyone has that layout stored somewhere in their brain. Still Colemak was an immediate and significant improvement, even though my typing speed went way down into the 10-15wpm initially. There are a lot fewer contorsions and you can type many words with home row keys only.

I persevered for 3 weeks, but even then I was struggling to get above 20 wpm. While feeling better than QWERTY, Colemak did not actually feel all that great. One particular annoyance was typing the “TH” combo. The T is just under the left index finger, which is just fine, but the H is reached by sliding the right index finger to the next key on the left. This is a very awkward manoeuvre in and of itself, but combining it with hitting the T key at speed is hard and just feels wrong. So every word containing a “th” becomes a little hiccup. I also found that in general the rhythm of the layout was an improvement only when compared to the low bar set by QWERTY.

I decided that perhaps I was barking up the wrong tree. Colemak might be easier to learn for QWERTY folk, but that actually worked against me: My beloved Dvorak has zero communality with either layout. Keeping the key under the same hand as in QWERTY actually slowed my progress for that very reason. What I really wanted was a “new improved” Dvorak version, not a better QWERTY, but I couldn’t find anything and wasn’t about to develop my own layout.

What I did find was a coherent criticism of Colemak that was insightful enough to clarify what I actually disliked about it, but hadn’t been able to put my finger on. The author of that criticism had also developed his own layout based on this analysis and that’s how I found Workman.

The Workman fully optimized keyboard layout

The Workman fully optimized keyboard layout

Apparently a lot of keyboard layout optimizers (yes, such things exist) consider all 4 fingers to have the same natural range of motion, mobility and strength. This explains why Colemak considers the H key to be in a prime position despite the fact that it is clearly much harder to reach sideway than upwards with your fingers. Colemak also does not consider the length of the fingers.

Workman gives each key a score based on how easy it is for an actual finger to hit it. I’m not certain that I would necessary have chosen the exact same scores, but it’s clearly an improvement over Colemak and the most common combos are easier to type. Workman keeps the Z and X keys in the same spot, but moves the CV keys one position to the right. With a sticker over the keys I find that I can live with that. There are also Mac implementations freely available, including “Programmer’s” versions, which I’ll probably be using as they are similar to the “Programmer’s Dvorak” that I use on my desktop machines.

I have used the Xmas holidays to practise my Workman layout and about a month in, I’m getting towards the usable stage and this is the first lengthy document that I have written in it. I’m optimistic about this remaining my laptop layout for good.

There are, however, a few things I’m not so keen on. The first is minor and concerns the “ch” bigram, which is fairly awkward to type. It’s not nearly as bad or common as the TH issue on Colemak or the “ls -l” plaguing Dvorak Unix users.. but still..

The other is a potential deal breaker and concerns the design decision to favour “single hand utilisation“. Workman’s designer, OJ Bucao, claims that it is easier, faster and more comfortable to type multiple letters with the same hand rather than by alternating hands.

This is the reverse of my own experience with Dvorak. When typing in Workman I’m constantly performing two, three or even four letter runs with one hand. For Bucao this is a good thing. He claims that after a while those patterns become ingrained and you end up typing them as a single action. He is certainly correct that it minimizes hand movement, as the un-used hand can find its way back to the home row and take a breather. The most common bigrams and trigrams are also easy to type with very little reaching.

Still the jury is out on that particular feature. I have noticed that I’ve started typing with those semi-automatic finger rolls, but I find it fatiguing and I don’t (yet?) like the rhythm much.. but it’s early days yet and generally I’m much happier with Workman than with either QWERTY or Colemak.

If you are interested in learning a new layout, I would recommend giving Workman a try above Colemak. Colemak and Dvorak both come pre-installed on macOS, but installing Workman is very easy. If you are a developer, the programmer’s versions makes typing codes much simpler.

Long-Term Review: The Kinesis Advantage 2 Ergonomic Keyboard

In the mid-1990s, while working as a full time researcher, writing up my PhD thesis and starting publicspace.net, my arms suddenly started tingling after a good day’s (and night’s) work. Shortly afterwards, my fingers and forearms would be on fire at the end of every day. I started worrying.

Eventually I couldn’t work full days any longer and even just typing a few words or using a mouse would cause pain and discomfort. I started seriously worrying that I had managed to hamstring myself before even making it into a “proper” job.

That’s how my obsession with all things ergonomic started.

A good 20 years later, I’m much healthier and have suffered no RSI related symptoms for at least 15 of those years.

Probably the two most effective things I did back in the mid-90s was to buy an outlandishly weird ergonomic keyboard called the “Kinesis Ergo” and learning to touch type with the DVORAK keyboard layout.

Kinesis Advantage 2 Keyboard

The Kinesis Advantage 2 Keyboard

The Kinesis Ergo keyboard is now in its brand new “Advantage 2” generation, which is an opportunity for a long term review. It looks like something from an alternative (much geekier) universe, but is probably the single best piece of ergonomics I’ve ever bought.

Like all ergonomic keyboards, the Kinesis will do you absolutely no good if you don’t touch type.

Ergonomic keyboards enable you to type without pain and with greatly diminished effort, but you have to learn how to use them. Two finger-pecking at a split keyboard with your wrists fully bent, hammering your fingers into the keys with your keyboard resting on a desk that is 5 inches too high, obviously won’t work.

The point is that it is simply impossible to type on a traditional keyboard without some degree of discomfort, because you just can’t get your limbs into a pain-free position. With the Kinesis Advantage, you can.

A great keyboard, which the Advantage 2 certainly is, goes one step further: not only can you type without injuring yourself, but it also helps you forget about the keyboard, concentrate on what you are writing and makes it feel natural and fun.

Just like lesser “ergonomic” keyboards such as Microsoft’s much loved, but ultimately very half-hearted attempts, the Advantage is “split“, meaning that each hand gets its own separate area and both are physically separated.

This allows your wrists and shoulders to stay in a neutral, un-bent position and is instrumental in preventing carpal tunnel syndrome. CTS is caused by the tendons of your fingers rubbing against the gap between your wrist bones while typing. When your wrists are bent sideways or strongly upwards or downwards that gap narrows and.. ouch!

Also just like other ergonomic keyboards, the Advantage has a “tented” design. This means that both halves of the keyboard have a gently upwards slope starting with your little fingers and progressively rising as you move towards the index fingers. Again this allows for a more natural position of wrists and shoulders.

The Kinesis also uses mechanical key switches: the “Cherry Browns” for mechanical keyboard enthusiasts. There is a debate whether mechanical key switches are truly superior to their scissor counterparts, but it is probably telling that even die hard scissor switch aficionados only claim that they are “just as good”; while nobody claims scissor switches are better. I personally much prefer the mechanical kind.

This, however, is where the similarities between the Advantage and something like Microsoft’s Surface Ergonomic Keyboard or even Matias’ Ergo Pro stop.

Matrix Key Layout

Kinesis Advantage Matrix Key Layout

The Kinesis Advantage is part of only a handful of keyboards that don’t use the staggered key rows that originate in the requirements of the mechanical typewriter, but instead uses a columnar (also known as a matrix) layout. All this means is that the keys are arranged in straight columns just like on a number pad.

The sheer stupidity of doing anything else does not hit you until you have used a matrix keyboard for a day or two and go back to a “stupid” keyboard. Who would do this to themselves? Simply arranging the keys in columns eliminates the awkward finger contortions that are such a fun part of touch typing. Yes, our fingers can move sideways, but they really don’t want to, especially when you want to hit something.

There are other matrix keyboards out there, all with their own fan base.

The Truly Ergonomic Keyboard

The Truly Ergonomic Keyboard

The Truly Ergonomic is a mechanical keyboard but completely flat with neither tenting, nor enough of a split for my tall frame.

The Type Matrix Keyboard

The Type Matrix Keyboard

The Type Matrix is a very similar affair but with scissor switches.

The Latest Ergo Dox Keyboard Iteration

The Latest Ergo Dox Keyboard Iteration

The ErgoDox is an open source DIY keyboard that is mechanical, tented and fully separated. This is the only keyboard I mention here that I don’t own myself. I don’t like the fact that it is “straight” tented rather than Kinesis’ more organic shape, but I can imagine that it is pretty close to the Kinesis and is a real “split keyboard”.

The Maltron 3D Two-Handed Keyboard

The Maltron 3D Two-Handed Keyboard

The Maltron Two-Handed 3D keyboard is very close to the Kinesis Advantage in almost all respects and I have used it for a few years before going back to the Kinesis. My major gripe is the build quality which is more “bespoke custom job” than what you’d expect from a consumer product.

Kinesis has gone a step beyond simply adopting a matrix layout in the search for the perfect ergonomic fit. Your hands in fact rest in a completely natural “well” taking into account the length of your finger and their natural curvature. Moving your fingers up and down in a straight line always puts your finger tips straight on the keys with no reaching. The new Advantage 2 even has textured and molded home row keys that make it immediately obvious that your finger tips are dead center on their respective home row keys.

Over the years, I have tried to move away from the Kinesis design; mostly in order to have a cheaper and more mobile setup. I spent several agonizing months in 2014 trying to migrate to the Microsoft Surface Ergonomic keyboard after my second Maltron developed yet another dead key, but I could never get comfortable with it.

It took me a while to realize why my attempts to go back to a more standard keyboard were doomed. The real reason is what makes the Advantage so hugely superior to the TypeMatrix and the Truly Ergonomic: the thumb clusters and in-line cursor keys.

Behold the thumb keys.

Behold the thumb clusters.

The thumb clusters are such an obvious improvement once you get used to them, that is seems impossible that there are keyboards without this feature. The thumbs are the strongest and most mobile fingers and yet on a traditional keyboard both thumbs only hit one miserable key.

Not so on the Kinesis, where each thumb gets its own cluster of keys. You press Space, Backspace and Delete with your thumbs. In fact the Space and Backspace keys are right under your thumbs when your hand is completely relaxed. Your thumb also covers your Control, Option and Command keys, as well as the less important Home, End, Page Up & Page Down keys. The cursor keys are placed in a 4th row that does not exist on other keyboards.

What these design choices amount to is what makes typing on the Kinesis Advantage such a great experience: you never have to move your hands away from the home row.

In all other keyboard designs, some frequently used keys such as the Backspace, Delete, Enter or the cursor keys require you to move your hand, usually the right hand, away from its home row, feel for the key, press it and then awkwardly feel your way back onto the home row.

Not having done this for well over a decade of continuous Maltron and Kinesis keyboard use, this absolutely drove me nuts on the Surface keyboards and I went back to the Advantage.

On the Kinesis, if you’ve mistyped something, your fingers stay where they are and you tap your left thumb to hit backspace. If you need to go back a few characters, bend your fingers until they rest on the cursor keys. Bend them back and you are on the home row again. Your hands themselves do not move.

Personally, I do not use the thumb to hold the Control, Option and Command keys but move my hand to reach the top of the cluster with my index finger; I’m not even sure whether this is as was intended, but it works really well and I’m back on my home row in no time.

On a traditional keyboard, the keys that need to be reached by bending your index fingers laterally (e.g. G and the H key) are very awkward to press. The Advantage does not eliminate this awkwardness altogether, but just sliding the finger sideways places it at the optimal angle to press sideways, making it into more of a poking motion which feels much more natural.

The Kinesis keyboard has the full range of function keys, but they are not much easier to reach than on any other keyboard. For almost two decades, the small function keys were rubber domed atrocities that served their purpose, but felt really cheap, especially when compared to the bank-breaking mechanical key switches used in the rest of the keyboard. In the Advantage 2 iteration these keys are now also mechanical, but while appreciated, this does not genuinely make a world of difference.

The latest model makes a bunch of detailed improvements, but the basic design has been identical since the early 1990s. The on-board programmability, which has always been a selling point is much also much improved.

The only programability feature that I have really used is the ability to switch between QWERTY and Dvorak keyboard layouts automatically. This allows you take your keyboard anywhere and type in Dvorak whether your employer feels like installing that keyboard layout on your machine or not.

The Advantage 2 also lets you easily remap keys, define macros and much else besides. I haven’t had enough time with the latest iteration to play much with the new features.

My only gripe with the Advantage 2 is that it is not yet a fully split keyboard. That would be awesome, but I guess at roughly $350, Kinesis reckons that a hard price limit has been reached. I disagree.

The Dactyl Fully Split Keyboard

The Dactyl Fully Split Keyboard

There is clearly Advantage-inspired fully split keyboard design available for 3D printing called the Dactyl Keyboard and I wish Kinesis would take that final step, so that I could replace my 3 Advantage keyboards one more time 🙂

Think of the MacBook Pro 2016 as the pro version of the MacBook

Having owned both a MacBook Pro 15″ Retina and a new MacBook, it is crystal clear to me that the new MacBook Pro descends straight from the MacBook and is not (just) an updated version of last year’s MacBook Pro.

The MacBook was the most extreme Macintosh laptop since the introduction of the original MacBook Air; not the reasonably priced and still vastly popular one, but the amazingly expensive and very, very slow 2008 MacBook Air.

The MacBook is supremely opinionated. Something that Apple, for better and often for worse, is great at. Everything was sacrificed for thinness and weight: A single USB-C port that is also used for charging; a keyboard with almost zero key travel; a touchpad that does not move.

Sure the MacBook takes some getting used to. At first, the keyboard is awkward and the touch pad is a little “weird”. Things don’t run as quickly as you’re used to.. then you get used to it and discover the Zen factor: Hush. It’s completely quiet.

After a while, even as a confirmed mechanical keyboard fanatic, I started appreciating the crispness of the keyboard. After less than a year, I started hating the mushy keys on my 2012 MacBook Pro 15″ so much that I started praying for a MacBook Pro with a new style keyboard. The old moving MacBook Pro touchpad feels equally antiquated.

As a fan of wired mice, at first I carried around a USB 3 dock to plug my mouse into, but soon the mouse and the dock stayed in the bag. It’s the convenience dummy. It was annoying having to buy a USB-C to Thunderbolt cable, but hey.. it’s hardly the end of the world.

From the perspective of somebody who has grown to appreciate the MacBook over the past year, the 2016 MacBook Pro looks very different.

The new MacBook Pro is a much faster machine than the MacBook, but keeps many of the attributes that made me fall in love with the later. The keyboard allegedly retains the crisp feel of the MacBook but is somewhat less extreme. The trackpad is huge but also does not move. The 15″ version features no less than 4 ports supporting 4 external displays (or 2 at 5K: a laptop first) and are faster than the built-in SSD. Said SSD might well be the fastest ever to be put into a stock laptop.

I have always found it hard to develop on a laptop, but the portability of the MacBook invisibly changed my habits. The MacBook is underpowered for serious development and the screen is too small for comfort, especially if you are used to multi-screen development setups.. and yet, convenience wins out and today I’m doing most of my exploratory development on the tiny MacBook.

Sure, the 2016 MacBook Pro 15″ is not going to be as portable as the MacBook, but it’s going to be much more so than the old model. On paper, the weight and the bulk savings may not amount to much, but as so often with Apple products, they tend to be more than the sum of their parts.

Many people are upset about the specs. There are faster laptops, with more RAM and with higher resolution screens out there. I don’t know whether it matters.

Intel is the limiting factor. Gone are the days when every two years CPU speeds doubled. Today’s gains are much more modest. We are also already at a point where most current computer models are simply fast enough, even for professional use. Not that I don’t want the fastest CPU out there. In reality, however, even the most power hungry professionals can’t really tell the difference between a Skylake and a Kaby Lake CPU.

Designing the ultimate laptop is no longer a matter of simply putting all the latest and most powerful components into a chassis. With the possible exception of die hard gamers, nobody wants a two inch thick 17″ laptop that sounds like a leaf blower. That does not mean that I’m opposed to Apple making such a machine for those who long for it; but it’s not the machine that I would buy.

I, personally, am looking forward to taking delivery of my 15″ MacBook Pro in the coming weeks and I fully expect it to be a great machine. Shame it couldn’t be thinner and lighter and fan-less (yet).

Tools of the Trade: AppCode, a breath of fresh air from the Xcode monoculture.

If you are a Mac or iOS developer for better or for worse there is no way around Xcode.

Xcode is free and full-featured, so why would you ever want to use anything else? This is the main reason why there are practically no other Mac OS X or iOS developer tools on the market today. There just isn’t any room for third parties for it to make economic sense to develop expensive developer tools.

The only other serious IDE for Mac OS X and iOS development is JetBrain’s AppCode and I’d recommend that every serious Apple developer should own a copy. While Xcode has evolved into a powerful and mostly stable tool, Apple has a lot of blindspots and Xcode is in many areas (at least) 15 years behind the top of crop. AppCode isn’t.

JetBrains is the powerhouse of Java development tools and they represent everything that Apple does not. Where Apple is closed, secretive and has a very paternalistic approach to its developer community, JetBrains is open, transparent, friendly and as cross-platform as it is possible to be.

The advantages for an Apple developer such as myself is that you get a peak at the world beyond Apple’s strictly enforced white room monoculture. Using AppCode is as much about growing as a developer as it is about efficiently developing software.

JetBrains offers IDEs that support nearly every language that is available and the more outrageously new and niche a language is, the more likely that JetBrains has a tool for it. This means that once you get used to the basic IDE concepts, you can take that expertise and use it for developing in other languages, on other platforms (Android, Windows, Web) and with other technology stacks.

I use WebStorm for my own website development, RubyMine for web app stuff and IntelliJ IDEA for learning functional programming in Scala. If I ever wanted to learn CoffeeScript, Dart or Haskell I know I’d be covered there too. On top of this JetBrains’ plug-in technology makes adding support for the latest and greatest open source technologies a breeze and JetBrains are very good at keeping an eye open for exiting new technologies. There’s a good chance that the first you hear about a new technology is by looking at JetBrains’ product release notes.

The AppCode IDE itself is very much in the mold of other Java development environments. The IDE can do everything and more, but it is also very busy and a long way from the pared-down minimalistic Apple aesthetic. It’s a nerdy power tool more than a philosophical statement.

JetBrains is rightly famous for their language parsing and refactoring acumen, so their IDEs are chock full of “intelligent” features. Not the kind of “intelligent” that makes everything harder, but the actual intelligent kind.

Navigating in AppCode is much more powerful than in Xcode. The gutter contains a myriad of options that will take you from method implementation to declarations and vice versa. You can also click and hold from class definitions to jump to super- and sub-classes, get in-line help and auto-fixing for commons problems. The as-you-type code analyzer finds potential problems and standard fixes, the code reformatting options are powerful and easily accessible. The intelligence extends to seamlessly into finding all places a piece of code is actually used rather than having to rely on text searches.

Best of all, however, AppCode can make changes to associated files without leaving the current file. The annoying roundtrip between implementation and header files that keeps interrupting your train of thought in Xcode can be wholly avoided. You write the implementation for a method and AppCode just offers you the ability to declare said method in the header with a single click without ever taking your eyes of the code you are busy writing.

Working in AppCode you constantly find yourself wondering why Apple can’t just do this. If it seems obvious, it’s in AppCode. Unfortunately this is rarely true for Xcode.

Refactoring is part and parcel of the AppCode experience and backed so far into the IDE that it becomes a nearly invisible part of your development. If you are used to refactoring in Xcode, you are likely to be non-plussed by AppCode’s refactoring support. Where Xcode makes a huge deal out of every refactoring: taking a snapshot, making you validate a thousand changes and more likely than not failing bang in the middle of the refactoring, AppCode just makes the changes with no fuss whatsoever. The first time I used the renaming refactoring in AppCode, I was wondering what I was doing wrong. I typed the new name into the red highlighted area and nothing happened! How do you terminate the editing? In fact, AppCode had already done the project-wide refactoring. Why make a fuss about it? Why could it fail? Why beach-ball for a few seconds? Why indeed?

AppCode enables you to work in a completely different manner to Xcode. Say you are into Test-Driven Development. Write the test cases first. When you instantiate your target class in the test class, AppCode will tell you that the class does not yet exist. A single click solves the problem by creating the class for you. As you write your tests, you can one-click to add method declarations and empty implementations. When you’ve finished with your test cases, there’ll be .m and .h files with complete stub implementations all without you ever leaving the test case implementation file.

Another big differences with Xcode is that where Apple knows everything best and either offers no customization or forces you to comply with their guidelines, JetBrains puts you in charge. Almost every aspect of the IDE is fully customizable: you can define your own coding style, which will cause AppCode to use your specific style to create stubs. You can even decide to reformat your code automatically before checking it into source control. You can (obviously) choose your own source code management system, add CocoaPods support, edit and preview HTML, CSS, Compass, TypeScript, JavaScript, files or add your own selection of plug-ins. In short, JetBrains is for grown-ups that like taking their own decisions.

Similarly, if you’ve ever felt the frustration of never being able to talk to anybody at Apple about Xcode, you will find the JetBrains support team a breath of fresh air. Something not working? Something not supported? Something you’d like to see added? Just drop them a line and an actual person will reply to you; better yet that person will be an approachable, open-minded fellow developer intent on helping you out. With JetBrains you’re the customer and you know best.

Seriously, just give it a shot. If only for a breath of fresh air.

The unbearable fragility of modern Mac OS X development

There I’ve done it again: I shipped a broken A Better Finder Rename release despite doubling down on build system verification, code signing requirements validation and gatekeeper acceptance checks, automation, quality assurance measures, etc.

Only in October, I had a similar issue. Luckily that time around it only took a few minutes to become aware of the problem and a few hours to ship a fix so very few users were affected. Right now I don’t know how many users were affected by the “botched” A Better Finder Rename 10.01 release.

This didn’t use to happen. Despite the fact that I did not spend nearly as much time ensuring that everything worked properly with the release management. Nor am I alone in this situation. Lots of big as well as small developers have recently shipped similarly compromised releases.

The situation on the Mac App Store is much, much worse. Nobody other than Apple knows how many Mac App Store customers were affected by the recent MAS certificate fiasco that had the distinction of making it all the way into the pages of Fortune magazine.

The truth is that Mac OS X development has become so very fragile.

The reasons for this are manifold and diverse but boil down to: too much changetoo little communicationtoo much complexity and finally too little change management and quality control at Apple.

The recent Mac App Store (MAS) fiasco that left many (1% of Mac App Store users? 100%? Nobody knows) users unable to use their apps purchased from the Mac App Store was down to Apple’s root certificate expiring. This was a planned event: certificates are used for digitally signing applications and they are only valid for a particular period of time, after which they need to be replaced with new certificates.

When the Mac App Store certificate expired, it was replaced with a new certificate but there were two problems. First, the now expired certificate was still cached by some of Apple’s servers: when Mac OS X opens an application it checks its signature, which in the end is guaranteed by Apple’s root certificate. Since this was no longer valid, Mac OS X refused to launch them and reported them as “broken”, leaving users and developers equally baffled. After far too long, Apple investigated the problem and emptied their caches which made the problem go away.

The second problem which was not solved by updating the caches, was due to Apple also replacing the certificate with a new, higher security version; of course without telling anybody. The new certificate could not be verified with the old version of OpenSSL that was used in the receipt checking code of many shipping apps.

When Apple created the Mac App Store, it provided a “receipt” that each application should check to see whether it has been properly bought on the Mac App Store. This is just a signed file that contains details about what was bought and when. Instead of doing the obvious thing, which would have been to provide developers with an API for checking the validity of the receipt against Apple’s own rules, they just publishing snippets of sample code so that each developer could “roll their own” verification code. Supposedly this was for added security (by not providing a single point of failure), but it seems more likely that they couldn’t be bothered to ship an API just for the Mac App Store.

This decision came back to haunt them, because most developers are not crypto experts and so had to rely on developer contributed code to check their app’s receipts. Once this worked properly, most developers wouldn’t dream of touching the code again.. which is how it came to pass that many, quite possibly a majority, of Mac App Store apps shipped with the same old receipt checking code in 2015 that they originally shipped with in 2010(?). This was fixed by Apple revoking the new style certificate and downgrading it to the old standard.

For once, I had been ahead of the curve and had recently updated all the receipt code in my applications (no small feat) and I have yet to hear from any customers who had problems.

Just before the Mac App Store fiasco, however, many non-MAS had also shipped with broken auto-update functionality.

Apple does not offer any auto-update facility for applications that are not on the Mac App Store, which lead to Andy Matuschak’s “Sparkle” framework becoming the de-facto standard for adding an auto-update features to Mac applications.

Driven by abuse of some HTTP communications in iOS apps, Apple decided that in iOS 9 it would by default opt all developers into using only (more secure) HTTPS connections within their applications. What is good for iOS 9 can’t be bad for Mac OS X 10.11 El Capitan, so Mac applications also got opted into this scheme.

Unfortunately, that broke Sparkle for applications which do not point to HTTPS “app casts” such as mine. I have long resisted installing my own HTTPS certificates because I was worried about messing up the expiry periods, etc.. apparently just the way that Apple did with the Mac App Store certificates.

Most developers will have been unaware of the change since Apple never announced it, but I had happened to see the WWDC conference videos that mentioned this in passing. Unfortunately, nothing “clicked” in my head when I heard this. My applications do not communicate with a server anywhere and I thus thought that this was not something I had to worry about. I had forgotten that Sparkle might use this internally.

Fortunately, I caught this at 6AM when I released A Better Finder Rename 10 final. I was just doing a normally completely redundant check through all the features of the program when I noticed that the new version failed when trying to check for updates. By 8AM, I had identified and fixed the problem so that very few people indeed could have been caught out by it. That was luck though.

The nefarious element here was that applications were opted in automatically and silently. Before 10.11 El Capitan was installed on your Mac, my applications updated just fine. Afterwards, they no longer did. Just because they were on El Capitan. Gee thanks!

Of course, this would not have happened if I hadn’t built A Better Finder Rename 10 with the Mac OS X 10.11 SDK (Software Development Kit) at the last moment.

It is somewhat crazy for a developer to change the SDK that s/he builds a brand-new version of their software against in the middle of the beta phase. Changing the SDK always introduces errors because the entire environment in which the code executes is changed. This may bring out bugs that were already present; things that should never have worked, but worked just because the API happened not to trigger the bug. It also introduces bugs that are just part of the new SDK and that you now have to work around. Changing SDKs makes existing programs fragile.

I’m very conservative when it comes to changing SDKs because I’m well aware of the risks. That’s why I’ve been building my code against older SDKs for the past 15 years. A Better Finder Rename 10 was built against the Mac OS X 10.7 SDK which is forwards-compatible with newer versions of Mac OS X.

The main reason for doing so, is that I wanted to be certain that I didn’t accidentally break A Better Finder Rename on older systems, which brings us to the next problem with Mac OS X development.

Xcode lets you specify a “deployment target”, for instance 10.7, while building with a newer SDK. This is the recommended way of developing on Mac OS X and keeping backwards compatibility. Xcode will, however, happily let you use APIs that are not compatible with your deployment target and thereby ensure that your application will crash on anything other than the latest Mac OS X.

In fact, Xcode encourages you to use the latest features that are not backwards compatible and will rewrite your code for you if you let it, so that it will crash. It will give you “deprecation warnings” for any API usage that is not in the latest SDK and resolving those warnings is likely to break backwards compatibly as well. Of course, you won’t know this until you run it on the old Mac OS X version.

Now which developer can afford to keep testing rigs with 10.7, 10.8, 10.9 and 10.10? Never mind spend the time re-testing every change on multiple platforms for each change?

Thus I happily built with the 10.7 SDK. Apple did not make this easy by not shipping the old SDKs with Xcode, but you could manually install them and they would work just fine.

Imagine my surprise after installing Xcode 7 and finding out that this no longer worked. The only workable solution was to build against the 10.11 SDK, so jumping forwards not one but 4 SDK versions. A bunch of code wouldn’t compile any longer because the libraries were gone. Luckily the receipt checking code was amongst those, so it got modernised just in time to avoid the Mac App Store receipt fiasco.

Nonetheless, now my entire code base had become fragile and largely un-tested between the last beta release and the final shipping product. Nightmare!

On top of that was it still even 10.7 compatible? or indeed 10.10 compatible? Just quickly running it on older systems wouldn’t provide more than a little additional confidence since it’s impossible to go through every code path of a complex product.

After installing virtual machines to test on, I still couldn’t be 100% certain. The solution came in the form of deploymate, a now truly essential developer tool which does what Xcode can’t do: check that API usage is compatible with the deployment target.

I have since spent many weeks trying to ensure that I won’t run into the same problems again by adding (additional) automated verification processes to my build system. My build system now runs the built product through SDK compatibility checking courtesy of deploymate, code signing validation and gatekeeper verifications on each build. I’m still working though deprecation warnings and the like and my code base will soon be bullet proofed at least until the next forced changes arrive.

You’d think that this was a long enough list of problems for one year, but this still does not account for Apple also changing the code signing rules (once again) earlier in the year (in a point update of 10.10 no less). This time it affected how resources and frameworks are signed. So applications that were signed correctly for years, now suddenly became incorrectly signed and Mac OS X would refuse to launch them because they were “broken”.

All this points to the underlying issues with the current spade of fragility of Mac applications: Apple keeps changing the status quo and neither it, nor developers have any chance of keeping up.

Apple’s own applications are full of bugs now. None more so than Xcode, which is both the lynch pin of all Mac OS X, iOS, watchOS and tvOS development and no doubt Apple most fragile app offering. Xcode is in beta at least 6 months a year and never really stabilises in between. Each new version has new “improvements” to code signing, app store uploading, verification code, etc.. and each new version breaks existing code and introduces its very own new bugs and crashes. From one day to the next, you don’t know as a developer whether your code works or not. Existing code that had worked fine on Friday evening, no longer works on Monday morning. Worse, chances are that you are not hunting for your own bugs, but those in your development tools, the operating system or Apple supplied SDKs.

All this is driven by the one-release-a-year schedule that Apple has imposed on itself. This leaves all of Apple’s software in various stages of brokenness. When Apple’s own staff cannot deal with this constantly shifting environment, how are third party developers supposed to?

Case in point: Apple’s own apps are not all iOS 9 compatible yet. Many don’t support the iPad Pro’s new native resolution yet. Some have gained Apple Watch extensions, but most haven’t.

Reliability is a property of a system that is changed slowly and deliberately and where all constitute parts are themselves reliable. The Mac and all other Apple platforms are currently undergoing the worst dip in reliability since Mac OS X was introduced.

Apple is pushing out half-baked annual releases of all its software products, as well as apparently completely unmanaged changes to policies, external rules and cloud services at an ever more frenetic pace.

These could be written off as temporary “growing pains”, but the big question is: Do all these annual updates equate to real progress?

When I switch on my Mac today, I use it for much the same things that I used it for 10 years ago. A lot has changed. Cumulatively Mac OS X 10.11 El Capitan is somewhat better than 10.6 Snow Leopard.. and yet if you discount cosmetic changes and new hardware, nothing much has changed. Certainly nothing much has actually improved.

I can’t help thinking that if we had had 2 or possibly 3 Mac OS X updates instead of 5 over those last 5 years, we’d be in a much better shape now. Apple and developers would have time to provide user benefits and rock solid reliability rather than just endlessly chasing their own tail.

The beauty of the Mac used to be that it just worked. I want to get back to that.

A Better Finder Rename 10 beta auto-update broken on 10.11 El Capitan

We are sorry to report that the auto-update feature on beta releases of A Better Finder Rename 10 is broken on Mac OS X 10.11 El Capitan and you will need to download the update to version 10 (out yesterday) directly from our website.

We  noticed this early at 6AM yesterday while checking the A Better Finder Rename 10.00 release and shipped a fixed version at 9AM (after struggling through work traffic) both GMT+1.

We have over time evolved a build process that guarantees that we only ship high-quality product builds, but we were caught out his time by the rapid pace of change imposed by Apple’s frequent Mac OS X and Xcode updates.

In the past, Apple was quite good about letting developers upgrade their development environments at their own pace, which is important because Mac users do not expect to always have to upgrade their Macs as soon as a new Mac OS X release is dropped. More recently, Apple has transferred a lot of its iOS practises to Mac OS X and have started really pushing developers to adopt features quickly and to get rid of backwards compatibility quickly.

At first this took the form of a gentle prodding, but over time it has become much more aggressive. Essentially they are deliberatley making it hard for developers not to drop support for older OS X versions.

We were caught out in this and still are. We had to install 10.11 on our development machines in order to test on El Capitan properly (the remote debugger has been discontinued for a while now), which lead to an auto-update of Xcode 7.

We had for the past decade built our products using the latest Xcode but using the oldest compatible SDK (in this case 10.7), because this ensures that the builds do not break backwards compatibility for customers on older OS X releases.

We were caught by the fact that Xcode 7 quietly drops the ability to work with older SDKs. Unfortunately building to the 10.11 SDK opts the program into a new rule that the program can only read HTTPS streams for security reasons. This was mentioned at WWDC for iOS applications but OS X hardly gets a mention in those talks. In any event, we did not think that this would affect A Better Finder Rename as we have no server backend, but as it happens, it breaks Sparkle auto-updates, which power our product’s (and 99% of non-Mac App Store applications’) auto-update feature. Note that our auto-updates are securely signed even thought they do not use https.
As a result the auto-update feature works fine as long as you are not on 10.11, but no longer works on 10.11. We only found this out when we shipped the first update since users have started installing 10.11, resulting in a minor but real mess where A Better Finder Rename beta cannot auto-update.
Sorry for the inconvenience.

A Better Finder Rename 10 on the horizon

Hi,

Since the beginning of the year, I have been working flat out on version 10 of A Better Finder Rename and we are nearing the first beta release.

There are a few things that I want to get out there before the first beta ships and those are mostly to do with the Mac App Store and upgrades.

As many of you will be aware of, the Mac App Store is not much loved by Mac OS X software developers, because it is very different from the “traditional” Mac Indie software distribution that many of us feel is superior in very many ways.

Nonetheless most of us “old timers” have made our software available on the Mac App Store due largely to popular demand. Clearly the Mac App Store is better for some customers.

A few years ago, the Mac App Store started to demand that all applications be sandboxed and that was the beginning of the end for many professional productivity applications on the Mac App Store.

Sandboxing itself is a good idea. In a nutshell, it just means that applications cannot access your entire computer, but are restricted to a “safe” container with their own memory and disk space. Access to anything outside that “container” needs to be specifically allowed either by the user or by Apple during their review process.

Many categories of software (i.e. games) work very well in their sandbox, but most professional applications require fairly unfettered file system access, inter-application communication and/or internet access.

Tools such as BBEdit (an awesome text editor), TextExpander (an awesome snippet expander), Panic’s Coda (an awesome web development tool) and many others (many of them awesome) are leaving the Mac App Store because of these limitations.

A Better Finder Rename 9 is in the app store as “Better Rename 9” and we have managed to keep it non-sandboxed by only shipping “fixes” and no major upgrades for years.

By its very nature, a file renaming tool needs unfettered access to the file system. There’s no chance of Apple granting us an “entitlement” to do that during the review process. The reason is that this pretty much defeats the objective of being sandboxed in the first place.

In the idealized sandbox world, it is the user who implicitly grants permission to manipulate files by selecting them in a Open File… dialog or by just drag & dropping them onto the application or its icon. This works fine for our other file utilities such as File Multi ToolA Better Finder Attributes, but not for A Better Finder Rename.

The reason for this is simple: on a Unix system such as Mac OS X, the name of a file is not stored in the file itself but in the folder that contains it. Dragging & dropping files only gives access to the file and not to its “parent folder”, so you can change everything except its name.

We would thus either need to ask the you to give Better Rename 9 permission for the parent folder every time you want to rename something, or store that permission somewhere after the first time. Alternatively, we could also ask you to give us permission for the entire disk.

This is not an elegant solution and Apple may or may not accept it. Having played around a bit with other programs that have similar problems, it seems that Apple would most likely allow this kind of “hack” where the program brings up an Open File… dialog and says “Sorry I want to access this file but I can’t, please select it for me!”. Yuck.

We have most of the code necessary to do this and are ready to ship it, but it will undermine the usability of the tool, so we are not certain whether we will continue to support a Mac App Store version beyond version 9.

The next huge problem is how to implement paid upgrades. Nobody wants to pay for upgrades, but upgrade revenue is important for developers and customers alike. The economics of software development are currently bifurcated: you have the traditional developers such as Panic, OmniGroup, BareBones, Red Sweater to name but a few, who diligently plug away at making their products ever more awesome.. then you have the newer App-Store generation authors who create an app, launch it, get a good pay day (or more likely not) then see revenues collapse and move on to the next app product.

Abandonware is okay for some categories of software. Who cares whether Flappy Birds gets updated for iOS 9? Professional software, however, is used for more than mere entertainment. Customers buy into professional software, learn how to use it and expect to be using it for many years to come. They expect the software to supported, for bugs to fixed, for it to work with the latest operating system version and to continously evolve with their own growing needs.

A Better Finder Rename was first published in 1996 on System 7 running on PowerPC-compatible Macs and has constantly evolved since. Version 10 is the most awesome version yet and contains many improvements that would have been just as relevant in 1996 as they are today, as well as many that nobody could have predicted back then. At this point, it has probably broken through 100,000 hours of development and support time. A substantial amount of this time was paid for by upgrade fees.

Paid upgrades have another crucial advantage for long term customers: while fire-and-forget developers optimize for immediate appeal, paid upgrades are almost always targeted squarely at the needs of long term users. It’s a different mind set: The success of a new app depends on how many people buy it now, the success of a paid upgrade depends on how many people are willing to pay for the improvements and the new features.

Paid upgrades are great for professional level software because they allow software developers to spend time addressing the needs of existing customers. That’s why it’s particularly troubling that Apple does not allow for any upgrade pricing on the Mac App Store.. and that’s why developers like me are not very happy about it.

Apple makes the Lion’s part of its revenue on hardware. Software for them is something that makes people buy their hardware, so they can afford to give their software away for free to make you buy more hardware.

Indie software developers are only selling software and don’t get a cut from the hardware sales. In fact if we sell through the Mac App Store, Apple gets a 30% cut of our revenue and that’s after sales tax in most countries (though not most parts of the US). For a 19.99 EUR sale in Germany for instance, a developer only gets 11.76 EUR paid out; the missing 41% goes to Apple and the German VAT office. After the tax office and social security payments here in Luxembourg, there is less than 6 EUR left for me of any one sale of Better Rename 9.

Apple has changed software pricing on mobile devices but also on the Mac quite dramatically. They started by offering iLife (iPhoto, iMovie, etc.) for $19.95, then added iWork (Pages, Numbers,…) again at bargain basement prices. At those price points, just charging you another $19.95 every year is perfectly fine. In the end, all those products are now completely free and Apple makes all its money off the hardware.

Most importantly, Apple has never had to finance the development of those software titles by their actual purchase price. They produced these titles to sell $2,000 MacBooks and iMacs, not for the sake of the $19.95 upgrade pricing. Not surprisingly, none of those applications has seen real effort put into maintaining either backwards compatibility or expanding their feature sets. They are entry level applications because Apple has no real interest in driving their development forwards.

Unfortunately, neither charging the full price for each upgrade or making upgrades free, works for applications such as BBEdit, OmniFocus, Coda or indeed A Better Finder Rename. I don’t want to ask customers to pay another $19.95 for A Better Finder Rename 10, but I can’t afford to make it free either.

Many developers have tried to overcome this problem in a variety of creative but imperfect ways. The ball has been in Apple’s court for years, but it’s very clear that they don’t mean to ever pick it up.

I’m not excluding bringing A Better Finder Rename 10 to the Mac App Store eventually, but in a first phase, A Better Finder Rename 10 will only be available from the publicspace.net website.

Our upgrade terms have always been quite generous: paid upgrades cost 50% of the initial purchase price, are fairly infrequent (every 2-5 years), you can get forever upgrades which cost 100% of the initial purchasing price and if you have only recently bought the product, you get a free upgrade.

For A Better Finder Rename 10 the upgrade terms are as follows:

  •  if you have purchased A Better Finder Rename or Better Rename 9 after the  1st of January 2015, you get a free upgrade
  • if you own a forever upgrade, you get a free upgrade
  • otherwise, you have to purchase a discounted paid upgrade

As you may or may not be aware of, anybody who has purchased Better Rename 9 from the Mac App Store can also run A Better Finder Rename 9 from our website for free. In 99% of all cases, A Better Finder Rename will automatically detect that you have previously bought Better Rename 9 and unlock automatically.

If it does not unlock automatically, all you need to do is to download Better Rename 9 to your machine and run it once. After that you can delete it and A Better Finder Rename will still remember that it was there once.

Once released, A Better Finder Rename 10 should automatically unlock if you have purchased Better Rename 9 from the Mac App Store after the 1st of January 2015. So if you buy Better Rename 9 from the Mac App Store now, or even after A Better Finder Rename 10 is out, you are covered.  If you run into any problems, contact us at support@publicspace.net and we will sort everything out with you.

Likewise, if you own A Better Finder Rename 9 or Better Rename 9 but have bought it before the 1st of January 2015, you can buy the discounted upgrade to version 10 from the upgrade page. You can do so even before A Better Finder Rename 10 comes out.

After version 10 has been out for a while, we will reconsider whether we’ll submit Better Rename 10 to the Mac App Store complete with the crippling “please let me rename this” dialog or leave things as they are.

For us, the important thing is that no matter whether you buy on the Mac App Store or from us directly you will have access to the same versions and will not be penalized in any way.

Unfortunately, Apple does not tell us the identities of anybody who purchases our products on the Mac App Store, so we cannot contact existing customers to let them know of these arrangements.. so if anybody wants to post a comment (developers can’t leave or reply to comments) saying “you can get a free upgrade from publicspace.net!”, you’re more than welcome.

MacBreakZ 5 web site redux

MacBreakZ is one of my earliest software projects and started in 1996, when I developed tendonitis in my forearms.

I heeded this as a wake up call and learned everything I could about RSI recovery and prevention and I have been RSI free for close on twenty years now.

As a software developer, of course, I needed to write a program that would embody all of that know how.

MacBreakZ has since gone through 5 major revisions and development is still ongoing.

Last month, I published the first Yosemite-only version and today the new website is ready.

In doing this I also had to review my book list on RSI, only to find that little has changed in the past decade. Either RSI is no longer a problem (definitively not true) or there’s no much money in writing about it 🙂

Enjoy!