AtelierClockwork

First Takeaway: Awesome, and in Need of a Wrapper

I was working with an app that’s blurring and darkening images to allow them to be used as a background to a view without affecting readability. The filter itself worked, but there were some cases where it darkened and blurred already dark images into almost total blackness. Functionally that was fine, but there was discussion about how to do it better.

This got me thinking of alternate Blend Modes, and getting clever with gradients and masking. I first thought about trying to implement them myself, but when looking into the best way to do it, I realized that all of the modes I wanted, and more, were [available as built in filters](https://developer.app le.com/library/mac/documentation/GraphicsImaging/Reference/CoreImageFilterReference/#//apple_ref/doc/uid/TP30000136-SW71) in Core Image.

After a bit of playing around, I got a proof of concept playground together accomplishing a basic variant of the effect that I want to build. Aside from a handful of silly mistakes on my part, mostly involving attempting to check my work at points in the stack where the image had an infinite canvas or strangenesses with how to use Swift with some of the Objective-C bridges to C.

It’s not fast, and I plan to build a real app and test on a device using GPU acceleration to figure out just how much of that is playgrounds / simulator / debug mode lag. Depending on the results of that I may also do things like write a fast blur shader.

The bigger takeaway is that core image has a lot of potential for useful effects in an app, but that the API is one of the strangest ones that I’ve worked with, and a handful of wrapper classes could take a lot of the strangeness out of working with it. Just creating some convenience methods that return initialized CIFilter objects and take real arguments rather than a dictionary of keys and values would make working with it less imposing.