Atelier Clockwork

Back to Writing

It’s Been a Busy Few Months.

Since my last post here, I’ve:

Now that my life is hopefully beginning to settle down from all of that, I’ll hopefully be getting back in to the routine of posting here on a regular basis once again.

Discipline

Good Code Comes from Good Habits.

When looking at what things that I’ve learned over the last few years that have made me a better developer, I find it a bit of a surprise that the top of the list is just developing the discipline to make sure that alongside doing the fun thing, the building, I also put in the effort to do the less fun things that make the end product better.

That means (in no particular order):

None of these ideas are novel, but I find it useful to remember that being a good craftsman about my code helps me get things done. Learning new languages / features / tools is important too, but without discipline none of those are going to make my output better.

Selling The Illusion

Put in More Work than Anyone Thinks Is Reasonable

My current project has two competing pull requests, one of them will be merged and the other will be picked over for any useful code to salvage from it into the surviving branch, then deleted.

If I was under a time crunch, I wouldn’t be doing things this way. It isn’t the fastest path to a library that’s good enough to solve the problem once. The thing is, I’m specifically writing a library that is hopefully going to see broad adoption, at the very least as a common dependency for in-house projects. This means front-loading suffering exploring how two divergent patterns can work is worthwhile as it will hopefully save re-work and API churn down the line.

My hope is that this project leads to something worth releasing soon, I have high hopes for the second implementation, what I thought was going to be the uglier and harder to streamline API slimmed down nicely from my original mental model of it when I applied some of the lessons learned in the original approach that I took.

Where the illusion comes in is that the API that ships is going to have a commit history that looks like it took about half the time and half the work to get to the real solution. Some of that may end up in documentation explaining why certain decisions were made, but a lot of it is going to be hidden in structure of the API looking like decisions that just worked out.

Harder, Better, Faster, Stronger

More than Ever Hour After Hour Work Is Never Over

As yet another calendar year marches to a close, I’m reflecting back on the year and how well I’m doing in various areas of my life.

In the area that I can take the least credit for, but am most proud of, at just a bit over two years old my son is healthy, happy and continues to surprise me as he develops. While my wife does a vast majority of work raising him, I’m very happy with the fact that I make it a priority to make it home from work before bedtime almost every night, and that so far my son seems to genuinely enjoy spending time with me.

I’ve made a lot of progress professionally this year. I presented at a conference, kicked off a project from scratch, shipped several open source libraries, and wrote a lot of Swift. While I keep getting better, especially by having a lot of smart people to learn from and work with, I also know enough to know that there’s always going to be room to grow as a developer, even in a fairly well constrained space of iOS application development.

On a personal front, I haven’t been quite as awesome this year. I’ve made some strides towards managing my stress levels, but even adding some basic mindfulness practice into the mix I feel like I’ve been holding steady rather than reducing my stress levels. I also have a mixed bag on the relationships front, in particular the number of people I’m in touch with has dropped off dramatically, and I’m not quite sure how I feel about that.

Overall, I’m happy with what I accomplished this year. I probably could have done more, but not significantly more without having to start making sacrifices to the balance of the various parts of my life, and long terms I don’t think I’d be happy if I did that. The action items to look at for next year are further stress remediation efforts, as that’s one of my largest blockers, and considering if I want to make a effort on the social front.

Generic Dithering

Dithering Functions as Data

In order to implement all eight dithering functions that I was interested in, I split the dithering logic into DitherPattern objects and an extension onto a CGContext that applies the dithering function. I sacrificed a lot of the potential speed optimizations in doing it this way because I couldn’t depend on tricks used to speed up the processing like only using modulo division when calculating the error as no all of the techniques used that, but it let me very quickly get some code working. This resulted in code that is very slow without compiler optimizations, but was acceptable with optimizations enabled.

The first of code to dig through is the CGContext extension:

public extension CGContext {

    public func dither(allowedColors: [RGBA], pattern: DitherPattern, using conversion: GrayscaleConversion) {
        guard let colors = data?.bindMemory(to: RGBA.self, capacity: width * height) else {
            fatalError()
        }
        let offsets = UnsafeMutablePointer<Int16>.allocate(capacity: width * height)
        // initialize and defer dealllocation
        let comparisons = allowedColors.map(AveragedColor.averager(using: conversion))
        let rowWidth = (bytesPerRow / (bitsPerPixel / 8))
        for y in 0..<height {
            let rowOffset = rowWidth * y
            let errorOffset = y * width
            for x in 0..<width {
                let position = x + rowOffset
                let average = conversion.averageFunction(colors[position])
                let averagedColor = AveragedColor(color: colors[position], average: average)
                let currentOffset = offsets[x + errorOffset]
                let results = comparisons.map(AveragedComparison.comparison(to: averagedColor, offsetBy: currentOffset))
                let sorted = results.sorted(by: AveragedComparison.sort)
                let error = sorted[0].difference
                let diffusion = pattern.diffused(error: Float(error), x: x, y: y, width: width, height: height)
                for point in diffusion {
                    let arrayPosition = point.xOff + (point.yOff * width)
                    let start = offsets[arrayPosition]
                    offsets[arrayPosition] = start &+ point.error
                }
                colors[position] = sorted[0].averagedColor.color
            }
        }
    }

}

This is binding the image data from the CGContext to memory, then created a block of memory to store all of the diffused offset data. The next thing that it does is converts the allowed colors list into grayscale averaged formulas using the supplied grayscale conversion to avoid recalculating those for every pixel in the image.

After this, there’s an interesting and slightly maddening caveat when working with image data in memory, the data is aligned to memory word size, which will mean if you don’t calculate the width of the row properly and just iterate through the array, the dead zone at the end of each row in some image means the data diffuses to the wrong pixels.

After this, we it’s just brute forcing our way through the image data, comparing the available colors, finding the closest, then calculating the error and passing it to the surrounding pixels. Figuring out how to diffuse the error is where the dithering pattern enumeration comes into play. Each dithering function is just a different divisor and pattern so I’m stripping the code down to just the Floyd-Stienberg pattern:

private struct Point<Value> {
    let xOff: Int, yOff: Int, error: Value
}

public enum DitherPattern {
    case floydStienberg, jarvisJudiceNink, stucki, atkinson,
         burkes, sierraThree, sierraTwo, sierraLite
}

private extension DitherPattern {
    var divisor: Float {
        let divisor: Float
        switch self {
        case .floydStienberg: divisor = 16
        // Other divisors go here
        }
        return divisor
    }

    var pattern: [Point<Float>] {
        let points: [Point<Float>]
        switch self {
        case .floydStienberg:
            points = [
                Point(xOff: 1, yOff: 0, error: 7),
                Point(xOff: -1, yOff: 1, error: 3),
                Point(xOff: 0, yOff: 1, error: 5),
                Point(xOff: 1, yOff: 1, error: 1),
            ]
        // Other patterns go here
        }

        return points
    }

    func diffused(error: Float, x: Int, y: Int, width: Int, height: Int) -> [Point<Int16>] {
        let errorPoint = error / divisor
        return pattern.flatMap { initialPoint in
            let finalX = x + initialPoint.xOff
            let finalY = y + initialPoint.yOff
            guard finalX >= 0, finalY >= 0, finalX < width, finalY < height else {
                return nil
            }
            return Point(xOff: finalX, yOff: finalY, error: Int16(errorPoint * initialPoint.error))
        }
    }
}

Each pattern includes the divisor that the initial error is divided by, then the pattern used to copy the error to surrounding pixels, which includes the offset for the pixel of the image, and the multiplier to multiply the divided error by. The diffused function then generates an array of surrounding points to modify, with some convenience methods to check that the pixels being diffused to are within the bounds of the image.

Once I had one pattern working properly, implementing all of the other patterns became a matter of data entry copying the matrix code into the Point format. The next goal with this code is either to see how much I can improve performance by working with the Accelerate framework, or to add some fancy bells and whistles before I perform optimizations that will probably make the code harder to wrangle.