I love sharing learnings and insights from my work. Some of my talks have been recorded and are available below:
Why do traditional event programming approaches fail on multitouch devices? I give a quick history of the problems Apple encountered when implementing multitouch interfaces and an overview of how the gesture system resolved those issues.
This demo-driven talk takes the architectural practices introduced in other recent talks and applies them to a real-world project. I talk through iterated refactoring steps, outlining my principles along the way. The code is available on GitHub, where each refactoring step has its own commit.
In this talk I outline an architectural attack on complexity which leverages the semantics of value types to confine control flow, dependency graphs, and hidden handshakes. I use Swift to illustrate the design pattern, but the concept applies to other languages.
This high-level talk explores several approaches to measuring and managing complexity. They're classic ideas, but I tried (with Colin Barrett) to make them approachable and concrete for iOS developers: managing information flow explicitly, defining clear responsibilities and boundaries, and using immutability and value types to reduce dependencies.
This talk covers how to make animations feel like a natural extension of user input. Josh Shaffer and I address how to transition from gesture control to an animation, then from one animation to another animation (while the first one's already in-flight), then from an animation back to gesture control. We also introduce a design pattern for dealing with the "meta-state" that creeps up around animations and which makes them difficult to interrupt.
The engineering overview of iOS 7's new UI. The changes to the design are both aesthetic and deeply structural. Jason Beaver and I talk about how UIKit changed to accommodate it, how app developers can adapt their software in the short and long term, and discuss the philosophy of the new design.
On mobile devices, concurrency's primary benefit is not in making a given computation faster by running it simultaneously on multiple cores. Instead, it can be leveraged so apps always have one primary thread available to react to user input, even while they're processing data or rendering complex UI. This talk demonstrates how to keep iOS applications running responsively even while they're busy.
A survey talk with Josh Shaffer. We cover how to control interactions among gesture recognizers and views, common pitfalls when designing custom recognizers, and some signal processing techniques for smoothing touch data over space and time.
Users expect a smooth transition when they rotate an iOS device from landscape to portrait. Making that happen seems intimidating, but Josh Shaffer and I break it down into manageable tasks and demonstrate how to let UIKit do most of the heavy lifting. This talk includes a number of advanced graphics performance tips (applicable beyond just rotations) and some surprising animation sleights-of-hand.
iOS interfaces do not appear on the screen by magic: UIKit sits on top of an exposed rendering stack. An understanding of that stack will help enormously when building and optimizing user interfaces. This talk includes an overview of iOS's multiprocess rendering model, details of the interface between UIKit and Core Animation, and many high-performance graphics tips. With Josh Shaffer and Mathieu Martin.
Write little robots to test your OS X and iOS applications. I discuss both low-level testing (via unit tests) and high-level testing (via UI automation). I also introduce a general testing philosophy which I hope brings some sanity to dogmas in this field.