A Best-in-Class iOS App: Free Preview

The Indie Dev Diaries
Jun 30th, 2021

Since my book prerelease launch, I’ve received a similar request several times - is there a sample? Can I preview it? What’s it like?

All fair questions, so today I thought I’d include a sample chapter right here in a blog post. So without further ado, here’s one chapter from the Accessibility section, “The Rotor Control”, which gives you an idea of what the book is like:

The Rotor Control

The rotor control helps make navigation quicker. That’s what it does at its very core. Consider the user interface below: Rotor Control example.

Visually, we get the benefit of context fairly quickly. We can pick out a few main headings, reason about content sections and generally get a feel for how we want to navigate the view. For example, we may not be interested in the top “Next Five Days” section since we intended to revisit the “Last Five Days” section right away. We can do that because we can see it. And when we can see it, we reason about where we want to go. In this case, we’ve seen that there are two main categories here, and based off of that - we chose to “navigate” to the second one.

The Rotor Control can give VoiceOver users that same affordance. One of its many benefits is that it can tell VoiceOver to only navigate by, or to, certain elements (such as headers). In our example above, that would allow VoiceOver users to navigate with the same efficiency as users who aren’t visually impaired might. Rotor Control example.

The Rotor Control has several ways to help with navigation. In fact, its capabilities shift with the context. There are options to change the speaking rate of VoiceOver, move to only misspelled words in text, change input methods and more.

So, how do developers fit into the Rotor Control? Primarily, two ways.

First, we can create our own rotors to hand off to the system’s Rotor Control to make custom categorical navigation possible for VoiceOver users. This is what we focus on in this chapter. If you find yourself in a situation where you’ve got an interface that would make sense to navigate to categorically, and the system’s default rotors don’t cover it - then you’ve found a great opportunity to supply your own custom rotor to fill that gap.

Secondly, we can make sure we’re using the correct accessibilityTraits in our apps to make sure the system provided rotor controls behave as users expect. If we’ve built a custom header-like element but we haven’t indicated to the system that it is a header-like element, then we’re essentially taking away functionality from VoiceOver users.

How it Works

Any NSObject has a custom rotors property we can assign to: open var accessibilityCustomRotors:[UIAccessibilityCustomRotor]? When we assign to it, those rotors become available to VoiceOver. Each rotor needs to know a few things:

  1. The name or type of the rotor
  2. The next element that it should navigate to

To vend that information, you’ll be dealing with three classes and one type aliased closure:

  1. UIAccessibilityCustomRotor
  2. UIAccessibilityCustomRotorItemResult
  3. UIAccessibilityCustomRotorSearchPredicate
  4. public typealias Search = (UIAccessibilityCustomRotorSearchPredicate) -> UIAccessibilityCustomRotorItemResult?

The rotor houses all of the information. The result is returned by us to let the active rotor know which element to go to next. The search predicate exposes what element is focused and which direction the user is navigating (in Rotor Control terms, either up (.previous) or down (.next). Finally, the search closure gives you the last active predicate while returning the next rotor item.

Let’s look at an end-to-end example. Consider a row that has four square views in it, and that a view controller is showing three of these rows. Each row has a different color, and each square within the row has a .button accessibility trait and the user can swipe through each one. Rotor Control example.

Without doing anything, if they want to see the colors in the last row, they’d have to swipe towards it several times to get there.

With a custom rotor, we could simplify things two ways:

  1. Provide a custom rotor that toggles each active row in of themselves (i.e. flick up and down to switch rows) or
  2. Provide three custom rotors, one for each color, where flicking up and down navigates to each square in the row.

Remember, when a rotor is active - the primary navigation gesture is swiping up or down to select items within that rotor’s category. Users can, and commonly do, still swipe left and right to navigate through the view’s hierarchy still. A rotor is used in tandem with typical VoiceOver navigation.

Sticking with example two from above, here’s what an implementation might look like:

// Xcode -> RotorControlFig1ViewController.swift
private func colorRowRotor(forColor color:String, stack:UIStackView) -> UIAccessibilityCustomRotor {
    return UIAccessibilityCustomRotor(name: color) { searchPredicate in

        // Ensure we've got a square that's focused
        guard let currentFocusedSquare = searchPredicate.currentItem.targetElement as? UIView else {
            return nil
        // Find the index of the square in the current stack view
        let indexOfCurrentSquare = stack.arrangedSubviews.firstIndex(of: currentFocusedSquare)
        let nextIndex: Int
        // Did the user swipe up, or down?
        switch searchPredicate.searchDirection {
        case .next:
            nextIndex = (indexOfCurrentSquare ?? 1) - 1
        case .previous:
            nextIndex = (indexOfCurrentSquare ?? -1) + 1
        @unknown default:
        // Ensure selecting the next square won't crash, this
        // Is basically signaling to VoiceOver we've either 
        // Reached the end or the beginning of the elements
        guard 0..<stack.arrangedSubviews.count ~= nextIndex else {
            return nil
        // VoiceOver will focus next based off of this result
        let result = UIAccessibilityCustomRotorItemResult(targetElement:
                                                          targetRange: nil)
        return result

Using this custom rotor function to create three rotors, the user could toggle to either “Red”, “Blue” or “Green” and the rotor would focus to that particular row with the next swipe up or down. Sequential swipes would then navigate through the squares in the row itself.

If we had implemented a single rotor instead of three (one for each row), we could have one single rotor called something like “Color Rows”, where each swipe up or down would take you to the next row and swipes left and right would navigate within them. It’s up to you to figure out which ways to implement these custom rotors - but try to create them in such a way that they navigate how users who aren’t visually impaired would scan and use your user interface.

When creating the implementation for a custom rotor, you’re essentially responsible for:

  1. Tracking the user’s search direction
  2. Returning the next item that belongs to the rotor based off of that

Another common way VoiceOver users rely on rotors is for long form text. Consider release notes for an app. If you or I were to implement a custom view, regardless of whether or not we used SwiftUI or UIKit, we might have one or more text controls listing everything out.

Visually, we’d likely make each release more distinct from the rest of the text. That means when folks view it, they are likely scanning the interface version by version. Rotor Control example.

Since VoiceOver users don’t get that by default, one way to solve this would be with a custom rotor. Thankfully, the rotor control has a initializer specifically for text ranges:

// Xcode -> RotorControlFig2ViewController.swift
private func versionReleaseRotor() -> UIAccessibilityCustomRotor {
    return UIAccessibilityCustomRotor(name: "Releases") { [unowned self] searchPredicate in
        guard let currentTextView = searchPredicate.currentItem.targetElement as? UITextView else {
            return nil
        var nextTextView: UITextView?
        let swipedNext = searchPredicate.searchDirection == .next
        if currentTextView == firstReleaseTextView {
            nextTextView = swipedNext ? secondReleaseTextView : nil
        } else if currentTextView == secondReleaseTextView {
            nextTextView = swipedNext ? thirdReleaseTextView : firstReleaseTextView
        } else {
            nextTextView = swipedNext ? nil : secondReleaseTextView
        guard let textView =  nextTextView else { return nil }
        let versionTextRange = versionTextPosition(in: textView)
        return UIAccessibilityCustomRotorItemResult(targetElement: textView,
                                                    targetRange: versionTextRange)

Notice that the logic and flow is extremely similar, but now we’re dealing with where in text the rotor should go along with the text control that contains it instead of in terms of a simple object(fn). This implementation requires a bit more tact that the one above, so if you can reconfigure your view setup to support the previous way of supporting a rotor control - by all means, do so.

However, I’d invite you not to be intimidated by this approach. Apple supplied it for a reason, and it’s built specifically for text-based navigation. It’s mostly a matter of translating a range of text into one UITextPostition object, so be sure to comb through the sample code to get a feel for it.


Know Where to Assign Rotors

When you assign to the accessibilityCustomRotors - make sure you do it as the right place. Any UIView can have these custom rotors, so when you assign some to any particular view, those are activated and used when that particular view is in focus.

If you need it makes sense to aggregate several rotors into one view’s rotor array, you certainly can:

self.view.acceessibilityCustomRotors = [view1.accessibilityCustomRotors, view2.accessibilityCustomRotors, view3.accessibilityCustomRotors].flatMap { $0 }
Using the Correct Traits and System Types

The rotor, in a sense, identifies where to go next categorically. For example, in most interfaces when the rotor is activated you’ll likely see “Heading” as an option. As I pointed out above in “How it Works”, that means you need heading level elements to have that accessibility trait so it’ll be exposed to the rotor control.

Conceptually, thinking in header level elements is quite trivial. But consider all of the other types of rotors available:

public enum SystemRotorType : Int {
    case none = 0
    case link = 1
    case visitedLink = 2
    case heading = 3
    case headingLevel1 = 4
    case headingLevel2 = 5
    case headingLevel3 = 6
    case headingLevel4 = 7
    case headingLevel5 = 8
    case headingLevel6 = 9
    case boldText = 10
    case italicText = 11
    case underlineText = 12
    case misspelledWord = 13
    case image = 14
    case textField = 15
    case table = 16
    case list = 17
    case landmark = 18

On the other hand, look at that list and try and see if there’s any elements you need to supply to the system that might would make sense to show but aren’t exposed as a rotor by default. Consider the interface below: Rotor Control example.

By default, there is no “Image” rotor that’s vended by the system, but using the initializer for custom rotor to take in a SystemRotorType, we can create one:

// Xcode -> RotorControlFig3ViewController.swift
private func imageRotor() -> UIAccessibilityCustomRotor {
    return UIAccessibilityCustomRotor(systemType: .image) { [unowned self] predicate in
        guard let currentImage = predicate.currentItem.targetElement as? UIImageView else { return nil }
        let nextIndex: Int
        let currentIndex = self.stackView.arrangedSubviews.firstIndex(of: currentImage)
        switch predicate.searchDirection {
        case .next:
            nextIndex = (currentIndex ?? 1) - 1
        case .previous:
            nextIndex = (currentIndex ?? -1) + 1
        @unknown default:
        guard 0..<self.stackView.arrangedSubviews.count ~= nextIndex else {
            return nil
        return UIAccessibilityCustomRotorItemResult(targetElement: self.stackView.arrangedSubviews[nextIndex],
                                                    targetRange: nil)

And now, activating the rotor control will show an “Images” option that will cycle through just the images within the interface. Notice that the initializer for the custom rotor takes in the SystemRotorType instead of a string, and we passed in .image.

Returning Results

Since we leverage UIAccessibilityCustomRotorItemResult to return the next item to select a rotor, it helps to know all of the ways you can package them up to the system. There are really only two simple responsibilities to remember:

  1. You’ll always return an item, or put differently - an object that the accessibility engine can select. If your logic dictates that you don’t have one, then you’d return nil from the rotor and not deliver a UIAccessibilityCustomRotorItemResult instance. Recall that the block you use to build a custom rotor asks you to return a nullable instance of that class - so indicating that you don’t have one is fine and in many cases the right call. It indicates to VoiceOver users that they’ve reached some sort of beginning or end.
  2. Once you’ve got an object, you can also return a text range if you’re dealing with text. That’s really all VoiceOver needs from your custom rotor to navigate. On the other hand, you’ll also receive one of these objects from the search predicate (more on that directly below) when constructing custom rotors. This proves useful as you’ll be able to inspect the last focused item or text range to help you vend the next accessible item to that the rotor should navigate to.
Search Predicates

Leveraging the search predicate appears intimidating at first, but I’ve found it helps to rename it in your head to something like “Previous Rotor Item” since that’s what it often represents. Hearing the word predicate may have you draw comparisons to NSPredicate which isn’t accurate in this case. This is simply an object with some useful information to help you decide what to do next.

You’ll typically use two critical pieces of information from the predicate:

  1. The last item that was focused by looking at predicate.currentItem.targetElement
  2. The direction the user swiped to reason if they want the next or previous item:
    switch predicate.searchDirection {
    case .next:
     // User swiped for next item
    case .previous:
     // User swiped for previous item
    @unknown default:
Custom Attributed String Keys

Since rotor results can be used in tandem with one or more text views you’ll be dealing with ranges of matched text quite often. Working with a text range is tricky enough, but you can make your life a bit easier in those situations by extending the attributed string API’s key type:

extension NSAttributedString.Key {
    static let versionHeader = NSAttributedString.Key.init("versionHeader")

Why do this? You can tack that key into your attributed text to later find its range in a trivial fashion when you’re creating custom rotors dealing with text:

// Note the Version header attribute added last
let attributes: [NSAttributedString.Key:Any] = [.font: UIFont.systemFont(ofSize: 24, weight: .heavy),
    .foregroundColor: UIColor.label,
    .versionHeader: NSNumber(booleanLiteral: true)]

attributedText.addAttributes(attributes, range:
                             rangeOfVersion(in: textview))

Then, when youre looking for that text to translate into a `UITextRange` instance, you can look for the specific attribute without having to match raw text instead:
// Search the text view by our custom attribute
textView.attributedText.enumerateAttribute(.versionHeader, in: NSMakeRange(0,
textView.text.count), options: []) { valueAttribute, matchedRange, stop in
    guard valueAttribute != nil else { return }
    // Use matchedRange to get a UITextPosition from the text view
    // Then stop iterating
    stop.pointee = true
Avoiding Dead Rotors

If you find yourself making a custom rotor, assigning to an object’s custom rotor property and it doesn’t show up it’s likely because the item you’re returning isn’t an accessible item by default.

If this happens, be sure to check that the object has isAccessibilityItem set as true and that the accessibility traits it has lend itself to navigational purposes. For example, .staticText isn’t a navigational item so it wouldn’t do anything for a rotor. Always remember - the rotor is there to make navigation snappy. As such, it stands to reason that the items we vend to it help accomplish that goal.

Three Key Takeaways

  1. The Rotor Control helps VoiceOver users navigate their device efficiently.
  2. We can extend the system rotor controls and provide our own.
  3. Be sure to use the correct accessibility traits to ensure your existing interface works great with the system-provided rotors.

Final Thoughts

So there you go, I hope you dig it!

I don’t like to be all “sales-y” on here, but I figured this was the quickest way to get a snip of the book out there. If you’re not familiar with the book, it’ll cover five main topics which I believe make up a great app. Those are accessibility, design, user experience, iOS APIs and toolbelt expansion (the last being a bonus section for beta buyers). You also get private Discord access to ask me questions and hang with the community, an Xcode demo project, lifetime updates and a 20% discount.

If that sounds groovy, you can check things out at its website here. Thanks for coming to my infomercial!

Until next time ✌️

iOS 15: Notable UIKit Additions

Jun 7th, 2021

Well, if you thought UIKit was going to show any signs of slowing down, Apple has some news for you - it’s not. Though SwiftUI continues to be a focus, as it should be, our venerable UI framework is better than it ever has been, replete with several meaningful changes. As is my annual tradition, let’s peek at some of my favorites.

If you want to catch up on this series first, you can view the iOS 11, iOS 12, iOS 13, and iOS 14 versions of this article.

Sheet Presentation

There are eventualities in life we all accept - death and taxes, for example. In iOS, it was when will Apple finally give us an API for the bottom sheet controller?

Lo, it’s here! And beautiful.

Meet UISheetPresentationController.

Setup is as easy configuring the presentation controller with a few options:

extension ViewController: UIViewControllerTransitioningDelegate {
    func presentationController(forPresented presented: UIViewController, presenting: UIViewController?, source: UIViewController) -> UIPresentationController? {
        let pc = UISheetPresentationController(presentedViewController: presented, presenting: presenting)
        pc.detents = [.medium()]
        pc.prefersGrabberVisible = true
        return pc

Or, you can leverage the new adaptive sheet presentation controller to use it as popover when needed on iPadOS:

// The UISheetPresentationController instance this popover will adapt to in compact size classes. Access this instance to customize or adjust the adaptive sheet.
@property (nonatomic, readonly, strong) UISheetPresentationController *adaptiveSheetPresentationController API_AVAILABLE(ios(15.0)) API_UNAVAILABLE(tvos, watchos);

Which, in my opinion, gives you a bit more flexibility than leveraging the transitioning delegate route. It also sets you up for iPadOS presentations:

let test = DemoVC()
test.modalPresentationStyle = .popover
if let pop = test.popoverPresentationController {
    let sheet = pop.adaptiveSheetPresentationController
    sheet.detents = [.medium()]
    sheet.prefersGrabberVisible = true 

present(test, animated: true)

The system uses a concept called “detents”, which only has two values right now, to determine its size:

// A system detent for a sheet that is approximately half the height of the screen, and is inactive in compact height.
+ (instancetype)mediumDetent;

// A system detent for a sheet at full height.
+ (instancetype)largeDetent;

They control if you want the sheet to present a lá Maps style, half showing, or go into the normal sheet presentation we’ve had since iOS 13. There are several configurables here, though, more than I expected. For example, you can have the sheet sized according to the preferred content size:

let test = DemoVC()
test.modalPresentationStyle = .popover
test.preferredContentSize = CGSize(width: 200, height: 200)
if let pop = test.popoverPresentationController {
    let sheet = pop.adaptiveSheetPresentationController
    pop.sourceView = self.button
    sheet.detents = [.medium()]
    sheet.prefersGrabberVisible = true
    sheet.widthFollowsPreferredContentSizeWhenEdgeAttached = true

present(test, animated: true)

Which gives us #TinySheet: A popover

You can also tweak the corner radius, whether or not to attach to the container view’s edges and more. Overall, this is just what the community needed, and wanted:

A demo of sheet presentations


This nifty new control, found in the new CoreLocationUI framework, helps you solve one problem quickly: Getting one-shot access to the user’s location. If you’ve got a feature of your app where it makes sense to get a location temporarily, you’ll want to use this1:

let btnFrame = CGRect(x: 100, y: 100, width: 215, height: 54)
let locationBtn = CLLocationButton(frame: btnFrame)
locationBtn.icon = CLLocationButtonIcon.arrowFilled // Or outline
locationBtn.label = CLLocationButtonLabel.currentLocation
locationBtn.cornerRadius = (btnFrame.size.height/2).rounded()

With that, we’ve got a pretty location button: A demo of CLLocationButton

When this is tapped, the system prompts the user about granting temporary access to their location. If they agree, your app gets CLAuthorizationStatus.authorizedWhenInUse permissions which expire when your app is no longer in use. From there, you can kick off all your location needs using the same code you always have.

You can tweak its icon and label properties to get predefined and fully localized iconography and text:

A demo of different styles of CLLocatiomButton


Not much to say here, aside from another finally! We now have a keyboard version of UILayoutGuide:

 testView.bottomAnchor.constraint(equalTo: view.keyboardLayoutGuide.topAnchor).isActive = true

But, what’s the tricky part about constraining to the keyboard? The fact that it can float all over the joint, and get resized on the fly. Thankfully, Cupertino and Friends thought of this:

view.keyboardLayoutGuide.followsUndockedKeyboard = true

If that is, in fact, what you want. If this is false, the default value, then the guide’s top anchor will reflect the view’s bottom anchor of its safe area layout guide when the keyboard is off floating about.


There’s new API to use tooltips, and it follows the “interaction” pattern that’s been used more and more in UIKit. For example, it works almost identical to the pointer interactions:

class ViewController: UIViewController {
    private let button = UIButton()
    override func viewDidLoad() {
        let square = UIView(frame: CGRect(x: 200, y: 200, width: 100, height: 100))
        square.backgroundColor = .red
        let toolTip = UIToolTipInteraction()
        toolTip.delegate = self

extension ViewController: UIToolTipInteractionDelegate {
    // If both delegate methods are implemented, this one takes precedence
    func toolTipInteraction(_ interaction: UIToolTipInteraction, toolTipAt point: CGPoint) -> String? {
        return "Hi There - I'm showing a tooltip"
    func toolTipInteraction(_ interaction: UIToolTipInteraction, toolTipAt point: CGPoint, boundingRect outRect: UnsafeMutablePointer<CGRect>) -> String? {
        return "Bounding rect delegate function"

However, try as I might, I can’t seem to get these delegate methods to fire on beta one. Though the header docs don’t mention it, and it is certainly available on iOS, perhaps this only occurs for Catalyst apps?

Image Decoding

A few years ago I wrote about the hoops you had to jump through to take less of a hit on image decoding. Now, there are built in methods to do this in a trivial fashion straight from ImageIO!

let thumbnailImg = UIImage(named: "Baylor")!
thumbnailImg.preparingThumbnail(of: CGSize(width: 200, height: 400)) 

// Or, do it async on a background thread
thumbnailImg.preparingThumbnail(of: CGSize(width: 200, height: 400)) {
    // Now dispatch back to main to use it

Also, you can tack a shape on to SF Symbols, which is very neat:

let trashCircle = UIImage(systemName: "trash", shape: .circle)
let imageView = UIImageView(image: trashCircle)
imageView.tintColor = .red
imageView.frame = CGRect(x: 100, y: 100, width: 200, height: 200)

let trashCircleSquare = UIImage(systemName: "trash", shape: .square)
let imageView2 = UIImageView(image: trashCircleSquare)
imageView2.tintColor = .red
imageView2.frame = CGRect(x: 100, y: 400, width: 200, height: 200)

Which results in:

Shape symbol vairants

Though, if the shape variant isn’t supported, it’ll return nil. I can see this resulting in more flexibile user interfaces, it’s a nice touch that I didn’t really know I wanted, but can think of several use cases for.

Final Thoughts

While none of us outside Cupertino and Friends are iOS 15 cognoscentis yet, we’ll get there slowly, beta by beta. I’m impressed with this release, and it has several “nice” things. It reminds me of iOS 12, a rock solid release that was lighter on huge, new and my-summer-is-gone features. That’s fine with me.

There are still plenty of interesting things in UIKit I haven’t looked at yet, like the new button configurations which seem to allow for multiline text and other perks. It doesn’t seem like UIKit has lost a step, so go boot up your beta and dig in!

Of course, all of these will eventually find themselves in the best-in-class book too, and I can’t wait to uncover ways to make all of these new APIs shine.

Until next time ✌️

  1. Or its SwiftUI variant, Location Button. 

WWDC 2021: The Pregame Quiz

Jun 1st, 2021

And we’re back! Another edition of the Swiftjective-C WWDC Pregame Quiz! While we wait to start round two of virtual dub dub, this year’s quiz is coming at you fresh and ready. With so much shipped already from Cupertino & Friends©, it begs the question - what are they clearing the keynote for?

We’ll know soon enough, but until then let’s gear up with the seventh annual Swiftjective-C WWDC Pregame Quiz!

If you’d like a quick primer on how this all works or how it got started, check out the first six quizzes from 2015 ,2016, 2017 ,2018, 2019 and 2020.

let quizSub = quiz(publisherFor: .dubDubTwentyTwentyOne)
.sink { questions in 
   print("Lets do it! Here are the questions: \(questions)")
.store(in &subs)

Ground Rules

There are three rounds, and the point break down is as follows:

  • Round 1 – 1 point each answer
  • Round 2 - 2 points each answer
  • Round 3 - 3 points each answer

The last question of each round is an optional wildcard question. Get it right, and your team gets 4 points, but miss it and the team will be deducted 2 points.

Round 1 - A History Lesson: The First iPhone 📱

Question 1:
While the first iPhone was still a mystery to the rest of the world, even to those inside Apple, what was the codename it was known under internally at Apple?

Question 2:
The iPhone actually wasn’t the first mobile phone to have iTunes on it. In fact, Apple collaborated another cell manufacturer to develop the ROKR E1, which had iTunes but was limited to 100 songs. Which carrier was it?

Question 3:
Before we had an official SDK for iPhoneOS, Jobs infamously promoted the use of web apps as a means to getting third party applications on the iPhone. What was the name of one of the very first of these apps (which was a shopping list)?

Question 4:
A fellow named Michael Kovatch held a very important domain name that Apple needed for launch, and they eventually paid him $1,000,000 for it.

What was the name of it?

The first month the iPhone was out in the wild wasn’t without issue. Which famous Youtube personality received a 300 page bill that had to be shipped in a box detailing their month’s charges?

Round 2 — Erm…“Epic” Questions 🙊

Question 1:
During Fortnite’s first two years on the App Store, Epic and Apple were enjoying a symbiotic relationship wherein Apple spent $1,000,000 of its own money to market the game.

How much money did Apple make from its App Store cut on Fortnite during these first two years?

Question 2:
The age old discussion of whether or not to let App Stores within the App Store, like Microsoft’s Project xCloud service, has been renewed with Apple vs Epic.

Which highly prominent game changed their branding to reflect that its developers build “in game experiences” instead of games during the trial to avoid being labeled as an App Store within the App Store?

Question 3:
In internal discussions regarding bringing iMessage to Android, Phill Schiller analogized the situation by mentioning another one of Apple’s own marquee applications “lost on Windows because it wasn’t a great Windows (redacted)…didn’t innovate enough…spent $0 marketing…shipped one update a year”.

Which application, still around today, was he talking about?

Question 4:
Contrary to popular believe, macOS isn’t immune to malware. In fact, there was malware which infected around 2,500 apps and 128 million customers that led to SourceDNA getting acquired by Apple.

Which malware was it?

While Apple makes handsome margins on the App Store, which gaming console was revealed to have never turned a profit on hardware during the trial?

Round 3 — Pour One Out for Forstall 🍻

Question 1:
While developing the initial iPhone, Steve was given the choice of either “shrinking down the mac or enlarging the iPod” and two teams competed to win the iPhone using their approach. Forstall went the “shrink down the mac” route - who did he compete against that took the iPod route?

Question 2:
Forstall argued to Phill Schiller that they should let Yahoo! include which API of theirs on iPhone in the early days, and that by doing so Apple would “take the high ground and compete”?

Question 3:
Apple released a public apology over the launch of a premiere application alongside iOS 6 that was considered buggy and inaccurate. It’s rumored that Forstall refused to sign the letter - which app was the letter apologizing for?

Question 4:
A man of many talents, Forstall also has design chops in addition to software engineering skills. Which Mac user interface, whose success led to him being promoted to SVP in 2003, was he primarily in charge of?

Though he laid low after his departure from Apple, years later he took up another interest. What type of “product” did he become a co-producer of?

Answer Key

Round 1:

  1. Purple
  2. Motorola. Also, Steve Jobs loathed the end product, which in part led to the iPhone.
  3. Good ol’ OneTrip - which you can still use here!
  4. iphone.com
  5. Wildcard: iJustine

Round 2:

  1. One HUNDRED million dollars
  2. Roblox
  3. Safari
  4. XcodeGhost
  5. Wildcard: Xbox

Round 3:

  1. Tony Fadell
  2. Yahoo!’s Widget Engine
  3. Maps
  4. The Aqua User Interface, known for translucent icons and water-themed visual cues.
  5. Wildcard: He does Broadway Plays! He co-produced Fun Home.
Welcome to the bottom 👋.