Skip to content

Latest commit

 

History

History
85 lines (59 loc) · 3.78 KB

README.md

File metadata and controls

85 lines (59 loc) · 3.78 KB

Day 53: Project 13: Instafilter, Part Two

Follow along at https://www.hackingwithswift.com/100/53.

📒 Field Notes

This day covers the second part of Project 13: Instafilter in Hacking with Swift.

I previously created projects alongside the material in the book in a separate repository. And you can find Project 13 here. Even better, though, I copied it over to Day 52's folder so I could extend it from where I left off.

With that in mind, Day 53 focuses on several specific topics:

  • Applying filters: CIContext, CIFilter
  • Saving to the iOS photo library

Applying filters: CIContext, CIFilter

For image processing, it's useful to think of the ways Core Image is separate from UIImage. Essentially, UIImages are the "high-level" bookends to the "low-level" processing that Core Image and Core Graphics do in between:

  • A CIImage is instantiated with a UIImage.
  • That CIImage is set as one of the values on a CIImageFilter.
  • CIImageFilters have an outputImage that's lying in wait to be processed by the CIContext.
  • The processed image is a CGImage (CG being "Core Graphics").
  • This CGImage can then be converted into another UIImage — thus completing the filtering cycle ♻.

There are far more details than that, but alas, I'd be better off deferring to Apple's Core Image Programming Guide.

With respect to our app, it means we need ways to handle all of these properties, and have them available when the processing is applied. My applyImageProcessing method — while abstracting some details elsewhere — looks like this:

func applyImageProcessing() {
    guard let (filterKey, filterValue) = currentFilterInfo else {
        return assertionFailure("Unable to compute processing properties for current filter")
    }

    guard let currentOutputImage = currentImageFilter.outputImage else {
        return assertionFailure("Unable to find output image in current filter.")
    }

    currentImageFilter.setValue(filterValue, forKey: filterKey)

    if let processedImage = imageFilterContext.createCGImage(currentOutputImage, from: currentOutputImage.extent) {
        imageView.image = UIImage(cgImage: processedImage)
    }
}

Saving to the iOS photo library

For this, we start with an extremely straightforward method, UIImageWriteToSavedPhotosAlbum — but then we need to configure a callback in a way that's slightly less straightforward:

UIImageWriteToSavedPhotosAlbum(
    currentImage,
    self,
    #selector(image(_:didFinishSavingWithError:contextInfo:)),
    nil
)

...

@objc func image(_ image: UIImage, didFinishSavingWithError error: Error?, contextInfo: UnsafeRawPointer) {
    let alertController = UIAlertController(title: nil, message: nil, preferredStyle: .alert)

    if let error = error {
        alertController.title = "Save Error"
        alertController.message = error.localizedDescription
    } else {
        alertController.title = "Saved!"
        alertController.message = "Your altered image has been saved to your photos."
    }

    alertController.addAction(UIAlertAction(title: "OK", style: .default))

    present(alertController, animated: true)
}

Admittedly, this callback style does make a lot of sense when you're familiar with Apple's naming conventions for delegate methods. But still... I can see where the Objective-C-ness can be a bit off-putting.

.... Choppy like an image run through the CIPixelate filter we just made 🥁.