100 Days of SwiftUI Day 63

100 Days of SwiftUI – Day 63

We’ve arrived at day 63 of the 100 Days of SwiftUI. Yesterday, we had the introduction to the latest project we’ll be working on, called Instafilter. Today, we’re starting the implementation of the app as we’ll dive into using Core Image, Apple’s framework for processing and analysing photos and videos. Let’s dive in!

Integrating Core Image with SwiftUI

Using Core Image isn’t quite the same as drawing, at least not for the most part. Instead of paths being drawn onto the screen, images (and videos, but we won’t be looking at those here) are being changed or manipulated. We can sharpen, blur or pixelate photos, for example.

While both Core Image and SwiftUI are Apple’s creation, they don’t quite integrate very well. There are reasons for this, but for the sake of brevity we won’t be diving into them here. What I will say is that, because the integration is a bit iffy, it would be wise to follow Paul’s article/video to ensure you fully grasp the implementation of Core Image.

Wrapping a UIViewController in a SwiftUI view

SwiftUI is a really fantastic framework for building apps, but right now it’s far from complete – there are many things it just can’t do, so you need to learn to talk to UIKit if you want to add more advanced functionality. Sometimes this will be to integrate existing code you wrote for UIKit (for example, if you work for a company with an existing UIKit app), but other times it will be because UIKit and Apple’s other frameworks provide us with useful code we want to show inside a SwiftUI layout.

In this project we’re going to ask users to import a picture from their photo library. Apple’s APIs come with dedicated code for doing just this, but that hasn’t been ported to SwiftUI and so we need to write that bridge ourself. Instead, it’s built into a separate framework called PhotosUI, which was designed to work with UIKit and so requires us to look at the way UIKit works.

Hacking with Swift, Paul Hudson (@twostraws)

As the above quote from Paul alludes to, this again, is a quite complicated subject. UIKit is Apple’s older framework which was, and is still being used to create apps for Apple devices. SwiftUI is a lot easier to work with and is what developers will continue using for the foreseeable future. However, it’s not quite complete yet. Older frameworks haven’t all been updated by Apple yet, which means that in some case, developers need to work out these kinks themselves.

As you’ll see has been a recurring theme more and more often the deeper we go into this course, I recommend always going through Paul’s fantastic video’s/articles. You can find this particular one, here.

Wrap up

And that’s it for day 63! Tomorrow, we’ll be continuing the second part of the implementation of this app. If you’re trying this yourself as well, I’m sure you’re already surprised with the results. I’ve used photo editing apps in the past and it’s really cool to get an insight in how they work and actually being able to make something similar ourselves to an extent. Time to recharge for now!

Darryl

Hi! My name is Darryl and this is my personal blog where I write about my journey as I learn programming! You'll also find articles about other things that interest me including games, tech and anime.

Post navigation

Leave a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

100 Days of SwiftUI – Day 57

100 Days of SwiftUI – Day 76

100 Days of SwiftUI – Day 100 – Review

100 Days of SwiftUI – Day 34