As it turns out there are quite a few ways to resize images in iOS, and if you're looking for a breakdown of ways to do it NSHipster has a great article on the different ways of doing it. This article is more about how I did it in an app I'm working on and one of the pitfalls that I ran into (and ended up debugging for a couple hours).

Here's the UIKit extension that I used originally:

extension UIImage {
    func resizeImageTo(newSize: CGSize) -> UIImage? {
        let hasAlpha = false
        let scale: CGFloat = 0.0
        UIGraphicsBeginImageContextWithOptions(newSize, !hasAlpha, scale)
        self.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
        let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()
        return scaledImage
    }
}

But I kept running into a very weird issue.

My app is using Core Data to store images, and as you do in Core Data I took my UIImage and turned it into a NSData object in my save method like so:

if let imageData = UIImageJPEGRepresentation(image, 0.6) {
    entry.image = imageData as NSData
}            

But I noticed something very interesting, and that was that my images were still quite large. When I hopped into LLDB to do some poking around I noticed that when I created a UIImage instance from the NSData object the size property of the UIImage had the same size values as the original image directly from the camera, pre-resize. It made no sense to me at first. When doing a container dump of my application I could see the image files and they were at full resolution.

After more poking around, I noticed that the scale value of the resized UIImage was set to 3.0 while the new UIImage loaded from the NSData object had a scale value set to 1.0. That's when it clicked.

According to Apple's UIGraphicsBeginImageContextWithOptions documentation on the scale option:

The scale factor to apply to the bitmap. If you specify a value of 0.0, the scale factor is set to the scale factor of the device’s main screen.

I'm doing the majority of my development on an iPhone 7 Plus, whose default scale factor is set to 3.0. Checking UIScreen.main.scale in LLDB proved that to be true.

My modified code looked like this, and if you're running into the same issue of your images having a different size after using UIImageJPEGRepresentation or UIImagePNGRepresentation make sure you're passing UIGraphicsBeginImageContextWithOptions the proper scale option!

extension UIImage {
    func resizeImageTo(newSize: CGSize) -> UIImage? {
        let hasAlpha = false
        let scale: CGFloat = 1.0
        UIGraphicsBeginImageContextWithOptions(newSize, !hasAlpha, scale)
        self.draw(in: CGRect(origin: CGPoint.zero, size: newSize))
        let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()
        return scaledImage
    }
}

After changing the scale option to 1.0, both the resized UIImage and the one I loaded using the NSData as a test both had the same scale value and therefore the same size values.

It's a small gotcha, but knowing that those two functions multiply your image's size by its scale will save you a ton of time when you're trying to resize your images. Hopefully it'll also help you get a small jump on your app's performance.