iOS: Converting UIImage to RGBA8 Bitmaps and Back

Edited 8/24/11: Fixed a bug with alpha transparency not being preserved. Thanks for the tip Scott! Updated the gist and github project to test transparent images.

Edited 12/13/10: Updated the code on github/gist to fix static analyzer warnings. Changed a function name to conform to the Apple standard.

When I started working with iPhone I was working with Objective-C and C++. I created a library in C++ and needed access to a bitmap array so that I could perform image processing. In order to do so I had to create some helper functions to convert between UIImage objects and the RGBA8 bitmap arrays.

Here are the updated routines that should work on iPhone 4.1 and iPad 3.2. The iPhone 4 has a high resolution screen requires setting a scaling factor for high resolution images. I've added support to set the scaling factor based on the devices mainScreen scaling factor

UPDATE: 9/23/10 My code to work with the Retina display was incorrect, it ran fine on iPad with 3.2, but it didn't do anything "high-res" on iPhone 4. I was using the following:

__IPHONE_OS_VERSION_MAX_ALLOWED >= 30200

but it isn't safe, when I run it for a universal App 4.1/3.2 it will always return 40100, and the expression didn't make sense. (Side Note) I took this check from Apple's website when iPad 3.2 was actually ahead of iPhone 3.1.X, but that doesn't help with iPhone 4.1 being ahead of iPad 3.2.

The issue with iPad is that the imageWithCGImage:scale:orientation: selector doesn't exist on iOS 3.2, most likely it will on iOS 4.2, so the following code should be safe. Some methods in iOS 4.1 don't exist in iOS 3.2, so you need to check to see if a newer method exists before trying to execute it. There are two methods you can use depending on the class/instance (+/-) modifier on the function definition.

+ (BOOL)respondsToSelector:(SEL)aSelector   // (+) Class method check
+ (BOOL)instancesRespondToSelector:(SEL)aSelector   // (-) Instance method check

imageWithCGImage:scale:orientation is a class method, so we need to use respondsToSelector: The correct code to scale the CGImage is below:

if([UIImage respondsToSelector:@selector(imageWithCGImage:scale:orientation:)]) {
	float scale = [[UIScreen mainScreen] scale];
	image = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationUp];
} else {
	image = [UIImage imageWithCGImage:imageRef];
}

[ad#Large Box]

It might help if there was some images to explain what's happening if you don't use this imageWithCGImage:scale:orientation: on the iPhone 4 with the correct scale factor. It should be 2.0 on Retina displays (iPhone 4 or the new iPod Touch) and 1.0 on the 3G, 3GS, and iPad. float scale = [[UIScreen mainScreen] scale]; will provide the correct scale factor for the device. The first image has jaggies in it, while the second does not. The third image, an iPhone 3G/3GS, also does not have jaggies.

[caption id="attachment_697" align="aligncenter" width="451" caption="iPhone 4 with default scale of 1.0 causes the image to be enlarged and with jaggies."][/caption]

[caption id="attachment_698" align="aligncenter" width="451" caption="iPhone 4 with scaling of 2.0 makes the image half the size and removes the jaggies"][/caption]

[caption id="attachment_692" align="aligncenter" width="414" caption="iPhone 3G/3GS with scaling set to 1.0"][/caption]

I hope it helps other people with image processing on the iPhone/iPad. It's based on some previous tutorials using OpenGL, which I fixed (memory leaks) and modified to work with unsigned char arrays (bitmap).

[ad#Link Banner]

Grab the two files here or the sample Universal iOS App project:

Example Usage:

NSString *path = (NSString*)[[NSBundle mainBundle] pathForResource:@"Icon4" ofType:@"png"];
UIImage *image = [UIImage imageWithContentsOfFile:path];
int width = image.size.width;
int height = image.size.height;

    // Create a bitmap
unsigned char *bitmap = [ImageHelper convertUIImageToBitmapRGBA8:image];

    // Create a UIImage using the bitmap
UIImage *imageCopy = [ImageHelper convertBitmapRGBA8ToUIImage:bitmap withWidth:width withHeight:height];

    // Display the image copy on the GUI
UIImageView *imageView = [[UIImageView alloc] initWithImage:imageCopy];

    // Cleanup
free(bitmap);

Below is the full source code for converting between bitmap and UIImage:

ImageHelper.h

/*
* The MIT License
*
* Copyright (c) 2011 Paul Solt, PaulSolt@gmail.com
*
* https://github.com/PaulSolt/UIImage-Conversion/blob/master/MITLicense.txt
*
*/
#import <Foundation/Foundation.h>
@interface ImageHelper : NSObject {
}
/** Converts a UIImage to RGBA8 bitmap.
@param image - a UIImage to be converted
@return a RGBA8 bitmap, or NULL if any memory allocation issues. Cleanup memory with free() when done.
*/
+ (unsigned char *) convertUIImageToBitmapRGBA8:(UIImage *)image;
/** A helper routine used to convert a RGBA8 to UIImage
@return a new context that is owned by the caller
*/
+ (CGContextRef) newBitmapRGBA8ContextFromImage:(CGImageRef)image;
/** Converts a RGBA8 bitmap to a UIImage.
@param buffer - the RGBA8 unsigned char * bitmap
@param width - the number of pixels wide
@param height - the number of pixels tall
@return a UIImage that is autoreleased or nil if memory allocation issues
*/
+ (UIImage *) convertBitmapRGBA8ToUIImage:(unsigned char *)buffer
withWidth:(int)width
withHeight:(int)height;
@end
view raw ImageHelper.h hosted with ❤ by GitHub
ImageHelper.m
/*
* The MIT License
*
* Copyright (c) 2011 Paul Solt, PaulSolt@gmail.com
*
* https://github.com/PaulSolt/UIImage-Conversion/blob/master/MITLicense.txt
*
*/
#import "ImageHelper.h"
@implementation ImageHelper
+ (unsigned char *) convertUIImageToBitmapRGBA8:(UIImage *) image {
CGImageRef imageRef = image.CGImage;
// Create a bitmap context to draw the uiimage into
CGContextRef context = [self newBitmapRGBA8ContextFromImage:imageRef];
if(!context) {
return NULL;
}
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
CGRect rect = CGRectMake(0, 0, width, height);
// Draw image into the context to get the raw image data
CGContextDrawImage(context, rect, imageRef);
// Get a pointer to the data
unsigned char *bitmapData = (unsigned char *)CGBitmapContextGetData(context);
// Copy the data and release the memory (return memory allocated with new)
size_t bytesPerRow = CGBitmapContextGetBytesPerRow(context);
size_t bufferLength = bytesPerRow * height;
unsigned char *newBitmap = NULL;
if(bitmapData) {
newBitmap = (unsigned char *)malloc(sizeof(unsigned char) * bytesPerRow * height);
if(newBitmap) { // Copy the data
for(int i = 0; i < bufferLength; ++i) {
newBitmap[i] = bitmapData[i];
}
}
free(bitmapData);
} else {
NSLog(@"Error getting bitmap pixel data\n");
}
CGContextRelease(context);
return newBitmap;
}
+ (CGContextRef) newBitmapRGBA8ContextFromImage:(CGImageRef) image {
CGContextRef context = NULL;
CGColorSpaceRef colorSpace;
uint32_t *bitmapData;
size_t bitsPerPixel = 32;
size_t bitsPerComponent = 8;
size_t bytesPerPixel = bitsPerPixel / bitsPerComponent;
size_t width = CGImageGetWidth(image);
size_t height = CGImageGetHeight(image);
size_t bytesPerRow = width * bytesPerPixel;
size_t bufferLength = bytesPerRow * height;
colorSpace = CGColorSpaceCreateDeviceRGB();
if(!colorSpace) {
NSLog(@"Error allocating color space RGB\n");
return NULL;
}
// Allocate memory for image data
bitmapData = (uint32_t *)malloc(bufferLength);
if(!bitmapData) {
NSLog(@"Error allocating memory for bitmap\n");
CGColorSpaceRelease(colorSpace);
return NULL;
}
//Create bitmap context
context = CGBitmapContextCreate(bitmapData,
width,
height,
bitsPerComponent,
bytesPerRow,
colorSpace,
kCGImageAlphaPremultipliedLast); // RGBA
if(!context) {
free(bitmapData);
NSLog(@"Bitmap context not created");
}
CGColorSpaceRelease(colorSpace);
return context;
}
+ (UIImage *) convertBitmapRGBA8ToUIImage:(unsigned char *) buffer
withWidth:(int) width
withHeight:(int) height {
size_t bufferLength = width * height * 4;
CGDataProviderRef provider = CGDataProviderCreateWithData(NULL, buffer, bufferLength, NULL);
size_t bitsPerComponent = 8;
size_t bitsPerPixel = 32;
size_t bytesPerRow = 4 * width;
CGColorSpaceRef colorSpaceRef = CGColorSpaceCreateDeviceRGB();
if(colorSpaceRef == NULL) {
NSLog(@"Error allocating color space");
CGDataProviderRelease(provider);
return nil;
}
CGBitmapInfo bitmapInfo = kCGBitmapByteOrderDefault | kCGImageAlphaPremultipliedLast;
CGColorRenderingIntent renderingIntent = kCGRenderingIntentDefault;
CGImageRef iref = CGImageCreate(width,
height,
bitsPerComponent,
bitsPerPixel,
bytesPerRow,
colorSpaceRef,
bitmapInfo,
provider, // data provider
NULL, // decode
YES, // should interpolate
renderingIntent);
uint32_t* pixels = (uint32_t*)malloc(bufferLength);
if(pixels == NULL) {
NSLog(@"Error: Memory not allocated for bitmap");
CGDataProviderRelease(provider);
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(iref);
return nil;
}
CGContextRef context = CGBitmapContextCreate(pixels,
width,
height,
bitsPerComponent,
bytesPerRow,
colorSpaceRef,
bitmapInfo);
if(context == NULL) {
NSLog(@"Error context not created");
free(pixels);
}
UIImage *image = nil;
if(context) {
CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), iref);
CGImageRef imageRef = CGBitmapContextCreateImage(context);
// Support both iPad 3.2 and iPhone 4 Retina displays with the correct scale
if([UIImage respondsToSelector:@selector(imageWithCGImage:scale:orientation:)]) {
float scale = [[UIScreen mainScreen] scale];
image = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationUp];
} else {
image = [UIImage imageWithCGImage:imageRef];
}
CGImageRelease(imageRef);
CGContextRelease(context);
}
CGColorSpaceRelease(colorSpaceRef);
CGImageRelease(iref);
CGDataProviderRelease(provider);
if(pixels) {
free(pixels);
}
return image;
}
@end
view raw ImageHelper.m hosted with ❤ by GitHub

[ad#Large Box]