Mocking a UIKit Delegate Protocol with Kiwi

I'm experimenting with mock objects in Kiwi and decided to try and mock the UIAlertViewDelegate. I ran into two issues. 1. Order is important when creating the mock delegate object. It needs to be setup with all the delegate method receive calls before you use the object associated with the delegate protocol.

2.  The delegate protocol methods need to be mocked in the order they are invoked, or you'll receive a test failure. This can be trial and error, if you don't know how the delegate protocol works.

a. For each mock delegate test, I just kept running until I was able to get the mock delegate calls passing.

b. I also discovered that the didDismissWithButtonIndex and  clickedButtonAtIndex were not invoked after using the dismissWithClickedButtonIndex method call.

[objc] #import "Kiwi.h" #import "PSMessages.h" #import "PSMessageConstants.h"

SPEC_BEGIN(PSMessagesTest)

describe(@"Create an alertview", ^{ __block PSMessages *messages = nil; beforeEach(^{ messages = [[PSMessages alloc] init]; });

NSDictionary *urlMessage = @{ @"type" : @"url", @"url" : @"http://www.PhotoTableApp.com", @"message" : @"Create a collage for the next holiday.", @"title" : @"Design Collages", @"buttons" : @[@"Yes", @"No"] };

context(@"with a url message", ^{ __block UIAlertView *alert = nil; beforeEach(^{ alert = [messages alertForMessage:urlMessage]; }); it(@"the cancel button is pressed", ^{ id delegateMock = [KWMock mockForProtocol:@protocol(UIAlertViewDelegate)]; alert.delegate = delegateMock;

int buttonIndex = alert.cancelButtonIndex;

[[[delegateMock shouldEventually] receive] alertViewShouldEnableFirstOtherButton:alert]; [[[delegateMock shouldEventually] receive] didPresentAlertView:alert]; [[[delegateMock shouldEventually] receive] willPresentAlertView:alert]; [[[delegateMock shouldEventually] receive] alertView:alert willDismissWithButtonIndex:buttonIndex]; [[[delegateMock shouldEventually] receive] alertView:alert willDismissWithButtonIndex:buttonIndex];

// Note: delegate methods not called for programatic UIAlertView dismiss [[[delegateMock shouldEventually] receive] alertView:alert didDismissWithButtonIndex:buttonIndex]; [[[delegateMock shouldEventually] receive] alertView:alert clickedButtonAtIndex:buttonIndex];

// Invoke methods after the delegateMock object is setup [alert show]; [alert dismissWithClickedButtonIndex:buttonIndex animated:NO];

});

it(@"the ok button is pressed", ^{ id delegateMock = [KWMock mockForProtocol:@protocol(UIAlertViewDelegate)]; alert.delegate = delegateMock;

int buttonIndex = alert.firstOtherButtonIndex;

[[[delegateMock shouldEventually] receive] alertViewShouldEnableFirstOtherButton:alert]; [[[delegateMock shouldEventually] receive] didPresentAlertView:alert]; [[[delegateMock shouldEventually] receive] willPresentAlertView:alert]; [[[delegateMock shouldEventually] receive] alertView:alert willDismissWithButtonIndex:buttonIndex]; [[[delegateMock shouldEventually] receive] alertView:alert willDismissWithButtonIndex:buttonIndex];

// Invoke methods after the delegateMock object is setup [alert show]; [alert dismissWithClickedButtonIndex:buttonIndex animated:NO];

}); });

}); SPEC_END [/objc]

Unit Testing Static Libraries with Kiwi for iOS Development

I've been playing with Kiwi and I'm trying some BDD (Behavior Driven Development) for a new static library component I wanted to build. I began with a new Xcode project using the Static Library template, but ran into issues with the difference between "logic tests" and "application tests". In short, all my non-UIKit code worked great, until I started to test my UIKit related functions.

The code crashed and makes it frustrating to write unit tests. If you've never experienced it before, it'll make your unit test experience unproductive. To solve the issue you'll need to create a new target (empty iOS application) and include unit tests. Xcode will automagically setup the unit tests to be "Application Tests" instead of "logic tests."

EmptyApp

#0 0x00a40881 in __HALT () #1 0x0097a971 in _CFRuntimeCreateInstance () #2 0x01337cc1 in GSFontCreateWithName () #3 0x05c32281 in UINewFont () #4 0x05c323ec in +[UIFont systemFontOfSize:traits:] () #5 0x05c32438 in +[UIFont systemFontOfSize:] () #6 0x05be24ee in +[UILabel defaultFont] () #7 0x05be32e5 in -[UILabel _commonInit] () #8 0x05be3424 in -[UILabel initWithFrame:] () #9 0x05e7cc67 in -[UIAlertView(Private) _createTitleLabelIfNeeded] () #10 0x05e8b4b9 in -[UIAlertView setTitle:] () #11 0x05e8bb37 in -[UIAlertView initWithTitle:message:delegate:cancelButtonTitle:otherButtonTitles:] () #12 0x02605831 in __block_global_23 at /Users/paulsolt/dev/Photo-Slide-Show/PSMessages/PSMessagesTests/PSMessagesTest.m:165 #13 0x0261b584 in __25-[KWExample visitItNode:]_block_invoke_0 at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWExample.m:220 #14 0x0261a11e in __42-[KWContextNode performExample:withBlock:]_block_invoke_0 at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:116 #15 0x0261a11e in __42-[KWContextNode performExample:withBlock:]_block_invoke_0 at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:116 #16 0x0261a11e in __42-[KWContextNode performExample:withBlock:]_block_invoke_0 at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:116 #17 0x0261a03e in -[KWContextNode performExample:withBlock:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:132 #18 0x0261a05d in -[KWContextNode performExample:withBlock:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:135 #19 0x0261a05d in -[KWContextNode performExample:withBlock:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWContextNode.m:135 #20 0x0261b539 in -[KWExample visitItNode:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWExample.m:216 #21 0x0261a553 in -[KWItNode acceptExampleNodeVisitor:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWItNode.m:41 #22 0x0261ae22 in -[KWExample runWithDelegate:] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWExample.m:113 #23 0x02618c95 in -[KWSpec invokeTest] at /Users/paulsolt/dev/Photo-Slide-Show/Frameworks/Kiwi/Kiwi/KWSpec.m:105 #24 0x2010405b in -[SenTestCase performTest:] () #25 0x201037bf in -[SenTest run] () #26 0x2010792b in -[SenTestSuite performTest:] () #27 0x201037bf in -[SenTest run] () #28 0x2010792b in -[SenTestSuite performTest:] () #29 0x201037bf in -[SenTest run] () #30 0x201063ec in +[SenTestProbe runTests:] () #31 0x0072f5c8 in +[NSObject performSelector:withObject:] () #32 0x00002342 in ___lldb_unnamed_function11$$otest () #33 0x000025ef in ___lldb_unnamed_function13$$otest () #34 0x0000268c in ___lldb_unnamed_function14$$otest () #35 0x00002001 in ___lldb_unnamed_function4$$otest () #36 0x00001f71 in ___lldb_unnamed_function1$$otest ()

Solution

1. Add a new target with it's own unit tests. Creating unit tests with a "static library" template gives you "logic tests", while creating unit tests with a "iPhone application" gives you "application tests." The difference is that you can't use UIKit classes in logic tests, but you can in application tests.

EmptyApp

2. In an application test, the runloop of the iPhone app starts, which means all the UIKit goodies are setup. The bad news is that it loads all the default state from your previous app runs. You might need to write some methods to cleanup or reset state. (GHUnit is nice because it's more sandboxed)

3. If you're using Kiwi you'll have to setup the Kiwi environment (library/header paths) again for the application tests.

ApplicationUnitTest

UICollectionView Custom Actions and UIMenuController

The UICollectionView can provide a special UIMenuController with cut, copy, and paste actions. To add UICollectionView custom actions you need to implement a few extra methods for the shared UIMenuController object. The view controller's parent window needs to be the key window and you'll need to respond to UIResponder method canBecomeFirstResponder. In your UICollectionViewController class do the following:

[objc] // ViewController.h @interface ViewController : UICollectionViewController

// ViewController.m -(void)viewDidLoad { [super viewDidLoad]; self.collectionView.delegate = self;

UIMenuItem *menuItem = [[UIMenuItem alloc] initWithTitle:@"Custom Action" action:@selector(customAction:)]; [[UIMenuController sharedMenuController] setMenuItems:[NSArray arrayWithObject:menuItem]];

}

#pragma mark - UICollectionViewDelegate methods - (BOOL)collectionView:(UICollectionView *)collectionView canPerformAction:(SEL)action forItemAtIndexPath:(NSIndexPath *)indexPath withSender:(id)sender { return YES; // YES for the Cut, copy, paste actions }

- (BOOL)collectionView:(UICollectionView *)collectionView shouldShowMenuForItemAtIndexPath:(NSIndexPath *)indexPath { return YES; }

- (void)collectionView:(UICollectionView *)collectionView performAction:(SEL)action forItemAtIndexPath:(NSIndexPath *)indexPath withSender:(id)sender { NSLog(@"performAction"); }

#pragma mark - UIMenuController required methods - (BOOL)canBecomeFirstResponder { // NOTE: This menu item will not show if this is not YES! return YES; }

- (BOOL)canPerformAction:(SEL)action withSender:(id)sender { NSLog(@"canPerformAction"); // The selector(s) should match your UIMenuItem selector if (action == @selector(customAction:)) { return YES; } return NO; }

#pragma mark - Custom Action(s) - (void)customAction:(id)sender { NSLog(@"custom action! %@", sender); } [/objc]

Here's what it looks like:

UIMenuController Custom UICollectionView

Linking to a Facebook Page from an iOS App

The Facebook app broke the old way of creating an iOS facebook page link. If the app isn't installed the old facebook link works, but when it is installed it just opens the Facebook app to the default page. Fix your app links to Facebook using this url format:

Old iOS Facebook Page Link:

http://www.facebook.com/PhotoTableApp

New iOS Facebook Page Link:

https://m.facebook.com/PhotoTableApp?_rdr

The only downside is that the user would have to login via the mobile website in order to "like" your Facebook page. It's certainly better than just seeing the "default" screen in the app. I'm not sure if the new Facebook SDK 3.1 fixes any of these issues, but I haven't seen a solution on Stackoverflow.com

 

iPhone Link to Facebook Page fails with App

Linking to a facebook page doesn't work from iOS if the new Facebook app is installed.

iPhone Link to Facebook Page Works

Linking to the mobile Facebook page with the ?_rdr flag fixes the issue.

Using the social.framework on iOS 6.0

Using the social.framework is real simple on iOS 6.0. Apple only refers to their reference, so I decided to show the code snippet example. To change between twitter and  Sina Weibo, just use the types: SLServiceTypeSinaWeibo SLServiceTypeTwitter. It just takes 10 lines of code. If you used the Twitter.framework, you can remove it and replace your Twitter code with the following code. The Social.framework will manage all social networks moving forward.

if([SLComposeViewController isAvailableForServiceType:SLServiceTypeFacebook]) {

SLComposeViewController *socialSheet = [SLComposeViewController composeViewControllerForServiceType:SLServiceTypeFacebook];

[socialSheet setInitialText:@"posted from @PhotoTable"];

[socialSheet addImage:image];

[socialSheet setCompletionHandler:^(SLComposeViewControllerResult result) {

NSLog(@"Result: %d", result);

}];

[self presentViewController:socialSheet animated:YES completion:^ {

}];

}

iPhone Unit Testing Explained - Part II

Xcode 4 has drastically improved iPhone unit testing and Mac unit testing from my previous post, iPhone Unit Testing Explained - Part I Creating the unit testing target is easy and you can start writing test code in under 5 minutes.

The biggest hassle in testing is setting up the project correctly, and Xcode 4 makes it simple. If you read Part I, I pushed for GHUnit because of the GUI interface, but now Xcode's built in testing is enough to get you started. If you need a GUI, add GHUnit later, but start writing your tests today, since they're compatible with GHUnit when you decide to integrate with it.

Testing is important to start from the beginning or you will never have the motivation to write the tests unless your boss demand

Getting Started

To start writing unit tests you have two options, either create a new project with unit tests or add unit tests to an existing project.

New Project with Unit Tests

Create a new project and make sure the checkbox is enabled for unit tests.

 

New Xcode 4 Project with Unit Tests

Add Unit Tests Target to Existing Projects

Add a unit test target to your project by clicking on your Project (top left) -> Add Target (bottom middle) -> iOS -> Other -> Unit Testing Bundle.

 

Add Unit Tests to Existing Xcode Project

(Optional) Share the Target and Testing Scheme

If you add a unit test target, you'll most likely want to share your testing scheme with your team over version control (git, svn, etc.) Otherwise you're teammates will have to setup for themselves.

Goto Editor -> Manage Schemes -> Click Shared next to the Unit Test

 

Share Xcode Schemes with Teammates

Adding Resources

When you want to test code or import resources like images or data files you'll need to tell the testing target about the resources. There are two ways, you can do it when you first add the resource to the project or you can do it by editing the Build Phases for the unit test target.

 

Adding New Resources to the Unit Test Target

Click on File -> Add Files to "TestProject" -> Click on the checkbox on unit test target and Copy items

 

Copy Items and add to Unit Test Target in Xcode 4

Adding Existing Resources to the Unit Test Target

Click on your project "TestProject" -> Build Phases -> Expand one of three (Compile Sources, Link Binary With Libraries, or Copy Bundle Resources)

 

Build Phases for unit test in Xcode 4

Resource Paths are Different!

Many assumptions that your bundle is the main bundle will cause problems when testing. (Especially when adding tests to existing code) Look at the difference in bundles, the main bundle isn't what you'd expect in a unit test.

NSString *mainBundlePath = [[NSBundle mainBundle] resourcePath];

NSString *directBundlePath = [[NSBundle bundleForClass:[self class]] resourcePath];

NSLog(@"Main Bundle Path: %@", mainBundlePath);

NSLog(@"Direct Path: %@", directBundlePath);

NSString *mainBundleResourcePath = [[NSBundle mainBundlepathForResource:@"Frame.png" ofType:nil];

NSString *directBundleResourcePath = [[NSBundle bundleForClass:[self class]] pathForResource:@"Frame.png" ofType:nil];

NSLog(@"Main Bundle Path: %@", mainBundleResourcePath);

NSLog(@"Direct Path: %@", directBundleResourcePath);

Output:

Main Bundle Path: /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator5.1.sdk/Developer/usr/bin

Direct Path: /Users/paulsolt/Library/Developer/Xcode/DerivedData/PhotoTable-dqueeqsjkjdthcbkrdzcvwifesvl/Build/Products/Debug-iphonesimulator/Unit Tests.octest

Main Bundle Path: (null)

Direct Path: /Users/paulsolt/Library/Developer/Xcode/DerivedData/PhotoTable-dqueeqsjkjdthcbkrdzcvwifesvl/Build/Products/Debug-iphonesimulator/Unit Tests.octest/Frame.png

Problem: My Unit test has a nil image, data file, etc. Why?

The unit test doesn't use the same bundle for resources that you're accustomed to when running an app. Therefore, the resource we're trying to load cannot be found.  You'll need to make changes to the code to support testing external resources (images, data files, etc). For example the following code

- (UIImage *)resizeFrameForImage:(NSString *)theImageName {
UIImage *image = [UIImage imageNamed:theImageName];
// ... do magical resize and return
return image;
}

Solution 1: Change the function parameters

Functions like these are semi-black boxes that aren't ideal for testing. You want access to all your inputs/outputs, especially if we're working with any kind of file resource. To fix it, just pass in the resource from the unit test, rather than having the function load it from a NSString object.

- (UIImage *)resizeFrameForImage:(UIImage *)theImage {
// ... do magical resize and return
return theImage;
}

Solution 2: Change the resource loading inside the function

If you need to load the resource in the function, you can alternatively change the way it is loaded. You need to stop using UIImage's imageNamed: method and switch to imageWithContentsOfFile: This way you can pass in the resource with the correct path, however it'll change logic elsewhere in your app.

- (UIImage *)resizeFrameForImage:(NSString *)theImagePath {
UIImage *image = [UIImage imageWithContentsOfFile:theImagePath];
// ... do magical resize and return
return image;
}

Solution 3: Load resources using the bundle for the current class

- (UIImage *)resizeFrameForImage:(NSString *)theImageName {
// Note: There are several ways you can write it, but make sure you include
//  the extension or you'll have trouble finding the resource
// 1. NSString *imagePath = [[NSBundle bundleForClass:[self class]] pathForResource:@"Image.png" ofType:nil];
// 2. NSString *imagePath = [[NSBundle bundleForClass:[self class]] pathForResource:@"Image" ofType:@"png"];
NSString *imagePath = [[NSBundle bundleForClass:[self class]] pathForResource:theImageName ofType:nil];
UIImage *image = [UIImage imageWithContentsOfFile:imagePath];
// ... do magical resize and return
return image;
}

My First Test

Code completion in Xcode will make writing tests easy. To test different things you'll use the following macros:

  • STAssertNotNil(Object, Description);
  • STAssertEquals(Value1, Value2, Description);
  • STAssertEqualObjects(Object1, Object2, Description);

Example: MyTest.m

@implementation TestImagePrintHelper
- (void)setUp
{
[super setUp];
// Set-up code here.
}
- (void)tearDown
{
// Tear-down code here.
[super tearDown];
}
- (void)testName {
NSString *testFirstName = @"Paul";
STAssertEqualObjects([person firstName], testFirstName, @"The name does not match");
}
@end

The STAssertEqualObjects macro will invoke the object's isEqual method, make sure you write one. See the section below. If you used the STAssertEquals it will test for primitive/pointer equality, not object equality.

Testable Code

Writing testable code requires that you add some additional functions that might feel optional before you decided to start testing.

1. Create a isEqual method for your class.

Most of the time you'll want to compare if the object is the correct object. This always requires that you write an isEqual method, otherwise you'll be using the NSObject isEqual test and it'll compare address pointers for the objects.

Example: Person.m

- (BOOL)isEqual:(id)other {
if (other == self) { // self equality, compare address pointers
return YES;
}
if (!other || ![other isKindOfClass:[self class]]) { 
// test not nil and is same type of class
return NO;
}
return [self isEqualToPerson: other]; // call our isEqual method for Person objects
}
- (BOOL)isEqualToPerson:(Person *) other {
BOOL value = NO;
if (self == other) { // test for self equality
value = YES;
 } elseif([[selffirstName] isEqualToString:[other firstName]] &&
[self age] == [other age]) {
// Add any other tests for instance variables (ivars) that need to be compared
value = YES;
}
return value;
}

2. Create a description method for your class.

This is what will output on the command line, rather than the objects memory address. It can be also called when you decide to print the value in a tests

Example: Person.m

- (NSString *)description {
return [NSStringstringWithFormat:@"Person Name: %@ Age: %d", [self firstName], [self age]];
}

Example: Test using the description method, and we'll see the first name and age printed like it was formatted in our description method.

STAssertEqualObjects([personfirstName], testFirstName, @"The name does not match %@", person);

Further reading in Apple's Unit Testing Guide is available. Now you have the basics for unit testing. The next part will provide an example project using resources and providing unit tests.

(Part III: Coming Soon)

The Potential of Siri

I've been working on iPhone development for several years and I have been very excited to see how the iPhone SDK has progressed. See my original article on how I envisioned the iPad revolutionizing the computing experience. Siri takes the iPhone platform to an entirely new level.  Feature article by MD

If you're curious as to how the potential of Siri will impact iOS App development, then look no further than the video Apple used to demonstrate the technology, despite its somewhat heavy-on-the-awesome slant. The fact of the matter is, this is voice recognition technology that's approaching the level of sophistication you saw in Startrek Voyager as a kid, and it could do wonders for 3rd party App developers.

"Load Angry Birds. New game, please. Aim 45 degrees off the starboard. Lower 3 degrees. Fire. Raise 3 degrees. Fire. Down with the piggies! Fire."

It sounds far fetched, but is it? Voice control to do anything from text messaging to calling, and it's not difficult to imagine it controlling things that are not traditionally based around natural voice input. It'd also be a great idea to work in some camera/augmented reality games, as playing "I Spy" with Siri could only make the device even more intuitive, when you're not ringing your husband, friend or O2 without even touching your handset.

An open API to Siri will help push developers in a new direction building innovative software for the iPhone 4S, given that Siri pushes the boundaries in a way that encourages people to try something new. Google Goggles was a great example of this - it took what QR codes were doing and pushed it farther than anyone had expected it to go - I certainly didn't expect to be able to use my voice to dictate a text message, make two calls and send an email to my boss, did you?

The best thing about it is the potential it awards developers working with voice and hardware interaction, as unless you're paranoid about SkyNet and HAL-9000 becoming a reality, it certainly puts a friendlier face on an already friendly looking phone. All that's left is for us to enjoy the adventures many app programmers are embarking on with Siri, in addition to the talented invidiual who's just ported it to the iPhone 4. Good times to come, ahead.

Idea to App Store - How to make, market, and sell your iPhone App

I've posted my presentation from the Computer Science Community at RIT on making iPhone/iPad apps from February 2011. The video is in two parts and I discuss the initial sales, analytics, marketing when releasing an iPhone app.

Update: 8/31/11 Idea to App Store Slides (PDF)

Part 1:

httpvh://www.youtube.com/watch?v=2uza2teEBEQ

Part 2:

httpvh://www.youtube.com/watch?v=iyBis6W28Rw

Transparent UITableView with Custom Background UIView and Tap Gestures

In order to create a custom background for a transparent UITableView you'll need to do a few things. I've got the basic code below after a lot of tinkering. I've also included how to make it so you can hide the UITableView when you tap in the transparent areas below the rows using a UITapGestureRecognizer. In the images below you can see the custom view in action.

UITableView with custom backgroundView
UITableView with Custom Background

[caption id="attachment_1101" align="aligncenter" width="159" caption="Hidden UITableView Showing Custom Background"]Hidden UITableView with Custom Background[/caption]

Key Points:

  • Don't subclass UITableView, instead use it as a instance variable in your own custom UIViewController subclass.
  • Create a custom UIView subclass to use as the background view, this will be visible when the UITableView is hidden or has a transparent background view.
  • On iPad make sure you clear the UITableView's backgroundView and set it to nil in addition to setting the background color to[UIColor clearColor]
  • Register a UITapGestureRecognizer with the viewController's view and then set the attribute cancelsTouches to NO so that the touches from the gesture propagate to both the UITableView and the custom background view.
  • In the -(void)handleTapGesture: method you'll want to send taps that don't touch a row to toggle the UI so that the UITableView hides or unhides.

Notes:

  • I show a UINavigationBar, so my UITableView frame needs to take into account the size of the navigation bar.
  • Set the UILabel's or custom views backgroundColor in the table's cells to have [UIColor clearColor] so that they animate and fade correctly.

See the sample code below:

Artwork Evolution on App Store

Artwork Evolution - Paul Solt Artwork Evolution, my first iOS App is now available on the App Store for iPhone, iPod Touch, and iPad. It allows you to create complex abstract art with the touch of a finger. You can breed images together to create new images.

[caption id="attachment_1038" align="aligncenter" width="396" caption="Artwork Evolution on iPhone 4"][/caption]

Tutorial Video

httpvh://www.youtube.com/watch?v=_VZnFsnO4YY

iOS: Converting UIImage to RGBA8 Bitmaps and Back

Edited 8/24/11: Fixed a bug with alpha transparency not being preserved. Thanks for the tip Scott! Updated the gist and github project to test transparent images.

Edited 12/13/10: Updated the code on github/gist to fix static analyzer warnings. Changed a function name to conform to the Apple standard.

When I started working with iPhone I was working with Objective-C and C++. I created a library in C++ and needed access to a bitmap array so that I could perform image processing. In order to do so I had to create some helper functions to convert between UIImage objects and the RGBA8 bitmap arrays.

Here are the updated routines that should work on iPhone 4.1 and iPad 3.2. The iPhone 4 has a high resolution screen requires setting a scaling factor for high resolution images. I've added support to set the scaling factor based on the devices mainScreen scaling factor

UPDATE: 9/23/10 My code to work with the Retina display was incorrect, it ran fine on iPad with 3.2, but it didn't do anything "high-res" on iPhone 4. I was using the following:

__IPHONE_OS_VERSION_MAX_ALLOWED >= 30200

but it isn't safe, when I run it for a universal App 4.1/3.2 it will always return 40100, and the expression didn't make sense. (Side Note) I took this check from Apple's website when iPad 3.2 was actually ahead of iPhone 3.1.X, but that doesn't help with iPhone 4.1 being ahead of iPad 3.2.

The issue with iPad is that the imageWithCGImage:scale:orientation: selector doesn't exist on iOS 3.2, most likely it will on iOS 4.2, so the following code should be safe. Some methods in iOS 4.1 don't exist in iOS 3.2, so you need to check to see if a newer method exists before trying to execute it. There are two methods you can use depending on the class/instance (+/-) modifier on the function definition.

+ (BOOL)respondsToSelector:(SEL)aSelector   // (+) Class method check
+ (BOOL)instancesRespondToSelector:(SEL)aSelector   // (-) Instance method check

imageWithCGImage:scale:orientation is a class method, so we need to use respondsToSelector: The correct code to scale the CGImage is below:

if([UIImage respondsToSelector:@selector(imageWithCGImage:scale:orientation:)]) {
	float scale = [[UIScreen mainScreen] scale];
	image = [UIImage imageWithCGImage:imageRef scale:scale orientation:UIImageOrientationUp];
} else {
	image = [UIImage imageWithCGImage:imageRef];
}

[ad#Large Box]

It might help if there was some images to explain what's happening if you don't use this imageWithCGImage:scale:orientation: on the iPhone 4 with the correct scale factor. It should be 2.0 on Retina displays (iPhone 4 or the new iPod Touch) and 1.0 on the 3G, 3GS, and iPad. float scale = [[UIScreen mainScreen] scale]; will provide the correct scale factor for the device. The first image has jaggies in it, while the second does not. The third image, an iPhone 3G/3GS, also does not have jaggies.

[caption id="attachment_697" align="aligncenter" width="451" caption="iPhone 4 with default scale of 1.0 causes the image to be enlarged and with jaggies."][/caption]

[caption id="attachment_698" align="aligncenter" width="451" caption="iPhone 4 with scaling of 2.0 makes the image half the size and removes the jaggies"][/caption]

[caption id="attachment_692" align="aligncenter" width="414" caption="iPhone 3G/3GS with scaling set to 1.0"][/caption]

I hope it helps other people with image processing on the iPhone/iPad. It's based on some previous tutorials using OpenGL, which I fixed (memory leaks) and modified to work with unsigned char arrays (bitmap).

[ad#Link Banner]

Grab the two files here or the sample Universal iOS App project:

Example Usage:

NSString *path = (NSString*)[[NSBundle mainBundle] pathForResource:@"Icon4" ofType:@"png"];
UIImage *image = [UIImage imageWithContentsOfFile:path];
int width = image.size.width;
int height = image.size.height;

    // Create a bitmap
unsigned char *bitmap = [ImageHelper convertUIImageToBitmapRGBA8:image];

    // Create a UIImage using the bitmap
UIImage *imageCopy = [ImageHelper convertBitmapRGBA8ToUIImage:bitmap withWidth:width withHeight:height];

    // Display the image copy on the GUI
UIImageView *imageView = [[UIImageView alloc] initWithImage:imageCopy];

    // Cleanup
free(bitmap);

Below is the full source code for converting between bitmap and UIImage:

ImageHelper.h

ImageHelper.m

[ad#Large Box]

C++ Logging and building Boost for iPhone/iPad 3.2 and MacOSX

In my effort to write more robust and maintainable code I have been searching for a cross-platform C++ logging utility. I'm working on a C++ static library for iPhone/iPad 3.2/Mac/Windows and I needed a way to log what was happening in my library. Along the way I was forced to build Boost for iPhone, iPhone Simulator, and the Mac.

Why logging?

Mobile devices lack a console when detached from a development machine, so it's hard to track down issues. I needed a system that could log at multiple levels (Debug1, Debug2, Info, Error, Warning) and be thread safe. Multiple logger levels allow a developer to turn up/down the detail of information that is stored, which in turn affect performance with I/O writes. A developer with logging information can better track down crashes and other issues during an applications lifetime.

Why Boost Logger Library v2?

I struggled trying to get a logger working. After many failed attempts with Pantheios, log4cxx, log4cpp, and glog, I settled on the Boost Logger Library v2 because I was able to "compile" for iPhone/iPad 3.2 and Mac OSX. Most of the loggers required other dependencies that would need to be rebuilt for iPhone and didn't directly support iPhone.

The Boost Logger is all header files so it doesn't require "compiling," which made it much easier to get working. However, it does require a few Boost libraries that need to be compiled. The Boost Logging needs the following libraries: filesystem, system, and threading depending on what functionality is used.

Step 1: Building Boost for iPhone/iPad and iPhone Simulator 3.2

A few Boost libraries need compiling for the iPhone/iPad and the iPhone Simulator in order to link against the Boost Logger. Matt Galloway provided a demo on how to compile Boost 1.41/1.42 for iPhone/iPhone Simulator. Here are the steps I used for Boost 1.42 based on his tutorial.

[ad#Large Box]

  1. Get Boost 1.42
  2. Extract Boost:
  3. tar xzf boost_1_42_0.tar.gz
  4. Create a user-config.jam file in your user directory (~/user-config.jam) such as /Users/paulsolt/user-config.jam with the following. (Note:  this config file needs to be rename or moved during the MacOSX bjam build)
  5. ~/user-config.jam

    using darwin : 4.2.1~iphone
       : /Developer/Platforms/iPhoneOS.platform/Developer/usr/bin/gcc-4.2 -arch armv7 -mthumb -fvisibility=hidden -fvisibility-inlines-hidden
       : 
       : arm iphone iphone-3.2
       ;
    
    using darwin : 4.2.1~iphonesim
       : /Developer/Platforms/iPhoneSimulator.platform/Developer/usr/bin/gcc-4.2 -arch i386 -fvisibility=hidden -fvisibility-inlines-hidden
       : 
       : x86 iphone iphonesim-3.2
       ;
  6. Make sure the file boost_1_42_0/tools/build/v2/tools/darwin.jam has the following information:
  7. tools/build/v2/tools/darwin.jam
    ## The MacOSX versions we can target.
    .macosx-versions =
        10.6 10.5 10.4 10.3 10.2 10.1
        iphone-3.2 iphonesim-3.2
        iphone-3.1.3 iphonesim-3.1.3
        iphone-3.1.2 iphonesim-3.1.2
        iphone-3.1 iphonesim-3.1
        iphone-3.0 iphonesim-3.0
        iphone-2.2.1 iphonesim-2.2.1
        iphone-2.2 iphonesim-2.2
        iphone-2.1 iphonesim-2.1
        iphone-2.0 iphonesim-2.0
        iphone-1.x
        ;
  8. Change directories to the Boost directory that you downloaded:
  9. cd /path/to/boost_1_42_0
  10. Run the following commands to compile the iPhone and iPhone Simulator Boost libraries. I only need filesystem, system, and thread to be use Boost logging for the iPhone, so I don't build everything. Run ./bootstrap.sh --help or ./bjam --help for more options. I built the binaries to a location in my development folder to include in my project dependencies.
  11. ./bootstrap.sh --with-libraries=filesystem,system,thread

    ./bjam --prefix=${HOME}/dev/boost/iphone toolset=darwin architecture=arm target-os=iphone macosx-version=iphone-3.2 define=_LITTLE_ENDIAN link=static install
    ./bjam --prefix=${HOME}/dev/boost/iphoneSimulator toolset=darwin architecture=x86 target-os=iphone macosx-version=iphonesim-3.2 link=static install
  12. Update: Create a universal Boost Library using the lipo tool. In this example I'm assuming the binaries that were created have the following names. The names from the bjam generation will be different, based on your own configuration.End Update
  13. [ad#Link Banner]

    lipo -create libboost_filesystem_iphone.a libboost_filesystem_iphonesimulator.a -output libboost_filesystem_iphone_universal.a
    
    lipo -create libboost_system_iphone.a libboost_system_iphonesimulator.a -output libboost_system_iphone_universal.a
    
    lipo -create libboost_thread_iphone.a libboost_thread_iphonesimulator.a -output libboost_thread_iphone_universal.a
    
  14. I'm working on a cross-platform project and my directory structure looks like the following structure. I copied the include and lib files for iPhone and iPhone Simulator into the appropriate directories. The dependency structure allows me to checkout the project on another machine and have relative references to Boost and other dependencies.
  15.    |-ArtworkEvolution
       |---Xcode
       |-----BoostLoggingTest
       |---dependencies
       |-----iphone
       |-------debug
       |-------release
       |---------include
       |-----------boost
       |---------lib
       |-----iphone-simulator
       |-------debug
       |-------release
       |---------include
       |-----------boost
       |---------lib
       |-----macosx
       |-------debug
       |-------release
       |---------include
       |-----------boost
       |-----------libs
       |-----win32
       |---docs
       |---source
       |---tests
  16. Download the Boost Logging Library v2 and unzip it.
  17. Copy and paste the logging folder into each include/boost folder for iPhone and iPhone Simulator dependency folders like in my directory structure. After you unzip the header files are located in the folder logging/boost/logging.

Step 2:  Creating the Xcode Project

With the iPhone and iPhone Simulator Boost libraries in hand we're ready to make an Xcode project. Due to the difference in the iPhone and iPhone Simulator libraries we'll need to make two targets. One will build linking against the iPhone Boost libraries (arm) and the other against the iPhone Boost Simulator libraries (x86).

Update: You don't need to create two targets, as we can use the lipo tool to make a universal iPhone/iPhone Simulator library file. The universal library file can be shared between iPhone and iPhone Simulator build configurations. See the instructions for using lipo to create the universal library files in the previous section. However, I will keep the two target instructions up as an alternate approach for Xcode project development, if you choose not to use the lipo tool.

End Update

[ad#Link Banner]

1. Create a new iPhone project (view based)

2. There will be two targets: "BoostLoggingTest Device" and "BoostLogging Test Simulator" each will reference different headers and libraries. Duplicate the starting target and rename each target respectively.

[caption id="attachment_566" align="aligncenter" width="492" caption="Duplicate target to make iPhone/iPhoneSimulator targets"][/caption]

3. Add the libraries that we compiled into two groups: device and simulator under Resources. Right-click on the group "Simulator" or "Device" and select "Add Existing Files". Search for the library .a files that you copied into the iphone and iphone-simulator directories. These resources should be added relative to the project folder.

4. Drag the appropriate libraries to each Target. We need two targets since the architecture is different on the iPhone device (arm) versus the iPhone Simulator (Intel x86).

[caption id="attachment_569" align="aligncenter" width="476" caption="Drag the device libraries to the device target."][/caption]

[caption id="attachment_570" align="aligncenter" width="476" caption="Drag simulator dependencies to the iPhone simulator target"][/caption]

5. Add the "Header Search Path" for each target. For me the relative path will be two directories up from the Xcode project folders:  ../../dependencies/iphone/release/include and ../../dependencies/iphone-simulator/release/include. Right-click on each Target in the left pane and click on "Get Info" -> Build -> Type "Header" in the search field -> Edit the list of paths.

[caption id="attachment_571" align="aligncenter" width="512" caption="Add the Device Target Header Search path for the boost libraries"][/caption]

[caption id="attachment_572" align="aligncenter" width="518" caption="Add the simulator targets Header Search Paths"][/caption]

6. Change the base SDK of each target. For the Device you need to use iPhone Device 3.2 and the Simulator Target needs iPhone Simulator 3.2 or later.

[caption id="attachment_573" align="aligncenter" width="431" caption="Set the Device Target to iPhone Device 3.2"][/caption]

[caption id="attachment_574" align="aligncenter" width="431" caption="Set the Simulator Target to iPhone Simulator 3.2"][/caption]

7. Now you have two different targets. One is for the iPhone Device and the other is for the iPhone Simulator. We did this because we built separate binaries for Boost on the iPhone (arm) and simulator (x86) platforms.

8. Set the project's Active SDK to use the Base SDK (top left of Xcode). Now it will automatically choose the iPhone Device or iPhone Simulator based on the Base SDK of each Target you select.

9. Logging on the iPhone requires that we use the full path to the file within the application sandbox. Use the following Objective-C code to get it:

[ad#Link Banner]

NSString *docsDirectory = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
NSString *path = [docsDirectory stringByAppendingPathComponent:@"err.txt"];
const char *outputFilename = [path UTF8String];

10. I modified one of the Boost Logging samples to use the full file path on the iPhone. Rename the main.m as main.mm to use Objective-C/C++ and copy paste the following:  main.mm code

11. If everything compiled and ran on the Device you can get the application data from the Xcode Organizer (Option+Command+O) Navigate to Devices and then look in Applications for the test application. Just drag the "Application Data" to your desktop to download it from the device. Your logs should appear in the Documents folder.

Part 3: Build Boost for Mac OS X 10.6 - 4 way fat (32/64 PPC and 32/64 Intel)

1. Build boost for Mac OS X. Note:  If you setup the user-config.jam file for iPhone Boost build, rename or move the file to a different folder than your home directory, otherwise ignore this command.

mv ~/user-config.jam ~/user-config.jam.INACTIVE
cd /path/to/boost_1_42_0
./bootstrap.sh --with-libraries=filesystem,system,thread
./bjam --prefix=${HOME}/dev/boost/macosx toolset=darwin architecture=combined address-model=32_64 link=static install

2. Copy the output into your dependency structure and add the Boost Logging Library headers into the include/boost folder. (Same procedure as with iPhone)

3. Setup a Xcode project or target with the appropriate header search path, Boost Mac OSX libraries in the same way we setup the iPhone Xcode project.

Note: If you get warnings about hidden symbols and default settings open the Xcode project for and make sure that the "Inline Methods Hidden" and "Symbols Hidden by Default" are unchecked. Clicking on/off might fix any Xcode warnings.

References:

[ad#Large Box]

iPad Revolution

A lot of people have been talking about the iPad. Here are my opinions on the future of iPad, computing, and entertainment.

The iPad is set to revolutionize how we interact with multimedia content and computers. There are a number of reasons that make the hardware and software standout. First and foremost, it is affordable cutting-edge technology. The $499 price point means that it is not out of reach for average consumers who are interested in an updated “all-in-one” computing device. All 9.7 inches of the screen are multi-touch, which will allow software developers to create very interactive applications. Star Trek, Avatar, and other science fiction movie computer interfaces can finally be realized on a large multi-touch screen. The device is connected, which allows the consumer to use it anywhere. Lastly, the device will provide the ultimate responsive user experience.

Home Entertainment Revolution

Apple now has the ability to revolutionize the home entertainment market. They are provided a multi-media portal, which will change the way we use our TV’s, computers, and music players. Imagine controlling a TV from the couch without attaching any wires. A user might want to watch “Batman Begins” on their 42” HDTV. A few touches will open iTunes and start the movie. I mentioned the TV, how does that fit into the picture? The movie streams wirelessly in HD from the couch to the 42” TV. Gone are all the cables, remotes, and hassles. Don’t bother with power cable, since the device will play content for 10 hours straight. The entertainment cabinet can be cleaned out. Throw out the VCR, CD player, DVD player, Blue-Ray player, cable TV, satellite TV, and digital antennas because they are not needed. Apple will be the one stop remote control into all media content and it will be seamless to use and control.

Affordable Technology

A few years back, in 2007, Amazon set out to take over the digital books arena. They did a pretty good job at providing access to books, but that is about all they did. The Kindle DX costs $489 and is just a digital book reader. It has limited processing power and storage space. The main attraction is the e-ink technology that is supposed to be easier to read. Overall, the device is nice, but is very limited in the target audience and lacks multi-media capabilities like an iPhone. Apple worked hard to set the price point of the iPad as close as possible to the Kindle, because they are directly competing with Kindle’s e-book market. For $10 more one can get a fully color display that can play videos, music, games, display e-books, and run applications. Apple did a wonderful job in selecting a set of features that could be combined for a relatively low price point. The device is slightly more expensive than other eBook readers and net books, but not overly expensive.

[ad#Large Box]

Large Multi-Touch Screen

For the longest time computers were something that required skill to use. However, this learning steep learning curve is almost no longer the case with the iPhone, iPod Touch, and iPad. The iPhone revolution brought capacitive multi-touch screens to the public. In English this means that a user just touches, not “presses,” the screen to perform actions. iPad is riding on that revolution wake and it is taking it step further by increasing the size of the screen. This technology is not foreign; it is mainstream and it is here to stay because it works. If a user knows how to use an iPhone or a laptop track pad then the transition is smooth. The touch screen is key, because it allows people to interact with a device just like they might interact with a microwave or a washing machine. A user physically touches, taps, and slides controls around that directly mirror the physical world. The iPad is a natural user interface and it is what most people what, but do not know how to ask for.

Software is Key

The main attraction with any working piece of hardware is software. People want to use a piece of hardware that is customizable. At any given point the device can assume different roles, because it was built to be extensible. In one instant it is an email program, movie player, music player, and then an entire college library. Apple has created a platform that provides many inputs and outputs that software developers can hook into to provide new and novel user experiences. The software development kit (SDK) has given developers direct access to technology that was locked down or too expensive to use. Developers can use a digital compass, accelerometer, multi-touch screen, microphone, and motion sensors to interact with a user in astonishing ways.

Connected

The iPhone provided the all-in-one experience because it can double as a music player, movie player, email program, Internet browser, and eBook reader. It was small, but it had the ability to execute each of those tasks. It has those abilities because of the Wi-Fi and 3G data connections. These connections make it possible to see content beyond the walls of a single hard drive. It provides a much richer experience to user. The iPad takes these same tasks and now makes it better by providing a bigger experience. Users can use these connections in a larger form factor and can be more productive. For most users a simple Wi-Fi connection will be all they need from the couch in the living room. Some users might be active and on the go, so they will need a 3G wireless connection. Apple has realized this connection issues and separated both technologies to reach different consumers needs. Users can get the Wi-Fi by itself, or combine Wi-Fi and 3G if they need to always have a connection the the internet.

[ad#Link Banner]

User Experience

Users want fast responsive devices, not sluggish devices. A lot of users complain that they cannot run multiple applications (multi-tasking) on the iPhone, but what they do not realize is what they have to give up for multiple applications. Running anything in parallel on a mobile device means that it is dividing computing resources and power among applications that are invisible in the background. These resource hogs will slow a device down and drain a battery.

Traditional multi-tasking is not what users want. Apple supports multi-tasking, but only to first party applications. In restricting access, Apple has complete control of the user experience. Third-party multi-tasking is not supported for a few reasons.

  • Window’s Task Manager is a power user feature that is unnecessarily complicated. On a Windows Mobile 6.x device, task manager is a terrible experience. For example, pressing the ‘X’ on an application is not guaranteed to close the application. The button may only minimize the application, in which case it is still using computing resources and draining the battery. The ability to manage open applications is a power user feature on a mobile device and should be hidden from a typical user.
  • What is the difference between running an application in the background and running an application one at a time if the transition from one application to the next is fast and seamless? Does the experience have to differ solely from a technicality? iPhone applications can save state from the last thing they were doing when they are closed. For example, if a user is composing an email about a trip on an iPad. They need weather information and decide to check the weather with the following steps.
    1. Press the home button.
    2. Touch the weather application.
    3. Press the home button.
    4. Touch the email application.
    5. Resume composing the email with the updated weather knowledge.
  • Running applications in the background allows companies to directly compete with Apples multimedia business. iTunes is one of the few applications like Mail and Messages that run in the background. A user can play music through iTunes while using different applications. If a user could use Pandora for music in the background, then they would have a smaller reason to stay on the iTunes platform. I do not see Apple changing this policy, since it is not in their best interest.

Conclusion

The iPad will simplify the experience to download new movies, games, music, books, and utility applications. There is no doubt in my mind that Apple will continue to innovate on this new iPad platform to further simplify and connect the multimedia experience in every persons home. The iPad is magic and just works.

[ad#Large Box]