Testing a Camera App in iOS Simulator

As of today, iOS simulator doesn’t use your dev machine’s camera to simulate iDevice’s camera – all you get is a black screen. I walked the slow route of testing my first iOS app Progress on actual devices instead of in the simulator for a long time before I realized there’s a simple work-around for this.

I also wanted to show how to start camera asynchronously in case you weren’t already doing that, and how to use AVFoundation for the camera (I was scared of it myself at first, but do trust me when I say I regret it).

Let’s say we have a CameraViewController – how you get to it is up to you. I wrote my own container view controller that pushes/pops the camera VC and acts as its delegate. You may use a navigation controller, or display it modally, etc.

CameraViewController.h:

#import <UIKit/UIKit.h>

@protocol CameraDelegate 

- (void)cameraStartedRunning;
- (void)didTakePhoto:(UIImage *)photo;

@end

@interface CameraViewController : UIViewController

@property (nonatomic, weak) id<CameraDelegate> delegate;

- (void)startCamera;
- (void)stopCamera;
- (BOOL)isCameraRunning;

@end

Your container or calling VC will implement the CameraDelegate protocol, spin up an instance of CameraViewController, register itself as the delegate on it, and call startCamera. Once the camera is ready to be shown, cameraStartedRunning will be called on the delegate controller on the main thread. 

Starting the Camera

CameraViewController.m, part 1:

#import "CameraViewController.h"
#import <AVFoundation/AVFoundation.h>

@interface CameraViewController ()

@property (nonatomic) AVCaptureSession *captureSession;
@property (nonatomic) UIView *cameraPreviewFeedView;
@property (nonatomic) AVCaptureStillImageOutput *stillImageOutput;
@property (nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;

@property (nonatomic) UILabel *noCameraInSimulatorMessage;

@end

@implementation CameraViewController {
	BOOL _simulatorIsCameraRunning;
}

- (void)viewDidLoad
{
    [super viewDidLoad];

	self.noCameraInSimulatorMessage.hidden = !TARGET_IPHONE_SIMULATOR;
}

- (UILabel *)noCameraInSimulatorMessage
{
	if (!_noCameraInSimulatorMessage) {
		CGFloat labelWidth = self.view.bounds.size.width * 0.75f;
		CGFloat labelHeight = 60;
		_noCameraInSimulatorMessage = [[UILabel alloc] initWithFrame:CGRectMake(self.view.center.x - labelWidth/2.0f, self.view.bounds.size.height - 75 - labelHeight, labelWidth, labelHeight)];
		_noCameraInSimulatorMessage.numberOfLines = 0; // wrap
		_noCameraInSimulatorMessage.text = @"Sorry, no camera in the simulator... Crying allowed.";
		_noCameraInSimulatorMessage.backgroundColor = [UIColor clearColor];
		_noCameraInSimulatorMessage.hidden = YES;
		_noCameraInSimulatorMessage.textColor = [UIColor whiteColor];
		_noCameraInSimulatorMessage.shadowOffset = CGSizeMake(1, 1);
		_noCameraInSimulatorMessage.textAlignment = NSTextAlignmentCenter;
		[self.view addSubview:_noCameraInSimulatorMessage];
	}

	return _noCameraInSimulatorMessage;
}

- (void)startCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = YES;
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
		return;
	}

	if (!self.cameraPreviewFeedView) {
		self.cameraPreviewFeedView = [[UIView alloc] initWithFrame:self.view.bounds];
		self.cameraPreviewFeedView.center = self.view.center;
		self.cameraPreviewFeedView.backgroundColor = [UIColor clearColor];

		if (![self.view.subviews containsObject:self.cameraPreviewFeedView]) {
			[self.view addSubview:self.cameraPreviewFeedView];
		}
	}

	if (![self isCameraRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
			AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

			if (!self.captureSession) {

				self.captureSession = [AVCaptureSession new];
				self.captureSession.sessionPreset = AVCaptureSessionPresetPhoto;

				NSError *error = nil;
				AVCaptureDeviceInput *newVideoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
				if (!newVideoInput) {
					// Handle the error appropriately.
					NSLog(@"ERROR: trying to open camera: %@", error);
				}

				AVCaptureStillImageOutput *newStillImageOutput = [AVCaptureStillImageOutput new];
				NSDictionary *outputSettings = @{ AVVideoCodecKey : AVVideoCodecJPEG };
				[newStillImageOutput setOutputSettings:outputSettings];

				if ([self.captureSession canAddInput:newVideoInput]) {
					[self.captureSession addInput:newVideoInput];
				}

				if ([self.captureSession canAddOutput:newStillImageOutput]) {
					[self.captureSession addOutput:newStillImageOutput];
					self.stillImageOutput = newStillImageOutput;
				}

				NSNotificationCenter *notificationCenter =
				[NSNotificationCenter defaultCenter];

				[notificationCenter addObserver: self
									   selector: @selector(onVideoError:)
										   name: AVCaptureSessionRuntimeErrorNotification
										 object: self.captureSession];

				if (!self.captureVideoPreviewLayer) {
					[NSThread executeOnMainThread: ^{
						self.captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
						self.captureVideoPreviewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
						self.captureVideoPreviewLayer.frame = self.cameraPreviewFeedView.bounds;
						[self.cameraPreviewFeedView.layer insertSublayer:self.captureVideoPreviewLayer atIndex:0];
					}];
				}
			}

			// this will block the thread until camera is started up
			[self.captureSession startRunning];

			[NSThread executeOnMainThread: ^{
				[self.delegate cameraStartedRunning];
			}];
		});
	} else {
		[NSThread executeOnMainThread: ^{
			[self.delegate cameraStartedRunning];
		}];
	}
}

...

At the top, we have some properties:

AVCaptureSession is the main object that handles AVFoundation-based video/photo stuff. Whether you’re capturing video or photo, you use it.

AVCaptureStillImageOutput is an object you use to take a still photo of AVCaptureSession’s video feed.

AVCaptureVideoPreviewLayer is a subclass of CALayer that displays what the camera sees. Because it’s a CALayer, you can insert it into any UIView, including controller’s own view. However, to separate concerns and for easier management, I don’t recommend using VC’s view as the parent view for that layer. Here, you can see that I use a separate cameraPreviewFeedView as the container for AVCaptureVideoPreviewLayer.

There’s also a private _simulatorIsCameraRunning boolean. It’s here because when we are running the app in the simulator, we can’t use AVCaptureSession and hence we can’t ask AVCaptureSession if the camera is running. So we need to track whether camera is on or off manually.

The noCameraInSimulatorMessage property is self-explanatory and entirely optional – it just lazy-instantiates a label and adds it to the main view. viewDidLoad simply shows/hides this label as appropriate. We make a call to TARGET_IPHONE_SIMULATOR to know when we’re in the simulator, which is provided by Apple as part of TargetConditionals.h, which is a part of UIKit.

With the variables in place, the logic is simple. When startCamera is called, we check if we’re running in a simulator (TARGET_IPHONE_SIMULATOR will be YES) and, if so, set _simulatorIsCameraRunning to YES. We then tell the delegate that the camera started and exit out of the method. If we’re not in the simulator, however, then we start up an AVFoundation-based camera asynchronously. Note that if a call to [self isCameraRunning] returns YES (code for that method is in part 2 below), we simply tell the delegate that we’re already running without doing anything else.

Sidenote: if you’re way smarter than me, you may be saying, “Wait, NSThread doesn’t define executeOnMainThread selector!” I award you geek points and explain that it’s just a category method I added to ensure a block executes on the main thread – you’ll see code for it at the end of the post (tip of the hat to Marco Arment for that one).

Why So Asynchronous?

Since the camera can take a second or two to start up and perception is reality, this is your opportunity to trick the user by making the app feel faster when it’s really not any faster at all. For example, I mentioned having a custom container controller that implements the CameraDelegate protocol. When user wants to start the camera, that container VC adds the CameraViewController as a child controller but makes it’s view transparent (alpha = 0). To the user, this is unnoticeable as they keep seeing the previous VC’s view. The container VC then calls “startCamera” on the newly-added camera VC and immediately starts performing the “I’m switching to camera” animation on the currently shown VC. Our animation is a bit elaborate, but you can do whatever fits your app. Even if you just fade out the current VC’s view over 0.7 seconds or so while the camera is loading, the app will feel faster b/c something is happening.

Then, whenever the container VC receives the cameraStartedRunning message from camera VC, it quickly fades in the camera VC’s view that it kept transparent until now, and it also fades in appropriate overlay controls (such as the shutter button) if they are separate from the Camera VC’s view (in our app, they are). What I mean by “quickly fades in” is 0.25 or 0.3 seconds – but tune it as you see fit.

It’s pretty crazy how much faster even a simple fade out/fade in feels compared to just going to a black screen and waiting for the camera to spin up, even though the time it takes for the camera to open is about the same in both cases. This is true even considering the fact that our fade in animation technically delays full camera appearance by a couple hundred milliseconds. Perception is reality!

Alright, now to part 2 of CameraViewController.m:

...

- (void)stopCamera
{
	if (TARGET_IPHONE_SIMULATOR) {
		_simulatorIsCameraRunning = NO;
		return;
	}

	if (self.captureSession && [self.captureSession isRunning]) {
		dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^ {
			[self.captureSession stopRunning];
		});
	}
}

- (BOOL)isCameraRunning
{
	if (TARGET_IPHONE_SIMULATOR) return _simulatorIsCameraRunning;

	if (!self.captureSession) return NO;

	return self.captureSession.isRunning;
}

- (void)onVideoError:(NSNotification *)notification
{
	NSLog(@"Video error: %@", notification.userInfo[AVCaptureSessionErrorKey]);
}

- (void)takePhoto
{
	dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
		if (TARGET_IPHONE_SIMULATOR) {
			[self.delegate didTakePhoto: [UIImage imageNamed:@"Simulator_OriginalPhoto@2x.jpg"]];
			return;
		}

		AVCaptureConnection *videoConnection = nil;
		for (AVCaptureConnection *connection in self.stillImageOutput.connections)
			{
			for (AVCaptureInputPort *port in [connection inputPorts])
				{
				if ([[port mediaType] isEqual:AVMediaTypeVideo] )
					{
					videoConnection = connection;
					break;
					}
				}
			if (videoConnection)
				{
				break;
				}
			}

		[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection
														   completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
		 {
			 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

			 [self.delegate didTakePhoto: [UIImage imageWithData: imageData]];
		 }];
	});
}
@end

stopCamera does nothing but sets our ‘fake’ boolean if we’re in the simulator, and actually stops the camera when we’re on the device. Notice that you don’t see this called anywhere inside of this controller. Why? If your case isn’t like mine, you definitely SHOULD call this from within viewWillDisappear in CameraViewController (and perhaps from within dealloc as well). Otherwise, the camera will be running in the background until iOS kills it (and I don’t know what the rules are for that), draining battery and turning your users into an angry mob.

If your case is like mine, however, your users will see some sort of edit screen after taking a picture with an option to retake their picture, and they may need to go back and forth a lot. To make sure they don’t have to wait for the camera to start up every time, I keep the camera running for 10 seconds. This means that, every time user takes a picture, I have to setup (and reset) a timer to kill the camera after 10 seconds. Where you put this timer code will depend on your app architecture, but do test well to make sure it always executes so that AVCaptureSession isn’t running in the background for long.

Back to code… The isCameraRunning variable is self-explanatory, and here we finally see our fake camera status boolean variable come in handy.

If AVCaptureSessions blows up for some reason (device out of memory, internal Apple code error, etc.) – onVideoError: notification will be sent. Notice we subscribe to it when setting up self.captureSession back inside startCamera.

Last but not least is takePhoto. I wrapped it into a dispatch call, but frankly I am not sure it helps any since Apple seems to block all threads when it extracts a photo. But, if it changes in the future, I’ll be ready. This is where the main you’ll find the main workaround code to make camera work in the simulator: we’re simply returning a random picture to the delegate. Well, I’m returning the same picture every time in this code, but you can imagine having a set of pictures from which you select a random one every time.

Wrap-Up

I’m sure you have more questions about AVFoundation such as “what is AVCaptureConnection or AVMediaTypeVideo and why do I see the word ‘video’ so often when I only care about still images?”, but this is just how it’s done and it’s outside of the scope of this post to describe why. Just copy, paste, and enjoy =) And if it breaks, don’t blame me.

Now, what calls takePhoto? Your delegate view controller, most likely. AVFoundation camera is barebones – it has no shutter button, etc. So you’ll have to create your own “overlay” controls view that shows a shutter button, retake button, maybe flash on/off button, etc. In my app the custom container controller I mentioned before is also the delegate for such an overlay view. When users tap the shutter button on the overlay view to take a photo, the overlay view sends a “plzTakePhotoMeow” message (wording is approximate) to it’s delegate (which is the custom container VC I keep talking about), and that delegate (the custom container VC) then calls takePhoto on the CameraViewController instance. In turn, when the camera VC is done getting an image from AVCaptureSession, it returns the image back to the delegate (the same custom container VC). The custom container VC can then remove Camera VC from the view hierarchy and instantiate the next VC, such as EditPhotoVC, passing it the photo from the camera.

Ok, I promised code for that NSThread category, so here it is:

@implementation NSThread (Helpers)

+ (void)executeOnMainThread:(void (^)())block
{
	if (!block) return;

	if ([[NSThread currentThread] isMainThread]) {
		block();
	} else {
		dispatch_sync(dispatch_get_main_queue(), ^ {
			block();
		});
	}
}

@end

Finally, the code for the timer to stop the camera (if you go this route) is below. First we define a few properties on the controller that is the delegate to CameraViewController (again, in my case it’s the container VC):

@property (atomic) BOOL openingCameraView;
@property (nonatomic) NSTimer *stopCameraTimer;

And then, in didTakePhoto:, canceledTakingPhoto: (if your overlay controls UI has that button), etc. spin up the timer:

[NSThread executeOnMainThread:^{
		// display the next VC here
		...

		// start a timer to shut off camera
		if (!self.stopCameraTimer || !self.stopCameraTimer.isValid) {
			self.stopCameraTimer = [NSTimer scheduledTimerWithTimeInterval:15 target:self selector:@selector(stopCameraAfterTimer:) userInfo:nil repeats:NO];
		}
	}];

The actual method the timer references is here, and it just ensures we don’t stop the camera while Camera VC is shown (that would be an oopsie) or we’re in the process of showing it (self.openingCameraView is set manually to true at the beginning of the method responsible for adding Camera VC to the view hierarchy):

- (void)stopCameraAfterTimer:(NSTimer *)timer
{
	if (!self.openingCameraView && !(self.currentlyShownVC == self.cameraVC)) {
		[self.cameraVC stopCamera];
	}
}

I hope this helps! If I made errors or you have questions, please contact me on Twitter.

Happy coding in 2014!