iOS4: Take photos with live video preview using AVFoundation

I’m writing this because – as of April 2011 – Apple’s official documentation is badly wrong. Some of their source code won’t even compile (typos that are obvious if they’d checked them), and some of their instructions are hugely over-complicated and yet simply don’t work.

This is a step-by-step guide to taking photos with live image preview. It’s also a good starting point for doing much more advanced video and image capture on iOS 4.

What are we trying to do?

It’s very easy to write an app that takes photos. It’s quite a lot of code, but it’s been built-in to iOS/iPhone OS for a few years now – and it still works.

But … with iOS 4, the new “AV Foundation” library offers a much more powerful way of taking photos, which lets you put the camera view inside your own app. So, for instance, you can make an app that looks like this:

Gotchas

0. Requires a 3GS, iPod Touch 3, or better…

The entire AV Foundation library is not available on the oldest iPhone and iPod Touch devices. I believe this is because Apple is doing a lot of the work in hardware, making use of features that didn’t exist in the original iPhone chips, and the 3G chips.

Interestingly, the AV Foundation library *is* available on the Simulator – which suggest that Apple certainly *could* have implemented AV F for older phones, but they decided not to. It’s very useful that you can test most of your AV F app on the Simulator (so long as you copy/paste some videos into the Simulator to work with).

1. Apple doesn’t tell you the necessary Frameworks

You need *all* the following frameworks (all come with Xcode, but you have to manually add them to your project):

  1. CoreVideo
  2. CoreMedia
  3. AVFoundation (of course…)
  4. ImageIO
  5. QuartzCore (maybe)

How do we: get live video from camera straight onto the screen?

Create a new UIViewController, add its view to the screen (either in IB or through code – if you don’t know how to add a ViewController’s view, you need to do some much more basic iPhone tutorials first).

Add a UIView object to the NIB (or as a subview), and create a @property in your controller:

@property(nonatomic, retain) IBOutlet UIView *vImagePreview;

Connect the UIView to the outlet above in IB, or assign it directly if you’re using code instead of a NIB.

Then edit your UIViewController, and give it the following viewDidAppear method:

-(void) viewDidAppear:(BOOL)animated
{
	AVCaptureSession *session = [[AVCaptureSession alloc] init];
	session.sessionPreset = AVCaptureSessionPresetMedium;
	
	CALayer *viewLayer = self.vImagePreview.layer;
	NSLog(@"viewLayer = %@", viewLayer);
	
	AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
	
	captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
	[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
	
	AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
	
	NSError *error = nil;
	AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
	if (!input) {
		// Handle the error appropriately.
		NSLog(@"ERROR: trying to open camera: %@", error);
	}
	[session addInput:input];
		
	[session startRunning];
}

Run your app on a device (NB: this will NOT run on Simulator – Apple doesn’t support cameras on the simulator (yet)), and … you should see the live camera view appearing in your subview

Gotchas

2. Apple’s example code for live-video doesn’t work

In the AVFoundation docs, Apple has a whole section on trying to do what we did above. Here’s a link: AV Foundation Programming Guide – Video Preview. But it doesn’t work.

UPDATE: c.f. Robert’s comment below. This method does work, you just have to use it in a different way.

“The method “imageFromSampleBuffer” does work when you send a sample buffer from “AVCaptureVideoDataOutput” which is “32BGRA”. You tried to send a sample buffer from “AVCaptureStillImageOutput” which is “AVVideoCodecJPEG”.”

(more details + source code in Robert’s comment at the end of this post)

If you look in the docs for AVCaptureVideoPreviewLayer, you’ll find a *different* source code example, which works without having to change codecs:

captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];

3. Apple’s image-capture docs are also wrong

In the AV Foundation docs, there’s also a section on how to get Images from the camera. This is mostly correct, and then at the last minute it goes horribly wrong.

Apple provides a link to another part of the docs, with the following source code:

{
    ...
    UIImage* image = imageFromSampleBuffer(imageSampleBuffer);
    ...
}

UIImage *imageFromSampleBuffer(CMSampleBufferRef sampleBuffer)
{	
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    // Lock the base address of the pixel buffer.
    CVPixelBufferLockBaseAddress(imageBuffer,0);
	
    // Get the number of bytes per row for the pixel buffer.
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    // Get the pixel buffer width and height.
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
	
    // Create a device-dependent RGB color space.
    static CGColorSpaceRef colorSpace = NULL;
    if (colorSpace == NULL) {
        colorSpace = CGColorSpaceCreateDeviceRGB();
		if (colorSpace == NULL) {
            // Handle the error appropriately.
            return nil;
        }
    }
	
    // Get the base address of the pixel buffer.
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    // Get the data size for contiguous planes of the pixel buffer.
    size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);
	
    // Create a Quartz direct-access data provider that uses data we supply.
    CGDataProviderRef dataProvider = CGDataProviderCreateWithData(NULL, baseAddress, bufferSize, NULL);
    // Create a bitmap image from data supplied by the data provider.
    CGImageRef cgImage = CGImageCreate(width, height, 8, 32, bytesPerRow, colorSpace, kCGImageAlphaNoneSkipFirst | 
kCGBitmapByteOrder32Little, dataProvider, NULL, true, kCGRenderingIntentDefault);
    CGDataProviderRelease(dataProvider);
	
    // Create and return an image object to represent the Quartz image.
    UIImage *image = [UIImage imageWithCGImage:cgImage];
    CGImageRelease(cgImage);
	
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
	
    return image;
}

This code has never worked for me – it always returns an empty 0x0 image, which is useless. That’s 45 lines of useless code, that everyone is required to re-implement in every app they write.

Or maybe not.

Instead, if you look at the WWDC videos, you find an alternate approach, that takes just two lines of source code:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage *image = [[UIImage alloc] initWithData:imageData];

Even better … this actually works!

How do we: take a photo of what’s in the live video feed?

There’s two halves to this. Obviously, we’ll need a button to capture a photo, and a UIImageView to display it. Less obviously, we’ll have to alter our existing camera-setup routine.

To make this work, we have to create an “output source” for the camera when we start it, and then later on when we want to take a photo we ask that “output” object to give us a single image.

Part 1: Add buttons and views and image-capture routine

So, create a new @property to hold a reference to our output object:

@property(nonatomic, retain) AVCaptureStillImageOutput *stillImageOutput;

Then make a UIImageView where we’ll display the captured photo. Add this to your NIB, or programmatically.

Hook it up to another @property, or assign it manually, e.g.;

@property(nonatomic, retain) IBOutlet UIImageView *vImage;

Finally, create a UIButton, so that you can take the photo.

Again, add it to your NIB (or programmatically to your screen), and hook it up to the following method:

-(IBAction) captureNow
{
	AVCaptureConnection *videoConnection = nil;
	for (AVCaptureConnection *connection in stillImageOutput.connections)
	{
		for (AVCaptureInputPort *port in [connection inputPorts])
		{
			if ([[port mediaType] isEqual:AVMediaTypeVideo] )
			{
				videoConnection = connection;
				break;
			}
		}
		if (videoConnection) { break; }
	}
	
	NSLog(@"about to request a capture from: %@", stillImageOutput);
	[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
	{
		 CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);
		 if (exifAttachments)
		 {
			// Do something with the attachments.
			NSLog(@"attachements: %@", exifAttachments);
		 }
		else
			NSLog(@"no attachments");
		
		NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
		UIImage *image = [[UIImage alloc] initWithData:imageData];

		self.vImage.image = image;
	 }];
}

Part 2: modify the camera-setup routine

Go back to the viewDidAppear method you created at the start of this post. The very last line must REMAIN the last line, so we’ll insert the new code immediately above it. Here’s the new code to insert:

stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];

[session addOutput:stillImageOutput];

Run the app, and you should get something like the image I showed at the start, where the part on the left is a live-preview from the camera, and the part on the right updates each time you click the “take photo” button:

83 thoughts on “iOS4: Take photos with live video preview using AVFoundation

  1. Fantastic post!

    Is the block under your heading “Part 2: modify…” starting with the line

    AVCaptureConnection *videoConnection = nil;

    really needed?

    That block declares a local variable that is never used when finalizing the AVCaptureStillImageOutput part of the capture session.

    The AvCaptureConnection again appears to be recomputed each time the capture is initiated (in your captureNow IBAction callback further above).

    Can you use the first declaration, as an attribute, so that it doesn’t have to be computed each time the photo is taken (reduce shutter lag)?

  2. @jpap

    You’re right – that “videoConnection”, and the whole block with it, isn’t need – it’s a copy/paste error from the “captureNow” method (elsewhere in the post), which *does* need it.

    Also, yes – you might be able to keep it in an attribute and re-use; although IIRC there are some situations where the array of available outputs changes dynamically – e.g. if you plug/unplug external monitors, projectors, etc.

    In practice, I’ve seen no noticeable shutter lag, except maybe on the 3GS, so I haven’t tried optimizing that method. The iphone4 and the ipod touch4 seem to be instantaneous with this code in the projects I’ve used it in.

  3. Hello,

    I am trying to port your code into Monotouch , would you please provide the whole sample project so I can see if I am missing things ?

    Thanks

  4. Thanks for this – consider it bookmarked !

    I assume you’ve submitted a bug radar or a document feedback to get Apple to correct the docs ?

  5. 10,000 people already have.

    …so far as anyone can tell.

    Until Apple makes their bug-tracking public, I’m not going to waste my time reporting the 100’s of bugs I come across – all of which have ALMOST CERTAINLY been reported already. Apple’s super-secrecy here hurts everyone – including Apple.

  6. @Johny, @Amer

    All the code is there in the post, I believe. This was simply written directly into a blank Apple xcode default template app.

    (in the screenshots, you can see the two tabs named “First” and “Second” – these are the defaults if you create a new Tabbed Application, I didn’t even rename them)

    If you’re having dfificulty starting or running a new Xcode project, I’m afraid I can’t help you. I suggest googling “tabbed iphone tutorial” or similar.

  7. Thanks buddy VV Nice post. Can you please tell me i need to display live video of iPhone with various effects like gray scale, sapia etc ..

  8. Hi!

    Thanks for this very good tutorial. It is very helpful. Keep on!

    Is there some way to make images smaller? In my project I have to process image and I it take a lot of time :-(. Maybe there is some settings to do that?
    Also, I have CaptureManager in own class, and I start running preview from another class. Image will not be saved before I press “takeImage” button again . I have debugged this and the part where completionHandler is, is waiting for calling methods to do what it have to do, and then is continuing with saving stillImage. I am new in programming objective-c, and maybe I dont understand how completionHandler is working. Can you please give me some tip on what to do? Is it possible to use something else but completionHandler to achive the better result?

    Thanks!

  9. Managing the camera (and using AV Foundation) is a pretty advanced iPhone topic – if you’re new to objective-C, I strongly advise you do *not* use this code, and instead focus on simpler projects until you’re comfortable with the language and platform. AV Foundation is missing a lot of documentation, it deals with low-level hardware, and brings in subtleties from Core Animation. That’s way too much distraction if you’re new.

  10. Hi!

    Millons of thanks for this tutoral. I was trying to catch the video input of the back camera for an AR app. I used te AVCAM sample code from Apple, but it works terribly slow and used lots of CPU running time. I also tried the UIImagePicker, but it has too many limitations and a lot of bugs.

    With your code it works smoothly now.

    Again, MILLIONS of thanks!!!

  11. Thank you very much!

    This example has clarified a lot for me. I was using the broken code for a day and wondering why nothing was showing up.

    Now that I am able to take a snapshot I want to receive a constant video stream from the camera or constant snaps on a interval into a buffer.

    Right now I am taking a snapshot every second and then sending it out on a UDP socket. Though it is working I can tell it is not efficient plus the iPhone makes a taking picture sound every second, which is undesirable.

    Any feedback will be appreciated.

  12. Bravo. Not often does one find a tutorial of this quality. I have four apps in the works that will take advantage of the camera. AVFoundation definitely wins over UIImagePicker and this tutorial gets the basics down quite effectively and efficiently!

  13. Thanks for you code . But after using them I found the ” session.sessionPreset = AVCaptureSessionPresetMedium” seems not work fine. When I change it to High or Low , the image captured is always 640*480! Can you tell me what is the problem ? Thank you very mush!

  14. thanks for this post 😀

    I would like to know how can I change the size of camera view ?

    Thanks

  15. How would you combine the video input with the UI elements from the window in an app? I have an app which has several non-opaque views which rotate and translate with device movement using the gyro. I’ve managed to create a video preview view behind this UI and I can capture the video from the camera to disk, but I haven’t found examples of saving both the video and the UI together. Are you aware of a way to do this?

  16. @Randy

    The easy way would be to add a Composition Layer / Composition Instructions into your AV Foundation output stack, that inserts the UIView contents manually. IIRC there’s an Apple-provided composition-source that takes a UIView as input – that’s how Apple does it in their apps (IIRC).

  17. Thanks for the tutorial. I get an error in my XCode saying: “Use of undeclared identifier ‘kCGImagePropertyExifDictionary’ on the line
    CFDictionaryRef exifAttachments = CMGetAttachment( imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);

    Do you know why?

  18. Thx! very useful.
    But You have one small error too )

    viewLayer – variable declared, but never used

    Not critical, but let your code be perfect! 😉

  19. This method does work:

    UIImage *imageFromSampleBuffer(CMSampleBufferRef sampleBuffer)

    Apple’s code is okay. You’re using it wrong. You want an UIImage from a samplebuffer with the wrong video codec.

    AVCaptureStillImageOutput uses: AVVideoCodecKey -> AVVideoCodecJPEG

    AVCaptureVideoDataOutput uses: kCVPixelBufferPixelFormatTypeKey -> kCVPixelFormatType_32BGRA

    The method “imageFromSampleBuffer” does work when you send a sample buffer from “AVCaptureVideoDataOutput” which is “32BGRA”. You tried to send a sample buffer from “AVCaptureStillImageOutput” which is “AVVideoCodecJPEG”.

    You get that empty picture because you’re sending wrong input format (32BGRA != AVVideoCodecJPEG).

    AVCaptureVideoDataOutput is really important because you can capture still images without that “shutter sound” direct from the live video feed. Or you can perform realtime face detection.

    Cheers,
    Robert

  20. @Robert – that’s awesome, thanks!

    I admit I gave up quickly – back when AVF came out, there were no docs, and no tutorials from Apple (or anyone else), so I went with whatever worked first time.

    I’ll update the post to point to your explanation.

  21. The code as you state above is not all there.
    Implementing this causes a number of errors because of items not declared
    It would be extremely helpful to have the whole project.
    Any chance you could put it up on git?

  22. @Michael

    I just re-read the post. As far as I can tell: no, there is nothing “not all there”. You have the complete source, and can apply it to any project. This has worked for everyone else – if you’re having problems, I suggest you post them.

  23. I guess I could go that route, but having the entire project save’s a lot of questions. So again, how about the entire project source?

    It would be very helpful.

  24. In lieu of getting the project source as a download, here is one question I have.

    Did you start with a tabbed app project and then added to the firstviewcontroller the two views I see in the finished app near the top of this post?

  25. After having said all that, I got it all working.

    Sorry for being a pest!

    And many thanks as this will really help!

  26. Yes, as described above. If you follow the steps in this post, it should work. If not … tell us what’s wrong, and we’ll try to find a solution.

    It is a LOT easier to just follow the steps than to debug old project files. I’m not going to give you support for debugging a project file from an old version of Xcode, so … not giving out the project file either (I probably don’t even have it any more – I expect I deleted it as soon as I had made the example).

  27. No worries :). AV Foundation has a very steep learning curve – it’s painful and frustrating – so I know how you feel.

  28. Hey Adam, I’ve tried to do the first step (1: Live image in view) in XCode 4.3.2 but it won’t compile.

    1. I created a new project with the Tabbed Application template.
    2. I imported the Frameworks described.
    3. Dropped a UI View object on my first view
    4. I ctrl dragged the view to the FirstViewController.h and it created a property:
    @property (strong, nonatomic) IBOutlet UIView *vImagePreview;
    5. I opened the FirstViewController.m and copied in the code
    -(void) viewDidAppear:(BOOL)animated {}
    6. 19 errors of undeclared identifier for AVCaptureSession, session, device and so on.

    Sorry if it’s a stupid question but I can’t figure it out…any ideas?

  29. If avcaPturesession is missing – absolutely sure you added + imported avfoundation framework?

  30. @adam Rookie error, I had forgotten to import the AVFoundation framework as you said!

    I added the line #import to my FirstViewController.h.

    When we capture a image, is this coming from one of the frames in the live video feed? Is it possible to call the normal image capture method on the Take Photo Now button? (Without bringing up the UI Picker view?) Do you know how apps such as Camera+ or Instragram do this?

    Thanks!

  31. I believe it comes from the camera feed – avf tends to work with streams (although this is effectively a postprocessed stream).

    Not sure I understand your other question?

  32. Sorry I’ll explain it further…If we are grabbing an image from the live camera feed it won’t be the same quality if we were to run the image capture method in UIImagePicker, therefore is it possible to display the live video feed but when it comes to taking the photo, actually *Take a photo* instead of grabbing a frame from the feed?

    Is there a better way to do this?

  33. Sorry, been so long since I did this that I can’t remember. Yes, it can be done (I’ve done it before) – although I thought this method above provided it (the purpose of “capture asynchronously” being to grab a high quality single image).

    …but TBH it’s been so long since I last did this in an app that I can’t 100% remember, sorry

  34. Hi, thanks for info, really interesting. Do you think these same functions could be used to take a screen capture? So for example if I’ve got an animation running rather than a video feed, could I still use this same technique to capture an image of it? Any idea how?

    Thanks,

  35. Capturing screenshots is a completely different thing, no shared code at all. If you google, there are a couple of solutions, including stufd thats practically one-line solutions. In my experience, these techniques capture animations fine.

    I believe the ones that use OpenGL will also capture any video that happens to be onscreen, because of the way apple renders – but I haven’t tried it myslef

  36. Can anbody tell me how to load video camera? using aVfoundationframework?

    How to capture any image from the live video.

  37. @Tina, Vipendra, et al

    If the tutorial above isn’t clear enough for you, I recommend you do some tutorials on basic iOS coding first.

    AV Foundation is NOT for beginners – you need to be confident in iOS programming before you attempt it. Sorry

  38. how can we use the hdr mode of camera to capture the photos in our app.
    Pro HDR uses the merging of three different images of different exposure to achieve this. How can we achieve this in our app? Please provide some sample code?

  39. I followed the instructions and seem to be getting an error where videoConnection is consistently not being set (it stays null). Any idea why?

  40. Is it possible to use the camera button found in the camera application rather than just a UIButton?

Leave a Reply

Your email address will not be published. Required fields are marked *