AVFoundationでカメラ映像をキャプチャ

目次

動機とか宣伝とか

「色比較」というiOSアプリを作っていたときにAVFoundationでカメラ映像をキャプチャする必要があったので、その備忘録。

本題

AVFoundationでカメラ映像をキャプチャするための短めのコードはこんな感じ。

ViewController.h

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController 

@property (strong, nonatomic) AVCaptureDeviceInput *deviceInput;
@property (strong, nonatomic) AVCaptureVideoDataOutput *videoDataOutput;
@property (strong, nonatomic) AVCaptureSession *session;
@property (strong, nonatomic) UIImageView *imageView;

@end

ViewController.m


#import "ViewController.h"

@interface ViewController ()

@end

@implementation ViewController

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    self.imageView = [[UIImageView alloc] initWithFrame:self.view.frame];
    self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    [self.view addSubview:self.imageView];
    [self setupCamera];
}


- (void)didReceiveMemoryWarning {
    [super didReceiveMemoryWarning];
    // Dispose of any resources that can be recreated.
}

- (void)setupCamera
{
    NSError *error = nil;
    
    //セッションの作成
    self.session = [[AVCaptureSession alloc] init];
    self.session.sessionPreset = AVCaptureSessionPresetMedium;
    
    //デバイスはカメラ
    AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    //入力をカメラに
    self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    [self.session addInput:self.deviceInput];
    
    self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    [self.session addOutput:self.videoDataOutput];
    
    dispatch_queue_t queue = dispatch_queue_create("CameraQueue", NULL);
    [self.videoDataOutput setAlwaysDiscardsLateVideoFrames:TRUE];
    [self.videoDataOutput setSampleBufferDelegate:self queue:queue];
    
    self.videoDataOutput.videoSettings = @{(id)kCVPixelBufferPixelFormatTypeKey : [NSNumber numberWithInt:kCVPixelFormatType_32BGRA]};
    
    [self.session startRunning];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
    dispatch_async(dispatch_get_main_queue(), ^{
        self.imageView.image = image;
    });
}

- (UIImage *)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(pixelBuffer, 0);
    CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer];
    CIContext *context = [CIContext contextWithOptions:nil];
    CGImageRef cgimage = [context createCGImage:ciimage fromRect:CGRectMake(0, 0, CVPixelBufferGetWidth(pixelBuffer), CVPixelBufferGetHeight(pixelBuffer))];
    
    UIImage *image = [UIImage imageWithCGImage:cgimage scale:1.0 orientation:UIImageOrientationRight];
    
    CGImageRelease(cgimage);
    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    
    return image;
}

@end

Info.plist

「Privacy - Camera Usage Description」項目を追加し、説明を追加。
(説明はなくてもエラーは起きないが、公開時の審査の際に再提出を求められる)

ちょっと補足

setupCamera関数内のself.session.sessionPresetの値を変更すると取得する映像のサイズが変わってくる。
AVCaptureSessionPresetMediumの他にも以下の様なものがある。
  • AVCaptureSessionPreset352x288
  • AVCaptureSessionPreset1280x720
  • AVCaptureSessionPreset1920x1080
  • AVCaptureSessionPreset3840x2160
  • AVCaptureSessionPresetHigh
  • AVCaptureSessionPresetiFrame1280x720
  • AVCaptureSessionPresetiFrame960x540
  • AVCaptureSessionPresetInputPriority
  • AVCaptureSessionPresetLow
  • AVCaptureSessionPresetPhoto
  • AVCaptureSessionPreset960x540
  • AVCaptureSessionPreset320x240
  • AVCaptureSessionPreset640x480
カテゴリー:Objective-C
記事作成日:2017-10-13