iOS开发-利用GPUImageMotionDetector做一个简易的运动检测

680 阅读1分钟

之前项目需要根据摄像头进行运动检测,写了一个小小的demo分享给大家

首先构建一个简易的video(偷懒直接使用的GPUImage)

- (void)creatGPUImageVideoCamera{
    //摄像头初始化
    _videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset1280x720 cameraPosition:AVCaptureDevicePositionBack];
    //开启捕获声音
    [_videoCamera addAudioInputsAndOutputs];
    //设置输出图像方向,可用于横屏推流。
    _videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
    //镜像策略,这里这样设置是最自然的。跟系统相机默认一样。
    _videoCamera.horizontallyMirrorRearFacingCamera = NO;
    _videoCamera.horizontallyMirrorFrontFacingCamera = YES;
    //设置预览view
    _gpuImageView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, kScreenWidth, kScreenHeight)];
    [self.view addSubview:_gpuImageView];
    //初始化美颜滤镜
    _beautifyFilter = [[GPUImageOpacityFilter alloc] init];
    //相机获取视频数据输出至美颜滤镜
    [_videoCamera addTarget:_beautifyFilter];
    //美颜后输出至预览
    [_beautifyFilter addTarget:_gpuImageView];
    
    _dataHandler = [[GPUImageRawDataOutput alloc]initWithImageSize:CGSizeMake(kScreenWidth, kScreenHeight) resultsInBGRAFormat:YES];
    [_beautifyFilter addTarget:_dataHandler];
    
//    _videoCamera.delegate = _dataHandler;
    
    //开始捕获视频
    [self.videoCamera startCameraCapture];
    
    //修改帧率
//    [self updateFps:AVCaptureSessionPreset640x480.fps];
    
}

这时候已经可以看到画面了


这时候就需要用到GPUImageMotionDetector来做运动检测

    // 运动检测的类
    _motionDetector = [[GPUImageMotionDetector alloc]init];
    [_beautifyFilter addTarget:_motionDetector];
    
    _motionDetector.motionDetectionBlock = ^(CGPoint motionCentroid, CGFloat motionIntensity, CMTime frameTime) {
        
        if (motionCentroid.x > kMotionDetectionSensitiveValue ||  motionCentroid.y > kMotionDetectionSensitiveValue){
            NSLog(@"有人经过了");
        }
        
    };
  // kMotionDetectionSensitiveValue设置一个敏感度

这时候控制台疯狂的输出"有人经过"


demo地址