AVFoundation - Retiming CMSampleBufferRef-Video-Ausgang

Ersten mal eine Frage hier. Ich hoffe, das post ist klar und Beispiel-code richtig formatiert ist.

Ich experimentiere mit AVFoundation-und Zeitraffer-Fotografie.

Meine Absicht ist es, schnappen Sie jeden N-TEN frame der video-Kamera von einem iOS-Gerät (mit meinem iPod touch, version 4) und schreiben Sie jede dieser frames aus, um eine Datei zu erstellen, einen Zeitraffer. Ich bin mit AVCaptureVideoDataOutput, AVAssetWriter und AVAssetWriterInput.

Das problem ist, wenn ich die CMSampleBufferRef übergeben

captureOutput:idOutputSampleBuffer:fromConnection:

, die Wiedergabe von jedem frame wird die Länge der Zeit, die zwischen original-input-frames. Eine Framerate von sagen 1fps. Ich bin auf der Suche, um die 30fps.

Ich habe versucht, mit

CMSampleBufferCreateCopyWithNewTiming()

, aber dann, nach 13 frames in die Datei geschrieben, die

captureOutput:idOutputSampleBuffer:fromConnection:

Haltestellen aufgerufen werden. Die Schnittstelle ist aktiv, und ich kann Tippen Sie auf eine Schaltfläche, um die Aufzeichnung zu beenden, und speichern Sie die Foto-Bibliothek für die Wiedergabe. Es erscheint wiedergeben, wie ich es will, 30fps, aber es hat nur die 13 frames.

Wie kann ich erreichen, mein Ziel, 30 fps Wiedergabe?
Wie kann ich feststellen, wo die app verloren und warum?

Ich habe eine fahne genannt useNativeTime so kann ich testen beiden Fällen. Wenn YES gesetzt ist, bekomme ich alle Bilder, die ich bin daran interessiert, wie der Rückruf nicht 'verloren gehen'. Wenn ich das flag auf NEIN, ich immer nur bekommen 13 frames verarbeitet und bin nie wieder zu dieser Methode wieder. Wie oben erwähnt, in beiden Fällen kann ich das video Abspielen.

Vielen Dank für jede Hilfe.

Hier ist, wo ich versuche zu tun, die retiming.

    - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    BOOL useNativeTime = NO;
    BOOL appendSuccessFlag = NO;

    //NSLog(@"in captureOutpput sample buffer method");
    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        //CMSampleBufferInvalidate(sampleBuffer);
        return;
    }

    if (! [inputWriterBuffer isReadyForMoreMediaData])
    {
        NSLog(@"Not ready for data.");
    }
    else {
        //Write every first frame of n frames (30 native from camera). 
        intervalFrames++;
        if (intervalFrames > 30) {
            intervalFrames = 1;
        }
        else if (intervalFrames != 1) {
            //CMSampleBufferInvalidate(sampleBuffer);
            return;
        }

        //Need to initialize start session time.
        if (writtenFrames < 1) {
            if (useNativeTime) imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            else imageSourceTime = CMTimeMake( 0 * 20 ,600); //CMTimeMake(1,30);
            [outputWriter startSessionAtSourceTime: imageSourceTime];
            NSLog(@"Starting CMtime");
            CMTimeShow(imageSourceTime);
        }

        if (useNativeTime) {
            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
            CMTimeShow(imageSourceTime);
            //CMTime myTiming = CMTimeMake(writtenFrames * 20,600);
            //CMSampleBufferSetOutputPresentationTimeStamp(sampleBuffer, myTiming); //Tried but has no affect.
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:sampleBuffer];
        }
        else {
            CMSampleBufferRef newSampleBuffer;
            CMSampleTimingInfo sampleTimingInfo;
            sampleTimingInfo.duration = CMTimeMake(20,600);
            sampleTimingInfo.presentationTimeStamp = CMTimeMake( (writtenFrames + 0) * 20,600);
            sampleTimingInfo.decodeTimeStamp = kCMTimeInvalid;
            OSStatus myStatus;

            //NSLog(@"numSamples of sampleBuffer: %i", CMSampleBufferGetNumSamples(sampleBuffer) );
            myStatus = CMSampleBufferCreateCopyWithNewTiming(kCFAllocatorDefault,
                                                             sampleBuffer,
                                                             1,
                                                             &sampleTimingInfo, //maybe a little confused on this param.
                                                             &newSampleBuffer);
            //These confirm the good heath of our newSampleBuffer.
            if (myStatus != 0) NSLog(@"CMSampleBufferCreateCopyWithNewTiming() myStatus: %i",myStatus);
            if (! CMSampleBufferIsValid(newSampleBuffer)) NSLog(@"CMSampleBufferIsValid NOT!");

            //No affect.
            //myStatus = CMSampleBufferMakeDataReady(newSampleBuffer);  //How is this different; CMSampleBufferSetDataReady ?
            //if (myStatus != 0) NSLog(@"CMSampleBufferMakeDataReady() myStatus: %i",myStatus);

            imageSourceTime = CMSampleBufferGetPresentationTimeStamp(newSampleBuffer);
            CMTimeShow(imageSourceTime);
            appendSuccessFlag = [inputWriterBuffer appendSampleBuffer:newSampleBuffer];
            //CMSampleBufferInvalidate(sampleBuffer); //Docs don't describe action. WTF does it do? Doesn't seem to affect my problem. Used with CMSampleBufferSetInvalidateCallback maybe?
            //CFRelease(sampleBuffer); //- Not surprisingly - “EXC_BAD_ACCESS”
        }

        if (!appendSuccessFlag)
        {
            NSLog(@"Failed to append pixel buffer");
        }
        else {
            writtenFrames++;
            NSLog(@"writtenFrames: %i", writtenFrames);
            }
    }

    //[self displayOuptutWritterStatus];    //Expect and see AVAssetWriterStatusWriting.
}

Mein setup routine.

    - (IBAction) recordingStartStop: (id) sender
{
    NSError * error;

    if (self.isRecording) {
        NSLog(@"~~~~~~~~~ STOPPING RECORDING ~~~~~~~~~");
        self.isRecording = NO;
        [recordingStarStop setTitle: @"Record" forState: UIControlStateNormal];

        //[self.captureSession stopRunning];
        [inputWriterBuffer markAsFinished];
        [outputWriter endSessionAtSourceTime:imageSourceTime];
        [outputWriter finishWriting]; //Blocks until file is completely written, or an error occurs.
        NSLog(@"finished CMtime");
        CMTimeShow(imageSourceTime);

        //Really, I should loop through the outputs and close all of them or target specific ones.
        //Since I'm only recording video right now, I feel safe doing this.
        [self.captureSession removeOutput: [[self.captureSession outputs] objectAtIndex: 0]];

        [videoOutput release];
        [inputWriterBuffer release];
        [outputWriter release];
        videoOutput = nil;
        inputWriterBuffer = nil;
        outputWriter = nil;
        NSLog(@"~~~~~~~~~ STOPPED RECORDING ~~~~~~~~~");
        NSLog(@"Calling UIVideoAtPathIsCompatibleWithSavedPhotosAlbum.");
        NSLog(@"filePath: %@", [projectPaths movieFilePath]);
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum([projectPaths movieFilePath])) {
            NSLog(@"Calling UISaveVideoAtPathToSavedPhotosAlbum.");
            UISaveVideoAtPathToSavedPhotosAlbum ([projectPaths movieFilePath], self, @selector(video:didFinishSavingWithError: contextInfo:), nil);
        }
        NSLog(@"~~~~~~~~~ WROTE RECORDING to PhotosAlbum ~~~~~~~~~");
    }
    else {
        NSLog(@"~~~~~~~~~ STARTING RECORDING ~~~~~~~~~");
        projectPaths = [[ProjectPaths alloc] initWithProjectFolder: @"TestProject"];
        intervalFrames = 30;

        videoOutput = [[AVCaptureVideoDataOutput alloc] init];
        NSMutableDictionary * cameraVideoSettings = [[[NSMutableDictionary alloc] init] autorelease];
        NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
        NSNumber* value = [NSNumber numberWithUnsignedInt: kCVPixelFormatType_32BGRA]; //kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange];
        [cameraVideoSettings setValue: value forKey: key];
        [videoOutput setVideoSettings: cameraVideoSettings];
        [videoOutput setMinFrameDuration: CMTimeMake(20, 600)]; //CMTimeMake(1, 30)]; //30fps
        [videoOutput setAlwaysDiscardsLateVideoFrames: YES];

        queue = dispatch_queue_create("cameraQueue", NULL);
        [videoOutput setSampleBufferDelegate: self queue: queue];
        dispatch_release(queue);

        NSMutableDictionary *outputSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [outputSettings setValue: AVVideoCodecH264 forKey: AVVideoCodecKey];
        [outputSettings setValue: [NSNumber numberWithInt: 1280] forKey: AVVideoWidthKey]; //currently assuming
        [outputSettings setValue: [NSNumber numberWithInt: 720] forKey: AVVideoHeightKey];

        NSMutableDictionary *compressionSettings = [[[NSMutableDictionary alloc] init] autorelease];
        [compressionSettings setValue: AVVideoProfileLevelH264Main30 forKey: AVVideoProfileLevelKey];
        //[compressionSettings setValue: [NSNumber numberWithDouble:1024.0*1024.0] forKey: AVVideoAverageBitRateKey];
        [outputSettings setValue: compressionSettings forKey: AVVideoCompressionPropertiesKey];

        inputWriterBuffer = [AVAssetWriterInput assetWriterInputWithMediaType: AVMediaTypeVideo outputSettings: outputSettings];
        [inputWriterBuffer retain];
        inputWriterBuffer.expectsMediaDataInRealTime = YES;

        outputWriter = [AVAssetWriter assetWriterWithURL: [projectPaths movieURLPath] fileType: AVFileTypeQuickTimeMovie error: &error];
        [outputWriter retain];

        if (error) NSLog(@"error for outputWriter = [AVAssetWriter assetWriterWithURL:fileType:error:");
        if ([outputWriter canAddInput: inputWriterBuffer]) [outputWriter addInput: inputWriterBuffer];
        else NSLog(@"can not add input");

        if (![outputWriter canApplyOutputSettings: outputSettings forMediaType:AVMediaTypeVideo]) NSLog(@"ouptutSettings are NOT supported");

        if ([captureSession canAddOutput: videoOutput]) [self.captureSession addOutput: videoOutput];
        else NSLog(@"could not addOutput: videoOutput to captureSession");

        //[self.captureSession startRunning];
        self.isRecording = YES;
        [recordingStarStop setTitle: @"Stop" forState: UIControlStateNormal];

        writtenFrames = 0;
        imageSourceTime = kCMTimeZero;
        [outputWriter startWriting];
        //[outputWriter startSessionAtSourceTime: imageSourceTime];
        NSLog(@"~~~~~~~~~ STARTED RECORDING ~~~~~~~~~");
        NSLog (@"recording to fileURL: %@", [projectPaths movieURLPath]);
    }

    NSLog(@"isRecording: %@", self.isRecording ? @"YES" : @"NO");

    [self displayOuptutWritterStatus];  
}

InformationsquelleAutor Darren Reely | 2011-02-27

Schreibe einen Kommentar