2013-01-02 10 views
7

Piszę plik wideo MP4 z AVAssetWriter przy użyciu AVAssetWriterInputPixelBufferAdaptor.AVAssetWriter czasami kończy się niepowodzeniem ze statusem AVAssetWriterStatusFailed. Wydaje się losowe

Źródło to wideo z kamery UIImagePickerController, świeżo pobrane z aparatu lub z biblioteki zasobów. Jakość w tej chwili to UIImagePickerControllerQualityTypeMedium.

Czasami program piszący kończy się niepowodzeniem. Jest to stan jest AVAssetWriterStatusFailed a właściwość błąd AVAssetWriter obiekty to:

Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" 
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210), 
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)", 
NSLocalizedDescription=The operation could not be completed 

Błąd występuje około 20% czasu kod jest uruchamiane. Wydaje się niedziałać częściej na iPhone 4/4S niż na iPhonie 5.

Występuje również częściej, jeśli jakość źródła wideo jest wyższa. Używanie UIImagePickerControllerQualityTypeLow błąd nie zdarza się tak często. Używanie UIImagePickerControllerQualityTypeHigh, błąd zdarza się nieco częściej.

Zauważyłem również coś jeszcze: Wydaje się, że przychodzi w falach. Kiedy się nie powiedzie, następujące uruchomienia często zawodzą, mimo że usunę aplikację i zainstaluję ją ponownie. Zastanawiam się, czy mój program wycieka z pamięci i czy ta pamięć pozostaje żywa, nawet jeśli aplikacja zostanie zabita (czy to nawet możliwe?).

Oto kod używam, aby uczynić mój film:

- (void)writeVideo 
{ 
    offlineRenderingInProgress = YES; 

/* --- Writer Setup --- */ 

    [locationQueue cancelAllOperations]; 

    [self stopWithoutRewinding]; 

    NSError *writerError = nil; 

    BOOL succes; 

    succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil]; 

    // DLog(@"Url: %@, succes: %i, error: %@", self.outputURL, succes, fileError); 

    writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError]; 
    //writer.shouldOptimizeForNetworkUse = NO; 

    if (writerError) { 
     DLog(@"Writer error: %@", writerError); 
     return; 
    } 

    float bitsPerPixel; 
    CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0])); 
    int numPixels = dimensions.width * dimensions.height; 
    int bitsPerSecond; 

    // Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate 
    if (numPixels < (640 * 480)) 
     bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low. 
    else 
     bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh. 

    bitsPerSecond = numPixels * bitsPerPixel; 

    NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys: 
              AVVideoCodecH264, AVVideoCodecKey, 
              [NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey, 
              [NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey, 
              [NSDictionary dictionaryWithObjectsAndKeys: 
              [NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey, 
              nil], AVVideoCompressionPropertiesKey, 
              nil]; 

    writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings]; 
    writerVideoInput.transform = movie.preferredTransform; 
    writerVideoInput.expectsMediaDataInRealTime = YES; 
    [writer addInput:writerVideoInput]; 

    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: 
                 [NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil]; 

    writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput 
                         sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary]; 
    BOOL couldStart = [writer startWriting]; 

    if (!couldStart) { 
     DLog(@"Could not start AVAssetWriter!"); 
     abort = YES; 
     [locationQueue cancelAllOperations]; 
     return; 
    } 

    [self configureFilters]; 

    CIContext *offlineRenderContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @NO}]; 


    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 

    if (!self.canEdit) { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES]; 
    } else { 
     [self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES]; 
    } 

    CMTime startOffset = reader.timeRange.start; 

    DLog(@"startOffset: %llu", startOffset.value); 

    [self.thumbnailEditView removeFromSuperview]; 
    // self.thumbnailEditView = nil; 

    [glLayer removeFromSuperlayer]; 
    glLayer = nil; 

    [playerView removeFromSuperview]; 
    playerView = nil; 

    glContext = nil; 



    [writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ 

     @try { 


     BOOL didWriteSomething = NO; 

     DLog(@"Preparing to write..."); 

     while ([writerVideoInput isReadyForMoreMediaData]) { 

      if (abort) { 
       NSLog(@"Abort == YES"); 
       [locationQueue cancelAllOperations]; 
       [writerVideoInput markAsFinished]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
      } 

      if (writer.status == AVAssetWriterStatusFailed) { 
       DLog(@"Writer.status: AVAssetWriterStatusFailed, error: %@", writer.error); 

       [[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:@"QualityOverride"]; 
       [[NSUserDefaults standardUserDefaults] synchronize]; 

       abort = YES; 
       [locationQueue cancelAllOperations]; 
       videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
       return; 
       DLog(@"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]); 
      } 

      DLog(@"Writing started..."); 

      CMSampleBufferRef buffer = nil; 

      if (reader.status != AVAssetReaderStatusUnknown) { 

       if (reader.status == AVAssetReaderStatusReading) { 
        buffer = [readerVideoOutput copyNextSampleBuffer]; 
        if (didWriteSomething == NO) { 
         DLog(@"Copying sample buffers..."); 
        } 
       } 

       if (!buffer) { 

        [writerVideoInput markAsFinished]; 

        DLog(@"Finished..."); 

        CGColorSpaceRelease(colorSpace); 

        [self offlineRenderingDidFinish]; 


        dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ 

         [writer finishWriting]; 
         if (writer.error != nil) { 
          DLog(@"Error: %@", writer.error); 
         } else { 
          DLog(@"Succes!"); 
         } 

         if (writer.status == AVAssetWriterStatusCompleted) { 

          videoConvertCompletionBlock(YES, nil); 
         } 

         else { 
          abort = YES; 
          videoConvertCompletionBlock(NO, writer.error.localizedDescription); 
         } 

        }); 


        return; 
       } 

       didWriteSomething = YES; 
      } 
      else { 

       DLog(@"Still waiting..."); 
       //Reader just needs a moment to get ready... 
       continue; 
      } 

      CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer); 

      if (pixelBuffer == NULL) { 
       DLog(@"Pixelbuffer == NULL"); 
       continue; 
      } 

      //DLog(@"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer)); 

      //NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace]; 

      CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil]; 

      CIImage *outputImage = [self filteredImageWithImage:ciimage]; 


      CVPixelBufferRef outPixelBuffer = NULL; 
      CVReturn status; 

      CFDictionaryRef empty; // empty value for attr value. 
      CFMutableDictionaryRef attrs; 
      empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary 
             NULL, 
             NULL, 
             0, 
             &kCFTypeDictionaryKeyCallBacks, 
             &kCFTypeDictionaryValueCallBacks); 

      attrs = CFDictionaryCreateMutable(kCFAllocatorDefault, 
               1, 
               &kCFTypeDictionaryKeyCallBacks, 
               &kCFTypeDictionaryValueCallBacks); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferIOSurfacePropertiesKey, 
           empty); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGImageCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 

      CFDictionarySetValue(attrs, 
           kCVPixelBufferCGBitmapContextCompatibilityKey, 
           (__bridge const void *)([NSNumber numberWithBool:YES])); 


      status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer); 

      //DLog(@"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer)); 

      if (status != kCVReturnSuccess) { 
       DLog(@"Couldn't allocate output pixelBufferRef!"); 
       continue; 
      } 

      [offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace]; 

      CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer); 
      CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset); 
      CMTime duration = reader.timeRange.duration; 
      if (CMTIME_IS_POSITIVE_INFINITY(duration)) { 
       duration = movie.duration; 
      } 
      CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default); 

      float durationFloat = (float)durationConverted.value; 
      float progress = ((float) currentTime.value)/durationFloat; 

      //DLog(@"duration : %f, progress: %f", durationFloat, progress); 

      [self updateOfflineRenderProgress:progress]; 

      if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) { 
       [writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime]; 
      } else { 
       continue; 
      } 

      if (writer.status == AVAssetWriterStatusWriting) { 
       DLog(@"Writer.status: AVAssetWriterStatusWriting"); 
      } 

      CFRelease(buffer); 
      CVPixelBufferRelease(outPixelBuffer); 
     } 

     } 

     @catch (NSException *exception) { 
      DLog(@"Catching exception: %@", exception); 
     } 

    }]; 

} 
+0

Twoje opcje CIContext są do tyłu. Zgaduję, że miałeś zamiar napisać 'CIContext * offlineRenderContext = [kontekst CIContekstu zOpcje: @ {kCIContextUseSoftwareRenderer: @NO}];' –

+0

Tak, oczywiście. Poprawiłem go w poście. –

Odpowiedz

12

Ok, myślę, że to sam rozwiązać. Zły facet był w tej linii:

[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ .... 

Przekazywana globalna kolejka jest równoległą kolejką. Pozwala to na wykonanie nowego wywołania zwrotnego przed zakończeniem poprzedniego. Twórca zasobów nie jest przeznaczony do pisania na więcej niż na jeden wątek naraz.

Tworzenie i używanie nową kolejkę seryjnego wydaje się rozwiązać problem:

assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL); 

[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{... 
Powiązane problemy