Technical Note TN2404

New AV Foundation APIs in OS X Yosemite for Professional Video Workflows

AV Foundation in OS X Yosemite introduces new features for professional video workflows. These include classes for iterating and examining the samples in a media, support for URL reference movies, writing of fragmented movie files, enhancements to DV file format support and uncompressed movie support.

This document provides a high-level overview of these new features.

New classes for media sample introspection and loading: AVSampleCursor and AVSampleBufferGenerator
URL Reference movie support
Fragmented movie support
DV Stream File Format Support
Uncompressed Movie Support
Document Revision History

New classes for media sample introspection and loading: AVSampleCursor and AVSampleBufferGenerator

AVSampleCursor

AVAssetTracknow supports methods to createAVSampleCursorobjects.AVSampleCursorobjects can be used to iterate and examine the samples in a media (as could be done with the now deprecated QuickTime APIs such asGetMediaNextInterestingTime,GetMediaSampleReferenceandCopyMediaMutableSampleTable).

The simplest case is creating an instance ofAVSampleCursorfrom anAVAssetTrackand positioning it at the first (or last) media sample in decode order. You should always first check theAVAssetTrackcanProvideSampleCursorsproperty to determine whether the asset can provide instances ofAVSampleCursorfor traversing its media samples and discovering information about them. SeeListing 1.

Listing 1Creating anAVSampleCursorobject from anAVAssetTrackinstance.

AVAssetTrack *assetTrack = <#An AVAssetTrack#>;
if (assetTrack.canProvideSampleCursors)
{
// Create an instance of AVSampleCursor and position it at the receiver's
// first media sample in decode order.
AVSampleCursor *cursor = [assetTrack makeSampleCursorAtFirstSampleInDecodeOrder];
// Do something interesting with the AVSampleCursor
}

AnAVSampleCursorcan also be created and positioned at or near a specified presentation timestamp:

Listing 2Create and initialize aAVSampleCursorobject from anAVAssetTrackand time stamp.

AVAssetTrack *assetTrack = <#An AVAssetTrack#>;
if (assetTrack.canProvideSampleCursors)
{
CMTime presentationTimeStamp = <#A time stamp#>;
// Create an instance of AVSampleCursor and position it at the specified time stamp
AVSampleCursor *cursor =
[assetTrack makeSampleCursorWithPresentationTimeStamp:presentationTimeStamp];
// Do something interesting with the AVSampleCursor
}

AnAVSampleCursorcan traverse across a given number of samples in decode order using thestepInDecodeOrderByCountmethod, or in presentation order using thestepInPresentationOrderByCountmethod. Similarly, you can reposition a cursor by a specificdeltaTimeon the decode or presentation timeline using thestepByDecodeTime:wasPinned:andstepByPresentationTime:wasPinned:methods, respectively.

You can compare the relative positions of two AVSampleCursors with thecomparePositionInDecodeOrderWithPositionOfCursormethod. You can also test a boundary in the reordering from decode order to presentation order with thesamplesWithEarlierDecodeTimeStampsMayHaveLaterPresentationTimeStampsThanCursorandsamplesWithLaterDecodeTimeStampsMayHaveEarlierPresentationTimeStampsThanCursormethods.

AnAVSampleCursorobject can provide various information about the media sample at its current position, such as its duration, presentation and decode timestamps, its ability to be decoded independently of other media samples, offset and length in its storage container, and whether or not the sample is intended to be loaded with other contiguous media samples in a "chunk".

SeeAVSampleCursor.h andAVAssetTrack.h for more information.

AVSampleBufferGenerator

TheAVSampleBufferGeneratorclass provides flexible services for loading samples referenced byAVSampleCursorintoCMSampleBufferobjects. Each request forCMSampleBuffercreation is described in anAVSampleBufferRequestobject.

To use anAVSampleBufferGenerator,first create anAVSampleBufferRequestobject and set its properties to configure your request:

Listing 3Creating and configuring anAVSampleBufferRequestobject.

// Create a AVSampleBufferRequest object from an AVSampleCursor.
AVSampleCursor *cursor = <#An AVSampleCursor#>;
AVSampleBufferRequest *sampleBufferRequest =
[[AVSampleBufferRequest alloc] initWithStartCursor:cursor];
if (sampleBufferRequest)
{
// Configure the AVSampleBufferRequest
// See AVSampleBufferGenerator.h for more information
sampleBufferRequest.direction = AVSampleBufferRequestDirectionForward;
// Do something interesting with the AVSampleBufferRequest
}

You create anAVSampleBufferGeneratorobject from anAVAssetinstance using theinitWithAssetmethod. The request will be immediate if you passNULLfor theCMTimebaseargument or set AVSampleBufferRequestModeImmediate as the mode; or scheduled if you pass a validCMTimebasevalue and set AVSampleBufferRequestModeScheduled as the mode.

To generate aCMSampleBufferfrom anAVSampleBufferGeneratorobject, call theAVSampleBufferGeneratorcreateSampleBufferForRequestmethod, passing yourAVSampleBufferRequestinstance as a parameter. SeeListing 4.

Listing 4Generating aCMSampleBufferfrom anAVSampleBufferGeneratorobject.

AVSampleBufferRequest *sampleBufferRequest = <#An AVSampleBufferRequest for the AVAsset#>;
AVAsset *asset = <#The AVAsset#>;
AVSampleBufferGenerator *generator =
[[AVSampleBufferGenerator alloc] initWithAsset:asset timebase:NULL];
if (generator)
{
CMSampleBufferRef samplebuffer =
[generator createSampleBufferForRequest:sampleBufferRequest];
// Do something interesting with sample buffer.
// For example:
// CMItemCount itemCount = CMSampleBufferGetNumSamples(samplebuffer);
// etc.
// See CMSampleBufferRef.h
CFRelease(samplebuffer);
}

SeeAVSampleBufferGenerator.h for more information.

URL Reference movie support

In OS X Yosemite, AV Foundation supports reference movies, where the movie file may not actually contain the movie's sample data. Such movie files contain relative or absolute URLs to other files where sample data is stored. URL reference movies may be played viaAVPlayer.

A newAVAssetReaderOutputsubclass,AVAssetReaderSampleReferenceOutput,lets you readCMSampleBuffer's which contain the absolute URL and offset in place of the sample data. To create anAVAssetReaderSampleReferenceOutputobject, you supply the asset track for which which the resultingAVAssetReaderSampleReferenceOutputshould provide sample references. The track must be one of the tracks contained by the targetAVAssetReader's asset.

Clients can extract information about the location (file URL and offset) of samples in a track by adding an instance ofAVAssetReaderSampleReferenceOutputto anAVAssetReaderusing theAVAssetReaderaddOutputmethod as shown inListing 5.

Listing 5Creating anAVAssetReaderSampleReferenceOutputobject and adding it as anAVAssetReaderoutput.

AVAssetReader *assetReader = <#an asset reader#>;
AVAssetTrack *aTrack = <#an asset track contained by the asset reader's asset#>;
AVAssetReaderSampleReferenceOutput *sampleRefOutput =
[AVAssetReaderSampleReferenceOutput assetReaderSampleReferenceOutputWithTrack: aTrack];
// Add AVAssetReaderSampleReferenceOutput to the AVAssetReader
[assetReader addOutput: sampleRefOutput];

No actual sample data can be extracted using theAVAssetReaderSampleReferenceOutputclass. The location of the sample data is described by thekCMSampleBufferAttachmentKey_SampleReferenceURLandkCMSampleBufferAttachmentKey_SampleReferenceByteOffsetattachments on the extracted sample buffer. Use theCMGetAttachmentAPI to get these attachments from the sample buffer.Listing 6shows an example.

Listing 6Get thekCMSampleBufferAttachmentKey_SampleReferenceURL,kCMSampleBufferAttachmentKey_SampleReferenceByteOffsetattachments from a sample buffer.

CMSampleBufferRef sampleBuffer = <#a sample buffer#>;
CMAttachmentMode attachmentMode;
CFURLRef url = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_SampleReferenceURL, &attachmentMode);
if (url)
{
// Do something interesting with the URL.
}
CFNumberRef byteoffset = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_SampleReferenceByteOffset, &attachmentMode);
if (byteoffset)
{
// Do something interesting with the byte offset.
}

More information about sample buffers describing sample references can be found in theCMSampleBufferdocumentation.CMSampleBuffer's constructed in this manner may be provided toAVAssetWriterInputto write sample references to new movie files.

A newAVAssetWriterInputproperty,sampleReferenceBaseURL,lets you specify a base URL that the sample reference URLs should be rewritten to be relative to. If the value of this property can be resolved as an absolute URL, the sample locations written to the file when appending sample references will be relative to this URL. The URL must point to a location that is in a directory that is a parent of the sample reference location. For example, setting thesampleReferenceBaseURLproperty to "file:///User/johnappleseed/Movies/" and appending sample buffers with thekCMSampleBufferAttachmentKey_SampleReferenceURLattachment set to "file:///User/johnappleseed/Movies/data/movie1.mov" will cause the sample reference "data/movie1.mov" to be written to the movie.

SeeAVAssetReaderOutput.h,AVAssetWriterInput.h andCMSampleBuffer.h for more information.

Fragmented movie support

AVFoundation'sAVCaptureMovieFileOutputandAVAssetWriterAPIs support writing fragmented movie files. Such files have a regular movie header near the beginning that describes some initial period of media data, followed by "fragments" consisting of a period of media data and then a "movie fragment" describing that additional data. This approach provides crash-resilience, since even if file writing stops abruptly, only the most recent fragment can be lost.

A newAVAssetsubclass,AVFragmentedMovie,is aware of the presence of movie fragments, and a companion class,AVFragmentedMovieMinder,can be used to request that AVFoundation periodically examine the movie file to see if more movie fragments have been appended. If they have, they are incorporated into theAVAssetand the newly added media data can immediately be played without needing to close and reopen theAVAsset.See AVMovie.h for additional information.

To enable writing of fragmented movie files usingAVCaptureMovieFileOutputorAVAssetWriter,specify a time to elapse between writing movie fragments value for themovieFragmentIntervalproperty on these classes (the default value iskCMTimeInvalid,which means that movie fragments should not be used).

In OS X Yosemite, QuickTime Player X uses these new services, so if you open a fragmented movie file while it is being captured by another process or computer, you will see its duration periodically increase, and you can scrub into and play portions of the movie that had not yet been captured when you first opened it.

DV Stream File Format Support

Support for playback of DV stream files is now directly integrated into the AVFoundation/CoreMedia workflow on OS X Yosemite.

Uncompressed Movie Support

Uncompressed movie formats now supported on OS X Yosemite include:



Document Revision History


DateNotes
2014-12-01

New document that describes AV Foundation APIs in OS X Yosemite for professional video workflows.