New AV Foundation APIs in OS X Yosemite for Professional Video Workflows
AV Foundation in OS X Yosemite introduces new features for professional video workflows. These include classes for iterating and examining the samples in a media, support for URL reference movies, writing of fragmented movie files, enhancements to DV file format support and uncompressed movie support.
This document provides a high-level overview of these new features.
New classes for media sample introspection and loading: AVSampleCursor and AVSampleBufferGenerator
AVSampleCursor
AVAssetTrack
now supports methods to createAVSampleCursor
objects.AVSampleCursor
objects can be used to iterate and examine the samples in a media (as could be done with the now deprecated QuickTime APIs such asGetMediaNextInterestingTime
,GetMediaSampleReference
andCopyMediaMutableSampleTable
).
The simplest case is creating an instance ofAVSampleCursor
from anAVAssetTrack
and positioning it at the first (or last) media sample in decode order. You should always first check theAVAssetTrack
canProvideSampleCursors
property to determine whether the asset can provide instances ofAVSampleCursor
for traversing its media samples and discovering information about them. SeeListing 1.
Listing 1Creating anAVSampleCursor
object from anAVAssetTrack
instance.
AVAssetTrack *assetTrack = <#An AVAssetTrack#>; |
if (assetTrack.canProvideSampleCursors) |
{ |
// Create an instance of AVSampleCursor and position it at the receiver's |
// first media sample in decode order. |
AVSampleCursor *cursor = [assetTrack makeSampleCursorAtFirstSampleInDecodeOrder]; |
// Do something interesting with the AVSampleCursor |
} |
AnAVSampleCursor
can also be created and positioned at or near a specified presentation timestamp:
Listing 2Create and initialize aAVSampleCursor
object from anAVAssetTrack
and time stamp.
AVAssetTrack *assetTrack = <#An AVAssetTrack#>; |
if (assetTrack.canProvideSampleCursors) |
{ |
CMTime presentationTimeStamp = <#A time stamp#>; |
// Create an instance of AVSampleCursor and position it at the specified time stamp |
AVSampleCursor *cursor = |
[assetTrack makeSampleCursorWithPresentationTimeStamp:presentationTimeStamp]; |
// Do something interesting with the AVSampleCursor |
} |
AnAVSampleCursor
can traverse across a given number of samples in decode order using thestepInDecodeOrderByCount
method, or in presentation order using thestepInPresentationOrderByCount
method. Similarly, you can reposition a cursor by a specificdeltaTime
on the decode or presentation timeline using thestepByDecodeTime:wasPinned:
andstepByPresentationTime:wasPinned:
methods, respectively.
You can compare the relative positions of two AVSampleCursors with thecomparePositionInDecodeOrderWithPositionOfCursor
method. You can also test a boundary in the reordering from decode order to presentation order with thesamplesWithEarlierDecodeTimeStampsMayHaveLaterPresentationTimeStampsThanCursor
andsamplesWithLaterDecodeTimeStampsMayHaveEarlierPresentationTimeStampsThanCursor
methods.
AnAVSampleCursor
object can provide various information about the media sample at its current position, such as its duration, presentation and decode timestamps, its ability to be decoded independently of other media samples, offset and length in its storage container, and whether or not the sample is intended to be loaded with other contiguous media samples in a "chunk".
SeeAVSampleCursor
.h andAVAssetTrack
.h for more information.
AVSampleBufferGenerator
TheAVSampleBufferGenerator
class provides flexible services for loading samples referenced byAVSampleCursor
intoCMSampleBuffer
objects. Each request forCMSampleBuffer
creation is described in anAVSampleBufferRequest
object.
To use anAVSampleBufferGenerator
,first create anAVSampleBufferRequest
object and set its properties to configure your request:
Listing 3Creating and configuring anAVSampleBufferRequest
object.
// Create a AVSampleBufferRequest object from an AVSampleCursor. |
AVSampleCursor *cursor = <#An AVSampleCursor#>; |
AVSampleBufferRequest *sampleBufferRequest = |
[[AVSampleBufferRequest alloc] initWithStartCursor:cursor]; |
if (sampleBufferRequest) |
{ |
// Configure the AVSampleBufferRequest |
// See AVSampleBufferGenerator.h for more information |
sampleBufferRequest.direction = AVSampleBufferRequestDirectionForward; |
// Do something interesting with the AVSampleBufferRequest |
} |
You create anAVSampleBufferGenerator
object from anAVAsset
instance using theinitWithAsset
method. The request will be immediate if you passNULL
for theCMTimebase
argument or set AVSampleBufferRequestModeImmediate as the mode; or scheduled if you pass a validCMTimebase
value and set AVSampleBufferRequestModeScheduled as the mode.
To generate aCMSampleBuffer
from anAVSampleBufferGenerator
object, call theAVSampleBufferGenerator
createSampleBufferForRequest
method, passing yourAVSampleBufferRequest
instance as a parameter. SeeListing 4.
Listing 4Generating aCMSampleBuffer
from anAVSampleBufferGenerator
object.
AVSampleBufferRequest *sampleBufferRequest = <#An AVSampleBufferRequest for the AVAsset#>; |
AVAsset *asset = <#The AVAsset#>; |
AVSampleBufferGenerator *generator = |
[[AVSampleBufferGenerator alloc] initWithAsset:asset timebase:NULL]; |
if (generator) |
{ |
CMSampleBufferRef samplebuffer = |
[generator createSampleBufferForRequest:sampleBufferRequest]; |
// Do something interesting with sample buffer. |
// For example: |
// CMItemCount itemCount = CMSampleBufferGetNumSamples(samplebuffer); |
// etc. |
// See CMSampleBufferRef.h |
CFRelease(samplebuffer); |
} |
SeeAVSampleBufferGenerator
.h for more information.
URL Reference movie support
In OS X Yosemite, AV Foundation supports reference movies, where the movie file may not actually contain the movie's sample data. Such movie files contain relative or absolute URLs to other files where sample data is stored. URL reference movies may be played viaAVPlayer
.
A newAVAssetReaderOutput
subclass,AVAssetReaderSampleReferenceOutput
,lets you readCMSampleBuffer
's which contain the absolute URL and offset in place of the sample data. To create anAVAssetReaderSampleReferenceOutput
object, you supply the asset track for which which the resultingAVAssetReaderSampleReferenceOutput
should provide sample references. The track must be one of the tracks contained by the targetAVAssetReader
's asset.
Clients can extract information about the location (file URL and offset) of samples in a track by adding an instance ofAVAssetReaderSampleReferenceOutput
to anAVAssetReader
using theAVAssetReader
addOutput
method as shown inListing 5.
Listing 5Creating anAVAssetReaderSampleReferenceOutput
object and adding it as anAVAssetReader
output.
AVAssetReader *assetReader = <#an asset reader#>; |
AVAssetTrack *aTrack = <#an asset track contained by the asset reader's asset#>; |
AVAssetReaderSampleReferenceOutput *sampleRefOutput = |
[AVAssetReaderSampleReferenceOutput assetReaderSampleReferenceOutputWithTrack: aTrack]; |
// Add AVAssetReaderSampleReferenceOutput to the AVAssetReader |
[assetReader addOutput: sampleRefOutput]; |
No actual sample data can be extracted using theAVAssetReaderSampleReferenceOutput
class. The location of the sample data is described by thekCMSampleBufferAttachmentKey_SampleReferenceURL
andkCMSampleBufferAttachmentKey_SampleReferenceByteOffset
attachments on the extracted sample buffer. Use theCMGetAttachment
API to get these attachments from the sample buffer.Listing 6shows an example.
Listing 6Get thekCMSampleBufferAttachmentKey_SampleReferenceURL
,kCMSampleBufferAttachmentKey_SampleReferenceByteOffset
attachments from a sample buffer.
CMSampleBufferRef sampleBuffer = <#a sample buffer#>; |
CMAttachmentMode attachmentMode; |
CFURLRef url = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_SampleReferenceURL, &attachmentMode); |
if (url) |
{ |
// Do something interesting with the URL. |
} |
CFNumberRef byteoffset = CMGetAttachment(sampleBuffer, kCMSampleBufferAttachmentKey_SampleReferenceByteOffset, &attachmentMode); |
if (byteoffset) |
{ |
// Do something interesting with the byte offset. |
} |
More information about sample buffers describing sample references can be found in theCMSampleBuffer
documentation.CMSampleBuffer
's constructed in this manner may be provided toAVAssetWriterInput
to write sample references to new movie files.
A newAVAssetWriterInput
property,sampleReferenceBaseURL
,lets you specify a base URL that the sample reference URLs should be rewritten to be relative to. If the value of this property can be resolved as an absolute URL, the sample locations written to the file when appending sample references will be relative to this URL. The URL must point to a location that is in a directory that is a parent of the sample reference location. For example, setting thesampleReferenceBaseURL
property to "file:///User/johnappleseed/Movies/" and appending sample buffers with thekCMSampleBufferAttachmentKey_SampleReferenceURL
attachment set to "file:///User/johnappleseed/Movies/data/movie1.mov" will cause the sample reference "data/movie1.mov" to be written to the movie.
SeeAVAssetReaderOutput
.h,AVAssetWriterInput
.h andCMSampleBuffer
.h for more information.
Fragmented movie support
AVFoundation'sAVCaptureMovieFileOutput
andAVAssetWriter
APIs support writing fragmented movie files. Such files have a regular movie header near the beginning that describes some initial period of media data, followed by "fragments" consisting of a period of media data and then a "movie fragment" describing that additional data. This approach provides crash-resilience, since even if file writing stops abruptly, only the most recent fragment can be lost.
A newAVAsset
subclass,AVFragmentedMovie
,is aware of the presence of movie fragments, and a companion class,AVFragmentedMovieMinder
,can be used to request that AVFoundation periodically examine the movie file to see if more movie fragments have been appended. If they have, they are incorporated into theAVAsset
and the newly added media data can immediately be played without needing to close and reopen theAVAsset
.See AVMovie.h for additional information.
To enable writing of fragmented movie files usingAVCaptureMovieFileOutput
orAVAssetWriter
,specify a time to elapse between writing movie fragments value for themovieFragmentInterval
property on these classes (the default value iskCMTimeInvalid
,which means that movie fragments should not be used).
In OS X Yosemite, QuickTime Player X uses these new services, so if you open a fragmented movie file while it is being captured by another process or computer, you will see its duration periodically increase, and you can scrub into and play portions of the movie that had not yet been captured when you first opened it.
DV Stream File Format Support
Support for playback of DV stream files is now directly integrated into the AVFoundation/CoreMedia workflow on OS X Yosemite.
Uncompressed Movie Support
Uncompressed movie formats now supported on OS X Yosemite include:
R10k
v210
2vuy
Document Revision History
Date | Notes |
---|---|
2014-12-01 | New document that describes AV Foundation APIs in OS X Yosemite for professional video workflows. |