AVVideoComposition sends wrong/inconsistent compositionTime

Originator:mself.com
Number:rdar://26831264 Date Originated:15-Jun-2016 10:42 PM
Status:Open Resolved:
Product:OS X SDK Product Version:OS X 10.11.5
Classification: Reproducible:Yes
 
Summary:
When using [AVVideoComposition videoCompositionWithAsset:applyingCIFiltersWithHandler:] to play a video, the time stamps passed in request.compositionTime do not correspond to presentation timestamps of the source video asset when playing the video with AVPlayerView.

However, when you single step with AVPlayerView, the time stamps *do* correspond to presentation timestamps of the source video asset.  The timebase of the time stamps is different in these two cases.

The inconsistency of the compositionTime time stamps make it difficult to build video filters that require high time accuracy.

Steps to Reproduce:
I have attached a sample program that reproduces the problem.  Here is a high-level summary of the sample program:

1) Load an MPEG-4 video (from a GoPro) into an AVAsset.
    - Note that the nominalFrameRate = 59.84, which is wrong (ffprobe reports 59.94).
    - Note that (1.0 / minFrameDuration) = 59.94, which is correct.
2) Read the first few frames of the video using AVAssetReader.
    - The PTS values are multiples of 0.016683 (1001/60000), which corresponds to 59.94 fps.
3) Create a video composition that just returns the source frames unmodified.
4) Play the composition with an AVPlayer.
    - The composition times are multiples of 0.016711 (1504/90000), which corresponds to 59.84 fps.
    - If you pause the video and use the stepping controls, now the composition times are multiples of 0.016683 (1001/60000), which corresponds to 59.94 fps.

Expected Results:
There are three 
1) The frame rate reported by nominalFrameRate should be 59.94 for this video rather than 59.84.
2) The values returned by request.compositionTime should correspond to the presentation times of the source video (the same ones returned by CMSampleBufferGetPresentationTimeStamp when using AVAssetReader).
3) The composition times should behave consistently whether AVPlayer is in playing mode or stepping mode.

Actual Results:
The actual results are included in the "Steps to Reproduce" section.

This issue is a problem for me because I am building a CIFilter that requires high time accuracy.  I process the video in a first pass (while recording info about each frame), and then I play the video back with a CIFilter that needs to look up the values from the first pass.  This is not possible if the time stamps from request.compositionTime are not correct or inconsistent.

Version:
Xcode Version 7.3 (7D175)
OS X 10.11.5


Notes:
Possibly related to rdar://15376688 ("AVFoundation lacking access to accurate timing of input frames when using custom video compositor")

Configuration:
iMac (27-inch, Late 2013)

Attachments:
'ViewController.h', 'AVAssetDemo.app.zip', 'ViewController.m' and 'log.txt' were successfully uploaded.

Comments

I found a solution

See http://stackoverflow.com/questions/37754673/how-do-you-access-the-frame-number-inside-avasynchronousciimagefilteringrequest/37957066#37957066 for details.


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!