Title: [222225] trunk
Revision
222225
Author
jer.no...@apple.com
Date
2017-09-19 14:15:46 -0700 (Tue, 19 Sep 2017)

Log Message

[Cocoa] Add an ImageDecoder subclass backed by AVFoundation
https://bugs.webkit.org/show_bug.cgi?id=176825

Reviewed by Eric Carlson.

Source/WebCore:

Add a new concrete subclass of ImageDecoder which uses AVFoundation to parse and decode
image data.

AVFoundation APIs require prior knowledge of the media data's mime type to determine whether
the media data is decodable, so the mime type information must be passed through from the
CachedResource -> CachedImage -> ImageFrameCache -> ImageSource so as to be available when
creating the ImageDecoder:

(Drive-by fix: the createFrameImageAtIndex() method will mutate internal state, so make it
non-const.)

* loader/cache/CachedImage.h:
* loader/cache/CachedResource.h:
(WebCore::CachedResource::mimeType const):
* platform/cf/CoreMediaSoftLink.cpp:
* platform/cf/CoreMediaSoftLink.h:
* platform/cocoa/VideoToolboxSoftLink.cpp:
* platform/cocoa/VideoToolboxSoftLink.h:
* platform/graphics/Image.cpp:
(WebCore::Image::mimeType const):
(WebCore::Image::expectedContentSize const):
* platform/graphics/Image.h:
* platform/graphics/ImageDecoder.cpp:
(WebCore::ImageDecoder::create):
* platform/graphics/ImageDecoder.h:
(WebCore::ImageDecoder::setExpectedContentSize):
* platform/graphics/ImageFrameCache.cpp:
(WebCore::ImageFrameCache::mimeType const):
* platform/graphics/ImageFrameCache.h:
* platform/graphics/ImageObserver.h:
* platform/graphics/ImageSource.cpp:
(WebCore::ImageSource::ensureDecoderAvailable):
* platform/graphics/cg/ImageDecoderCG.cpp:
(WebCore::ImageDecoderCG::createFrameImageAtIndex):
* platform/graphics/cg/ImageDecoderCG.h:

Add the new class, ImageDecoderAVFObjC:

AVFoundation expects to load all the media data for an AVURLAsset itself. To map between the
provided SharedData and AVURLAsset's requirements, create a delegate object
WebCoreSharedBufferResourceLoaderDelegate, which responds to requests from the AVURLAsset by
extracting data from the SharedData object. Ensure AVURLAsset doesn't load any data outside
this delegate by passing the AVURLAssetReferenceRestrictionsKey /
AVAssetReferenceRestrictionForbidAll key and value in the AVURLAsset creation options.

* platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.h: Added.
(WebCore::ImageDecoderAVFObjC::create):
(WebCore::ImageDecoderAVFObjC::mimeType const):
(WebCore::ImageDecoderAVFObjC::RotationProperties::isIdentity const):
* platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm: Added.
(SOFT_LINK_CONSTANT):
(-[WebCoreSharedBufferResourceLoaderDelegate initWithParent:]):
(-[WebCoreSharedBufferResourceLoaderDelegate setExpectedContentSize:]):
(-[WebCoreSharedBufferResourceLoaderDelegate updateData:complete:]):
(-[WebCoreSharedBufferResourceLoaderDelegate canFulfillRequest:]):
(-[WebCoreSharedBufferResourceLoaderDelegate enqueueRequest:]):
(-[WebCoreSharedBufferResourceLoaderDelegate fulfillPendingRequests]):
(-[WebCoreSharedBufferResourceLoaderDelegate fulfillRequest:]):
(-[WebCoreSharedBufferResourceLoaderDelegate resourceLoader:shouldWaitForLoadingOfRequestedResource:]):
(-[WebCoreSharedBufferResourceLoaderDelegate resourceLoader:didCancelLoadingRequest:]):
(WebCore::customSchemeURL):
(WebCore::imageDecoderAssetOptions):
(WebCore::transformToRotationProperties):
(WebCore::ImageDecoderAVFObjC::ImageDecoderAVFObjC):
(WebCore::ImageDecoderAVFObjC::canDecodeType):
(WebCore::ImageDecoderAVFObjC::firstEnabledTrack):
(WebCore::ImageDecoderAVFObjC::readSampleMetadata): Parses the media data using AVSampleCursor to walk
    the media sample table, extracting frame presentation time, decode time, and duration.
(WebCore::ImageDecoderAVFObjC::readTrackMetadata): Reads the affine transform and size information from
    the AVAssetTrack, and transforms the transform into a rotation value.
(WebCore::ImageDecoderAVFObjC::storeSampleBuffer): Decompress the incoming sample data, optionally rotate
    the output, and store the results in the sample data vector.
(WebCore::ImageDecoderAVFObjC::advanceCursor): Wrap around the end of the sample table.
(WebCore::ImageDecoderAVFObjC::setTrack): Reset all sample and track metadata.
(WebCore::ImageDecoderAVFObjC::encodedDataStatus const): Retrieve from sample data.
(WebCore::ImageDecoderAVFObjC::frameCount const): Ditto.
(WebCore::ImageDecoderAVFObjC::repetitionCount const): Ditto.
(WebCore::ImageDecoderAVFObjC::uti const): Ditto.
(WebCore::ImageDecoderAVFObjC::filenameExtension const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameSizeAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameIsCompleteAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameOrientationAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameDurationAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameHasAlphaAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameAllowSubsamplingAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::frameBytesAtIndex const): Ditto.
(WebCore::ImageDecoderAVFObjC::createFrameImageAtIndex): If the sample data has already been
    decompressed, return it. Otherwise, walk through the sample table decompressing frames
    until the desired frame is decoded.
(WebCore::ImageDecoderAVFObjC::setData):
(WebCore::ImageDecoderAVFObjC::clearFrameBufferCache):

Modify WebCoreDecompressionSession so that it can emit frames which have been converted from
YUV -> RGB as part of the decode operation. Also, add a synchronous decoding operation
method, for use in ImageDecoderAVFObjC.

* platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:
(WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession):
* platform/graphics/cocoa/WebCoreDecompressionSession.h:
(WebCore::WebCoreDecompressionSession::createOpenGL):
(WebCore::WebCoreDecompressionSession::createRGB):
* platform/graphics/cocoa/WebCoreDecompressionSession.mm:
(WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession):
(WebCore::WebCoreDecompressionSession::ensureDecompressionSessionForSample):
(WebCore::WebCoreDecompressionSession::decodeSample):
(WebCore::WebCoreDecompressionSession::decodeSampleSync):

Other changes:

* WebCore.xcodeproj/project.pbxproj: Add new files to project.
* platform/cocoa/VideoToolboxSoftLink.cpp: Add newly referenced methods.
* platform/cocoa/VideoToolboxSoftLink.h: Ditto.

Source/WTF:

* wtf/Platform.h:

LayoutTests:

* fast/images/animated-image-mp4-expected.txt: Added.
* fast/images/animated-image-mp4.html: Added.
* fast/images/resources/animated-red-green-blue.mp4: Added.
* platform/ios/TestExpectations:

Modified Paths

Added Paths

Diff

Modified: trunk/LayoutTests/ChangeLog (222224 => 222225)


--- trunk/LayoutTests/ChangeLog	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/LayoutTests/ChangeLog	2017-09-19 21:15:46 UTC (rev 222225)
@@ -1,3 +1,15 @@
+2017-09-19  Jer Noble  <jer.no...@apple.com>
+
+        [Cocoa] Add an ImageDecoder subclass backed by AVFoundation
+        https://bugs.webkit.org/show_bug.cgi?id=176825
+
+        Reviewed by Eric Carlson.
+
+        * fast/images/animated-image-mp4-expected.txt: Added.
+        * fast/images/animated-image-mp4.html: Added.
+        * fast/images/resources/animated-red-green-blue.mp4: Added.
+        * platform/ios/TestExpectations:
+
 2017-09-19  Matt Lewis  <jlew...@apple.com>
 
         Marked imported/w3c/web-platform-tests/background-fetch/interfaces-worker.https.html as flaky on El Capitan Debug.

Added: trunk/LayoutTests/fast/images/animated-image-mp4-expected.txt (0 => 222225)


--- trunk/LayoutTests/fast/images/animated-image-mp4-expected.txt	                        (rev 0)
+++ trunk/LayoutTests/fast/images/animated-image-mp4-expected.txt	2017-09-19 21:15:46 UTC (rev 222225)
@@ -0,0 +1,16 @@
+Test that an mp4 media file loaded as an image can be painted in a canvas.
+
+On success, you will see a series of "PASS" messages, followed by "TEST COMPLETE".
+
+
+PASS Image eventually became red
+PASS Image eventually became green
+PASS Image eventually became blue
+PASS Image eventually became red
+PASS Image eventually became green
+PASS Image eventually became blue
+PASS Image eventually became red
+PASS successfullyParsed is true
+
+TEST COMPLETE
+

Added: trunk/LayoutTests/fast/images/animated-image-mp4.html (0 => 222225)


--- trunk/LayoutTests/fast/images/animated-image-mp4.html	                        (rev 0)
+++ trunk/LayoutTests/fast/images/animated-image-mp4.html	2017-09-19 21:15:46 UTC (rev 222225)
@@ -0,0 +1,96 @@
+<!DOCTYPE html>
+<html>
+<body>
+    <canvas id="canvas" width=100 height=100></div>
+    <script src=""
+    <script>
+        window.jsTestIsAsync = true;
+        var imageData;
+
+        function loadImage(src) {
+            return new Promise(resolve => {
+                let image = new Image;
+                image.src = ""
+                return image.decode().then(() => { resolve(image); });
+            });
+        }
+
+        async function testImage(image, colors, frameRate)
+        {
+            let canvas = document.getElementById('canvas');
+            var previousValue = null;
+
+            while (colors.length) {
+                let color = colors.shift();
+                previousValue = await shouldBecome(image, canvas, color, previousValue, frameRate);
+            }
+        }
+
+        function shouldBecome(image, canvas, color, previousValue, frameRate)
+        {
+            return new Promise(resolve => {
+                let referenceData = colorToImageData(color);
+
+                var test = () => {
+                    let context = canvas.getContext('2d');
+                    context.drawImage(image, 0, 0, canvas.width, canvas.height);
+                    let imageData = context.getImageData(0, 0, 1, 1).data;
+
+                    if (arraysAreApproximatelyEqual(imageData, referenceData, 2)) {
+                        testPassed(`Image eventually became ${ color }`);
+                        resolve(imageData);
+                        return;
+                    }
+
+                    if (previousValue && !arraysAreApproximatelyEqual(imageData, previousValue, 2)) {
+                        testFailed(`Image changed to an unexpected value (was ${ imageData.toString() }, expected ${ color })`);
+                        resolve(imageData);
+                        return;
+                    }
+
+                    setTimeout(test, 1000 / frameRate);
+                };
+
+                test();
+            });
+        }
+
+        function colorToImageData(color)
+        {
+            let canvas = document.createElement('canvas');
+            canvas.width = 1;
+            canvas.height = 1;
+            let context = canvas.getContext('2d');
+            context.fillStyle = color;
+            context.fillRect(0, 0, 1, 1);
+            return context.getImageData(0, 0, 1, 1).data;
+        }
+
+        function arraysAreApproximatelyEqual(test, target, tolerance)
+        {
+            if (test.length != target.length)
+                return false;
+
+            for (let i = 0; i < test.length; ++i) {
+                if (Math.abs(test[i] - target[i]) > tolerance)
+                    return false;
+            }
+
+            return true;
+        }
+
+        function endTest() {
+            finishJSTest();
+            if (window.testRunner)
+                testRunner.notifyDone();
+        }
+
+        description('Test that an mp4 media file loaded as an image can be painted in a canvas.')
+
+        loadImage("resources/animated-red-green-blue.mp4").then(image => {
+            testImage(image, ['red', 'green', 'blue', 'red', 'green', 'blue', 'red'], 100).then(endTest, endTest);
+        });
+    </script>
+    <script src=""
+</body>
+</html>

Added: trunk/LayoutTests/fast/images/resources/animated-red-green-blue.mp4 (0 => 222225)


--- trunk/LayoutTests/fast/images/resources/animated-red-green-blue.mp4	                        (rev 0)
+++ trunk/LayoutTests/fast/images/resources/animated-red-green-blue.mp4	2017-09-19 21:15:46 UTC (rev 222225)
@@ -0,0 +1,13 @@
+ ftypM4V M4V M4A mp42isommdat,\x81_\x90@\x80\x87\xF4N\xCD
+Kܡ\x94:\xC3ԛ\x80M%\xB8  \xC8\xE0R~\x95\x80P&)\xD3<:\xE9C\xE5	\xBE2\xE2zܚ$++7=\x8B\xB7C"F
+o\x98`g\x88\x96\xB3\x98\xA72\xEB*\xCC\xDA.\xA7\xC0~\xB4՞\xF9\x93g\x8C\xA7\xA9z\xAE<\x9D%\xB8  \xC8\xE0R~\x95\x80P&)\xD3<:\xE9C\xE5	\xBD\xC0\x96 \xAB\xB1n{n\x86D\x8C\xDF0\xC0\xCF-g1Ne\xD6U\x99\xB4]OL\xB0\&j\xADX!V\xE2\x8D\xC2\xB1۔_\xE7$\x8A\xA5\xCF\xE1c\xB3\xAB\xA64\xB1/\xDA!¦{F\xC9\4q\xD1K^\x8F\xA3\x8Bv\xBDw4]\xBE5S@#wF\xDDCy\x96\xC9[\xF2:>\xC41y9\x98\xF39\xF5c\xEB9\x96-4\xE0\x9A\xB9@\xCD\x87\xF4N\xCD
+Kܡ\x94:\xC3ԛ\x80K\xB8AC\xFF\xC8\xE0R~\x95\x80OD/L\xF2>\xD9[\xBEP\x9B\xE3!\xC1\xDE'\xA1\xB0}ɠq\xAB\xA9\x96]\x92boB\x9D6\xF5>E\xC69{}3|\x8Eցi\xAB<	\xF3&\xCFN'R\xF5\x\x94\xB8AC\xFF\xC8\xE0R~\x95\x80OD/L\xF2>\xD9[\xBEP\x9B\xDC	b*\xAD`gWR-,\xBB$>\xC4ޅ:
+m\xEA|\x8B\x8Cr\xF6\xFAf\xF9T\xDD?0\xD0\xF3\xAA\xCDZ\xDCQ\xB8Bv;r\x8B\xFC\xE4\x91T\xB9\xFC,vt\xE9\xAEcSh\xDB\xBEQ\xE8\xF5;\x88\xD1\xCC^3\xEC-\xF0\xCD\xFD\x94\xFE
+\x94\xB47tm\xD44\x83\xE7uRKu\xC5\xE5\xB8ղ!H+2\x8B\x87\xD7\xC6@cT-\xCDEA\x87\xF4N\xCD
+Kܡ\x94:\xC3ԛ\x80L%\xB8\xC8\xE0R~\x95\x80P\x98k\xD3=	1\xB7\xBEP\x9B\xE3!\xC1\xDE'\xA1\xB0}ɣ*\xF5\xAF\xBC9\xAAsUo1\xB4\xC1
+\xB7\xB7\xF2J\xEDK\x9D<\x91X\xF2Ҋ\xB3\xC0\x9F2l\xF1\x94\xE2u/Uǀ\x94%\xB8\xC8\xE0R~\x95\x80P\x98k\xD3=	1\xB7\xBEP\x9B\xDC	b*\xB2}\x99\xEB_xrT\xE6\xAA\xDEci\x82&oo\xE4\x95ڗ:y"\xA8ۢ
+O\xC6W\xFC&;훔\x94SVC\xF0\xAER*\x97?\x85\x8Eη;s^\x93?!\x91n\xAC\xFE\xAFQ\xEB\xF17\x9C\x{11B0EC}\xBC\xFAZ\xB9\xB5\xF2\xD547tm\xD4\x88M\xA5\xFB0\xC8_:\xAC\xD71\xA9\x8B}\xAC\xAF\x83\xBEAp1t\xFB\xFF\xAC\\xF4moovlmvhd\xD5\xDCh\x90\xD5\xDCh\x902@\x80trak\tkhd\xD5\xDCh\x90\xD5\xDCh\x90@dd$edtselst\xF8mdia mdhd\xD5\xDCh\x90\xD5\xDCh\x902U\xC41hdlrvideCore Media Video\x9Fminfvmhd$dinfdrefurl _stbl\xC8stsd\xB8a
 vc1ddHH\xFF\xFF+avcCd\xFF\xE1'd\xACR1\xCF\xE7\x9FjaZ\xF7\xBE(\xFE	\x8Bcolrnclx
+fiel
+chrmpaspsttsstsssdtp ( stsc stszstco0
\ No newline at end of file

Modified: trunk/LayoutTests/platform/ios/TestExpectations (222224 => 222225)


--- trunk/LayoutTests/platform/ios/TestExpectations	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/LayoutTests/platform/ios/TestExpectations	2017-09-19 21:15:46 UTC (rev 222225)
@@ -2993,3 +2993,5 @@
 
 webkit.org/b/176878 [ Debug ] fast/multicol/spanner-crash-when-adding-summary.html [ Crash ]
 
+# This test relies on APIs not available on iOS
+fast/images/animated-image-mp4.html [ Skip ]

Modified: trunk/Source/WTF/ChangeLog (222224 => 222225)


--- trunk/Source/WTF/ChangeLog	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WTF/ChangeLog	2017-09-19 21:15:46 UTC (rev 222225)
@@ -1,3 +1,12 @@
+2017-09-19  Jer Noble  <jer.no...@apple.com>
+
+        [Cocoa] Add an ImageDecoder subclass backed by AVFoundation
+        https://bugs.webkit.org/show_bug.cgi?id=176825
+
+        Reviewed by Eric Carlson.
+
+        * wtf/Platform.h:
+
 2017-09-18  Andy Estes  <aes...@apple.com>
 
         [Cocoa] Upstream sandbox-related WebKitSystemInterface functions

Modified: trunk/Source/WTF/wtf/Platform.h (222224 => 222225)


--- trunk/Source/WTF/wtf/Platform.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WTF/wtf/Platform.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -1156,6 +1156,10 @@
 #define USE_INSERTION_UNDO_GROUPING 1
 #endif
 
+#if PLATFORM(MAC)
+#define HAVE_AVSAMPLEBUFFERGENERATOR 1
+#endif
+
 #if PLATFORM(COCOA)
 #define HAVE_TIMINGDATAOPTIONS 1
 #endif

Modified: trunk/Source/WebCore/ChangeLog (222224 => 222225)


--- trunk/Source/WebCore/ChangeLog	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/ChangeLog	2017-09-19 21:15:46 UTC (rev 222225)
@@ -1,3 +1,124 @@
+2017-09-19  Jer Noble  <jer.no...@apple.com>
+
+        [Cocoa] Add an ImageDecoder subclass backed by AVFoundation
+        https://bugs.webkit.org/show_bug.cgi?id=176825
+
+        Reviewed by Eric Carlson.
+
+        Add a new concrete subclass of ImageDecoder which uses AVFoundation to parse and decode
+        image data.
+
+        AVFoundation APIs require prior knowledge of the media data's mime type to determine whether
+        the media data is decodable, so the mime type information must be passed through from the
+        CachedResource -> CachedImage -> ImageFrameCache -> ImageSource so as to be available when
+        creating the ImageDecoder:
+
+        (Drive-by fix: the createFrameImageAtIndex() method will mutate internal state, so make it
+        non-const.)
+
+        * loader/cache/CachedImage.h:
+        * loader/cache/CachedResource.h:
+        (WebCore::CachedResource::mimeType const):
+        * platform/cf/CoreMediaSoftLink.cpp:
+        * platform/cf/CoreMediaSoftLink.h:
+        * platform/cocoa/VideoToolboxSoftLink.cpp:
+        * platform/cocoa/VideoToolboxSoftLink.h:
+        * platform/graphics/Image.cpp:
+        (WebCore::Image::mimeType const):
+        (WebCore::Image::expectedContentSize const):
+        * platform/graphics/Image.h:
+        * platform/graphics/ImageDecoder.cpp:
+        (WebCore::ImageDecoder::create):
+        * platform/graphics/ImageDecoder.h:
+        (WebCore::ImageDecoder::setExpectedContentSize):
+        * platform/graphics/ImageFrameCache.cpp:
+        (WebCore::ImageFrameCache::mimeType const):
+        * platform/graphics/ImageFrameCache.h:
+        * platform/graphics/ImageObserver.h:
+        * platform/graphics/ImageSource.cpp:
+        (WebCore::ImageSource::ensureDecoderAvailable):
+        * platform/graphics/cg/ImageDecoderCG.cpp:
+        (WebCore::ImageDecoderCG::createFrameImageAtIndex):
+        * platform/graphics/cg/ImageDecoderCG.h:
+
+        Add the new class, ImageDecoderAVFObjC:
+
+        AVFoundation expects to load all the media data for an AVURLAsset itself. To map between the
+        provided SharedData and AVURLAsset's requirements, create a delegate object
+        WebCoreSharedBufferResourceLoaderDelegate, which responds to requests from the AVURLAsset by
+        extracting data from the SharedData object. Ensure AVURLAsset doesn't load any data outside
+        this delegate by passing the AVURLAssetReferenceRestrictionsKey /
+        AVAssetReferenceRestrictionForbidAll key and value in the AVURLAsset creation options.
+
+        * platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.h: Added.
+        (WebCore::ImageDecoderAVFObjC::create):
+        (WebCore::ImageDecoderAVFObjC::mimeType const):
+        (WebCore::ImageDecoderAVFObjC::RotationProperties::isIdentity const):
+        * platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm: Added.
+        (SOFT_LINK_CONSTANT):
+        (-[WebCoreSharedBufferResourceLoaderDelegate initWithParent:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate setExpectedContentSize:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate updateData:complete:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate canFulfillRequest:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate enqueueRequest:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate fulfillPendingRequests]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate fulfillRequest:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate resourceLoader:shouldWaitForLoadingOfRequestedResource:]):
+        (-[WebCoreSharedBufferResourceLoaderDelegate resourceLoader:didCancelLoadingRequest:]):
+        (WebCore::customSchemeURL):
+        (WebCore::imageDecoderAssetOptions):
+        (WebCore::transformToRotationProperties):
+        (WebCore::ImageDecoderAVFObjC::ImageDecoderAVFObjC):
+        (WebCore::ImageDecoderAVFObjC::canDecodeType):
+        (WebCore::ImageDecoderAVFObjC::firstEnabledTrack):
+        (WebCore::ImageDecoderAVFObjC::readSampleMetadata): Parses the media data using AVSampleCursor to walk
+            the media sample table, extracting frame presentation time, decode time, and duration.
+        (WebCore::ImageDecoderAVFObjC::readTrackMetadata): Reads the affine transform and size information from
+            the AVAssetTrack, and transforms the transform into a rotation value.
+        (WebCore::ImageDecoderAVFObjC::storeSampleBuffer): Decompress the incoming sample data, optionally rotate
+            the output, and store the results in the sample data vector.
+        (WebCore::ImageDecoderAVFObjC::advanceCursor): Wrap around the end of the sample table.
+        (WebCore::ImageDecoderAVFObjC::setTrack): Reset all sample and track metadata.
+        (WebCore::ImageDecoderAVFObjC::encodedDataStatus const): Retrieve from sample data.
+        (WebCore::ImageDecoderAVFObjC::frameCount const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::repetitionCount const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::uti const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::filenameExtension const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameSizeAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameIsCompleteAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameOrientationAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameDurationAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameHasAlphaAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameAllowSubsamplingAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::frameBytesAtIndex const): Ditto.
+        (WebCore::ImageDecoderAVFObjC::createFrameImageAtIndex): If the sample data has already been
+            decompressed, return it. Otherwise, walk through the sample table decompressing frames
+            until the desired frame is decoded.
+        (WebCore::ImageDecoderAVFObjC::setData):
+        (WebCore::ImageDecoderAVFObjC::clearFrameBufferCache):
+
+        Modify WebCoreDecompressionSession so that it can emit frames which have been converted from
+        YUV -> RGB as part of the decode operation. Also, add a synchronous decoding operation
+        method, for use in ImageDecoderAVFObjC.
+
+        * platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm:
+        (WebCore::MediaPlayerPrivateMediaSourceAVFObjC::ensureDecompressionSession):
+        * platform/graphics/cocoa/WebCoreDecompressionSession.h:
+        (WebCore::WebCoreDecompressionSession::createOpenGL):
+        (WebCore::WebCoreDecompressionSession::createRGB):
+        * platform/graphics/cocoa/WebCoreDecompressionSession.mm:
+        (WebCore::WebCoreDecompressionSession::WebCoreDecompressionSession):
+        (WebCore::WebCoreDecompressionSession::ensureDecompressionSessionForSample):
+        (WebCore::WebCoreDecompressionSession::decodeSample):
+        (WebCore::WebCoreDecompressionSession::decodeSampleSync):
+
+        Other changes:
+
+        * WebCore.xcodeproj/project.pbxproj: Add new files to project.
+        * platform/cocoa/VideoToolboxSoftLink.cpp: Add newly referenced methods.
+        * platform/cocoa/VideoToolboxSoftLink.h: Ditto.
+
+
 2017-09-19  Basuke Suzuki  <basuke.suz...@sony.com>
 
         [Curl] Move Authentication related tasks into AuthenticationChallenge class

Modified: trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj (222224 => 222225)


--- trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/WebCore.xcodeproj/project.pbxproj	2017-09-19 21:15:46 UTC (rev 222225)
@@ -6282,6 +6282,8 @@
 		CD19A2681A13E700008D650E /* DiagnosticLoggingClient.h in Headers */ = {isa = PBXBuildFile; fileRef = CD19A2671A13E700008D650E /* DiagnosticLoggingClient.h */; settings = {ATTRIBUTES = (Private, ); }; };
 		CD19FEA81F573972000C42FB /* ImageDecoder.h in Headers */ = {isa = PBXBuildFile; fileRef = CD19FEA61F573972000C42FB /* ImageDecoder.h */; };
 		CD19FEA91F573972000C42FB /* ImageDecoder.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CD19FEA71F573972000C42FB /* ImageDecoder.cpp */; };
+		CD19FEAE1F574B6D000C42FB /* ImageDecoderAVFObjC.h in Headers */ = {isa = PBXBuildFile; fileRef = CD19FEAC1F574B6D000C42FB /* ImageDecoderAVFObjC.h */; };
+		CD19FEAF1F574B6D000C42FB /* ImageDecoderAVFObjC.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD19FEAD1F574B6D000C42FB /* ImageDecoderAVFObjC.mm */; };
 		CD1E7347167BC78E009A885D /* TextTrackRepresentation.cpp in Sources */ = {isa = PBXBuildFile; fileRef = CD1E7346167BC78E009A885D /* TextTrackRepresentation.cpp */; };
 		CD225C0B1C46FBF400140761 /* WebCoreNSURLSession.mm in Sources */ = {isa = PBXBuildFile; fileRef = CD225C091C46FBF400140761 /* WebCoreNSURLSession.mm */; };
 		CD225C0C1C46FBF400140761 /* WebCoreNSURLSession.h in Headers */ = {isa = PBXBuildFile; fileRef = CD225C0A1C46FBF400140761 /* WebCoreNSURLSession.h */; settings = {ATTRIBUTES = (Private, ); }; };
@@ -15028,6 +15030,8 @@
 		CD19A2671A13E700008D650E /* DiagnosticLoggingClient.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = DiagnosticLoggingClient.h; sourceTree = "<group>"; };
 		CD19FEA61F573972000C42FB /* ImageDecoder.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = ImageDecoder.h; sourceTree = "<group>"; };
 		CD19FEA71F573972000C42FB /* ImageDecoder.cpp */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.cpp; path = ImageDecoder.cpp; sourceTree = "<group>"; };
+		CD19FEAC1F574B6D000C42FB /* ImageDecoderAVFObjC.h */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.c.h; path = ImageDecoderAVFObjC.h; sourceTree = "<group>"; };
+		CD19FEAD1F574B6D000C42FB /* ImageDecoderAVFObjC.mm */ = {isa = PBXFileReference; lastKnownFileType = sourcecode.cpp.objcpp; path = ImageDecoderAVFObjC.mm; sourceTree = "<group>"; };
 		CD1E7346167BC78E009A885D /* TextTrackRepresentation.cpp */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.cpp; path = TextTrackRepresentation.cpp; sourceTree = "<group>"; };
 		CD225C091C46FBF400140761 /* WebCoreNSURLSession.mm */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.cpp.objcpp; path = WebCoreNSURLSession.mm; sourceTree = "<group>"; };
 		CD225C0A1C46FBF400140761 /* WebCoreNSURLSession.h */ = {isa = PBXFileReference; fileEncoding = 4; lastKnownFileType = sourcecode.c.h; path = WebCoreNSURLSession.h; sourceTree = "<group>"; };
@@ -23847,8 +23851,8 @@
 				43D2597613C816F400608559 /* ImageBuffer.cpp */,
 				B2A10B910B3818BD00099AA4 /* ImageBuffer.h */,
 				22BD9F7D1353625C009BD102 /* ImageBufferData.h */,
+				CD19FEA71F573972000C42FB /* ImageDecoder.cpp */,
 				CD19FEA61F573972000C42FB /* ImageDecoder.h */,
-				CD19FEA71F573972000C42FB /* ImageDecoder.cpp */,
 				5576A5621D88A70800CCC04C /* ImageFrame.cpp */,
 				5576A5631D88A70800CCC04C /* ImageFrame.h */,
 				5597F8241D91C3130066BC21 /* ImageFrameCache.cpp */,
@@ -25060,6 +25064,8 @@
 				CDDE02EF18B5651200CF7FF1 /* CDMSessionAVStreamSession.mm */,
 				CDE595961BF26E2100A1CBE8 /* CDMSessionMediaSourceAVFObjC.h */,
 				CDE5959C1BF2757100A1CBE8 /* CDMSessionMediaSourceAVFObjC.mm */,
+				CD19FEAC1F574B6D000C42FB /* ImageDecoderAVFObjC.h */,
+				CD19FEAD1F574B6D000C42FB /* ImageDecoderAVFObjC.mm */,
 				07AA6B69166D019500D45671 /* InbandTextTrackPrivateAVFObjC.h */,
 				07AA6B6A166D019500D45671 /* InbandTextTrackPrivateAVFObjC.mm */,
 				07367DDD172CA67F00D861B9 /* InbandTextTrackPrivateLegacyAVFObjC.h */,
@@ -28274,6 +28280,8 @@
 				510192D618B6B9B7007FC7A1 /* ImageControlsRootElement.h in Headers */,
 				510192D218B6B9AB007FC7A1 /* ImageControlsRootElementMac.h in Headers */,
 				A779791A0D6B9D0C003851B9 /* ImageData.h in Headers */,
+				CD19FEA81F573972000C42FB /* ImageDecoder.h in Headers */,
+				CD19FEAE1F574B6D000C42FB /* ImageDecoderAVFObjC.h in Headers */,
 				555B87ED1CAAF0AB00349425 /* ImageDecoderCG.h in Headers */,
 				97205AB61239291000B17380 /* ImageDocument.h in Headers */,
 				5576A5651D88A70800CCC04C /* ImageFrame.h in Headers */,
@@ -28485,7 +28493,6 @@
 				947949381E0459FA00018D85 /* JSDeprecatedCSSOMValue.h in Headers */,
 				9479493A1E0459FA00018D85 /* JSDeprecatedCSSOMValueList.h in Headers */,
 				31FB1A66120A5D3F00DC02A0 /* JSDeviceMotionEvent.h in Headers */,
-				CD19FEA81F573972000C42FB /* ImageDecoder.h in Headers */,
 				59A86008119DAFA100DEF1EF /* JSDeviceOrientationEvent.h in Headers */,
 				659DDC8309E198BA001BF3C6 /* JSDocument.h in Headers */,
 				1221E05E1C02B444006A1A00 /* JSDocumentAnimation.h in Headers */,
@@ -31462,7 +31469,6 @@
 				A104F24314C71F7A009E2C23 /* CachedSVGDocument.cpp in Sources */,
 				E1B533471717D0A100F205F9 /* CachedSVGDocumentReference.cpp in Sources */,
 				1C0939EA1A13E12900B788E5 /* CachedSVGFont.cpp in Sources */,
-				CD19FEA91F573972000C42FB /* ImageDecoder.cpp in Sources */,
 				0753860214489E9800B78452 /* CachedTextTrack.cpp in Sources */,
 				BCB16C270979C3BD00467741 /* CachedXSLStyleSheet.cpp in Sources */,
 				41380C281F3436AC00155FDA /* CacheStorage.cpp in Sources */,
@@ -32311,6 +32317,8 @@
 				510192D518B6B9B7007FC7A1 /* ImageControlsRootElement.cpp in Sources */,
 				510192D118B6B9AB007FC7A1 /* ImageControlsRootElementMac.cpp in Sources */,
 				A77979190D6B9D0C003851B9 /* ImageData.cpp in Sources */,
+				CD19FEA91F573972000C42FB /* ImageDecoder.cpp in Sources */,
+				CD19FEAF1F574B6D000C42FB /* ImageDecoderAVFObjC.mm in Sources */,
 				555B87EC1CAAF0AB00349425 /* ImageDecoderCG.cpp in Sources */,
 				97205AB51239291000B17380 /* ImageDocument.cpp in Sources */,
 				5576A5641D88A70800CCC04C /* ImageFrame.cpp in Sources */,

Modified: trunk/Source/WebCore/loader/cache/CachedImage.h (222224 => 222225)


--- trunk/Source/WebCore/loader/cache/CachedImage.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/loader/cache/CachedImage.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -130,6 +130,9 @@
 
         // ImageObserver API
         URL sourceUrl() const override { return !m_cachedImages.isEmpty() ? (*m_cachedImages.begin())->url() : URL(); }
+        String mimeType() const override { return !m_cachedImages.isEmpty() ? (*m_cachedImages.begin())->mimeType() : emptyString(); }
+        long long expectedContentLength() const override { return !m_cachedImages.isEmpty() ? (*m_cachedImages.begin())->expectedContentLength() : 0; }
+
         void decodedSizeChanged(const Image&, long long delta) final;
         void didDraw(const Image&) final;
 

Modified: trunk/Source/WebCore/loader/cache/CachedResource.h (222224 => 222225)


--- trunk/Source/WebCore/loader/cache/CachedResource.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/loader/cache/CachedResource.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -117,6 +117,8 @@
     const String& cachePartition() const { return m_resourceRequest.cachePartition(); }
     PAL::SessionID sessionID() const { return m_sessionID; }
     Type type() const { return m_type; }
+    String mimeType() const { return m_response.mimeType(); }
+    long long expectedContentLength() const { return m_response.expectedContentLength(); }
 
     static bool shouldUsePingLoad(Type type) { return type == Type::Beacon; }
 

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -43,6 +43,7 @@
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetDataBuffer, CMBlockBufferRef, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetFormatDescription, CMFormatDescriptionRef, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfo, OSStatus, (CMSampleBufferRef sbuf, CMItemIndex sampleIndex, CMSampleTimingInfo* timingInfoOut), (sbuf, sampleIndex, timingInfoOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMSampleBufferDataIsReady, Boolean, (CMSampleBufferRef sbuf), (sbuf))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeCompare, int32_t, (CMTime time1, CMTime time2), (time1, time2))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeAdd, CMTime, (CMTime time1, CMTime time2), (time1, time2))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, CoreMedia, CMTimeGetSeconds, Float64, (CMTime time), (time))

Modified: trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h (222224 => 222225)


--- trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/cf/CoreMediaSoftLink.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -47,6 +47,8 @@
 #define CMSampleBufferGetFormatDescription softLink_CoreMedia_CMSampleBufferGetFormatDescription
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferGetSampleTimingInfo, OSStatus, (CMSampleBufferRef sbuf, CMItemIndex sampleIndex, CMSampleTimingInfo* timingInfoOut), (sbuf, sampleIndex, timingInfoOut))
 #define CMSampleBufferGetSampleTimingInfo softLink_CoreMedia_CMSampleBufferGetSampleTimingInfo
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMSampleBufferDataIsReady, Boolean, (CMSampleBufferRef sbuf), (sbuf))
+#define CMSampleBufferDataIsReady softLink_CoreMedia_CMSampleBufferDataIsReady
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeConvertScale, CMTime, (CMTime time, int32_t newTimescale, CMTimeRoundingMethod method), (time, newTimescale, method))
 #define CMTimeConvertScale softLink_CoreMedia_CMTimeConvertScale
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, CoreMedia, CMTimeAdd, CMTime, (CMTime time1, CMTime time2), (time1, time2))

Modified: trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -28,17 +28,23 @@
 #include <VideoToolbox/VideoToolbox.h>
 #include <wtf/SoftLinking.h>
 
+typedef struct OpaqueVTImageRotationSession* VTImageRotationSessionRef;
+
 SOFT_LINK_FRAMEWORK_FOR_SOURCE(WebCore, VideoToolbox)
 
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTSessionCopyProperty, OSStatus, (VTSessionRef session, CFStringRef propertyKey, CFAllocatorRef allocator, void* propertyValueOut), (session, propertyKey, allocator, propertyValueOut))
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionCreate, OSStatus, (CFAllocatorRef allocator, CMVideoFormatDescriptionRef videoFormatDescription, CFDictionaryRef videoDecoderSpecification, CFDictionaryRef destinationImageBufferAttributes, const VTDecompressionOutputCallbackRecord* outputCallback, VTDecompressionSessionRef* decompressionSessionOut), (allocator, videoFormatDescription, videoDecoderSpecification, destinationImageBufferAttributes, outputCallback, decompressionSessionOut))
-#define VTDecompressionSessionCreate softLink_VideoToolbox_VTDecompressionSessionCreate
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionCanAcceptFormatDescription, Boolean, (VTDecompressionSessionRef session, CMFormatDescriptionRef newFormatDesc), (session, newFormatDesc))
-#define VTDecompressionSessionCanAcceptFormatDescription softLink_VideoToolbox_VTDecompressionSessionCanAcceptFormatDescription
 SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionWaitForAsynchronousFrames, OSStatus, (VTDecompressionSessionRef session), (session))
-#define VTDecompressionSessionWaitForAsynchronousFrames softLink_VideoToolbox_VTDecompressionSessionWaitForAsynchronousFrames
-SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrame, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, void* sourceFrameRefCon, VTDecodeInfoFlags* infoFlagsOut), (session, sampleBuffer, decodeFlags, sourceFrameRefCon, infoFlagsOut))
-#define VTDecompressionSessionDecodeFrame softLink_VideoToolbox_VTDecompressionSessionDecodeFrame
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrameWithOutputHandler, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, VTDecodeInfoFlags *infoFlagsOut, VTDecompressionOutputHandler outputHandler), (session, sampleBuffer, decodeFlags, infoFlagsOut, outputHandler))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTImageRotationSessionCreate, OSStatus, (CFAllocatorRef allocator, uint32_t rotationDegrees, VTImageRotationSessionRef* imageRotationSessionOut), (allocator, rotationDegrees, imageRotationSessionOut))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTImageRotationSessionSetProperty, OSStatus, (VTImageRotationSessionRef session, CFStringRef propertyKey, CFTypeRef propertyValue), (session, propertyKey, propertyValue))
+SOFT_LINK_FUNCTION_FOR_SOURCE(WebCore, VideoToolbox, VTImageRotationSessionTransferImage, OSStatus, (VTImageRotationSessionRef session, CVPixelBufferRef sourceBuffer, CVPixelBufferRef destinationBuffer), (session, sourceBuffer, destinationBuffer))
 SOFT_LINK_FUNCTION_MAY_FAIL_FOR_SOURCE(WebCore, VideoToolbox, VTIsHardwareDecodeSupported, Boolean, (CMVideoCodecType codecType), (codecType))
 SOFT_LINK_FUNCTION_MAY_FAIL_FOR_SOURCE(WebCore, VideoToolbox, VTGetGVADecoderAvailability, OSStatus, (uint32_t* totalInstanceCountOut, uint32_t* freeInstanceCountOut), (totalInstanceCountOut, freeInstanceCountOut))
+SOFT_LINK_FUNCTION_MAY_FAIL_FOR_SOURCE(WebCore, VideoToolbox, VTCreateCGImageFromCVPixelBuffer, OSStatus, (CVPixelBufferRef pixelBuffer, CFDictionaryRef options, CGImageRef* imageOut), (pixelBuffer, options, imageOut))
 SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder, CFStringRef)
-#define kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder get_VideoToolbox_kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder()
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTDecompressionPropertyKey_PixelBufferPool, CFStringRef)
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTImageRotationPropertyKey_EnableHighSpeedTransfer, CFStringRef)
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTImageRotationPropertyKey_FlipHorizontalOrientation, CFStringRef)
+SOFT_LINK_CONSTANT_FOR_SOURCE(WebCore, VideoToolbox, kVTImageRotationPropertyKey_FlipVerticalOrientation, CFStringRef)

Modified: trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h (222224 => 222225)


--- trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/cocoa/VideoToolboxSoftLink.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -28,8 +28,12 @@
 #include <VideoToolbox/VideoToolbox.h>
 #include <wtf/SoftLinking.h>
 
+typedef struct OpaqueVTImageRotationSession* VTImageRotationSessionRef;
+
 SOFT_LINK_FRAMEWORK_FOR_HEADER(WebCore, VideoToolbox)
 
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTSessionCopyProperty, OSStatus, (VTSessionRef session, CFStringRef propertyKey, CFAllocatorRef allocator, void* propertyValueOut), (session, propertyKey, allocator, propertyValueOut))
+#define VTSessionCopyProperty softLink_VideoToolbox_VTSessionCopyProperty
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionCreate, OSStatus, (CFAllocatorRef allocator, CMVideoFormatDescriptionRef videoFormatDescription, CFDictionaryRef videoDecoderSpecification, CFDictionaryRef destinationImageBufferAttributes, const VTDecompressionOutputCallbackRecord* outputCallback, VTDecompressionSessionRef* decompressionSessionOut), (allocator, videoFormatDescription, videoDecoderSpecification, destinationImageBufferAttributes, outputCallback, decompressionSessionOut))
 #define VTDecompressionSessionCreate softLink_VideoToolbox_VTDecompressionSessionCreate
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionCanAcceptFormatDescription, Boolean, (VTDecompressionSessionRef session, CMFormatDescriptionRef newFormatDesc), (session, newFormatDesc))
@@ -36,11 +40,27 @@
 #define VTDecompressionSessionCanAcceptFormatDescription softLink_VideoToolbox_VTDecompressionSessionCanAcceptFormatDescription
 SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionWaitForAsynchronousFrames, OSStatus, (VTDecompressionSessionRef session), (session))
 #define VTDecompressionSessionWaitForAsynchronousFrames softLink_VideoToolbox_VTDecompressionSessionWaitForAsynchronousFrames
-SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrame, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, void* sourceFrameRefCon, VTDecodeInfoFlags* infoFlagsOut), (session, sampleBuffer, decodeFlags, sourceFrameRefCon, infoFlagsOut))
-#define VTDecompressionSessionDecodeFrame softLink_VideoToolbox_VTDecompressionSessionDecodeFrame
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTDecompressionSessionDecodeFrameWithOutputHandler, OSStatus, (VTDecompressionSessionRef session, CMSampleBufferRef sampleBuffer, VTDecodeFrameFlags decodeFlags, VTDecodeInfoFlags *infoFlagsOut, VTDecompressionOutputHandler outputHandler), (session, sampleBuffer, decodeFlags, infoFlagsOut, outputHandler))
+#define VTDecompressionSessionDecodeFrameWithOutputHandler softLink_VideoToolbox_VTDecompressionSessionDecodeFrameWithOutputHandler
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTImageRotationSessionCreate, OSStatus, (CFAllocatorRef allocator, uint32_t rotationDegrees, VTImageRotationSessionRef* imageRotationSessionOut), (allocator, rotationDegrees, imageRotationSessionOut))
+#define VTImageRotationSessionCreate softLink_VideoToolbox_VTImageRotationSessionCreate
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTImageRotationSessionSetProperty, OSStatus, (VTImageRotationSessionRef session, CFStringRef propertyKey, CFTypeRef propertyValue), (session, propertyKey, propertyValue))
+#define VTImageRotationSessionSetProperty softLink_VideoToolbox_VTImageRotationSessionSetProperty
+SOFT_LINK_FUNCTION_FOR_HEADER(WebCore, VideoToolbox, VTImageRotationSessionTransferImage, OSStatus, (VTImageRotationSessionRef session, CVPixelBufferRef sourceBuffer, CVPixelBufferRef destinationBuffer), (session, sourceBuffer, destinationBuffer))
+#define VTImageRotationSessionTransferImage softLink_VideoToolbox_VTImageRotationSessionTransferImage
 SOFT_LINK_FUNCTION_MAY_FAIL_FOR_HEADER(WebCore, VideoToolbox, VTIsHardwareDecodeSupported, Boolean, (CMVideoCodecType codecType), (codecType))
 #define VTIsHardwareDecodeSupported softLink_VideoToolbox_VTIsHardwareDecodeSupported
 SOFT_LINK_FUNCTION_MAY_FAIL_FOR_HEADER(WebCore, VideoToolbox, VTGetGVADecoderAvailability, OSStatus, (uint32_t* totalInstanceCountOut, uint32_t* freeInstanceCountOut), (totalInstanceCountOut, freeInstanceCountOut))
 #define VTGetGVADecoderAvailability softLink_VideoToolbox_VTGetGVADecoderAvailability
+SOFT_LINK_FUNCTION_MAY_FAIL_FOR_HEADER(WebCore, VideoToolbox, VTCreateCGImageFromCVPixelBuffer, OSStatus, (CVPixelBufferRef pixelBuffer, CFDictionaryRef options, CGImageRef* imageOut), (pixelBuffer, options, imageOut))
+#define VTCreateCGImageFromCVPixelBuffer softLink_VideoToolbox_VTCreateCGImageFromCVPixelBuffer
 SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder, CFStringRef)
 #define kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder get_VideoToolbox_kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTDecompressionPropertyKey_PixelBufferPool, CFStringRef)
+#define kVTDecompressionPropertyKey_PixelBufferPool get_VideoToolbox_kVTDecompressionPropertyKey_PixelBufferPool()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTImageRotationPropertyKey_EnableHighSpeedTransfer, CFStringRef)
+#define kVTImageRotationPropertyKey_EnableHighSpeedTransfer get_VideoToolbox_kVTImageRotationPropertyKey_EnableHighSpeedTransfer()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTImageRotationPropertyKey_FlipHorizontalOrientation, CFStringRef)
+#define kVTImageRotationPropertyKey_FlipHorizontalOrientation get_VideoToolbox_kVTImageRotationPropertyKey_FlipHorizontalOrientation()
+SOFT_LINK_CONSTANT_FOR_HEADER(WebCore, VideoToolbox, kVTImageRotationPropertyKey_FlipVerticalOrientation, CFStringRef)
+#define kVTImageRotationPropertyKey_FlipVerticalOrientation get_VideoToolbox_kVTImageRotationPropertyKey_FlipVerticalOrientation()

Modified: trunk/Source/WebCore/platform/graphics/Image.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/Image.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/Image.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -84,6 +84,16 @@
     return imageObserver() ? imageObserver()->sourceUrl() : URL();
 }
 
+String Image::mimeType() const
+{
+    return imageObserver() ? imageObserver()->mimeType() : emptyString();
+}
+
+long long Image::expectedContentLength() const
+{
+    return imageObserver() ? imageObserver()->expectedContentLength() : 0;
+}
+
 void Image::fillWithSolidColor(GraphicsContext& ctxt, const FloatRect& dstRect, const Color& color, CompositeOperator op)
 {
     if (!color.isVisible())

Modified: trunk/Source/WebCore/platform/graphics/Image.h (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/Image.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/Image.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -143,6 +143,8 @@
     ImageObserver* imageObserver() const { return m_imageObserver; }
     void setImageObserver(ImageObserver* observer) { m_imageObserver = observer; }
     URL sourceURL() const;
+    String mimeType() const;
+    long long expectedContentLength() const;
 
     enum TileRule { StretchTile, RoundTile, SpaceTile, RepeatTile };
 

Modified: trunk/Source/WebCore/platform/graphics/ImageDecoder.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageDecoder.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageDecoder.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -34,10 +34,21 @@
 #include "ScalableImageDecoder.h"
 #endif
 
+#if HAVE(AVSAMPLEBUFFERGENERATOR)
+#include "ImageDecoderAVFObjC.h"
+#endif
+
 namespace WebCore {
 
-RefPtr<ImageDecoder> ImageDecoder::create(SharedBuffer& data, AlphaOption alphaOption, GammaAndColorProfileOption gammaAndColorProfileOption)
+RefPtr<ImageDecoder> ImageDecoder::create(SharedBuffer& data, const String& mimeType, AlphaOption alphaOption, GammaAndColorProfileOption gammaAndColorProfileOption)
 {
+#if HAVE(AVSAMPLEBUFFERGENERATOR)
+    if (ImageDecoderAVFObjC::canDecodeType(mimeType))
+        return ImageDecoderAVFObjC::create(data, mimeType, alphaOption, gammaAndColorProfileOption);
+#else
+    UNUSED_PARAM(mimeType);
+#endif
+
 #if USE(CG)
     return ImageDecoderCG::create(data, alphaOption, gammaAndColorProfileOption);
 #elif USE(DIRECT2D)

Modified: trunk/Source/WebCore/platform/graphics/ImageDecoder.h (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageDecoder.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageDecoder.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -43,7 +43,7 @@
 class ImageDecoder : public ThreadSafeRefCounted<ImageDecoder> {
     WTF_MAKE_FAST_ALLOCATED;
 public:
-    static RefPtr<ImageDecoder> create(SharedBuffer&, AlphaOption, GammaAndColorProfileOption);
+    static RefPtr<ImageDecoder> create(SharedBuffer&, const String& mimeType, AlphaOption, GammaAndColorProfileOption);
     virtual ~ImageDecoder() = default;
 
     virtual size_t bytesDecodedToDetermineProperties() const = 0;
@@ -68,6 +68,7 @@
 
     virtual NativeImagePtr createFrameImageAtIndex(size_t, SubsamplingLevel = SubsamplingLevel::Default, const DecodingOptions& = DecodingMode::Synchronous) = 0;
 
+    virtual void setExpectedContentSize(long long) { }
     virtual void setData(SharedBuffer&, bool allDataReceived) = 0;
     virtual bool isAllDataReceived() const = 0;
     virtual void clearFrameBufferCache(size_t) = 0;

Modified: trunk/Source/WebCore/platform/graphics/ImageFrameCache.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageFrameCache.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageFrameCache.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -390,6 +390,16 @@
     return m_image ? m_image->sourceURL() : URL();
 }
 
+String ImageFrameCache::mimeType() const
+{
+    return m_image ? m_image->mimeType() : emptyString();
+}
+
+long long ImageFrameCache::expectedContentLength() const
+{
+    return m_image ? m_image->expectedContentLength() : 0;
+}
+
 template<typename T, T (ImageDecoder::*functor)() const>
 T ImageFrameCache::metadata(const T& defaultValue, std::optional<T>* cachedValue)
 {

Modified: trunk/Source/WebCore/platform/graphics/ImageFrameCache.h (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageFrameCache.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageFrameCache.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -67,6 +67,8 @@
     void clearMetadata();
     void clearImage() { m_image = nullptr; }
     URL sourceURL() const;
+    String mimeType() const;
+    long long expectedContentLength() const;
 
     // Asynchronous image decoding
     void startAsyncDecodingQueue();

Modified: trunk/Source/WebCore/platform/graphics/ImageObserver.h (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageObserver.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageObserver.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -41,6 +41,9 @@
     virtual ~ImageObserver() {}
 public:
     virtual URL sourceUrl() const = 0;
+    virtual String mimeType() const = 0;
+    virtual long long expectedContentLength() const = 0;
+
     virtual void decodedSizeChanged(const Image&, long long delta) = 0;
 
     virtual void didDraw(const Image&) = 0;

Modified: trunk/Source/WebCore/platform/graphics/ImageSource.cpp (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/ImageSource.cpp	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/ImageSource.cpp	2017-09-19 21:15:46 UTC (rev 222225)
@@ -75,7 +75,7 @@
     if (!data || isDecoderAvailable())
         return true;
 
-    m_decoder = ImageDecoder::create(*data, m_alphaOption, m_gammaAndColorProfileOption);
+    m_decoder = ImageDecoder::create(*data, m_frameCache->mimeType(), m_alphaOption, m_gammaAndColorProfileOption);
     if (!isDecoderAvailable())
         return false;
 

Added: trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.h (0 => 222225)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.h	                        (rev 0)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -0,0 +1,128 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#pragma once
+
+#if HAVE(AVSAMPLEBUFFERGENERATOR)
+
+#include "ImageDecoder.h"
+#include <map>
+#include <wtf/Lock.h>
+#include <wtf/Vector.h>
+#include <wtf/text/WTFString.h>
+
+OBJC_CLASS AVAssetTrack;
+OBJC_CLASS AVSampleBufferGenerator;
+OBJC_CLASS AVSampleCursor;
+OBJC_CLASS AVURLAsset;
+OBJC_CLASS WebCoreSharedBufferResourceLoaderDelegate;
+typedef struct opaqueCMSampleBuffer* CMSampleBufferRef;
+typedef struct OpaqueVTImageRotationSession* VTImageRotationSessionRef;
+typedef struct __CVPixelBufferPool* CVPixelBufferPoolRef;
+
+namespace WTF {
+class MediaTime;
+}
+
+namespace WebCore {
+
+class PixelBufferConformerCV;
+class WebCoreDecompressionSession;
+
+class ImageDecoderAVFObjC : public ImageDecoder {
+public:
+    static RefPtr<ImageDecoderAVFObjC> create(SharedBuffer&, const String& mimeType, AlphaOption, GammaAndColorProfileOption);
+    virtual ~ImageDecoderAVFObjC();
+
+    size_t bytesDecodedToDetermineProperties() const override { return 0; }
+    static bool canDecodeType(const String& mimeType);
+
+    const String& mimeType() const { return m_mimeType; }
+
+    EncodedDataStatus encodedDataStatus() const final;
+    IntSize size() const final;
+    size_t frameCount() const final;
+    RepetitionCount repetitionCount() const final;
+    String uti() const final;
+    String filenameExtension() const final;
+    std::optional<IntPoint> hotSpot() const final { return std::nullopt; }
+
+    IntSize frameSizeAtIndex(size_t, SubsamplingLevel = SubsamplingLevel::Default) const final;
+    bool frameIsCompleteAtIndex(size_t) const final;
+    ImageOrientation frameOrientationAtIndex(size_t) const final;
+
+    Seconds frameDurationAtIndex(size_t) const final;
+    bool frameHasAlphaAtIndex(size_t) const final;
+    bool frameAllowSubsamplingAtIndex(size_t) const final;
+    unsigned frameBytesAtIndex(size_t, SubsamplingLevel = SubsamplingLevel::Default) const final;
+
+    NativeImagePtr createFrameImageAtIndex(size_t, SubsamplingLevel = SubsamplingLevel::Default, const DecodingOptions& = DecodingMode::Synchronous) final;
+
+    void setExpectedContentSize(long long) final;
+    void setData(SharedBuffer&, bool allDataReceived) final;
+    bool isAllDataReceived() const final { return m_isAllDataReceived; }
+    void clearFrameBufferCache(size_t) final;
+
+    struct RotationProperties {
+        bool flipX { false };
+        bool flipY { false };
+        unsigned angle { 0 };
+
+        bool isIdentity() const { return !flipX && !flipY && !angle; }
+    };
+
+private:
+    ImageDecoderAVFObjC(SharedBuffer&, const String& mimeType, AlphaOption, GammaAndColorProfileOption);
+
+    AVAssetTrack *firstEnabledTrack();
+    void readSampleMetadata();
+    void readTrackMetadata();
+    bool storeSampleBuffer(CMSampleBufferRef);
+    void advanceCursor();
+    void setTrack(AVAssetTrack *);
+
+    String m_mimeType;
+    String m_uti;
+    RetainPtr<AVURLAsset> m_asset;
+    RetainPtr<AVAssetTrack> m_track;
+    RetainPtr<AVSampleCursor> m_cursor;
+    RetainPtr<AVSampleBufferGenerator> m_generator;
+    RetainPtr<WebCoreSharedBufferResourceLoaderDelegate> m_loader;
+    RetainPtr<VTImageRotationSessionRef> m_rotationSession;
+    RetainPtr<CVPixelBufferPoolRef> m_rotationPool;
+    Ref<WebCoreDecompressionSession> m_decompressionSession;
+
+    struct SampleData;
+    std::map<WTF::MediaTime, size_t> m_presentationTimeToIndex;
+    Vector<SampleData> m_sampleData;
+    Lock m_sampleGeneratorLock;
+    bool m_isAllDataReceived { false };
+    long long m_expectedContentSize { 0 };
+    std::optional<IntSize> m_size;
+    std::optional<RotationProperties> m_rotation;
+};
+
+}
+#endif

Added: trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm (0 => 222225)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm	                        (rev 0)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/ImageDecoderAVFObjC.mm	2017-09-19 21:15:46 UTC (rev 222225)
@@ -0,0 +1,623 @@
+/*
+ * Copyright (C) 2017 Apple Inc. All rights reserved.
+ *
+ * Redistribution and use in source and binary forms, with or without
+ * modification, are permitted provided that the following conditions
+ * are met:
+ * 1. Redistributions of source code must retain the above copyright
+ *    notice, this list of conditions and the following disclaimer.
+ * 2. Redistributions in binary form must reproduce the above copyright
+ *    notice, this list of conditions and the following disclaimer in the
+ *    documentation and/or other materials provided with the distribution.
+ *
+ * THIS SOFTWARE IS PROVIDED BY APPLE INC. AND ITS CONTRIBUTORS ``AS IS''
+ * AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
+ * THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
+ * PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL APPLE INC. OR ITS CONTRIBUTORS
+ * BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+ * CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF
+ * SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS
+ * INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN
+ * CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE)
+ * ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF
+ * THE POSSIBILITY OF SUCH DAMAGE.
+ */
+
+#import "config.h"
+#import "ImageDecoderAVFObjC.h"
+
+#if HAVE(AVSAMPLEBUFFERGENERATOR)
+
+#import "AffineTransform.h"
+#import "FloatQuad.h"
+#import "FloatRect.h"
+#import "FloatSize.h"
+#import "MIMETypeRegistry.h"
+#import "MediaTimeAVFoundation.h"
+#import "SharedBuffer.h"
+#import "UTIUtilities.h"
+#import "WebCoreDecompressionSession.h"
+#import <AVFoundation/AVAsset.h>
+#import <AVFoundation/AVAssetResourceLoader.h>
+#import <AVFoundation/AVAssetTrack.h>
+#import <AVFoundation/AVSampleBufferGenerator.h>
+#import <AVFoundation/AVSampleCursor.h>
+#import <AVFoundation/AVTime.h>
+#import <VideoToolbox/VTUtilities.h>
+#import <map>
+#import <wtf/MainThread.h>
+#import <wtf/MediaTime.h>
+#import <wtf/NeverDestroyed.h>
+#import <wtf/OSObjectPtr.h>
+#import <wtf/SoftLinking.h>
+#import <wtf/Vector.h>
+
+#import "CoreMediaSoftLink.h"
+#import "VideoToolboxSoftLink.h"
+
+#pragma mark - Soft Linking
+
+SOFT_LINK_FRAMEWORK_OPTIONAL(AVFoundation)
+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVURLAsset)
+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferGenerator)
+SOFT_LINK_CLASS_OPTIONAL(AVFoundation, AVSampleBufferRequest)
+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVMediaCharacteristicVisual, NSString *)
+SOFT_LINK_POINTER_OPTIONAL(AVFoundation, AVURLAssetReferenceRestrictionsKey, NSString *)
+#define AVMediaCharacteristicVisual getAVMediaCharacteristicVisual()
+#define AVURLAssetReferenceRestrictionsKey getAVURLAssetReferenceRestrictionsKey()
+
+#pragma mark -
+
+@interface WebCoreSharedBufferResourceLoaderDelegate : NSObject<AVAssetResourceLoaderDelegate> {
+    WebCore::ImageDecoderAVFObjC* _parent;
+    long long _expectedContentSize;
+    RetainPtr<NSData> _data;
+    bool _complete;
+    Vector<RetainPtr<AVAssetResourceLoadingRequest>> _requests;
+    Lock _dataLock;
+}
+- (id)initWithParent:(WebCore::ImageDecoderAVFObjC*)parent;
+- (void)setExpectedContentSize:(long long)expectedContentSize;
+- (void)updateData:(NSData *)data complete:(BOOL)complete;
+- (BOOL)canFulfillRequest:(AVAssetResourceLoadingRequest *)loadingRequest;
+- (void)enqueueRequest:(AVAssetResourceLoadingRequest *)loadingRequest;
+- (void)fulfillPendingRequests;
+- (void)fulfillRequest:(AVAssetResourceLoadingRequest *)loadingRequest;
+@end
+
+@implementation WebCoreSharedBufferResourceLoaderDelegate
+- (id)initWithParent:(WebCore::ImageDecoderAVFObjC*)parent
+{
+    if (!(self = [super init]))
+        return nil;
+    _parent = parent;
+
+    return self;
+}
+
+- (void)setExpectedContentSize:(long long)expectedContentSize
+{
+    LockHolder holder { _dataLock };
+    _expectedContentSize = expectedContentSize;
+
+    [self fulfillPendingRequests];
+}
+
+- (void)updateData:(NSData *)data complete:(BOOL)complete
+{
+    LockHolder holder { _dataLock };
+    _data = data;
+    _complete = complete;
+
+    [self fulfillPendingRequests];
+}
+
+- (BOOL)canFulfillRequest:(AVAssetResourceLoadingRequest *)request
+{
+    if (!request)
+        return NO;
+
+    if (request.finished || request.cancelled)
+        return NO;
+
+    // AVURLAsset's resource loader requires knowing the expected content size
+    // to load sucessfully. That requires either having the complete data for
+    // the resource, or knowing the expected content size. 
+    if (!_complete && !_expectedContentSize)
+        return NO;
+
+    if (auto dataRequest = request.dataRequest) {
+        if (dataRequest.requestedOffset + dataRequest.requestedLength > static_cast<long long>(_data.get().length))
+            return NO;
+    }
+
+    return YES;
+}
+
+- (void)enqueueRequest:(AVAssetResourceLoadingRequest *)loadingRequest
+{
+    ASSERT(!_requests.contains(loadingRequest));
+    _requests.append(loadingRequest);
+}
+
+- (void)fulfillPendingRequests
+{
+    for (auto& request : _requests) {
+        if ([self canFulfillRequest:request.get()])
+            [self fulfillRequest:request.get()];
+    }
+
+    _requests.removeAllMatching([] (auto& request) {
+        return request.get().finished;
+    });
+}
+
+- (void)fulfillRequest:(AVAssetResourceLoadingRequest *)request
+{
+    if (auto infoRequest = request.contentInformationRequest) {
+        infoRequest.contentType = _parent->uti();
+        infoRequest.byteRangeAccessSupported = YES;
+        infoRequest.contentLength = _complete ? _data.get().length : _expectedContentSize;
+    }
+
+    if (auto dataRequest = request.dataRequest) {
+        long long availableLength = _data.get().length - dataRequest.requestedOffset;
+        if (availableLength <= 0)
+            return;
+
+        long long requestedLength;
+        if (dataRequest.requestsAllDataToEndOfResource)
+            requestedLength = availableLength;
+        else
+            requestedLength = std::min<long long>(availableLength, dataRequest.requestedLength);
+
+        auto range = NSMakeRange(static_cast<NSUInteger>(dataRequest.requestedOffset), static_cast<NSUInteger>(requestedLength));
+        NSData* requestedData = [_data subdataWithRange:range];
+        if (!requestedData)
+            return;
+
+        [dataRequest respondWithData:requestedData];
+    }
+
+    [request finishLoading];
+}
+
+- (BOOL)resourceLoader:(AVAssetResourceLoader *)resourceLoader shouldWaitForLoadingOfRequestedResource:(AVAssetResourceLoadingRequest *)loadingRequest
+{
+    LockHolder holder { _dataLock };
+
+    UNUSED_PARAM(resourceLoader);
+
+    if ([self canFulfillRequest:loadingRequest]) {
+        [self fulfillRequest:loadingRequest];
+        return NO;
+    }
+
+    [self enqueueRequest:loadingRequest];
+    return YES;
+}
+
+- (void)resourceLoader:(AVAssetResourceLoader *)resourceLoader didCancelLoadingRequest:(AVAssetResourceLoadingRequest *)loadingRequest
+{
+    LockHolder holder { _dataLock };
+
+    UNUSED_PARAM(resourceLoader);
+    _requests.removeAll(loadingRequest);
+}
+@end
+
+namespace WebCore {
+
+#pragma mark - Static Methods
+
+static NSURL *customSchemeURL()
+{
+    static NeverDestroyed<RetainPtr<NSURL>> url;
+    if (!url.get())
+        url.get() = adoptNS([[NSURL alloc] initWithString:@"custom-imagedecoderavfobjc://resource"]);
+
+    return url.get().get();
+}
+
+static NSDictionary *imageDecoderAssetOptions()
+{
+    static NeverDestroyed<RetainPtr<NSDictionary>> options;
+    if (!options.get())
+        options.get() = @{ AVURLAssetReferenceRestrictionsKey: @(AVAssetReferenceRestrictionForbidAll) };
+
+    return options.get().get();
+}
+
+static ImageDecoderAVFObjC::RotationProperties transformToRotationProperties(AffineTransform inTransform)
+{
+    ImageDecoderAVFObjC::RotationProperties rotation;
+    if (inTransform.isIdentity())
+        return rotation;
+
+    AffineTransform::DecomposedType decomposed { };
+    if (!inTransform.decompose(decomposed))
+        return rotation;
+
+    rotation.flipY = WTF::areEssentiallyEqual(decomposed.scaleX, -1.);
+    rotation.flipX = WTF::areEssentiallyEqual(decomposed.scaleY, -1.);
+    auto degrees = rad2deg(decomposed.angle);
+    while (degrees < 0)
+        degrees += 360;
+
+    // Only support rotation in multiples of 90º:
+    if (WTF::areEssentiallyEqual(fmod(degrees, 90.), 0.))
+        rotation.angle = clampToUnsigned(degrees);
+
+    return rotation;
+}
+
+struct ImageDecoderAVFObjC::SampleData {
+    Seconds duration { 0 };
+    bool hasAlpha { false };
+    IntSize frameSize;
+    RetainPtr<CMSampleBufferRef> sample;
+    RetainPtr<CGImageRef> image;
+    MediaTime decodeTime;
+    MediaTime presentationTime;
+};
+
+#pragma mark - ImageDecoderAVFObjC
+
+RefPtr<ImageDecoderAVFObjC> ImageDecoderAVFObjC::create(SharedBuffer& data, const String& mimeType, AlphaOption alphaOption, GammaAndColorProfileOption gammaAndColorProfileOption)
+{
+    // AVFoundation may not be available at runtime.
+    if (!getAVURLAssetClass())
+        return nullptr;
+
+    if (!canLoad_VideoToolbox_VTCreateCGImageFromCVPixelBuffer())
+        return nullptr;
+
+    return adoptRef(*new ImageDecoderAVFObjC(data, mimeType, alphaOption, gammaAndColorProfileOption));
+}
+
+ImageDecoderAVFObjC::ImageDecoderAVFObjC(SharedBuffer& data, const String& mimeType, AlphaOption, GammaAndColorProfileOption)
+    : ImageDecoder()
+    , m_mimeType(mimeType)
+    , m_uti(WebCore::UTIFromMIMEType(mimeType))
+    , m_asset(adoptNS([allocAVURLAssetInstance() initWithURL:customSchemeURL() options:imageDecoderAssetOptions()]))
+    , m_loader(adoptNS([[WebCoreSharedBufferResourceLoaderDelegate alloc] initWithParent:this]))
+    , m_decompressionSession(WebCoreDecompressionSession::createRGB())
+{
+    [m_loader updateData:data.createNSData().get() complete:NO];
+
+    [m_asset.get().resourceLoader setDelegate:m_loader.get() queue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0)];
+    [m_asset loadValuesAsynchronouslyForKeys:@[@"tracks"] completionHandler:[protectedThis = makeRefPtr(this)] () mutable {
+        callOnMainThread([protectedThis = WTFMove(protectedThis)] {
+            protectedThis->setTrack(protectedThis->firstEnabledTrack());
+        });
+    }];
+}
+
+ImageDecoderAVFObjC::~ImageDecoderAVFObjC() = default;
+
+bool ImageDecoderAVFObjC::canDecodeType(const String& mimeType)
+{
+    return [getAVURLAssetClass() isPlayableExtendedMIMEType:mimeType];
+}
+
+AVAssetTrack *ImageDecoderAVFObjC::firstEnabledTrack()
+{
+    NSArray<AVAssetTrack *> *videoTracks = [m_asset tracksWithMediaCharacteristic:AVMediaCharacteristicVisual];
+    NSUInteger firstEnabledIndex = [videoTracks indexOfObjectPassingTest:^(AVAssetTrack *track, NSUInteger, BOOL*) {
+        return track.enabled;
+    }];
+
+    if (firstEnabledIndex == NSNotFound)
+        return nil;
+
+    return [videoTracks objectAtIndex:firstEnabledIndex];
+}
+
+void ImageDecoderAVFObjC::readSampleMetadata()
+{
+    if (!m_sampleData.isEmpty())
+        return;
+
+    // NOTE: there is no API to return the number of samples in the sample table. Instead,
+    // simply increment the sample in decode order by an arbitrarily large number.
+    RetainPtr<AVSampleCursor> cursor = [m_track makeSampleCursorAtFirstSampleInDecodeOrder];
+    int64_t sampleCount = 0;
+    if (cursor)
+        sampleCount = 1 + [cursor stepInDecodeOrderByCount:std::numeric_limits<int32_t>::max()];
+
+    // NOTE: there is no API to return the first sample cursor in presentation order. Instead,
+    // simply decrement sample in presentation order by an arbitrarily large number.
+    [cursor stepInPresentationOrderByCount:std::numeric_limits<int32_t>::min()];
+
+    ASSERT(sampleCount >= 0);
+    m_sampleData.resize(static_cast<size_t>(sampleCount));
+
+    if (!m_generator)
+        m_generator = [allocAVSampleBufferGeneratorInstance() initWithAsset:m_asset.get() timebase:nil];
+
+    for (size_t index = 0; index < static_cast<size_t>(sampleCount); ++index) {
+        auto& sampleData = m_sampleData[index];
+        sampleData.duration = Seconds(CMTimeGetSeconds([cursor currentSampleDuration]));
+        sampleData.decodeTime = toMediaTime([cursor decodeTimeStamp]);
+        sampleData.presentationTime = toMediaTime([cursor presentationTimeStamp]);
+        auto request = adoptNS([allocAVSampleBufferRequestInstance() initWithStartCursor:cursor.get()]);
+        sampleData.sample = adoptCF([m_generator createSampleBufferForRequest:request.get()]);
+        m_presentationTimeToIndex.insert(std::make_pair(sampleData.presentationTime, index));
+        [cursor stepInPresentationOrderByCount:1];
+    }
+}
+
+void ImageDecoderAVFObjC::readTrackMetadata()
+{
+    if (!m_rotation)
+        m_rotation = transformToRotationProperties(CGAffineTransformConcat(m_asset.get().preferredTransform, m_track.get().preferredTransform));
+
+    if (!m_size) {
+        auto size = FloatSize(m_track.get().naturalSize);
+        auto angle = m_rotation.value().angle;
+        if (angle == 90 || angle == 270)
+            size = size.transposedSize();
+
+        m_size = expandedIntSize(size);
+    }
+}
+
+bool ImageDecoderAVFObjC::storeSampleBuffer(CMSampleBufferRef sampleBuffer)
+{
+    auto pixelBuffer = m_decompressionSession->decodeSampleSync(sampleBuffer);
+    if (!pixelBuffer)
+        return false;
+
+    auto presentationTime = toMediaTime(CMSampleBufferGetPresentationTimeStamp(sampleBuffer));
+    auto indexIter = m_presentationTimeToIndex.find(presentationTime);
+
+    if (m_rotation && !m_rotation.value().isIdentity()) {
+        auto& rotation = m_rotation.value();
+        if (!m_rotationSession) {
+            VTImageRotationSessionRef rawRotationSession = nullptr;
+            VTImageRotationSessionCreate(kCFAllocatorDefault, rotation.angle, &rawRotationSession);
+            m_rotationSession = rawRotationSession;
+            VTImageRotationSessionSetProperty(m_rotationSession.get(), kVTImageRotationPropertyKey_EnableHighSpeedTransfer, kCFBooleanTrue);
+
+            if (rotation.flipY)
+                VTImageRotationSessionSetProperty(m_rotationSession.get(), kVTImageRotationPropertyKey_FlipVerticalOrientation, kCFBooleanTrue);
+            if (rotation.flipX)
+                VTImageRotationSessionSetProperty(m_rotationSession.get(), kVTImageRotationPropertyKey_FlipHorizontalOrientation, kCFBooleanTrue);
+        }
+
+        if (!m_rotationPool) {
+            auto pixelAttributes = (CFDictionaryRef)@{
+                (NSString *)kCVPixelBufferWidthKey: @(m_size.value().width()),
+                (NSString *)kCVPixelBufferHeightKey: @(m_size.value().height()),
+                (NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA),
+                (NSString *)kCVPixelBufferCGImageCompatibilityKey: @YES,
+            };
+            CVPixelBufferPoolRef rawPool = nullptr;
+            CVPixelBufferPoolCreate(kCFAllocatorDefault, nullptr, pixelAttributes, &rawPool);
+            m_rotationPool = adoptCF(rawPool);
+        }
+
+        CVPixelBufferRef rawRotatedBuffer = nullptr;
+        CVPixelBufferPoolCreatePixelBuffer(kCFAllocatorDefault, m_rotationPool.get(), &rawRotatedBuffer);
+        auto status = VTImageRotationSessionTransferImage(m_rotationSession.get(), pixelBuffer.get(), rawRotatedBuffer);
+        if (status == noErr)
+            pixelBuffer = adoptCF(rawRotatedBuffer);
+    }
+
+    CGImageRef rawImage = nullptr;
+    if (noErr != VTCreateCGImageFromCVPixelBuffer(pixelBuffer.get(), nullptr, &rawImage))
+        return false;
+
+    ASSERT(indexIter->second < m_sampleData.size());
+    auto& sampleData = m_sampleData[indexIter->second];
+    sampleData.image = adoptCF(rawImage);
+
+    auto alphaInfo = CGImageGetAlphaInfo(rawImage);
+    sampleData.hasAlpha = (alphaInfo != kCGImageAlphaNone && alphaInfo != kCGImageAlphaNoneSkipLast && alphaInfo != kCGImageAlphaNoneSkipFirst);
+
+    return true;
+}
+
+void ImageDecoderAVFObjC::advanceCursor()
+{
+    if (![m_cursor stepInDecodeOrderByCount:1])
+        m_cursor = [m_track makeSampleCursorAtFirstSampleInDecodeOrder];
+}
+
+void ImageDecoderAVFObjC::setTrack(AVAssetTrack *track)
+{
+    if (m_track == track)
+        return;
+    m_track = track;
+
+    LockHolder holder { m_sampleGeneratorLock };
+    m_sampleData.clear();
+    m_size.reset();
+    m_rotation.reset();
+    m_cursor = nullptr;
+    m_generator = nullptr;
+    m_rotationSession = nullptr;
+
+    [track loadValuesAsynchronouslyForKeys:@[@"naturalSize", @"preferredTransform"] completionHandler:[protectedThis = makeRefPtr(this)] () mutable {
+        callOnMainThread([protectedThis = WTFMove(protectedThis)] {
+            protectedThis->readTrackMetadata();
+            protectedThis->readSampleMetadata();
+        });
+    }];
+}
+
+EncodedDataStatus ImageDecoderAVFObjC::encodedDataStatus() const
+{
+    if (m_sampleData.isEmpty())
+        return EncodedDataStatus::Unknown;
+    return EncodedDataStatus::Complete;
+}
+
+IntSize ImageDecoderAVFObjC::size() const
+{
+    if (m_size)
+        return m_size.value();
+    return IntSize();
+}
+
+size_t ImageDecoderAVFObjC::frameCount() const
+{
+    return m_sampleData.size();
+}
+
+RepetitionCount ImageDecoderAVFObjC::repetitionCount() const
+{
+    // In the absence of instructions to the contrary, assume all media formats repeat infinitely.
+    // FIXME: Future media formats may embed repeat count information, and when that is available
+    // through AVAsset, account for it here.
+    return RepetitionCountInfinite;
+}
+
+String ImageDecoderAVFObjC::uti() const
+{
+    return m_uti;
+}
+
+String ImageDecoderAVFObjC::filenameExtension() const
+{
+    return MIMETypeRegistry::getPreferredExtensionForMIMEType(m_mimeType);
+}
+
+IntSize ImageDecoderAVFObjC::frameSizeAtIndex(size_t, SubsamplingLevel) const
+{
+    return size();
+}
+
+bool ImageDecoderAVFObjC::frameIsCompleteAtIndex(size_t index) const
+{
+    if (index >= m_sampleData.size())
+        return false;
+
+    auto sampleData = m_sampleData[index];
+    if (!sampleData.sample)
+        return false;
+
+    return CMSampleBufferDataIsReady(sampleData.sample.get());
+}
+
+ImageOrientation ImageDecoderAVFObjC::frameOrientationAtIndex(size_t) const
+{
+    return ImageOrientation();
+}
+
+Seconds ImageDecoderAVFObjC::frameDurationAtIndex(size_t index) const
+{
+    if (index < m_sampleData.size())
+        return m_sampleData[index].duration;
+    return { };
+}
+
+bool ImageDecoderAVFObjC::frameHasAlphaAtIndex(size_t index) const
+{
+    if (index < m_sampleData.size())
+        return m_sampleData[index].hasAlpha;
+    return false;
+}
+
+bool ImageDecoderAVFObjC::frameAllowSubsamplingAtIndex(size_t index) const
+{
+    return index <= m_sampleData.size();
+}
+
+unsigned ImageDecoderAVFObjC::frameBytesAtIndex(size_t index, SubsamplingLevel subsamplingLevel) const
+{
+    if (!frameIsCompleteAtIndex(index))
+        return 0;
+
+    IntSize frameSize = frameSizeAtIndex(index, subsamplingLevel);
+    return (frameSize.area() * 4).unsafeGet();
+}
+
+NativeImagePtr ImageDecoderAVFObjC::createFrameImageAtIndex(size_t index, SubsamplingLevel, const DecodingOptions&)
+{
+    LockHolder holder { m_sampleGeneratorLock };
+
+    if (index >= m_sampleData.size())
+        return nullptr;
+
+    auto& sampleData = m_sampleData[index];
+    if (sampleData.image)
+        return sampleData.image;
+
+    if (!m_cursor)
+        m_cursor = [m_track makeSampleCursorAtFirstSampleInDecodeOrder];
+
+    auto frameCursor = [m_track makeSampleCursorWithPresentationTimeStamp:toCMTime(sampleData.presentationTime)];
+    if ([frameCursor comparePositionInDecodeOrderWithPositionOfCursor:m_cursor.get()] == NSOrderedAscending)  {
+        // Rewind cursor to the last sync sample to begin decoding
+        m_cursor = [frameCursor copy];
+        do {
+            if ([m_cursor currentSampleSyncInfo].sampleIsFullSync)
+                break;
+        } while ([m_cursor stepInDecodeOrderByCount:-1] == -1);
+
+    }
+
+    if (!m_generator)
+        m_generator = [allocAVSampleBufferGeneratorInstance() initWithAsset:m_asset.get() timebase:nil];
+
+    RetainPtr<CGImageRef> image;
+    while (true) {
+        if ([frameCursor comparePositionInDecodeOrderWithPositionOfCursor:m_cursor.get()] == NSOrderedAscending)
+            return nullptr;
+
+        auto presentationTime = toMediaTime(m_cursor.get().presentationTimeStamp);
+        auto indexIter = m_presentationTimeToIndex.find(presentationTime);
+        advanceCursor();
+
+        if (indexIter == m_presentationTimeToIndex.end())
+            return nullptr;
+
+        auto& cursorSampleData = m_sampleData[indexIter->second];
+
+        if (!cursorSampleData.sample)
+            return nullptr;
+
+        if (!storeSampleBuffer(cursorSampleData.sample.get()))
+            return nullptr;
+
+        if (sampleData.image)
+            return sampleData.image;
+    }
+
+    ASSERT_NOT_REACHED();
+    return nullptr;
+}
+
+void ImageDecoderAVFObjC::setExpectedContentSize(long long expectedContentSize)
+{
+    if (m_expectedContentSize == expectedContentSize)
+        return;
+
+    m_loader.get().expectedContentSize = m_expectedContentSize;
+}
+
+void ImageDecoderAVFObjC::setData(SharedBuffer& data, bool allDataReceived)
+{
+    [m_loader updateData:data.createNSData().get() complete:allDataReceived];
+
+    if (allDataReceived) {
+        m_isAllDataReceived = true;
+
+        if (!m_track)
+            setTrack(firstEnabledTrack());
+
+        readTrackMetadata();
+        readSampleMetadata();
+    }
+}
+
+void ImageDecoderAVFObjC::clearFrameBufferCache(size_t index)
+{
+    for (size_t i = 0; i < index; ++i)
+        m_sampleData[i].image = nullptr;
+}
+
+}
+
+#endif

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/MediaPlayerPrivateMediaSourceAVFObjC.mm	2017-09-19 21:15:46 UTC (rev 222225)
@@ -754,7 +754,7 @@
     if (m_decompressionSession)
         return;
 
-    m_decompressionSession = WebCoreDecompressionSession::create();
+    m_decompressionSession = WebCoreDecompressionSession::createOpenGL();
     m_decompressionSession->setTimebase([m_synchronizer timebase]);
 
     if (m_mediaSourcePrivate)

Modified: trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.h	2017-09-19 21:15:46 UTC (rev 222225)
@@ -51,7 +51,8 @@
 
 class WebCoreDecompressionSession : public ThreadSafeRefCounted<WebCoreDecompressionSession> {
 public:
-    static Ref<WebCoreDecompressionSession> create() { return adoptRef(*new WebCoreDecompressionSession()); }
+    static Ref<WebCoreDecompressionSession> createOpenGL() { return adoptRef(*new WebCoreDecompressionSession(OpenGL)); }
+    static Ref<WebCoreDecompressionSession> createRGB() { return adoptRef(*new WebCoreDecompressionSession(RGB)); }
 
     void invalidate();
     bool isInvalidated() const { return m_invalidated; }
@@ -62,6 +63,8 @@
     void stopRequestingMediaData();
     void notifyWhenHasAvailableVideoFrame(std::function<void()>);
 
+    RetainPtr<CVPixelBufferRef> decodeSampleSync(CMSampleBufferRef);
+
     void setTimebase(CMTimebaseRef);
     CMTimebaseRef timebase() const { return m_timebase.get(); }
 
@@ -75,8 +78,14 @@
     MediaTime totalFrameDelay() { return m_totalFrameDelay; }
 
 private:
-    WebCoreDecompressionSession();
+    enum Mode {
+        OpenGL,
+        RGB,
+    };
+    WebCoreDecompressionSession(Mode);
 
+    void ensureDecompressionSessionForSample(CMSampleBufferRef);
+
     void decodeSample(CMSampleBufferRef, bool displaying);
     void enqueueDecodedSample(CMSampleBufferRef, bool displaying);
     void handleDecompressionOutput(bool displaying, OSStatus, VTDecodeInfoFlags, CVImageBufferRef, CMTime presentationTimeStamp, CMTime presentationDuration);
@@ -85,7 +94,6 @@
     void automaticDequeue();
     bool shouldDecodeSample(CMSampleBufferRef, bool displaying);
 
-    static void decompressionOutputCallback(void* decompressionOutputRefCon, void* sourceFrameRefCon, OSStatus, VTDecodeInfoFlags, CVImageBufferRef, CMTime presentationTimeStamp, CMTime presentationDuration);
     static CMTime getDecodeTime(CMBufferRef, void* refcon);
     static CMTime getPresentationTime(CMBufferRef, void* refcon);
     static CMTime getDuration(CMBufferRef, void* refcon);
@@ -96,6 +104,7 @@
     static const CMItemCount kHighWaterMark = 60;
     static const CMItemCount kLowWaterMark = 15;
 
+    Mode m_mode;
     RetainPtr<VTDecompressionSessionRef> m_decompressionSession;
     RetainPtr<CMBufferQueueRef> m_producerQueue;
     RetainPtr<CMBufferQueueRef> m_consumerQueue;

Modified: trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm (222224 => 222225)


--- trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm	2017-09-19 21:11:41 UTC (rev 222224)
+++ trunk/Source/WebCore/platform/graphics/cocoa/WebCoreDecompressionSession.mm	2017-09-19 21:15:46 UTC (rev 222225)
@@ -44,8 +44,9 @@
 
 namespace WebCore {
 
-WebCoreDecompressionSession::WebCoreDecompressionSession()
-    : m_decompressionQueue(adoptOSObject(dispatch_queue_create("WebCoreDecompressionSession Decompression Queue", DISPATCH_QUEUE_SERIAL)))
+WebCoreDecompressionSession::WebCoreDecompressionSession(Mode mode)
+    : m_mode(mode)
+    , m_decompressionQueue(adoptOSObject(dispatch_queue_create("WebCoreDecompressionSession Decompression Queue", DISPATCH_QUEUE_SERIAL)))
     , m_enqueingQueue(adoptOSObject(dispatch_queue_create("WebCoreDecompressionSession Enqueueing Queue", DISPATCH_QUEUE_SERIAL)))
     , m_hasAvailableImageSemaphore(adoptOSObject(dispatch_semaphore_create(0)))
 {
@@ -200,7 +201,7 @@
     return true;
 }
 
-void WebCoreDecompressionSession::decodeSample(CMSampleBufferRef sample, bool displaying)
+void WebCoreDecompressionSession::ensureDecompressionSessionForSample(CMSampleBufferRef sample)
 {
     if (isInvalidated())
         return;
@@ -214,20 +215,31 @@
     if (!m_decompressionSession) {
         CMVideoFormatDescriptionRef videoFormatDescription = CMSampleBufferGetFormatDescription(sample);
         NSDictionary* videoDecoderSpecification = @{ (NSString *)kVTVideoDecoderSpecification_EnableHardwareAcceleratedVideoDecoder: @YES };
+
+        NSDictionary *attributes;
+        if (m_mode == OpenGL) {
 #if PLATFORM(IOS)
-        NSDictionary* attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey: @YES};
+            attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLESFBOCompatibilityKey: @YES};
 #else
-        NSDictionary* attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLFBOCompatibilityKey: @YES};
+            attributes = @{(NSString *)kCVPixelBufferIOSurfaceOpenGLFBOCompatibilityKey: @YES};
 #endif
+        } else {
+            ASSERT(m_mode == RGB);
+            attributes = @{(NSString *)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
+        }
         VTDecompressionSessionRef decompressionSessionOut = nullptr;
-        VTDecompressionOutputCallbackRecord callback {
-            &decompressionOutputCallback,
-            this,
-        };
-        if (noErr == VTDecompressionSessionCreate(kCFAllocatorDefault, videoFormatDescription, (CFDictionaryRef)videoDecoderSpecification, (CFDictionaryRef)attributes, &callback, &decompressionSessionOut))
+        if (noErr == VTDecompressionSessionCreate(kCFAllocatorDefault, videoFormatDescription, (CFDictionaryRef)videoDecoderSpecification, (CFDictionaryRef)attributes, nullptr, &decompressionSessionOut))
             m_decompressionSession = adoptCF(decompressionSessionOut);
     }
+}
 
+void WebCoreDecompressionSession::decodeSample(CMSampleBufferRef sample, bool displaying)
+{
+    if (isInvalidated())
+        return;
+
+    ensureDecompressionSessionForSample(sample);
+
     VTDecodeInfoFlags flags { kVTDecodeFrame_EnableTemporalProcessing };
     if (!displaying)
         flags |= kVTDecodeFrame_DoNotOutputFrame;
@@ -240,14 +252,25 @@
         return;
     }
 
-    VTDecompressionSessionDecodeFrame(m_decompressionSession.get(), sample, flags, reinterpret_cast<void*>(displaying), nullptr);
+    VTDecompressionSessionDecodeFrameWithOutputHandler(m_decompressionSession.get(), sample, flags, nullptr, [this, displaying] (OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef imageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration) {
+        handleDecompressionOutput(displaying, status, infoFlags, imageBuffer, presentationTimeStamp, presentationDuration);
+    });
 }
 
-void WebCoreDecompressionSession::decompressionOutputCallback(void* decompressionOutputRefCon, void* sourceFrameRefCon, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef imageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
+RetainPtr<CVPixelBufferRef> WebCoreDecompressionSession::decodeSampleSync(CMSampleBufferRef sample)
 {
-    WebCoreDecompressionSession* session = static_cast<WebCoreDecompressionSession*>(decompressionOutputRefCon);
-    bool displaying = sourceFrameRefCon;
-    session->handleDecompressionOutput(displaying, status, infoFlags, imageBuffer, presentationTimeStamp, presentationDuration);
+    if (isInvalidated())
+        return nullptr;
+
+    ensureDecompressionSessionForSample(sample);
+
+    RetainPtr<CVPixelBufferRef> pixelBuffer;
+    VTDecodeInfoFlags flags { 0 };
+    VTDecompressionSessionDecodeFrameWithOutputHandler(m_decompressionSession.get(), sample, flags, nullptr, [&] (OSStatus, VTDecodeInfoFlags, CVImageBufferRef imageBuffer, CMTime, CMTime) mutable {
+        if (imageBuffer && CFGetTypeID(imageBuffer) == CVPixelBufferGetTypeID())
+            pixelBuffer = (CVPixelBufferRef)imageBuffer;
+    });
+    return pixelBuffer;
 }
 
 void WebCoreDecompressionSession::handleDecompressionOutput(bool displaying, OSStatus status, VTDecodeInfoFlags infoFlags, CVImageBufferRef rawImageBuffer, CMTime presentationTimeStamp, CMTime presentationDuration)
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to