Title: [282865] trunk/Source
Revision
282865
Author
j...@apple.com
Date
2021-09-21 23:25:33 -0700 (Tue, 21 Sep 2021)

Log Message

Use SharedMemory for transferring appended buffers from SourceBuffer to the GPU process
https://bugs.webkit.org/show_bug.cgi?id=230329
rdar://problem/83291495

Source/WebCore:

Use SharedBuffer instead of Vector to pass data to the SourceBuffer's related
classes (SourceBuffer, SourceBufferPrivate and SourceBufferParser).
Modify SourceBufferParserWebM to never perform memory allocation and copy
of the original data content. Instead, we use CMBlockBuffer objects that retain the
backing SharedBuffer and use offsets inside this SharedBuffer to reference the data.
SourceBufferParserAVFObjC requires little modification as a NSData can wrap a SharedBuffer.

Reviewed by Jer Noble.

There should be no change from an observable standpoint other than the GPU memory usage
being drastically reduced (from 700MB when watching a 4K/60fps YouTube video to just over 200MB
on an iMac Pro (which only has software VP9 decoding), 25MB vs 360MB on an iPad)
Existing tests are fully exercising this new code.

* Modules/mediasource/SourceBuffer.cpp: Simplify logic around m_pendingAppendData member.
Only one appendBuffer operation can be pending at any given time otherwise appendBuffer
will throw an exception; as such, there's no need to append the data to a vector: "there can
be only one".
(WebCore::SourceBuffer::abortIfUpdating):
(WebCore::SourceBuffer::appendBufferInternal):
(WebCore::SourceBuffer::appendBufferTimerFired):
(WebCore::SourceBuffer::reportExtraMemoryAllocated):
* Modules/mediasource/SourceBuffer.h:
* platform/SharedBuffer.cpp:
(WebCore::SharedBuffer::SharedBuffer):
(WebCore::SharedBuffer::create):
(WebCore::SharedBuffer::copyTo const):
(WebCore::SharedBuffer::DataSegment::data const):
(WebCore::SharedBuffer::DataSegment::size const):
* platform/SharedBuffer.h: Add new DataSegment type that takes a Provider in constructor.
A Provider provides two Function members data and size.
* platform/audio/cocoa/AudioFileReaderCocoa.cpp: The AudioFileReaderCocoa required
the CMBlockBuffer containing the compressed content to be contiguous. This is no
longer guaranteed so ensure that the CMBlockBuffer is contiguous.
(WebCore::AudioFileReader::demuxWebMData const):
(WebCore::AudioFileReader::decodeWebMData const):
* platform/graphics/SourceBufferPrivate.h:
(WebCore::SourceBufferPrivate::append):
* platform/graphics/avfoundation/objc/SourceBufferParserAVFObjC.mm:
(WebCore::SourceBufferParserAVFObjC::appendData):
* platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h:
* platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm:
(WebCore::SourceBufferPrivateAVFObjC::append):
* platform/graphics/cocoa/SourceBufferParser.cpp:
(WebCore::SourceBufferParser::Segment::Segment):
(WebCore::SourceBufferParser::Segment::size const):
(WebCore::SourceBufferParser::Segment::read const):
(WebCore::SourceBufferParser::Segment::takeSharedBuffer):
(WebCore::SourceBufferParser::Segment::getSharedBuffer const):
* platform/graphics/cocoa/SourceBufferParser.h:
* platform/graphics/cocoa/SourceBufferParserWebM.cpp:
(WebCore::SourceBufferParserWebM::SourceBufferParserWebM):
(WebCore::SourceBufferParserWebM::TrackData::contiguousCompleteBlockBuffer const):
(WebCore::SourceBufferParserWebM::TrackData::readFrameData):
(WebCore::SourceBufferParserWebM::VideoTrackData::consumeFrameData):
(WebCore::SourceBufferParserWebM::VideoTrackData::createSampleBuffer):
(WebCore::SourceBufferParserWebM::AudioTrackData::resetCompleted):
(WebCore::SourceBufferParserWebM::AudioTrackData::consumeFrameData):
(WebCore::SourceBufferParserWebM::AudioTrackData::createSampleBuffer):
(WebCore::SourceBufferParserWebM::flushPendingAudioBuffers):
* platform/graphics/cocoa/SourceBufferParserWebM.h:
(WebCore::SourceBufferParserWebM::TrackData::resetCompleted):
(WebCore::SourceBufferParserWebM::TrackData::reset):

Source/WebCore/PAL:

Reviewed by Jer Noble.

* pal/cf/CoreMediaSoftLink.cpp:
* pal/cf/CoreMediaSoftLink.h: Add required CoreMedia methods.

Source/WebKit:

Use SharedMemory to pass SourceBuffer content to RemoteSourceBufferProxy in GPU process.
This is done by wrapping a SharedMemory into a SharedBuffer.

Reviewed by Jer Noble.

* GPUProcess/media/RemoteSourceBufferProxy.cpp:
(WebKit::RemoteSourceBufferProxy::append):
* GPUProcess/media/RemoteSourceBufferProxy.h:
* GPUProcess/media/RemoteSourceBufferProxy.messages.in:
* Platform/SharedMemory.cpp:
(WebKit::SharedMemory::createSharedBuffer const):
* Platform/SharedMemory.h:
* WebProcess/GPU/media/SourceBufferPrivateRemote.cpp:
(WebKit::SourceBufferPrivateRemote::append):
* WebProcess/GPU/media/SourceBufferPrivateRemote.h:

Modified Paths

Diff

Modified: trunk/Source/WebCore/ChangeLog (282864 => 282865)


--- trunk/Source/WebCore/ChangeLog	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/ChangeLog	2021-09-22 06:25:33 UTC (rev 282865)
@@ -1,3 +1,73 @@
+2021-09-21  Jean-Yves Avenard  <j...@apple.com>
+
+        Use SharedMemory for transferring appended buffers from SourceBuffer to the GPU process
+        https://bugs.webkit.org/show_bug.cgi?id=230329
+        rdar://problem/83291495
+
+        Use SharedBuffer instead of Vector to pass data to the SourceBuffer's related
+        classes (SourceBuffer, SourceBufferPrivate and SourceBufferParser).
+        Modify SourceBufferParserWebM to never perform memory allocation and copy
+        of the original data content. Instead, we use CMBlockBuffer objects that retain the
+        backing SharedBuffer and use offsets inside this SharedBuffer to reference the data.
+        SourceBufferParserAVFObjC requires little modification as a NSData can wrap a SharedBuffer.
+
+        Reviewed by Jer Noble.
+
+        There should be no change from an observable standpoint other than the GPU memory usage
+        being drastically reduced (from 700MB when watching a 4K/60fps YouTube video to just over 200MB
+        on an iMac Pro (which only has software VP9 decoding), 25MB vs 360MB on an iPad)
+        Existing tests are fully exercising this new code.
+
+        * Modules/mediasource/SourceBuffer.cpp: Simplify logic around m_pendingAppendData member.
+        Only one appendBuffer operation can be pending at any given time otherwise appendBuffer
+        will throw an exception; as such, there's no need to append the data to a vector: "there can
+        be only one".
+        (WebCore::SourceBuffer::abortIfUpdating):
+        (WebCore::SourceBuffer::appendBufferInternal):
+        (WebCore::SourceBuffer::appendBufferTimerFired):
+        (WebCore::SourceBuffer::reportExtraMemoryAllocated):
+        * Modules/mediasource/SourceBuffer.h:
+        * platform/SharedBuffer.cpp:
+        (WebCore::SharedBuffer::SharedBuffer):
+        (WebCore::SharedBuffer::create):
+        (WebCore::SharedBuffer::copyTo const):
+        (WebCore::SharedBuffer::DataSegment::data const):
+        (WebCore::SharedBuffer::DataSegment::size const):
+        * platform/SharedBuffer.h: Add new DataSegment type that takes a Provider in constructor.
+        A Provider provides two Function members data and size.
+        * platform/audio/cocoa/AudioFileReaderCocoa.cpp: The AudioFileReaderCocoa required
+        the CMBlockBuffer containing the compressed content to be contiguous. This is no
+        longer guaranteed so ensure that the CMBlockBuffer is contiguous.
+        (WebCore::AudioFileReader::demuxWebMData const):
+        (WebCore::AudioFileReader::decodeWebMData const):
+        * platform/graphics/SourceBufferPrivate.h:
+        (WebCore::SourceBufferPrivate::append):
+        * platform/graphics/avfoundation/objc/SourceBufferParserAVFObjC.mm:
+        (WebCore::SourceBufferParserAVFObjC::appendData):
+        * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h:
+        * platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm:
+        (WebCore::SourceBufferPrivateAVFObjC::append):
+        * platform/graphics/cocoa/SourceBufferParser.cpp:
+        (WebCore::SourceBufferParser::Segment::Segment):
+        (WebCore::SourceBufferParser::Segment::size const):
+        (WebCore::SourceBufferParser::Segment::read const):
+        (WebCore::SourceBufferParser::Segment::takeSharedBuffer):
+        (WebCore::SourceBufferParser::Segment::getSharedBuffer const):
+        * platform/graphics/cocoa/SourceBufferParser.h:
+        * platform/graphics/cocoa/SourceBufferParserWebM.cpp:
+        (WebCore::SourceBufferParserWebM::SourceBufferParserWebM):
+        (WebCore::SourceBufferParserWebM::TrackData::contiguousCompleteBlockBuffer const):
+        (WebCore::SourceBufferParserWebM::TrackData::readFrameData):
+        (WebCore::SourceBufferParserWebM::VideoTrackData::consumeFrameData):
+        (WebCore::SourceBufferParserWebM::VideoTrackData::createSampleBuffer):
+        (WebCore::SourceBufferParserWebM::AudioTrackData::resetCompleted):
+        (WebCore::SourceBufferParserWebM::AudioTrackData::consumeFrameData):
+        (WebCore::SourceBufferParserWebM::AudioTrackData::createSampleBuffer):
+        (WebCore::SourceBufferParserWebM::flushPendingAudioBuffers):
+        * platform/graphics/cocoa/SourceBufferParserWebM.h:
+        (WebCore::SourceBufferParserWebM::TrackData::resetCompleted):
+        (WebCore::SourceBufferParserWebM::TrackData::reset):
+
 2021-09-21  Alexey Shvayka  <shvaikal...@gmail.com>
 
         [WebIDL] DOM constructors should extend InternalFunction

Modified: trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp (282864 => 282865)


--- trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/Modules/mediasource/SourceBuffer.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -44,6 +44,7 @@
 #include "Logging.h"
 #include "MediaDescription.h"
 #include "MediaSource.h"
+#include "SharedBuffer.h"
 #include "SourceBufferList.h"
 #include "SourceBufferPrivate.h"
 #include "TextTrackList.h"
@@ -394,7 +395,7 @@
 
     // 4.1. Abort the buffer append algorithm if it is running.
     m_appendBufferTimer.stop();
-    m_pendingAppendData.clear();
+    m_pendingAppendData = nullptr;
     m_private->abort();
 
     // 4.2. Set the updating attribute to false.
@@ -494,7 +495,8 @@
 
     // NOTE: Return to 3.2 appendBuffer()
     // 3. Add data to the end of the input buffer.
-    m_pendingAppendData.append(data, size);
+    ASSERT(!m_pendingAppendData);
+    m_pendingAppendData = SharedBuffer::create(data, size);
 
     // 4. Set the updating attribute to true.
     m_updating = true;
@@ -524,18 +526,14 @@
     // https://dvcs.w3.org/hg/html-media/raw-file/tip/media-source/media-source.html#sourcebuffer-segment-parser-loop
     // When the segment parser loop algorithm is invoked, run the following steps:
 
+    RefPtr<SharedBuffer> appendData = WTFMove(m_pendingAppendData);
     // 1. Loop Top: If the input buffer is empty, then jump to the need more data step below.
-    if (!m_pendingAppendData.size()) {
+    if (!appendData || !appendData->size()) {
         sourceBufferPrivateAppendComplete(AppendResult::AppendSucceeded);
         return;
     }
 
-    // Manually clear out the m_pendingAppendData Vector, in case the platform implementation
-    // rejects appending the buffer for whatever reason.
-    // FIXME: The implementation should guarantee the move from this Vector, and we should
-    // assert here to confirm that. See https://bugs.webkit.org/show_bug.cgi?id=178003.
-    m_private->append(WTFMove(m_pendingAppendData));
-    m_pendingAppendData.clear();
+    m_private->append(appendData.releaseNonNull());
 }
 
 void SourceBuffer::sourceBufferPrivateAppendComplete(AppendResult result)
@@ -1211,7 +1209,10 @@
 
 void SourceBuffer::reportExtraMemoryAllocated(uint64_t extraMemory)
 {
-    uint64_t extraMemoryCost = m_pendingAppendData.capacity() + extraMemory;
+    uint64_t extraMemoryCost = extraMemory;
+    if (m_pendingAppendData)
+        extraMemoryCost += m_pendingAppendData->size();
+
     if (extraMemoryCost <= m_reportedExtraMemoryCost)
         return;
 

Modified: trunk/Source/WebCore/Modules/mediasource/SourceBuffer.h (282864 => 282865)


--- trunk/Source/WebCore/Modules/mediasource/SourceBuffer.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/Modules/mediasource/SourceBuffer.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -221,7 +221,7 @@
 
     WTF::Observer<void*()> m_opaqueRootProvider;
 
-    Vector<unsigned char> m_pendingAppendData;
+    RefPtr<SharedBuffer> m_pendingAppendData;
     Timer m_appendBufferTimer;
 
     RefPtr<VideoTrackList> m_videoTracks;

Modified: trunk/Source/WebCore/PAL/ChangeLog (282864 => 282865)


--- trunk/Source/WebCore/PAL/ChangeLog	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/PAL/ChangeLog	2021-09-22 06:25:33 UTC (rev 282865)
@@ -1,3 +1,14 @@
+2021-09-21  Jean-Yves Avenard  <j...@apple.com>
+
+        Use SharedMemory for transferring appended buffers from SourceBuffer to the GPU process
+        https://bugs.webkit.org/show_bug.cgi?id=230329
+        rdar://problem/83291495
+
+        Reviewed by Jer Noble.
+
+        * pal/cf/CoreMediaSoftLink.cpp:
+        * pal/cf/CoreMediaSoftLink.h: Add required CoreMedia methods.
+
 2021-09-21  Per Arne Vollan  <pvol...@apple.com>
 
         [Mac Catalyst] Fix build issue

Modified: trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp (282864 => 282865)


--- trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -113,6 +113,8 @@
 
 #if PLATFORM(COCOA)
 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferCreateContiguous, OSStatus, (CFAllocatorRef structureAllocator, CMBlockBufferRef sourceBuffer, CFAllocatorRef blockAllocator, const CMBlockBufferCustomBlockSource* customBlockSource, size_t offsetToData, size_t dataLength, CMBlockBufferFlags flags, CMBlockBufferRef* blockBufferOut), (structureAllocator, sourceBuffer, blockAllocator, customBlockSource, offsetToData, dataLength, flags, blockBufferOut), PAL_EXPORT)
+SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferAppendBufferReference, OSStatus, (CMBlockBufferRef theBuffer, CMBlockBufferRef targetBBuf, size_t offsetToData, size_t dataLength, CMBlockBufferFlags flags), (theBuffer, targetBBuf, offsetToData, dataLength, flags), PAL_EXPORT)
+SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMBlockBufferCreateEmpty, OSStatus, (CFAllocatorRef structureAllocator, uint32_t subBlockCapacity, CMBlockBufferFlags flags, CMBlockBufferRef* blockBufferOut), (structureAllocator, subBlockCapacity, flags, blockBufferOut), PAL_EXPORT)
 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMFormatDescriptionGetMediaSubType, FourCharCode, (CMFormatDescriptionRef desc), (desc), PAL_EXPORT)
 SOFT_LINK_FUNCTION_FOR_SOURCE_WITH_EXPORT(PAL, CoreMedia, CMFormatDescriptionGetMediaType, CMMediaType, (CMFormatDescriptionRef desc), (desc), PAL_EXPORT)
 

Modified: trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h (282864 => 282865)


--- trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/PAL/pal/cf/CoreMediaSoftLink.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -175,6 +175,10 @@
 
 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferCreateContiguous, OSStatus, (CFAllocatorRef structureAllocator, CMBlockBufferRef sourceBuffer, CFAllocatorRef blockAllocator, const CMBlockBufferCustomBlockSource* customBlockSource, size_t offsetToData, size_t dataLength, CMBlockBufferFlags flags, CMBlockBufferRef* blockBufferOut), (structureAllocator, sourceBuffer, blockAllocator, customBlockSource, offsetToData, dataLength, flags, blockBufferOut))
 #define CMBlockBufferCreateContiguous softLink_CoreMedia_CMBlockBufferCreateContiguous
+SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferAppendBufferReference, OSStatus, (CMBlockBufferRef theBuffer, CMBlockBufferRef targetBBuf, size_t offsetToData, size_t dataLength, CMBlockBufferFlags flags), (theBuffer, targetBBuf, offsetToData, dataLength, flags))
+#define CMBlockBufferAppendBufferReference softLink_CoreMedia_CMBlockBufferAppendBufferReference
+SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMBlockBufferCreateEmpty, OSStatus, (CFAllocatorRef structureAllocator, uint32_t subBlockCapacity, CMBlockBufferFlags flags, CMBlockBufferRef* blockBufferOut), (structureAllocator, subBlockCapacity, flags, blockBufferOut))
+#define CMBlockBufferCreateEmpty softLink_CoreMedia_CMBlockBufferCreateEmpty
 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMFormatDescriptionGetMediaSubType, FourCharCode, (CMFormatDescriptionRef desc), (desc))
 #define CMFormatDescriptionGetMediaSubType softLink_CoreMedia_CMFormatDescriptionGetMediaSubType
 SOFT_LINK_FUNCTION_FOR_HEADER(PAL, CoreMedia, CMFormatDescriptionGetMediaType, CMMediaType, (CMFormatDescriptionRef desc), (desc))

Modified: trunk/Source/WebCore/platform/SharedBuffer.cpp (282864 => 282865)


--- trunk/Source/WebCore/platform/SharedBuffer.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/SharedBuffer.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -52,6 +52,12 @@
     m_segments.append({0, DataSegment::create(WTFMove(fileData))});
 }
 
+SharedBuffer::SharedBuffer(DataSegment::Provider&& provider)
+    : m_size(provider.size())
+{
+    m_segments.append({0, DataSegment::create(WTFMove(provider))});
+}
+
 SharedBuffer::SharedBuffer(Vector<uint8_t>&& data)
 {
     append(WTFMove(data));
@@ -236,6 +242,11 @@
     return clone;
 }
 
+Ref<SharedBuffer> SharedBuffer::create(DataSegment::Provider&& provider)
+{
+    return adoptRef(*new SharedBuffer(WTFMove(provider)));
+}
+
 void SharedBuffer::forEachSegment(const Function<void(const Span<const uint8_t>&)>& apply) const
 {
     auto segments = m_segments;
@@ -267,12 +278,40 @@
 
 void SharedBuffer::copyTo(void* destination, size_t length) const
 {
-    ASSERT(length <= size());
+    return copyTo(destination, 0, length);
+}
+
+void SharedBuffer::copyTo(void* destination, size_t offset, size_t length) const
+{
+    ASSERT(length + offset <= size());
+    if (offset >= size())
+        return;
+    auto remaining = std::min(length, size() - offset);
+    if (!remaining)
+        return;
+
+    auto segment = begin();
+    if (offset >= segment->segment->size()) {
+        auto comparator = [](const size_t& position, const DataSegmentVectorEntry& entry) {
+            return position < entry.beginPosition;
+        };
+        segment = std::upper_bound(segment, end(), offset, comparator);
+        segment--; // std::upper_bound gives a pointer to the segment that is greater than offset. We want the segment just before that.
+    }
     auto destinationPtr = static_cast<uint8_t*>(destination);
-    auto remaining = std::min(length, size());
-    for (auto& segment : m_segments) {
-        size_t amountToCopyThisTime = std::min(remaining, segment.segment->size());
-        memcpy(destinationPtr, segment.segment->data(), amountToCopyThisTime);
+
+    size_t positionInSegment = offset - segment->beginPosition;
+    size_t amountToCopyThisTime = std::min(remaining, segment->segment->size() - positionInSegment);
+    memcpy(destinationPtr, segment->segment->data() + positionInSegment, amountToCopyThisTime);
+    remaining -= amountToCopyThisTime;
+    if (!remaining)
+        return;
+    destinationPtr += amountToCopyThisTime;
+
+    // If we reach here, there must be at least another segment available as we have content left to be fetched.
+    for (++segment; segment != end(); ++segment) {
+        size_t amountToCopyThisTime = std::min(remaining, segment->segment->size());
+        memcpy(destinationPtr, segment->segment->data(), amountToCopyThisTime);
         remaining -= amountToCopyThisTime;
         if (!remaining)
             return;
@@ -312,7 +351,8 @@
 #if USE(GSTREAMER)
         [](const RefPtr<GstMappedOwnedBuffer>& data) { return data->data(); },
 #endif
-        [](const FileSystem::MappedFileData& data) { return static_cast<const uint8_t*>(data.data()); }
+        [](const FileSystem::MappedFileData& data) { return static_cast<const uint8_t*>(data.data()); },
+        [](const Provider& provider) { return provider.data(); }
     );
     return WTF::visit(visitor, m_immutableData);
 }
@@ -395,7 +435,8 @@
 #if USE(GSTREAMER)
         [](const RefPtr<GstMappedOwnedBuffer>& data) { return data->size(); },
 #endif
-        [](const FileSystem::MappedFileData& data) { return data.size(); }
+        [](const FileSystem::MappedFileData& data) { return data.size(); },
+        [](const Provider& provider) { return provider.size(); }
     );
     return WTF::visit(visitor, m_immutableData);
 }

Modified: trunk/Source/WebCore/platform/SharedBuffer.h (282864 => 282865)


--- trunk/Source/WebCore/platform/SharedBuffer.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/SharedBuffer.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -29,6 +29,7 @@
 #include <_javascript_Core/ArrayBuffer.h>
 #include <wtf/FileSystem.h>
 #include <wtf/Forward.h>
+#include <wtf/Function.h>
 #include <wtf/RefCounted.h>
 #include <wtf/Span.h>
 #include <wtf/ThreadSafeRefCounted.h>
@@ -123,6 +124,7 @@
 
     Ref<SharedBuffer> copy() const;
     void copyTo(void* destination, size_t length) const;
+    void copyTo(void* destination, size_t offset, size_t length) const;
 
     // Data wrapped by a DataSegment should be immutable because it can be referenced by other objects.
     // To modify or combine the data, allocate a new DataSegment.
@@ -148,6 +150,13 @@
 #endif
         static Ref<DataSegment> create(FileSystem::MappedFileData&& data) { return adoptRef(*new DataSegment(WTFMove(data))); }
 
+        struct Provider {
+            WTF::Function<const uint8_t*()> data;
+            WTF::Function<size_t()> size;
+        };
+
+        static Ref<DataSegment> create(Provider&& provider) { return adoptRef(*new DataSegment(WTFMove(provider))); }
+
 #if USE(FOUNDATION)
         RetainPtr<NSData> createNSData() const;
 #endif
@@ -171,6 +180,8 @@
 #endif
         DataSegment(FileSystem::MappedFileData&& data)
             : m_immutableData(WTFMove(data)) { }
+        DataSegment(Provider&& provider)
+            : m_immutableData(WTFMove(provider)) { }
 
         Variant<Vector<uint8_t>,
 #if USE(CF)
@@ -182,10 +193,13 @@
 #if USE(GSTREAMER)
             RefPtr<GstMappedOwnedBuffer>,
 #endif
-            FileSystem::MappedFileData> m_immutableData;
+            FileSystem::MappedFileData,
+            Provider> m_immutableData;
         friend class SharedBuffer;
     };
 
+    static Ref<SharedBuffer> create(DataSegment::Provider&&);
+
     void forEachSegment(const Function<void(const Span<const uint8_t>&)>&) const;
     bool startsWith(const Span<const uint8_t>& prefix) const;
 
@@ -216,6 +230,7 @@
     explicit SharedBuffer(const char*, size_t);
     explicit SharedBuffer(Vector<uint8_t>&&);
     explicit SharedBuffer(FileSystem::MappedFileData&&);
+    explicit SharedBuffer(DataSegment::Provider&&);
 #if USE(CF)
     explicit SharedBuffer(CFDataRef);
 #endif

Modified: trunk/Source/WebCore/platform/audio/cocoa/AudioFileReaderCocoa.cpp (282864 => 282865)


--- trunk/Source/WebCore/platform/audio/cocoa/AudioFileReaderCocoa.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/audio/cocoa/AudioFileReaderCocoa.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -40,6 +40,7 @@
 #include "InbandTextTrackPrivate.h"
 #include "Logging.h"
 #include "MediaSampleAVFObjC.h"
+#include "SharedBuffer.h"
 #include "VideoTrackPrivate.h"
 #include "WebMAudioUtilitiesCocoa.h"
 #include <AudioToolbox/AudioConverter.h>
@@ -129,6 +130,7 @@
     WTF_MAKE_FAST_ALLOCATED;
 
 public:
+    Ref<SharedBuffer> m_buffer;
 #if ENABLE(MEDIA_SOURCE)
     Ref<AudioTrackPrivateWebM> m_track;
 #endif
@@ -180,6 +182,7 @@
 
 std::unique_ptr<AudioFileReaderWebMData> AudioFileReader::demuxWebMData(const uint8_t* data, size_t dataSize) const
 {
+    auto buffer = SharedBuffer::create(data, dataSize);
     auto parser = adoptRef(new SourceBufferParserWebM());
     bool error = false;
     std::optional<uint64_t> audioTrackId;
@@ -212,12 +215,12 @@
             return;
         track->setDiscardPadding(discardPadding);
     });
-    SourceBufferParser::Segment segment({ data, dataSize });
+    SourceBufferParser::Segment segment(makeRef(buffer.get()));
     parser->appendData(WTFMove(segment));
     if (!track)
         return nullptr;
     parser->flushPendingAudioBuffers();
-    return makeUnique<AudioFileReaderWebMData>(AudioFileReaderWebMData { track.releaseNonNull(), WTFMove(duration), WTFMove(samples) });
+    return makeUnique<AudioFileReaderWebMData>(AudioFileReaderWebMData { WTFMove(buffer), track.releaseNonNull(), WTFMove(duration), WTFMove(samples) });
 }
 
 struct PassthroughUserData {
@@ -342,16 +345,21 @@
     for (size_t i = 0; i < m_webmData->m_samples.size(); i++) {
         auto& sample = m_webmData->m_samples[i];
         CMSampleBufferRef sampleBuffer = sample->sampleBuffer();
-        auto buffer = PAL::CMSampleBufferGetDataBuffer(sampleBuffer);
-        ASSERT(PAL::CMBlockBufferIsRangeContiguous(buffer, 0, 0));
-        if (!PAL::CMBlockBufferIsRangeContiguous(buffer, 0, 0)) {
-            RELEASE_LOG_FAULT(WebAudio, "Unable to read sample content (not contiguous)");
-            return { };
+        auto rawBuffer = PAL::CMSampleBufferGetDataBuffer(sampleBuffer);
+        RetainPtr<CMBlockBufferRef> buffer = rawBuffer;
+        // Make sure block buffer is contiguous.
+        if (!PAL::CMBlockBufferIsRangeContiguous(rawBuffer, 0, 0)) {
+            CMBlockBufferRef contiguousBuffer = nullptr;
+            if (PAL::CMBlockBufferCreateContiguous(nullptr, rawBuffer, nullptr, nullptr, 0, 0, 0, &contiguousBuffer) != kCMBlockBufferNoErr) {
+                RELEASE_LOG_FAULT(WebAudio, "failed to create contiguous block buffer");
+                return { };
+            }
+            buffer = adoptCF(contiguousBuffer);
         }
 
-        size_t srcSize = PAL::CMBlockBufferGetDataLength(buffer);
+        size_t srcSize = PAL::CMBlockBufferGetDataLength(buffer.get());
         char* srcData = nullptr;
-        if (PAL::CMBlockBufferGetDataPointer(buffer, 0, nullptr, nullptr, &srcData) != noErr) {
+        if (PAL::CMBlockBufferGetDataPointer(buffer.get(), 0, nullptr, nullptr, &srcData) != noErr) {
             RELEASE_LOG_FAULT(WebAudio, "Unable to retrieve data");
             return { };
         }

Modified: trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.cpp (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -33,6 +33,7 @@
 #include "MediaSample.h"
 #include "PlatformTimeRanges.h"
 #include "SampleMap.h"
+#include "SharedBuffer.h"
 #include "SourceBufferPrivateClient.h"
 #include "TimeRanges.h"
 #include <wtf/CheckedArithmetic.h>
@@ -1315,6 +1316,16 @@
     updateHighestPresentationTimestamp();
 }
 
+void SourceBufferPrivate::append(Ref<SharedBuffer>&& buffer)
+{
+    append(buffer->extractData());
+}
+
+void SourceBufferPrivate::append(Vector<unsigned char>&&)
+{
+    RELEASE_ASSERT_NOT_REACHED();
+}
+
 } // namespace WebCore
 
 #endif

Modified: trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.h (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/SourceBufferPrivate.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -48,6 +48,7 @@
 
 namespace WebCore {
 
+class SharedBuffer;
 class TimeRanges;
 
 enum class SourceBufferAppendMode : uint8_t {
@@ -66,7 +67,7 @@
     WEBCORE_EXPORT virtual ~SourceBufferPrivate();
 
     virtual void setActive(bool) = 0;
-    virtual void append(Vector<unsigned char>&&) = 0;
+    WEBCORE_EXPORT virtual void append(Ref<SharedBuffer>&&);
     virtual void abort() = 0;
     virtual void resetParserState() = 0;
     virtual void removedFromMediaSource() = 0;
@@ -147,6 +148,8 @@
 #endif
 
 protected:
+    // The following method should never be called directly and be overridden instead.
+    WEBCORE_EXPORT virtual void append(Vector<unsigned char>&&);
     virtual MediaTime timeFudgeFactor() const { return {2002, 24000}; }
     virtual bool isActive() const { return false; }
     virtual bool isSeeking() const { return false; }

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferParserAVFObjC.mm (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferParserAVFObjC.mm	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferParserAVFObjC.mm	2021-09-22 06:25:33 UTC (rev 282865)
@@ -40,6 +40,7 @@
 #import "MediaSampleAVFObjC.h"
 #import "NotImplemented.h"
 #import "SharedBuffer.h"
+#import "SourceBufferPrivate.h"
 #import "TimeRanges.h"
 #import "VideoTrackPrivateMediaSourceAVFObjC.h"
 #import <AVFoundation/AVAssetTrack.h>
@@ -211,8 +212,8 @@
 
 void SourceBufferParserAVFObjC::appendData(Segment&& segment, CompletionHandler<void()>&& completionHandler, AppendFlags flags)
 {
-    auto sharedData = SharedBuffer::create(segment.takeVector());
-    auto nsData = sharedData->createNSData();
+    auto sharedBuffer = segment.takeSharedBuffer();
+    auto nsData = sharedBuffer->createNSData();
     if (m_parserStateWasReset || flags == AppendFlags::Discontinuity)
         [m_parser appendStreamData:nsData.get() withFlags:AVStreamDataParserStreamDataDiscontinuity];
     else

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -149,7 +149,7 @@
     void didProvideMediaDataForTrackId(Ref<MediaSample>&&, uint64_t trackId, const String& mediaType);
 
     // SourceBufferPrivate overrides
-    void append(Vector<unsigned char>&&) final;
+    void append(Ref<SharedBuffer>&&) final;
     void abort() final;
     void resetParserState() final;
     void removedFromMediaSource() final;

Modified: trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/avfoundation/objc/SourceBufferPrivateAVFObjC.mm	2021-09-22 06:25:33 UTC (rev 282865)
@@ -572,9 +572,9 @@
     return globalQueue;
 }
 
-void SourceBufferPrivateAVFObjC::append(Vector<unsigned char>&& data)
+void SourceBufferPrivateAVFObjC::append(Ref<SharedBuffer>&& data)
 {
-    ALWAYS_LOG(LOGIDENTIFIER, "data length = ", data.size());
+    ALWAYS_LOG(LOGIDENTIFIER, "data length = ", data->size());
 
     ASSERT(!m_hasSessionSemaphore);
     ASSERT(!m_abortSemaphore);

Modified: trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.cpp (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -29,6 +29,7 @@
 #if ENABLE(MEDIA_SOURCE)
 
 #include "ContentType.h"
+#include "SharedBuffer.h"
 #include "SourceBufferParserAVFObjC.h"
 #include "SourceBufferParserWebM.h"
 #include <pal/cocoa/MediaToolboxSoftLink.h>
@@ -77,8 +78,8 @@
 {
 }
 
-SourceBufferParser::Segment::Segment(Vector<uint8_t>&& segment)
-    : m_segment(WTFMove(segment))
+SourceBufferParser::Segment::Segment(Ref<SharedBuffer>&& buffer)
+    : m_segment(WTFMove(buffer))
 {
 }
 
@@ -98,9 +99,9 @@
             return clampTo<size_t>(MTPluginByteSourceGetLength(byteSource.get()));
         },
 #endif
-        [](const Vector<uint8_t>& vector)
+        [](const Ref<SharedBuffer>& buffer)
         {
-            return vector.size();
+            return buffer->size();
         }
     );
 }
@@ -119,15 +120,15 @@
             return sizeRead;
         },
 #endif
-        [&](const Vector<uint8_t>& vector)
+        [&](const Ref<SharedBuffer>& buffer)
         {
-            memcpy(destination, vector.data() + position, sizeToRead);
+            buffer->copyTo(destination, position, sizeToRead);
             return sizeToRead;
         }
     );
 }
 
-Vector<uint8_t> SourceBufferParser::Segment::takeVector()
+Ref<SharedBuffer> SourceBufferParser::Segment::takeSharedBuffer()
 {
     return WTF::switchOn(m_segment,
 #if HAVE(MT_PLUGIN_FORMAT_READER)
@@ -135,16 +136,32 @@
         {
             Vector<uint8_t> vector(size());
             vector.shrink(read(0, vector.size(), vector.data()));
-            return vector;
+            return SharedBuffer::create(WTFMove(vector));
         },
 #endif
-        [](Vector<uint8_t>& vector)
+        [&](Ref<SharedBuffer>& buffer)
         {
-            return std::exchange(vector, { });
+            return std::exchange(buffer, SharedBuffer::create());
         }
     );
 }
 
+RefPtr<SharedBuffer> SourceBufferParser::Segment::getSharedBuffer() const
+{
+    return WTF::switchOn(m_segment,
+#if HAVE(MT_PLUGIN_FORMAT_READER)
+        [&](const RetainPtr<MTPluginByteSourceRef>&) -> RefPtr<SharedBuffer>
+        {
+            return nullptr;
+        },
+#endif
+        [&](const Ref<SharedBuffer>& buffer) -> RefPtr<SharedBuffer>
+        {
+            return buffer.ptr();
+        }
+    );
+}
+
 } // namespace WebCore
 
 #endif // ENABLE(MEDIA_SOURCE)

Modified: trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.h (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParser.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -43,6 +43,7 @@
 
 class ContentType;
 class MediaSample;
+class SharedBuffer;
 
 class WEBCORE_EXPORT SourceBufferParser : public ThreadSafeRefCounted<SourceBufferParser> {
 public:
@@ -66,9 +67,11 @@
 #if HAVE(MT_PLUGIN_FORMAT_READER)
         Segment(RetainPtr<MTPluginByteSourceRef>&&);
 #endif
-        Segment(Vector<uint8_t>&&);
+        Segment(Ref<SharedBuffer>&&);
         Segment(Segment&&) = default;
-        Vector<uint8_t> takeVector();
+        Ref<SharedBuffer> takeSharedBuffer();
+        // Will return nullptr if Segment's backend isn't a SharedBuffer.
+        RefPtr<SharedBuffer> getSharedBuffer() const;
 
         size_t size() const;
         size_t read(size_t position, size_t, uint8_t* destination) const;
@@ -78,7 +81,7 @@
 #if HAVE(MT_PLUGIN_FORMAT_READER)
             RetainPtr<MTPluginByteSourceRef>,
 #endif
-            Vector<uint8_t>
+            Ref<SharedBuffer>
         > m_segment;
     };
 

Modified: trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.cpp (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -273,7 +273,7 @@
 
 using namespace webm;
 
-class SourceBufferParserWebM::StreamingVectorReader final : public webm::Reader {
+class SourceBufferParserWebM::SegmentReader final : public webm::Reader {
     WTF_MAKE_FAST_ALLOCATED;
 public:
     void appendSegment(Segment&& segment)
@@ -354,6 +354,84 @@
         return Status(Status::kWouldBlock);
     }
 
+    static void FreeDataSegment(void* refcon, void*, size_t)
+    {
+        auto* buffer = reinterpret_cast<SharedBuffer::DataSegment*>(refcon);
+        buffer->deref();
+    }
+
+    Status ReadInto(std::size_t numToRead, CMBlockBufferRef outputBuffer, uint64_t* numActuallyRead)
+    {
+        ASSERT(outputBuffer && numActuallyRead);
+        if (!numActuallyRead)
+            return Status(Status::kNotEnoughMemory);
+
+        *numActuallyRead = 0;
+        if (!outputBuffer)
+            return Status(Status::kNotEnoughMemory);
+
+        while (numToRead && m_currentSegment != m_data.end()) {
+            auto& currentSegment = *m_currentSegment;
+
+            if (m_positionWithinSegment >= currentSegment.size()) {
+                advanceToNextSegment();
+                continue;
+            }
+            RefPtr<SharedBuffer> sharedBuffer = currentSegment.getSharedBuffer();
+            CMBlockBufferRef rawBlockBuffer = nullptr;
+            uint64_t lastRead = 0;
+            size_t destinationOffset = m_positionWithinSegment;
+            if (!sharedBuffer) {
+                // We could potentially allocate more memory than needed if the read is partial.
+                auto err = PAL::CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, nullptr, numToRead, kCFAllocatorDefault, nullptr, 0, numToRead, kCMBlockBufferAssureMemoryNowFlag, &rawBlockBuffer);
+                if (err != kCMBlockBufferNoErr)
+                    return Status(Status::kNotEnoughMemory);
+                uint8_t* blockBufferData = nullptr;
+                size_t segmentSizeAtPosition = 0;
+                err = PAL::CMBlockBufferGetDataPointer(rawBlockBuffer, 0, &segmentSizeAtPosition, nullptr, (char**)&blockBufferData);
+                if (err != kCMBlockBufferNoErr)
+                    return Status(Status::kNotEnoughMemory);
+                lastRead = currentSegment.read(m_positionWithinSegment, numToRead, blockBufferData);
+                destinationOffset = 0;
+            } else {
+                ASSERT(sharedBuffer->hasOneSegment(), "Can only deal with sharedBuffer containing a single DataSegment");
+                // A SharedBuffer doesn't have thread-safe refcounting, as such we must keep a reference to the DataSegment instead.
+                // TODO: could we only create a new CMBlockBuffer if the backend memory changed since the previous one?
+                auto firstSegment = sharedBuffer->begin()->segment;
+                size_t canRead = std::min<size_t>(numToRead, firstSegment->size() - m_positionWithinSegment);
+                // From CMBlockBufferCustomBlockSource documentation:
+                // Note that for 64-bit architectures, this struct contains misaligned function pointers.
+                // To avoid link-time issues, it is recommended that clients fill CMBlockBufferCustomBlockSource's function pointer fields
+                // by using assignment statements, rather than declaring them as global or static structs.
+                CMBlockBufferCustomBlockSource allocator;
+                allocator.version = 0;
+                allocator.AllocateBlock = nullptr;
+                allocator.FreeBlock = FreeDataSegment;
+                allocator.refCon = firstSegment.ptr();
+                firstSegment->ref();
+                auto err = PAL::CMBlockBufferCreateWithMemoryBlock(nullptr, static_cast<void*>(const_cast<uint8_t*>(firstSegment->data())), firstSegment->size(), nullptr, &allocator, m_positionWithinSegment, canRead, 0, &rawBlockBuffer);
+                if (err != kCMBlockBufferNoErr)
+                    return Status(Status::kNotEnoughMemory);
+                lastRead = canRead;
+            }
+            auto blockBuffer = adoptCF(rawBlockBuffer);
+            auto err = PAL::CMBlockBufferAppendBufferReference(outputBuffer, rawBlockBuffer, 0, 0, 0);
+            if (err != kCMBlockBufferNoErr)
+                return Status(Status::kNotEnoughMemory);
+            m_position += lastRead;
+            *numActuallyRead += lastRead;
+            m_positionWithinSegment += lastRead;
+            numToRead -= lastRead;
+            if (m_positionWithinSegment == currentSegment.size())
+                advanceToNextSegment();
+        }
+        if (!numToRead)
+            return Status(Status::kOkCompleted);
+        if (*numActuallyRead)
+            return Status(Status::kOkPartial);
+        return Status(Status::kWouldBlock);
+    }
+
     uint64_t Position() const final { return m_position; }
 
     void reset()
@@ -595,7 +673,7 @@
 }
 
 SourceBufferParserWebM::SourceBufferParserWebM()
-    : m_reader(WTF::makeUniqueRef<StreamingVectorReader>())
+    : m_reader(WTF::makeUniqueRef<SegmentReader>())
 {
     if (isWebmParserAvailable())
         m_parser = WTF::makeUniqueWithoutFastMallocCheck<WebmParser>();
@@ -1099,66 +1177,43 @@
 
 #define PARSER_LOG_ERROR_IF_POSSIBLE(...) if (parser().loggerPtr()) parser().loggerPtr()->error(logChannel(), WTF::Logger::LogSiteIdentifier(logClassName(), __func__, parser().logIdentifier()), __VA_ARGS__)
 
-#if ENABLE(VP9)
-void SourceBufferParserWebM::VideoTrackData::reset()
+RetainPtr<CMBlockBufferRef> SourceBufferParserWebM::TrackData::contiguousCompleteBlockBuffer(size_t offset, size_t length) const
 {
-    m_currentBlockBuffer = nullptr;
-    TrackData::reset();
+    if (!offset && !length && PAL::CMBlockBufferIsRangeContiguous(m_completeBlockBuffer.get(), 0, 0))
+        return m_completeBlockBuffer;
+    CMBlockBufferRef rawContiguousBuffer = nullptr;
+    if (PAL::CMBlockBufferCreateContiguous(nullptr, m_completeBlockBuffer.get(), nullptr, nullptr, offset, length, 0, &rawContiguousBuffer) != kCMBlockBufferNoErr) {
+        RELEASE_LOG_FAULT(WebAudio, "failed to create contiguous block buffer");
+        return nullptr;
+    }
+    return adoptCF(rawContiguousBuffer);
 }
-#endif
 
-webm::Status SourceBufferParserWebM::VideoTrackData::consumeFrameData(webm::Reader& reader, const FrameMetadata& metadata, uint64_t* bytesRemaining, const CMTime& presentationTime, int sampleCount)
+webm::Status SourceBufferParserWebM::TrackData::readFrameData(webm::Reader& reader, const webm::FrameMetadata& metadata, uint64_t* bytesRemaining)
 {
-#if ENABLE(VP9)
-    CMBlockBufferRef rawBlockBuffer = nullptr;
-
-    if (m_currentPacketSize && *m_currentPacketSize != metadata.size) {
-        // The packet's metadata doesn't match the currently pending partial packet; restart.
-        ASSERT_NOT_REACHED_WITH_MESSAGE("VideoTrackData::consumeFrameData: webm in nonsensical state");
-        m_partialBytesRead = 0;
-        m_currentBlockBuffer = nullptr;
-        m_currentPacketSize = std::nullopt;
+    if (m_completePacketSize && *m_completePacketSize != metadata.size) {
+        // The packet's metadata doesn't match the currently pending complete packet; restart.
+        ASSERT_NOT_REACHED_WITH_MESSAGE("TrackData::readFrameData: webm in nonsensical state");
+        reset();
     }
 
-    if (!m_currentPacketSize)
-        m_currentPacketSize = metadata.size;
+    if (!m_completePacketSize)
+        m_completePacketSize = metadata.size;
 
     if (!m_currentBlockBuffer) {
-        auto err = PAL::CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, nullptr, *m_currentPacketSize, kCFAllocatorDefault, nullptr, 0, *m_currentPacketSize, 0, &rawBlockBuffer);
-        if (err) {
-            PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferCreateWithMemoryBlock failed with error", err);
+        ASSERT(!m_partialBytesRead);
+        CMBlockBufferRef rawBlockBuffer = nullptr;
+        auto err = PAL::CMBlockBufferCreateEmpty(kCFAllocatorDefault, mMaxBlockBufferCapacity, 0, &rawBlockBuffer);
+        if (err != kCMBlockBufferNoErr || !rawBlockBuffer) {
+            PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferCreateEmpty failed with error", err);
             return Skip(&reader, bytesRemaining);
         }
-
         m_currentBlockBuffer = adoptCF(rawBlockBuffer);
-        m_partialBytesRead = 0;
-
-        err = PAL::CMBlockBufferAssureBlockMemory(m_currentBlockBuffer.get());
-        if (err) {
-            PARSER_LOG_ERROR_IF_POSSIBLE("CMAudioSampleBufferCreateWithPacketDescriptions failed with error", err);
-            return Skip(&reader, bytesRemaining);
-        }
     }
 
-    if (*m_currentPacketSize < m_partialBytesRead + *bytesRemaining) {
-        PARSER_LOG_ERROR_IF_POSSIBLE("Invalid frame size allocated, ignoring the entire frame");
-        m_partialBytesRead += *bytesRemaining;
-        return Skip(&reader, bytesRemaining);
-    }
     while (*bytesRemaining) {
-        size_t segmentSizeAtPosition = 0;
-        uint8_t* blockBufferData = nullptr;
-        auto err = PAL::CMBlockBufferGetDataPointer(m_currentBlockBuffer.get(), m_partialBytesRead, &segmentSizeAtPosition, nullptr, (char**)&blockBufferData);
-        if (err) {
-            PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferGetDataPointer failed with error", err);
-            return Skip(&reader, bytesRemaining);
-        }
-        if (*bytesRemaining < segmentSizeAtPosition) {
-            PARSER_LOG_ERROR_IF_POSSIBLE("An error occurred, destination buffer too small to contain our frame");
-            return Skip(&reader, bytesRemaining);
-        }
         uint64_t bytesRead;
-        auto status = reader.Read(*bytesRemaining, (uint8_t*)blockBufferData, &bytesRead);
+        auto status = static_cast<SourceBufferParserWebM::SegmentReader&>(reader).ReadInto(*bytesRemaining, m_currentBlockBuffer.get(), &bytesRead);
         *bytesRemaining -= bytesRead;
         m_partialBytesRead += bytesRead;
 
@@ -1166,12 +1221,33 @@
             return status;
     }
 
-    ASSERT(m_partialBytesRead <= *m_currentPacketSize);
-    if (m_partialBytesRead < *m_currentPacketSize)
+    ASSERT(m_partialBytesRead <= *m_completePacketSize);
+    if (m_partialBytesRead < *m_completePacketSize)
         return webm::Status(webm::Status::kOkPartial);
 
+    if (!m_completeBlockBuffer)
+        m_completeBlockBuffer = WTFMove(m_currentBlockBuffer);
+    else {
+        auto err = PAL::CMBlockBufferAppendBufferReference(m_completeBlockBuffer.get(), m_currentBlockBuffer.get(), 0, 0, 0);
+        if (err) {
+            PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferAppendBufferReference to complete block failed with error", err);
+            return Status(Status::kNotEnoughMemory);
+        }
+        m_currentBlockBuffer = nullptr;
+    }
+    m_partialBytesRead = 0;
+
+    return webm::Status(webm::Status::kOkCompleted);
+}
+
+webm::Status SourceBufferParserWebM::VideoTrackData::consumeFrameData(webm::Reader& reader, const FrameMetadata& metadata, uint64_t* bytesRemaining, const CMTime& presentationTime, int sampleCount)
+{
+#if ENABLE(VP9)
+    auto status = readFrameData(reader, metadata, bytesRemaining);
+    if (!status.completed_ok())
+        return status;
+
     createSampleBuffer(presentationTime, sampleCount, metadata);
-
     reset();
 #else
     UNUSED_PARAM(metadata);
@@ -1186,8 +1262,14 @@
 {
 #if ENABLE(VP9)
     uint8_t* blockBufferData = nullptr;
-    size_t segmentSizeAtPosition = 0;
-    auto err = PAL::CMBlockBufferGetDataPointer(m_currentBlockBuffer.get(), 0, &segmentSizeAtPosition, nullptr, (char**)&blockBufferData);
+    constexpr size_t maxHeaderSize = 32; // The maximum length of a VP9 uncompressed header is 144 bits and 11 bytes for VP8. Round high.
+    size_t segmentHeaderLength = std::min(maxHeaderSize, *m_completePacketSize);
+    auto contiguousBuffer = contiguousCompleteBlockBuffer(0, segmentHeaderLength);
+    if (!contiguousBuffer) {
+        PARSER_LOG_ERROR_IF_POSSIBLE("VideoTrackData::createSampleBuffer failed to create contiguous data block");
+        return;
+    }
+    auto err = PAL::CMBlockBufferGetDataPointer(contiguousBuffer.get(), 0, nullptr, nullptr, (char**)&blockBufferData);
     if (err) {
         PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferGetDataPointer failed with error", err);
         return;
@@ -1196,7 +1278,7 @@
     bool isKey = false;
     RetainPtr<CMFormatDescriptionRef> formatDescription;
     if (codec() == CodecType::VP9) {
-        if (!m_headerParser.ParseUncompressedHeader(blockBufferData, segmentSizeAtPosition))
+        if (!m_headerParser.ParseUncompressedHeader(blockBufferData, segmentHeaderLength))
             return;
 
         if (m_headerParser.key()) {
@@ -1209,7 +1291,7 @@
             setFormatDescription(WTFMove(formatDescription));
         }
     } else if (codec() == CodecType::VP8) {
-        auto header = parseVP8FrameHeader(blockBufferData, segmentSizeAtPosition);
+        auto header = parseVP8FrameHeader(blockBufferData, segmentHeaderLength);
         if (header && header->keyframe) {
             isKey = true;
             auto formatDescription = createFormatDescriptionFromVP8Header(*header, track().video.value().colour);
@@ -1222,7 +1304,7 @@
     }
 
     auto track = this->track();
-    // FIXME: A block might contain more than one frame, but only this frame has been read into `currentBlockBuffer`.
+    // FIXME: A block might contain more than one frame, but only this frame has been read into `completeBlockBuffer`.
     // Below we create sample buffers for each frame, each with the block's timecode and `num_frames` value.
     // Shouldn't we create just one sample buffer once all the block's frames have been read into `currentBlockBuffer`?
 
@@ -1231,9 +1313,8 @@
         duration = track.default_duration.value() * presentationTime.timescale / k_us_in_seconds;
 
     CMSampleBufferRef rawSampleBuffer = nullptr;
-    size_t frameSize = PAL::CMBlockBufferGetDataLength(m_currentBlockBuffer.get());
     CMSampleTimingInfo timing = { PAL::CMTimeMake(duration, presentationTime.timescale), presentationTime, presentationTime };
-    err = PAL::CMSampleBufferCreateReady(kCFAllocatorDefault, m_currentBlockBuffer.get(), this->formatDescription().get(), sampleCount, 1, &timing, 1, &frameSize, &rawSampleBuffer);
+    err = PAL::CMSampleBufferCreateReady(kCFAllocatorDefault, m_completeBlockBuffer.get(), this->formatDescription().get(), sampleCount, 1, &timing, 1, &*m_completePacketSize, &rawSampleBuffer);
     if (err) {
         PARSER_LOG_ERROR_IF_POSSIBLE("CMSampleBufferCreateReady failed with error", err);
         return;
@@ -1263,57 +1344,28 @@
 #endif // ENABLE(VP9)
 }
 
-void SourceBufferParserWebM::AudioTrackData::reset()
+void SourceBufferParserWebM::AudioTrackData::resetCompleted()
 {
+    mNumFramesInCompleteBlock = 0;
     m_packetDescriptions.clear();
-    m_packetsData.clear();
-    m_currentPacketByteOffset = std::nullopt;
-    TrackData::reset();
+    m_currentPacketByteOffset = 0;
+    TrackData::resetCompleted();
 }
 
 webm::Status SourceBufferParserWebM::AudioTrackData::consumeFrameData(webm::Reader& reader, const FrameMetadata& metadata, uint64_t* bytesRemaining, const CMTime& presentationTime, int sampleCount)
 {
-    ASSERT(sampleCount);
+    auto status = readFrameData(reader, metadata, bytesRemaining);
+    if (!status.completed_ok())
+        return status;
 
-    if (m_packetDescriptions.isEmpty()) {
-        m_byteOffset = metadata.position;
+    // Attempts to minimise the amount of memory allocations due to repetitve CMBlockBufferAppendBufferReference calls in readFrameData.
+    mNumFramesInCompleteBlock++;
+    if (mNumFramesInCompleteBlock > mMaxBlockBufferCapacity)
+        mMaxBlockBufferCapacity = mNumFramesInCompleteBlock;
+
+    if (m_packetDescriptions.isEmpty())
         m_samplePresentationTime = presentationTime;
-    }
 
-    if (m_currentPacketSize && *m_currentPacketSize != metadata.size) {
-        // The packet's metadata doesn't match the currently pending partial packet; restart.
-        ASSERT_NOT_REACHED_WITH_MESSAGE("AudioTrackData::consumeFrameData: webm in nonsensical state");
-        m_partialBytesRead = 0;
-        m_currentPacketSize = std::nullopt;
-    }
-
-    if (!m_currentPacketSize)
-        m_currentPacketSize = metadata.size;
-
-    // Ensure we have room to store the full pending frame.
-    if (m_packetsData.size() < m_packetsBytesRead + metadata.size)
-        m_packetsData.grow(m_packetsBytesRead + metadata.size);
-
-    if (!m_currentPacketByteOffset)
-        m_currentPacketByteOffset = m_packetsBytesRead;
-
-    while (*bytesRemaining) {
-        uint64_t bytesRead;
-        auto status = reader.Read(*bytesRemaining, m_packetsData.data() + m_packetsBytesRead + m_partialBytesRead, &bytesRead);
-        *bytesRemaining -= bytesRead;
-        m_partialBytesRead += bytesRead;
-
-        if (!status.completed_ok())
-            return status;
-    }
-
-    ASSERT(m_partialBytesRead <= *m_currentPacketSize);
-    if (m_partialBytesRead < *m_currentPacketSize)
-        return webm::Status(webm::Status::kOkPartial);
-
-    m_packetsBytesRead += m_partialBytesRead;
-    m_partialBytesRead = 0;
-
     if (!formatDescription()) {
         if (!track().codec_private.is_present()) {
             PARSER_LOG_ERROR_IF_POSSIBLE("Audio track missing magic cookie");
@@ -1325,8 +1377,19 @@
         if (codec() == CodecType::Vorbis)
             formatDescription = createVorbisAudioFormatDescription(privateData.size(), privateData.data());
         else if (codec() == CodecType::Opus) {
+            auto contiguousBuffer = contiguousCompleteBlockBuffer(m_currentPacketByteOffset, *m_completePacketSize);
+            if (!contiguousBuffer) {
+                PARSER_LOG_ERROR_IF_POSSIBLE("AudioTrackData::consumeFrameData: unable to create contiguous data block");
+                return Skip(&reader, bytesRemaining);
+            }
+            uint8_t* blockBufferData = nullptr;
+            auto err = PAL::CMBlockBufferGetDataPointer(contiguousBuffer.get(), 0, nullptr, nullptr, (char**)&blockBufferData);
+            if (err) {
+                PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferGetDataPointer failed with error", err);
+                return Skip(&reader, bytesRemaining);
+            }
             OpusCookieContents cookieContents;
-            if (!parseOpusPrivateData(privateData.size(), privateData.data(), m_packetsBytesRead, m_packetsData.data(), cookieContents)) {
+            if (!parseOpusPrivateData(privateData.size(), privateData.data(), *m_completePacketSize, blockBufferData, cookieContents)) {
                 PARSER_LOG_ERROR_IF_POSSIBLE("Failed to parse Opus private data");
                 return Skip(&reader, bytesRemaining);
             }
@@ -1355,10 +1418,21 @@
     } else if (codec() == CodecType::Opus) {
         // Opus technically allows the frame duration and frames-per-packet values to change from packet to packet.
         // CoreAudio doesn't support ASBD values like these to change on a per-packet basis, so throw an error when
-        // that kind of variablility is encountered.
+        // that kind of variability is encountered.
         OpusCookieContents cookieContents;
         auto& privateData = track().codec_private.value();
-        if (!parseOpusPrivateData(privateData.size(), privateData.data(), m_packetsBytesRead, m_packetsData.data(), cookieContents)
+        auto contiguousBuffer = contiguousCompleteBlockBuffer(m_currentPacketByteOffset, *m_completePacketSize);
+        if (!contiguousBuffer) {
+            PARSER_LOG_ERROR_IF_POSSIBLE("AudioTrackData::consumeFrameData: unable to create contiguous data block");
+            return Skip(&reader, bytesRemaining);
+        }
+        uint8_t* blockBufferData = nullptr;
+        auto err = PAL::CMBlockBufferGetDataPointer(contiguousBuffer.get(), 0, nullptr, nullptr, (char**)&blockBufferData);
+        if (err) {
+            PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferGetDataPointer failed with error", err);
+            return Skip(&reader, bytesRemaining);
+        }
+        if (!parseOpusPrivateData(privateData.size(), privateData.data(), *m_completePacketSize, blockBufferData, cookieContents)
             || cookieContents.framesPerPacket != m_framesPerPacket
             || cookieContents.frameDuration != m_frameDuration) {
             PARSER_LOG_ERROR_IF_POSSIBLE("Opus frames-per-packet changed within a track; error");
@@ -1366,14 +1440,15 @@
         }
     }
 
-    m_packetDescriptions.append({ static_cast<int64_t>(*m_currentPacketByteOffset), 0, static_cast<UInt32>(*m_currentPacketSize) });
-    m_currentPacketByteOffset = std::nullopt;
-    m_currentPacketSize = std::nullopt;
+    m_packetDescriptions.append({ static_cast<int64_t>(m_currentPacketByteOffset), 0, static_cast<UInt32>(*m_completePacketSize) });
+    m_currentPacketByteOffset += *m_completePacketSize;
+    m_completePacketSize = std::nullopt;
 
     auto sampleDuration = PAL::CMTimeGetSeconds(PAL::CMTimeSubtract(presentationTime, m_samplePresentationTime)) + PAL::CMTimeGetSeconds(m_packetDuration) * sampleCount;
-
-    if (sampleDuration >= m_minimumSampleDuration)
+    if (sampleDuration >= m_minimumSampleDuration) {
         createSampleBuffer(metadata.position);
+        reset();
+    }
 
     ASSERT(!*bytesRemaining);
     return webm::Status(webm::Status::kOkCompleted);
@@ -1381,27 +1456,11 @@
 
 void SourceBufferParserWebM::AudioTrackData::createSampleBuffer(std::optional<size_t> latestByteRangeOffset)
 {
-    if (m_packetDescriptions.isEmpty() || !m_packetsBytesRead)
+    if (m_packetDescriptions.isEmpty())
         return;
 
-    ASSERT(!m_packetsData.isEmpty());
-
-    CMBlockBufferRef blockBuffer = nullptr;
-    auto err = PAL::CMBlockBufferCreateWithMemoryBlock(kCFAllocatorDefault, nullptr, m_packetsBytesRead, kCFAllocatorDefault, nullptr, 0, m_packetsBytesRead, kCMBlockBufferAssureMemoryNowFlag, &blockBuffer);
-    if (err) {
-        PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferCreateWithMemoryBlock failed with %d", err);
-        return;
-    }
-    auto buffer = adoptCF(blockBuffer);
-
-    err = PAL::CMBlockBufferReplaceDataBytes(m_packetsData.data(), buffer.get(), 0, m_packetsBytesRead);
-    if (err) {
-        PARSER_LOG_ERROR_IF_POSSIBLE("CMBlockBufferReplaceDataBytes failed with %d", err);
-        return;
-    }
-
     CMSampleBufferRef rawSampleBuffer = nullptr;
-    err = PAL::CMAudioSampleBufferCreateReadyWithPacketDescriptions(kCFAllocatorDefault, buffer.get(), formatDescription().get(), m_packetDescriptions.size(), m_samplePresentationTime, m_packetDescriptions.data(), &rawSampleBuffer);
+    auto err = PAL::CMAudioSampleBufferCreateReadyWithPacketDescriptions(kCFAllocatorDefault, m_completeBlockBuffer.get(), formatDescription().get(), m_packetDescriptions.size(), m_samplePresentationTime, m_packetDescriptions.data(), &rawSampleBuffer);
     if (err) {
         PARSER_LOG_ERROR_IF_POSSIBLE("CMAudioSampleBufferCreateWithPacketDescriptions failed with %d", err);
         return;
@@ -1408,12 +1467,6 @@
     }
     auto sampleBuffer = adoptCF(rawSampleBuffer);
 
-    m_packetsData.remove(0, m_packetsBytesRead);
-    if (m_currentPacketByteOffset)
-        *m_currentPacketByteOffset -= m_packetsBytesRead;
-    m_packetsBytesRead = 0;
-    m_packetDescriptions.clear();
-
     auto trackID = track().track_uid.value();
     parser().provideMediaData(WTFMove(sampleBuffer), trackID, latestByteRangeOffset);
 }
@@ -1421,8 +1474,11 @@
 void SourceBufferParserWebM::flushPendingAudioBuffers()
 {
     for (auto& track : m_tracks) {
-        if (track->trackType() == SourceBufferParserWebM::TrackData::Type::Audio)
-            downcast<AudioTrackData>(track.get()).createSampleBuffer();
+        if (track->trackType() == SourceBufferParserWebM::TrackData::Type::Audio) {
+            AudioTrackData& audioTrack = downcast<AudioTrackData>(track.get());
+            audioTrack.createSampleBuffer();
+            audioTrack.resetCompleted();
+        }
     }
 }
 

Modified: trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.h (282864 => 282865)


--- trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebCore/platform/graphics/cocoa/SourceBufferParserWebM.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -58,7 +58,7 @@
 class SourceBufferParserWebM : public SourceBufferParser, private webm::Callback {
     WTF_MAKE_FAST_ALLOCATED;
 public:
-    class StreamingVectorReader;
+    class SegmentReader;
 
     static bool isWebMFormatReaderAvailable();
     static MediaPlayerEnums::SupportsType isContentTypeSupported(const ContentType&);
@@ -155,7 +155,7 @@
         void setFormatDescription(RetainPtr<CMFormatDescriptionRef>&& description) { m_formatDescription = WTFMove(description); }
 
         SourceBufferParserWebM& parser() const { return m_parser; }
-        
+
         virtual webm::Status consumeFrameData(webm::Reader&, const webm::FrameMetadata&, uint64_t*, const CMTime&, int)
         {
             ASSERT_NOT_REACHED();
@@ -162,16 +162,25 @@
             return webm::Status(webm::Status::kInvalidElementId);
         }
 
-        virtual void reset()
+        virtual void resetCompleted()
         {
-            m_currentPacketSize = std::nullopt;
+            m_completeBlockBuffer = nullptr;
+        }
+        void reset()
+        {
+            resetCompleted();
+            m_completePacketSize = std::nullopt;
             m_partialBytesRead = 0;
+            m_currentBlockBuffer = nullptr;
         }
 
     protected:
-        std::optional<size_t> m_currentPacketSize;
-        // Size of the currently parsed packet, possibly incomplete.
-        size_t m_partialBytesRead { 0 };
+        RetainPtr<CMBlockBufferRef> contiguousCompleteBlockBuffer(size_t offset, size_t length) const;
+        webm::Status readFrameData(webm::Reader&, const webm::FrameMetadata&, uint64_t* bytesRemaining);
+        RetainPtr<CMBlockBufferRef> m_completeBlockBuffer;
+        std::optional<size_t> m_completePacketSize;
+        // Initial allocation size of empty CMBlockBuffer.
+        size_t mMaxBlockBufferCapacity { 0 };
 
     private:
         CodecType m_codec;
@@ -178,7 +187,10 @@
         webm::TrackEntry m_track;
         Type m_trackType;
         RetainPtr<CMFormatDescriptionRef> m_formatDescription;
+        RetainPtr<CMBlockBufferRef> m_currentBlockBuffer;
         SourceBufferParserWebM& m_parser;
+        // Size of the currently incomplete parsed packet.
+        size_t m_partialBytesRead { 0 };
     };
 
     class VideoTrackData : public TrackData {
@@ -193,9 +205,6 @@
         {
         }
 
-#if ENABLE(VP9)
-        void reset() final;
-#endif
         webm::Status consumeFrameData(webm::Reader&, const webm::FrameMetadata&, uint64_t*, const CMTime&, int) final;
 
     private:
@@ -204,7 +213,6 @@
 
 #if ENABLE(VP9)
         vp9_parser::Vp9HeaderParser m_headerParser;
-        RetainPtr<CMBlockBufferRef> m_currentBlockBuffer;
 #endif
     };
 
@@ -222,7 +230,7 @@
         }
 
         webm::Status consumeFrameData(webm::Reader&, const webm::FrameMetadata&, uint64_t*, const CMTime&, int) final;
-        void reset() final;
+        void resetCompleted() final;
         void createSampleBuffer(std::optional<size_t> latestByteRangeOffset = std::nullopt);
 
     private:
@@ -230,15 +238,11 @@
 
         CMTime m_samplePresentationTime;
         CMTime m_packetDuration;
-        Vector<uint8_t> m_packetsData;
-        std::optional<size_t> m_currentPacketByteOffset;
-        // Size of the complete packets parsed so far.
-        size_t m_packetsBytesRead { 0 };
-        size_t m_byteOffset { 0 };
+        size_t m_currentPacketByteOffset { 0 };
         uint8_t m_framesPerPacket { 0 };
         Seconds m_frameDuration { 0_s };
         Vector<AudioStreamPacketDescription> m_packetDescriptions;
-
+        size_t mNumFramesInCompleteBlock { 0 };
         // FIXME: 0.5 - 1.0 seconds is a better duration per sample buffer, but use 2 seconds so at least the first
         // sample buffer will play until we fix MediaSampleCursor::createSampleBuffer to deal with `endCursor`.
         float m_minimumSampleDuration { 2 };
@@ -281,7 +285,7 @@
 
     State m_state { State::None };
 
-    UniqueRef<StreamingVectorReader> m_reader;
+    UniqueRef<SegmentReader> m_reader;
 
     Vector<UniqueRef<TrackData>> m_tracks;
     using BlockVariant = Variant<webm::Block, webm::SimpleBlock>;

Modified: trunk/Source/WebKit/ChangeLog (282864 => 282865)


--- trunk/Source/WebKit/ChangeLog	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/ChangeLog	2021-09-22 06:25:33 UTC (rev 282865)
@@ -1,3 +1,25 @@
+2021-09-21  Jean-Yves Avenard  <j...@apple.com>
+
+        Use SharedMemory for transferring appended buffers from SourceBuffer to the GPU process
+        https://bugs.webkit.org/show_bug.cgi?id=230329
+        rdar://problem/83291495
+
+        Use SharedMemory to pass SourceBuffer content to RemoteSourceBufferProxy in GPU process.
+        This is done by wrapping a SharedMemory into a SharedBuffer.
+
+        Reviewed by Jer Noble.
+
+        * GPUProcess/media/RemoteSourceBufferProxy.cpp:
+        (WebKit::RemoteSourceBufferProxy::append):
+        * GPUProcess/media/RemoteSourceBufferProxy.h:
+        * GPUProcess/media/RemoteSourceBufferProxy.messages.in:
+        * Platform/SharedMemory.cpp:
+        (WebKit::SharedMemory::createSharedBuffer const):
+        * Platform/SharedMemory.h:
+        * WebProcess/GPU/media/SourceBufferPrivateRemote.cpp:
+        (WebKit::SourceBufferPrivateRemote::append):
+        * WebProcess/GPU/media/SourceBufferPrivateRemote.h:
+
 2021-09-21  Simon Fraser  <simon.fra...@apple.com>
 
         Change from ENABLE(RUBBER_BANDING) to HAVE(RUBBER_BANDING)

Modified: trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.cpp (282864 => 282865)


--- trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -192,9 +192,13 @@
     m_connectionToWebProcess->connection().send(Messages::SourceBufferPrivateRemote::SourceBufferPrivateBufferedDirtyChanged(flag), m_identifier);
 }
 
-void RemoteSourceBufferProxy::append(const IPC::DataReference& data)
+void RemoteSourceBufferProxy::append(const SharedMemory::IPCHandle& bufferHandle)
 {
-    m_sourceBufferPrivate->append(data.vector());
+    auto sharedMemory = SharedMemory::map(bufferHandle.handle, SharedMemory::Protection::ReadOnly);
+    if (!sharedMemory)
+        return;
+
+    m_sourceBufferPrivate->append(sharedMemory->createSharedBuffer(bufferHandle.dataSize));
 }
 
 void RemoteSourceBufferProxy::abort()

Modified: trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.h (282864 => 282865)


--- trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -32,6 +32,7 @@
 #include "MessageReceiver.h"
 #include "RemoteSourceBufferIdentifier.h"
 #include "RemoteSourceBufferProxyMessagesReplies.h"
+#include "SharedMemory.h"
 #include "TrackPrivateRemoteIdentifier.h"
 #include <WebCore/MediaDescription.h>
 #include <WebCore/SourceBufferPrivate.h>
@@ -87,7 +88,7 @@
     void setActive(bool);
     void canSwitchToType(const WebCore::ContentType&, CompletionHandler<void(bool)>&&);
     void setMode(WebCore::SourceBufferAppendMode);
-    void append(const IPC::DataReference&);
+    void append(const SharedMemory::IPCHandle&);
     void abort();
     void resetParserState();
     void removedFromMediaSource();

Modified: trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.messages.in (282864 => 282865)


--- trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.messages.in	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/GPUProcess/media/RemoteSourceBufferProxy.messages.in	2021-09-22 06:25:33 UTC (rev 282865)
@@ -29,7 +29,7 @@
     SetActive(bool active)
     CanSwitchToType(WebCore::ContentType contentType) -> (bool canSwitch) Synchronous
     SetMode(WebCore::SourceBufferAppendMode appendMode)
-    Append(IPC::DataReference data)
+    Append(WebKit::SharedMemory::IPCHandle data)
     Abort()
     ResetParserState()
     RemovedFromMediaSource()

Modified: trunk/Source/WebKit/Platform/SharedMemory.cpp (282864 => 282865)


--- trunk/Source/WebKit/Platform/SharedMemory.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/Platform/SharedMemory.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -48,6 +48,19 @@
     return sharedMemory;
 }
 
+Ref<SharedBuffer> SharedMemory::createSharedBuffer(size_t dataSize) const
+{
+    ASSERT(dataSize <= size());
+    return SharedBuffer::create({
+        [protectedThis = makeRef(*this)] () -> const uint8_t* {
+            return static_cast<const uint8_t*>(protectedThis->data());
+        },
+        [dataSize] () -> size_t {
+            return dataSize;
+        }
+    });
+}
+
 #if !PLATFORM(COCOA)
 void SharedMemory::Handle::takeOwnershipOfMemory(MemoryLedger) const
 {

Modified: trunk/Source/WebKit/Platform/SharedMemory.h (282864 => 282865)


--- trunk/Source/WebKit/Platform/SharedMemory.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/Platform/SharedMemory.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -153,6 +153,8 @@
     // Return the system page size in bytes.
     static unsigned systemPageSize();
 
+    Ref<WebCore::SharedBuffer> createSharedBuffer(size_t) const;
+
 private:
 #if OS(DARWIN)
     WTF::MachSendRight createSendRight(Protection) const;

Modified: trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.cpp (282864 => 282865)


--- trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.cpp	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.cpp	2021-09-22 06:25:33 UTC (rev 282865)
@@ -78,12 +78,19 @@
     m_gpuProcessConnection->messageReceiverMap().removeMessageReceiver(Messages::SourceBufferPrivateRemote::messageReceiverName(), m_remoteSourceBufferIdentifier.toUInt64());
 }
 
-void SourceBufferPrivateRemote::append(Vector<unsigned char>&& data)
+void SourceBufferPrivateRemote::append(Ref<SharedBuffer>&& data)
 {
     if (!m_gpuProcessConnection)
         return;
 
-    m_gpuProcessConnection->connection().send(Messages::RemoteSourceBufferProxy::Append(IPC::DataReference(data)), m_remoteSourceBufferIdentifier);
+    auto sharedData = SharedMemory::copyBuffer(data);
+    SharedMemory::Handle handle;
+    sharedData->createHandle(handle, SharedMemory::Protection::ReadOnly);
+
+    // Take ownership of shared memory and mark it as media-related memory.
+    handle.takeOwnershipOfMemory(MemoryLedger::Media);
+
+    m_gpuProcessConnection->connection().send(Messages::RemoteSourceBufferProxy::Append(SharedMemory::IPCHandle { WTFMove(handle), sharedData->size() }), m_remoteSourceBufferIdentifier);
 }
 
 void SourceBufferPrivateRemote::abort()

Modified: trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.h (282864 => 282865)


--- trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.h	2021-09-22 04:55:56 UTC (rev 282864)
+++ trunk/Source/WebKit/WebProcess/GPU/media/SourceBufferPrivateRemote.h	2021-09-22 06:25:33 UTC (rev 282865)
@@ -72,7 +72,7 @@
 
     // SourceBufferPrivate overrides
     void setActive(bool) final;
-    void append(Vector<unsigned char>&&) final;
+    void append(Ref<WebCore::SharedBuffer>&&) final;
     void abort() final;
     void resetParserState() final;
     void removedFromMediaSource() final;
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to