Title: [258732] trunk/Source/_javascript_Core
Revision
258732
Author
ysuz...@apple.com
Date
2020-03-19 14:59:22 -0700 (Thu, 19 Mar 2020)

Log Message

[JSC] StructureStubInfo::bufferedStructures should not ref/deref UniquedStringImpl
https://bugs.webkit.org/show_bug.cgi?id=209266
<rdar://problem/60508312>

Reviewed by Saam Barati.

StructureStubInfo::bufferedStructures includes RefPtr<UniquedStringImpl>. So destroying StructureStubInfo in
CodeBlock::finalizeUnconditionally can access to AtomStringTable, and get nullptr AtomStringTable since
CodeBlock::finalizeUnconditionally can be executed in heap-thread.

Temporarily setting AtomStringTable in the heap-thread when executing GC End phase is dangerous: Web worker's
JSC VM is releasing heapAccess when waiting for the next message in the RunLoop. This potentially means that
Web worker's main thread can run concurrently with Web worker's JSC VM's End phase heap-thread until the web
worker takes JSLock. (This is not a problem in WebCore since WebCore JSC VM never releases heapAccess. We cannot
take the same design since we would like to run End phase even if web worker is not getting any messages).

And removing resetJITData in CodeBlock::finalizeUnconditionally does not fix as well since CodeBlock::finalizeUnconditionally
calls StructureStubInfo::visitWeakReferences, and it removes some of entries of StructureStubInfo::bufferedStructures after
ByVal extension is introduced into StructureStubInfo.

This patch uses CacheableIdentifier for bufferedStructures. We make BufferedStructure class which holds Structure and CacheableIdentifier.
And StructureStubInfo holds HashSet<BufferedStructure>. We also visit CacheableIdentifier in StructureStubInfo::visitAggregate. To allow
concurrent collector to run this, we introduce m_bufferedStructuresLock in StructureStubInfo to guard m_bufferedStructures.

* bytecode/StructureStubInfo.cpp:
(JSC::StructureStubInfo::StructureStubInfo):
(JSC::StructureStubInfo::addAccessCase):
(JSC::StructureStubInfo::reset):
(JSC::StructureStubInfo::visitAggregate):
(JSC::StructureStubInfo::visitWeakReferences):
* bytecode/StructureStubInfo.h:
(JSC::StructureStubInfo::considerCaching):
(JSC::StructureStubInfo::getByIdSelfIdentifier):
(JSC::StructureStubInfo::cacheType const):
(JSC::StructureStubInfo::clearBufferedStructures):
(JSC::StructureStubInfo::BufferedStructure::BufferedStructure):
(JSC::StructureStubInfo::BufferedStructure::isHashTableDeletedValue const):
(JSC::StructureStubInfo::BufferedStructure::hash const):
(JSC::StructureStubInfo::BufferedStructure::operator==):
(JSC::StructureStubInfo::BufferedStructure::operator!=):
(JSC::StructureStubInfo::BufferedStructure::Hash::hash):
(JSC::StructureStubInfo::BufferedStructure::Hash::equal):
(JSC::StructureStubInfo::BufferedStructure::structure const):
(JSC::StructureStubInfo::BufferedStructure::byValId const):
* jit/JITOperations.cpp:
* runtime/CacheableIdentifier.h:
(JSC::CacheableIdentifier::hash const):

Modified Paths

Diff

Modified: trunk/Source/_javascript_Core/ChangeLog (258731 => 258732)


--- trunk/Source/_javascript_Core/ChangeLog	2020-03-19 21:57:00 UTC (rev 258731)
+++ trunk/Source/_javascript_Core/ChangeLog	2020-03-19 21:59:22 UTC (rev 258732)
@@ -1,5 +1,55 @@
 2020-03-19  Yusuke Suzuki  <ysuz...@apple.com>
 
+        [JSC] StructureStubInfo::bufferedStructures should not ref/deref UniquedStringImpl
+        https://bugs.webkit.org/show_bug.cgi?id=209266
+        <rdar://problem/60508312>
+
+        Reviewed by Saam Barati.
+
+        StructureStubInfo::bufferedStructures includes RefPtr<UniquedStringImpl>. So destroying StructureStubInfo in
+        CodeBlock::finalizeUnconditionally can access to AtomStringTable, and get nullptr AtomStringTable since
+        CodeBlock::finalizeUnconditionally can be executed in heap-thread.
+
+        Temporarily setting AtomStringTable in the heap-thread when executing GC End phase is dangerous: Web worker's
+        JSC VM is releasing heapAccess when waiting for the next message in the RunLoop. This potentially means that
+        Web worker's main thread can run concurrently with Web worker's JSC VM's End phase heap-thread until the web
+        worker takes JSLock. (This is not a problem in WebCore since WebCore JSC VM never releases heapAccess. We cannot
+        take the same design since we would like to run End phase even if web worker is not getting any messages).
+
+        And removing resetJITData in CodeBlock::finalizeUnconditionally does not fix as well since CodeBlock::finalizeUnconditionally
+        calls StructureStubInfo::visitWeakReferences, and it removes some of entries of StructureStubInfo::bufferedStructures after
+        ByVal extension is introduced into StructureStubInfo.
+
+        This patch uses CacheableIdentifier for bufferedStructures. We make BufferedStructure class which holds Structure and CacheableIdentifier.
+        And StructureStubInfo holds HashSet<BufferedStructure>. We also visit CacheableIdentifier in StructureStubInfo::visitAggregate. To allow
+        concurrent collector to run this, we introduce m_bufferedStructuresLock in StructureStubInfo to guard m_bufferedStructures.
+
+        * bytecode/StructureStubInfo.cpp:
+        (JSC::StructureStubInfo::StructureStubInfo):
+        (JSC::StructureStubInfo::addAccessCase):
+        (JSC::StructureStubInfo::reset):
+        (JSC::StructureStubInfo::visitAggregate):
+        (JSC::StructureStubInfo::visitWeakReferences):
+        * bytecode/StructureStubInfo.h:
+        (JSC::StructureStubInfo::considerCaching):
+        (JSC::StructureStubInfo::getByIdSelfIdentifier):
+        (JSC::StructureStubInfo::cacheType const):
+        (JSC::StructureStubInfo::clearBufferedStructures):
+        (JSC::StructureStubInfo::BufferedStructure::BufferedStructure):
+        (JSC::StructureStubInfo::BufferedStructure::isHashTableDeletedValue const):
+        (JSC::StructureStubInfo::BufferedStructure::hash const):
+        (JSC::StructureStubInfo::BufferedStructure::operator==):
+        (JSC::StructureStubInfo::BufferedStructure::operator!=):
+        (JSC::StructureStubInfo::BufferedStructure::Hash::hash):
+        (JSC::StructureStubInfo::BufferedStructure::Hash::equal):
+        (JSC::StructureStubInfo::BufferedStructure::structure const):
+        (JSC::StructureStubInfo::BufferedStructure::byValId const):
+        * jit/JITOperations.cpp:
+        * runtime/CacheableIdentifier.h:
+        (JSC::CacheableIdentifier::hash const):
+
+2020-03-19  Yusuke Suzuki  <ysuz...@apple.com>
+
         Unreviewed, build fix after r258717
         https://bugs.webkit.org/show_bug.cgi?id=199295
 

Modified: trunk/Source/_javascript_Core/bytecode/StructureStubInfo.cpp (258731 => 258732)


--- trunk/Source/_javascript_Core/bytecode/StructureStubInfo.cpp	2020-03-19 21:57:00 UTC (rev 258731)
+++ trunk/Source/_javascript_Core/bytecode/StructureStubInfo.cpp	2020-03-19 21:59:22 UTC (rev 258732)
@@ -42,10 +42,6 @@
 
 StructureStubInfo::StructureStubInfo(AccessType accessType)
     : accessType(accessType)
-    , m_cacheType(CacheType::Unset)
-    , countdown(1) // For a totally clear stub, we'll patch it after the first execution.
-    , repatchCount(0)
-    , numberOfCoolDowns(0)
     , bufferingCountdown(Options::repatchBufferingCountdown())
     , resetByGC(false)
     , tookSlowPath(false)
@@ -165,7 +161,7 @@
                 return result;
 
             if (!result.buffered()) {
-                bufferedStructures.clear();
+                clearBufferedStructures();
                 return result;
             }
         } else {
@@ -188,7 +184,7 @@
                 return result;
 
             if (!result.buffered()) {
-                bufferedStructures.clear();
+                clearBufferedStructures();
                 return result;
             }
             
@@ -204,7 +200,7 @@
         if (!result.buffered()) {
             if (StructureStubInfoInternal::verbose)
                 dataLog("Didn't buffer anything, bailing.\n");
-            bufferedStructures.clear();
+            clearBufferedStructures();
             return result;
         }
         
@@ -217,7 +213,7 @@
         
         // Forget the buffered structures so that all future attempts to cache get fully handled by the
         // PolymorphicAccess.
-        bufferedStructures.clear();
+        clearBufferedStructures();
         
         result = u.stub->regenerate(locker, vm, codeBlock, *this);
         
@@ -240,7 +236,7 @@
 
 void StructureStubInfo::reset(CodeBlock* codeBlock)
 {
-    bufferedStructures.clear();
+    clearBufferedStructures();
 
     if (m_cacheType == CacheType::Unset)
         return;
@@ -290,6 +286,11 @@
 
 void StructureStubInfo::visitAggregate(SlotVisitor& visitor)
 {
+    {
+        auto locker = holdLock(m_bufferedStructuresLock);
+        for (auto& bufferedStructure : m_bufferedStructures)
+            bufferedStructure.byValId().visitAggregate(visitor);
+    }
     switch (m_cacheType) {
     case CacheType::Unset:
     case CacheType::ArrayLength:
@@ -312,12 +313,13 @@
 void StructureStubInfo::visitWeakReferences(CodeBlock* codeBlock)
 {
     VM& vm = codeBlock->vm();
-    
-    bufferedStructures.removeIf(
-        [&] (auto& pair) -> bool {
-            Structure* structure = pair.first;
-            return !vm.heap.isMarked(structure);
-        });
+    {
+        auto locker = holdLock(m_bufferedStructuresLock);
+        m_bufferedStructures.removeIf(
+            [&] (auto& entry) -> bool {
+                return !vm.heap.isMarked(entry.structure());
+            });
+    }
 
     switch (m_cacheType) {
     case CacheType::GetByIdSelf:

Modified: trunk/Source/_javascript_Core/bytecode/StructureStubInfo.h (258731 => 258732)


--- trunk/Source/_javascript_Core/bytecode/StructureStubInfo.h	2020-03-19 21:57:00 UTC (rev 258731)
+++ trunk/Source/_javascript_Core/bytecode/StructureStubInfo.h	2020-03-19 21:59:22 UTC (rev 258732)
@@ -99,8 +99,87 @@
     // This returns true if it has marked everything that it will ever mark.
     bool propagateTransitions(SlotVisitor&);
         
-    ALWAYS_INLINE bool considerCaching(VM& vm, CodeBlock* codeBlock, Structure* structure, UniquedStringImpl* impl = nullptr)
+    StubInfoSummary summary(VM&) const;
+    
+    static StubInfoSummary summary(VM&, const StructureStubInfo*);
+
+    CacheableIdentifier getByIdSelfIdentifier()
     {
+        RELEASE_ASSERT(m_cacheType == CacheType::GetByIdSelf);
+        return m_getByIdSelfIdentifier;
+    }
+
+    bool containsPC(void* pc) const;
+
+    uint32_t inlineSize() const
+    {
+        int32_t inlineSize = MacroAssembler::differenceBetweenCodePtr(start, doneLocation);
+        ASSERT(inlineSize >= 0);
+        return inlineSize;
+    }
+
+    CodeLocationJump<JSInternalPtrTag> patchableJump()
+    { 
+        ASSERT(accessType == AccessType::InstanceOf);
+        return start.jumpAtOffset<JSInternalPtrTag>(0);
+    }
+
+    JSValueRegs valueRegs() const
+    {
+        return JSValueRegs(
+#if USE(JSVALUE32_64)
+            valueTagGPR,
+#endif
+            valueGPR);
+    }
+
+    JSValueRegs propertyRegs() const
+    {
+        return JSValueRegs(
+#if USE(JSVALUE32_64)
+            v.propertyTagGPR,
+#endif
+            regs.propertyGPR);
+    }
+
+    JSValueRegs baseRegs() const
+    {
+        return JSValueRegs(
+#if USE(JSVALUE32_64)
+            baseTagGPR,
+#endif
+            baseGPR);
+    }
+
+    bool thisValueIsInThisGPR() const { return accessType == AccessType::GetByIdWithThis; }
+
+#if ASSERT_ENABLED
+    void checkConsistency();
+#else
+    ALWAYS_INLINE void checkConsistency() { }
+#endif
+
+    CacheType cacheType() const { return m_cacheType; }
+
+    // Not ByVal and ById case: e.g. instanceof, by-index etc.
+    ALWAYS_INLINE bool considerCachingGeneric(VM& vm, CodeBlock* codeBlock, Structure* structure)
+    {
+        return considerCaching(vm, codeBlock, structure, CacheableIdentifier());
+    }
+
+    ALWAYS_INLINE bool considerCachingById(VM& vm, CodeBlock* codeBlock, Structure* structure)
+    {
+        return considerCaching(vm, codeBlock, structure, CacheableIdentifier());
+    }
+
+    ALWAYS_INLINE bool considerCachingByVal(VM& vm, CodeBlock* codeBlock, Structure* structure, CacheableIdentifier impl)
+    {
+        return considerCaching(vm, codeBlock, structure, impl);
+    }
+
+private:
+    ALWAYS_INLINE bool considerCaching(VM& vm, CodeBlock* codeBlock, Structure* structure, CacheableIdentifier impl)
+    {
         DisallowGC disallowGC;
 
         // We never cache non-cells.
@@ -115,7 +194,7 @@
         //
         // If we determine that we should do something to the IC then the next order of business is
         // to determine if this Structure would impact the IC at all. We know that it won't, if we
-        // have already buffered something on its behalf. That's what the bufferedStructures set is
+        // have already buffered something on its behalf. That's what the m_bufferedStructures set is
         // for.
         
         everConsidered = true;
@@ -159,7 +238,11 @@
             // NOTE: This will behave oddly for InstanceOf if the user varies the prototype but not
             // the base's structure. That seems unlikely for the canonical use of instanceof, where
             // the prototype is fixed.
-            bool isNewlyAdded = bufferedStructures.add({ structure, impl }).isNewEntry;
+            bool isNewlyAdded = false;
+            {
+                auto locker = holdLock(m_bufferedStructuresLock);
+                isNewlyAdded = m_bufferedStructures.add({ structure, impl }).isNewEntry;
+            }
             if (isNewlyAdded)
                 vm.heap.writeBarrier(codeBlock);
             return isNewlyAdded;
@@ -168,17 +251,72 @@
         return false;
     }
 
-    StubInfoSummary summary(VM&) const;
-    
-    static StubInfoSummary summary(VM&, const StructureStubInfo*);
+    void setCacheType(CacheType);
 
-    bool containsPC(void* pc) const;
+    void clearBufferedStructures()
+    {
+        auto locker = holdLock(m_bufferedStructuresLock);
+        m_bufferedStructures.clear();
+    }
 
+    class BufferedStructure {
+    public:
+        static constexpr uintptr_t hashTableDeletedValue = 0x2;
+        BufferedStructure() = default;
+        BufferedStructure(Structure* structure, CacheableIdentifier byValId)
+            : m_structure(structure)
+            , m_byValId(byValId)
+        { }
+        BufferedStructure(WTF::HashTableDeletedValueType)
+            : m_structure(bitwise_cast<Structure*>(hashTableDeletedValue))
+        { }
+
+        bool isHashTableDeletedValue() const { return bitwise_cast<uintptr_t>(m_structure) == hashTableDeletedValue; }
+
+        unsigned hash() const
+        {
+            unsigned hash = PtrHash<Structure*>::hash(m_structure);
+            if (m_byValId)
+                hash += m_byValId.hash();
+            return hash;
+        }
+
+        friend bool operator==(const BufferedStructure& a, const BufferedStructure& b)
+        {
+            return a.m_structure == b.m_structure && a.m_byValId == b.m_byValId;
+        }
+
+        friend bool operator!=(const BufferedStructure& a, const BufferedStructure& b)
+        {
+            return !(a == b);
+        }
+
+        struct Hash {
+            static unsigned hash(const BufferedStructure& key)
+            {
+                return key.hash();
+            }
+
+            static bool equal(const BufferedStructure& a, const BufferedStructure& b)
+            {
+                return a == b;
+            }
+
+            static constexpr bool safeToCompareToEmptyOrDeleted = false;
+        };
+        using KeyTraits = SimpleClassHashTraits<BufferedStructure>;
+        static_assert(KeyTraits::emptyValueIsZero, "Structure* and CacheableIdentifier are empty if they are zero-initialized");
+
+        Structure* structure() const { return m_structure; }
+        const CacheableIdentifier& byValId() const { return m_byValId; }
+
+    private:
+        Structure* m_structure { nullptr };
+        CacheableIdentifier m_byValId;
+    };
+
+public:
     CodeOrigin codeOrigin;
-private:
-    CacheableIdentifier m_getByIdSelfIdentifier;
-public:
-
     union {
         struct {
             WriteBarrierBase<Structure> baseObjectStructure;
@@ -186,21 +324,14 @@
         } byIdSelf;
         PolymorphicAccess* stub;
     } u;
-
-    CacheableIdentifier getByIdSelfIdentifier()
-    {
-        RELEASE_ASSERT(m_cacheType == CacheType::GetByIdSelf);
-        return m_getByIdSelfIdentifier;
-    }
-    
 private:
+    CacheableIdentifier m_getByIdSelfIdentifier;
     // Represents those structures that already have buffered AccessCases in the PolymorphicAccess.
     // Note that it's always safe to clear this. If we clear it prematurely, then if we see the same
     // structure again during this buffering countdown, we will create an AccessCase object for it.
     // That's not so bad - we'll get rid of the redundant ones once we regenerate.
-    HashSet<std::pair<Structure*, RefPtr<UniquedStringImpl>>> bufferedStructures;
+    HashSet<BufferedStructure, BufferedStructure::Hash, BufferedStructure::KeyTraits> m_bufferedStructures;
 public:
-    
     CodeLocationLabel<JITStubRoutinePtrTag> start; // This is either the start of the inline IC for *byId caches. or the location of patchable jump for 'instanceof' caches.
     CodeLocationLabel<JSInternalPtrTag> doneLocation;
     CodeLocationCall<JSInternalPtrTag> slowPathCallLocation;
@@ -208,13 +339,6 @@
 
     RegisterSet usedRegisters;
 
-    uint32_t inlineSize() const
-    {
-        int32_t inlineSize = MacroAssembler::differenceBetweenCodePtr(start, doneLocation);
-        ASSERT(inlineSize >= 0);
-        return inlineSize;
-    }
-
     GPRReg baseGPR;
     GPRReg valueGPR;
     union {
@@ -233,56 +357,15 @@
     } v;
 #endif
 
-    CodeLocationJump<JSInternalPtrTag> patchableJump()
-    { 
-        ASSERT(accessType == AccessType::InstanceOf);
-        return start.jumpAtOffset<JSInternalPtrTag>(0);
-    }
-
-    JSValueRegs valueRegs() const
-    {
-        return JSValueRegs(
-#if USE(JSVALUE32_64)
-            valueTagGPR,
-#endif
-            valueGPR);
-    }
-
-    JSValueRegs propertyRegs() const
-    {
-        return JSValueRegs(
-#if USE(JSVALUE32_64)
-            v.propertyTagGPR,
-#endif
-            regs.propertyGPR);
-    }
-
-    JSValueRegs baseRegs() const
-    {
-        return JSValueRegs(
-#if USE(JSVALUE32_64)
-            baseTagGPR,
-#endif
-            baseGPR);
-    }
-
-    bool thisValueIsInThisGPR() const { return accessType == AccessType::GetByIdWithThis; }
-
-#if ASSERT_ENABLED
-    void checkConsistency();
-#else
-    ALWAYS_INLINE void checkConsistency() { }
-#endif
-
     AccessType accessType;
 private:
-    CacheType m_cacheType;
-    void setCacheType(CacheType);
+    CacheType m_cacheType { CacheType::Unset };
 public:
-    CacheType cacheType() const { return m_cacheType; }
-    uint8_t countdown; // We repatch only when this is zero. If not zero, we decrement.
-    uint8_t repatchCount;
-    uint8_t numberOfCoolDowns;
+    // We repatch only when this is zero. If not zero, we decrement.
+    // Setting 1 for a totally clear stub, we'll patch it after the first execution.
+    uint8_t countdown { 1 };
+    uint8_t repatchCount { 0 };
+    uint8_t numberOfCoolDowns { 0 };
 
     CallSiteIndex callSiteIndex;
 
@@ -296,6 +379,8 @@
     bool propertyIsString : 1;
     bool propertyIsInt32 : 1;
     bool propertyIsSymbol : 1;
+private:
+    Lock m_bufferedStructuresLock;
 };
 
 inline CodeOrigin getStructureStubInfoCodeOrigin(StructureStubInfo& structureStubInfo)

Modified: trunk/Source/_javascript_Core/jit/JITOperations.cpp (258731 => 258732)


--- trunk/Source/_javascript_Core/jit/JITOperations.cpp	2020-03-19 21:57:00 UTC (rev 258731)
+++ trunk/Source/_javascript_Core/jit/JITOperations.cpp	2020-03-19 21:59:22 UTC (rev 258732)
@@ -213,7 +213,7 @@
     RETURN_IF_EXCEPTION(scope, encodedJSValue());
 
     CodeBlock* codeBlock = callFrame->codeBlock();
-    if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()) && !slot.isTaintedByOpaqueObject() && (slot.isCacheableValue() || slot.isCacheableGetter() || slot.isUnset()))
+    if (stubInfo->considerCachingById(vm, codeBlock, baseValue.structureOrNull()) && !slot.isTaintedByOpaqueObject() && (slot.isCacheableValue() || slot.isCacheableGetter() || slot.isUnset()))
         repatchGetBy(globalObject, codeBlock, baseValue, CacheableIdentifier::createFromIdentifierOwnedByCodeBlock(ident), slot, *stubInfo, GetByKind::Try);
 
     return JSValue::encode(slot.getPureResult());
@@ -269,7 +269,7 @@
     RETURN_IF_EXCEPTION(scope, encodedJSValue());
 
     CodeBlock* codeBlock = callFrame->codeBlock();
-    if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()))
+    if (stubInfo->considerCachingById(vm, codeBlock, baseValue.structureOrNull()))
         repatchGetBy(globalObject, codeBlock, baseValue, CacheableIdentifier::createFromIdentifierOwnedByCodeBlock(ident), slot, *stubInfo, GetByKind::Direct);
 
     RELEASE_AND_RETURN(scope, JSValue::encode(found ? slot.getValue(globalObject, ident) : jsUndefined()));
@@ -329,7 +329,7 @@
         LOG_IC((ICEvent::OperationGetByIdOptimize, baseValue.classInfoOrNull(vm), ident, baseValue == slot.slotBase()));
         
         CodeBlock* codeBlock = callFrame->codeBlock();
-        if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()))
+        if (stubInfo->considerCachingById(vm, codeBlock, baseValue.structureOrNull()))
             repatchGetBy(globalObject, codeBlock, baseValue, CacheableIdentifier::createFromIdentifierOwnedByCodeBlock(ident), slot, *stubInfo, GetByKind::Normal);
         return found ? slot.getValue(globalObject, ident) : jsUndefined();
     }));
@@ -386,7 +386,7 @@
         LOG_IC((ICEvent::OperationGetByIdWithThisOptimize, baseValue.classInfoOrNull(vm), ident, baseValue == slot.slotBase()));
         
         CodeBlock* codeBlock = callFrame->codeBlock();
-        if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()))
+        if (stubInfo->considerCachingById(vm, codeBlock, baseValue.structureOrNull()))
             repatchGetBy(globalObject, codeBlock, baseValue, CacheableIdentifier::createFromIdentifierOwnedByCodeBlock(ident), slot, *stubInfo, GetByKind::WithThis);
         return found ? slot.getValue(globalObject, ident) : jsUndefined();
     }));
@@ -468,7 +468,7 @@
     PropertySlot slot(baseObject, PropertySlot::InternalMethodType::HasProperty);
     bool found = baseObject->getPropertySlot(globalObject, ident, slot);
     CodeBlock* codeBlock = callFrame->codeBlock();
-    if (stubInfo->considerCaching(vm, codeBlock, baseObject->structure(vm)))
+    if (stubInfo->considerCachingById(vm, codeBlock, baseObject->structure(vm)))
         repatchInByID(globalObject, codeBlock, baseObject, ident, found, slot, *stubInfo);
     return JSValue::encode(jsBoolean(found));
 }
@@ -583,7 +583,7 @@
     if (accessType != static_cast<AccessType>(stubInfo->accessType))
         return;
     
-    if (stubInfo->considerCaching(vm, codeBlock, structure))
+    if (stubInfo->considerCachingById(vm, codeBlock, structure))
         repatchPutByID(globalObject, codeBlock, baseValue, structure, ident, slot, *stubInfo, NotDirect);
 }
 
@@ -614,7 +614,7 @@
     if (accessType != static_cast<AccessType>(stubInfo->accessType))
         return;
     
-    if (stubInfo->considerCaching(vm, codeBlock, structure))
+    if (stubInfo->considerCachingById(vm, codeBlock, structure))
         repatchPutByID(globalObject, codeBlock, baseValue, structure, ident, slot, *stubInfo, NotDirect);
 }
 
@@ -644,7 +644,7 @@
     if (accessType != static_cast<AccessType>(stubInfo->accessType))
         return;
     
-    if (stubInfo->considerCaching(vm, codeBlock, structure))
+    if (stubInfo->considerCachingById(vm, codeBlock, structure))
         repatchPutByID(globalObject, codeBlock, baseObject, structure, ident, slot, *stubInfo, Direct);
 }
 
@@ -674,7 +674,7 @@
     if (accessType != static_cast<AccessType>(stubInfo->accessType))
         return;
     
-    if (stubInfo->considerCaching(vm, codeBlock, structure))
+    if (stubInfo->considerCachingById(vm, codeBlock, structure))
         repatchPutByID(globalObject, codeBlock, baseObject, structure, ident, slot, *stubInfo, Direct);
 }
 
@@ -2001,7 +2001,7 @@
 
     if (baseValue.isCell() && subscript.isInt32()) {
         Structure* structure = baseValue.asCell()->structure(vm);
-        if (stubInfo->considerCaching(vm, codeBlock, structure)) {
+        if (stubInfo->considerCachingGeneric(vm, codeBlock, structure)) {
             if (profile) {
                 ConcurrentJSLocker locker(codeBlock->m_lock);
                 profile->computeUpdatedPrediction(locker, codeBlock, structure);
@@ -2018,8 +2018,9 @@
             return JSValue::encode(baseValue.getPropertySlot(globalObject, propertyName, [&] (bool found, PropertySlot& slot) -> JSValue {
                 LOG_IC((ICEvent::OperationGetByValOptimize, baseValue.classInfoOrNull(vm), propertyName, baseValue == slot.slotBase())); 
                 
-                if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull(), propertyName.impl()))
-                    repatchGetBy(globalObject, codeBlock, baseValue, CacheableIdentifier::createFromCell(subscript.asCell()), slot, *stubInfo, GetByKind::NormalByVal);
+                CacheableIdentifier identifier = CacheableIdentifier::createFromCell(subscript.asCell());
+                if (stubInfo->considerCachingByVal(vm, codeBlock, baseValue.structureOrNull(), identifier))
+                    repatchGetBy(globalObject, codeBlock, baseValue, identifier, slot, *stubInfo, GetByKind::NormalByVal);
                 return found ? slot.getValue(globalObject, propertyName) : jsUndefined();
             }));
         }
@@ -2174,7 +2175,7 @@
 
         if (!parseIndex(ident)) {
             CodeBlock* codeBlock = callFrame->codeBlock();
-            if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()))
+            if (stubInfo->considerCachingById(vm, codeBlock, baseValue.structureOrNull()))
                 repatchDeleteBy(globalObject, codeBlock, slot, baseValue, oldStructure, CacheableIdentifier::createFromIdentifierOwnedByCodeBlock(ident), *stubInfo, DelByKind::Normal);
         }
     }
@@ -2236,8 +2237,9 @@
 
         if (subscript.isSymbol() || !parseIndex(propertyName)) {
             CodeBlock* codeBlock = callFrame->codeBlock();
-            if (stubInfo->considerCaching(vm, codeBlock, baseValue.structureOrNull()))
-                repatchDeleteBy(globalObject, codeBlock, slot, baseValue, oldStructure, CacheableIdentifier::createFromCell(subscript.asCell()), *stubInfo, DelByKind::NormalByVal);
+            CacheableIdentifier identifier = CacheableIdentifier::createFromCell(subscript.asCell());
+            if (stubInfo->considerCachingByVal(vm, codeBlock, baseValue.structureOrNull(), identifier))
+                repatchDeleteBy(globalObject, codeBlock, slot, baseValue, oldStructure, identifier, *stubInfo, DelByKind::NormalByVal);
         }
     }
 
@@ -2316,7 +2318,7 @@
     RETURN_IF_EXCEPTION(scope, JSValue::encode(jsUndefined()));
     
     CodeBlock* codeBlock = callFrame->codeBlock();
-    if (stubInfo->considerCaching(vm, codeBlock, value.structureOrNull()))
+    if (stubInfo->considerCachingGeneric(vm, codeBlock, value.structureOrNull()))
         repatchInstanceOf(globalObject, codeBlock, value, proto, *stubInfo, result);
     
     return JSValue::encode(jsBoolean(result));

Modified: trunk/Source/_javascript_Core/runtime/CacheableIdentifier.h (258731 => 258732)


--- trunk/Source/_javascript_Core/runtime/CacheableIdentifier.h	2020-03-19 21:57:00 UTC (rev 258731)
+++ trunk/Source/_javascript_Core/runtime/CacheableIdentifier.h	2020-03-19 21:59:22 UTC (rev 258732)
@@ -69,6 +69,8 @@
 
     explicit operator bool() const { return m_bits; }
 
+    unsigned hash() const { return uid()->symbolAwareHash(); }
+
     CacheableIdentifier& operator=(const CacheableIdentifier&) = default;
     CacheableIdentifier& operator=(CacheableIdentifier&&) = default;
 
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to