Title: [187590] branches/jsc-tailcall/Source/_javascript_Core
Revision
187590
Author
basile_clem...@apple.com
Date
2015-07-30 11:21:36 -0700 (Thu, 30 Jul 2015)

Log Message

Merged r187505 from trunk.

    Simplify call linking
    https://bugs.webkit.org/show_bug.cgi?id=147363

    Reviewed by Filip Pizlo.

    Previously, we were passing both the CallLinkInfo and a
    (CodeSpecializationKind, RegisterPreservationMode) pair to the
    different call linking slow paths. However, the CallLinkInfo already
    has all of that information, and we don't gain anything by having them
    in additional static parameters - except possibly a very small
    performance gain in presence of inlining. However since those are
    already slow paths, this performance loss (if it exists) will not be
    visible in practice.

    This patch removes the various specialized thunks and JIT operations
    for regular and polymorphic call linking with a single thunk and
    operation for each case. Moreover, it removes the four specialized
    virtual call thunks and operations with one virtual call thunk for each
    call link info, allowing for better branch prediction by the CPU and
    fixing a pre-existing FIXME.

    * bytecode/CallLinkInfo.cpp:
    (JSC::CallLinkInfo::unlink):
    (JSC::CallLinkInfo::dummy): Deleted.
    * bytecode/CallLinkInfo.h:
    (JSC::CallLinkInfo::CallLinkInfo):
    (JSC::CallLinkInfo::registerPreservationMode):
    (JSC::CallLinkInfo::setUpCallFromFTL):
    (JSC::CallLinkInfo::setSlowStub):
    (JSC::CallLinkInfo::clearSlowStub):
    (JSC::CallLinkInfo::slowStub):
    * dfg/DFGDriver.cpp:
    (JSC::DFG::compileImpl):
    * dfg/DFGJITCompiler.cpp:
    (JSC::DFG::JITCompiler::link):
    * ftl/FTLJSCallBase.cpp:
    (JSC::FTL::JSCallBase::link):
    * jit/JITCall.cpp:
    (JSC::JIT::compileCallEvalSlowCase):
    (JSC::JIT::compileOpCall):
    (JSC::JIT::compileOpCallSlowCase):
    * jit/JITCall32_64.cpp:
    (JSC::JIT::compileCallEvalSlowCase):
    (JSC::JIT::compileOpCall):
    (JSC::JIT::compileOpCallSlowCase):
    * jit/JITOperations.cpp:
    * jit/JITOperations.h:
    (JSC::operationLinkFor): Deleted.
    (JSC::operationVirtualFor): Deleted.
    (JSC::operationLinkPolymorphicCallFor): Deleted.
    * jit/Repatch.cpp:
    (JSC::generateByIdStub):
    (JSC::linkSlowFor):
    (JSC::linkFor):
    (JSC::revertCall):
    (JSC::unlinkFor):
    (JSC::linkVirtualFor):
    (JSC::linkPolymorphicCall):
    * jit/Repatch.h:
    * jit/ThunkGenerators.cpp:
    (JSC::linkCallThunkGenerator):
    (JSC::linkPolymorphicCallThunkGenerator):
    (JSC::virtualThunkFor):
    (JSC::linkForThunkGenerator): Deleted.
    (JSC::linkConstructThunkGenerator): Deleted.
    (JSC::linkCallThatPreservesRegsThunkGenerator): Deleted.
    (JSC::linkConstructThatPreservesRegsThunkGenerator): Deleted.
    (JSC::linkPolymorphicCallForThunkGenerator): Deleted.
    (JSC::linkPolymorphicCallThatPreservesRegsThunkGenerator): Deleted.
    (JSC::virtualForThunkGenerator): Deleted.
    (JSC::virtualCallThunkGenerator): Deleted.
    (JSC::virtualConstructThunkGenerator): Deleted.
    (JSC::virtualCallThatPreservesRegsThunkGenerator): Deleted.
    (JSC::virtualConstructThatPreservesRegsThunkGenerator): Deleted.
    * jit/ThunkGenerators.h:
    (JSC::linkThunkGeneratorFor): Deleted.
    (JSC::linkPolymorphicCallThunkGeneratorFor): Deleted.
    (JSC::virtualThunkGeneratorFor): Deleted.

Modified Paths

Diff

Modified: branches/jsc-tailcall/Source/_javascript_Core/ChangeLog (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/ChangeLog	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/ChangeLog	2015-07-30 18:21:36 UTC (rev 187590)
@@ -1,5 +1,88 @@
 2015-07-23  Basile Clement  <basile_clem...@apple.com>
 
+        Merged r187505 from trunk.
+
+            Simplify call linking
+            https://bugs.webkit.org/show_bug.cgi?id=147363
+
+            Reviewed by Filip Pizlo.
+
+            Previously, we were passing both the CallLinkInfo and a
+            (CodeSpecializationKind, RegisterPreservationMode) pair to the
+            different call linking slow paths. However, the CallLinkInfo already
+            has all of that information, and we don't gain anything by having them
+            in additional static parameters - except possibly a very small
+            performance gain in presence of inlining. However since those are
+            already slow paths, this performance loss (if it exists) will not be
+            visible in practice.
+
+            This patch removes the various specialized thunks and JIT operations
+            for regular and polymorphic call linking with a single thunk and
+            operation for each case. Moreover, it removes the four specialized
+            virtual call thunks and operations with one virtual call thunk for each
+            call link info, allowing for better branch prediction by the CPU and
+            fixing a pre-existing FIXME.
+
+            * bytecode/CallLinkInfo.cpp:
+            (JSC::CallLinkInfo::unlink):
+            (JSC::CallLinkInfo::dummy): Deleted.
+            * bytecode/CallLinkInfo.h:
+            (JSC::CallLinkInfo::CallLinkInfo):
+            (JSC::CallLinkInfo::registerPreservationMode):
+            (JSC::CallLinkInfo::setUpCallFromFTL):
+            (JSC::CallLinkInfo::setSlowStub):
+            (JSC::CallLinkInfo::clearSlowStub):
+            (JSC::CallLinkInfo::slowStub):
+            * dfg/DFGDriver.cpp:
+            (JSC::DFG::compileImpl):
+            * dfg/DFGJITCompiler.cpp:
+            (JSC::DFG::JITCompiler::link):
+            * ftl/FTLJSCallBase.cpp:
+            (JSC::FTL::JSCallBase::link):
+            * jit/JITCall.cpp:
+            (JSC::JIT::compileCallEvalSlowCase):
+            (JSC::JIT::compileOpCall):
+            (JSC::JIT::compileOpCallSlowCase):
+            * jit/JITCall32_64.cpp:
+            (JSC::JIT::compileCallEvalSlowCase):
+            (JSC::JIT::compileOpCall):
+            (JSC::JIT::compileOpCallSlowCase):
+            * jit/JITOperations.cpp:
+            * jit/JITOperations.h:
+            (JSC::operationLinkFor): Deleted.
+            (JSC::operationVirtualFor): Deleted.
+            (JSC::operationLinkPolymorphicCallFor): Deleted.
+            * jit/Repatch.cpp:
+            (JSC::generateByIdStub):
+            (JSC::linkSlowFor):
+            (JSC::linkFor):
+            (JSC::revertCall):
+            (JSC::unlinkFor):
+            (JSC::linkVirtualFor):
+            (JSC::linkPolymorphicCall):
+            * jit/Repatch.h:
+            * jit/ThunkGenerators.cpp:
+            (JSC::linkCallThunkGenerator):
+            (JSC::linkPolymorphicCallThunkGenerator):
+            (JSC::virtualThunkFor):
+            (JSC::linkForThunkGenerator): Deleted.
+            (JSC::linkConstructThunkGenerator): Deleted.
+            (JSC::linkCallThatPreservesRegsThunkGenerator): Deleted.
+            (JSC::linkConstructThatPreservesRegsThunkGenerator): Deleted.
+            (JSC::linkPolymorphicCallForThunkGenerator): Deleted.
+            (JSC::linkPolymorphicCallThatPreservesRegsThunkGenerator): Deleted.
+            (JSC::virtualForThunkGenerator): Deleted.
+            (JSC::virtualCallThunkGenerator): Deleted.
+            (JSC::virtualConstructThunkGenerator): Deleted.
+            (JSC::virtualCallThatPreservesRegsThunkGenerator): Deleted.
+            (JSC::virtualConstructThatPreservesRegsThunkGenerator): Deleted.
+            * jit/ThunkGenerators.h:
+            (JSC::linkThunkGeneratorFor): Deleted.
+            (JSC::linkPolymorphicCallThunkGeneratorFor): Deleted.
+            (JSC::virtualThunkGeneratorFor): Deleted.
+
+2015-07-23  Basile Clement  <basile_clem...@apple.com>
+
         jsc-tailcall: Repatching tail calls as jump should depend on the opcode, not the JS CallLinkInfo
         https://bugs.webkit.org/show_bug.cgi?id=147243
 

Modified: branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -55,10 +55,7 @@
         return;
     }
     
-    unlinkFor(
-        repatchBuffer, *this,
-        (m_callType == Construct || m_callType == ConstructVarargs)? CodeForConstruct : CodeForCall,
-        m_isFTL ? MustPreserveRegisters : RegisterPreservationNotRequired);
+    unlinkFor(repatchBuffer, *this);
 
     // It will be on a list if the callee has a code block.
     if (isOnList())
@@ -104,12 +101,6 @@
     }
 }
 
-CallLinkInfo& CallLinkInfo::dummy()
-{
-    static NeverDestroyed<CallLinkInfo> dummy;
-    return dummy;
-}
-
 } // namespace JSC
 #endif // ENABLE(JIT)
 

Modified: branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.h (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.h	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/bytecode/CallLinkInfo.h	2015-07-30 18:21:36 UTC (rev 187590)
@@ -86,7 +86,7 @@
     }
     
     CallLinkInfo()
-        : m_isFTL(false)
+        : m_registerPreservationMode(static_cast<unsigned>(RegisterPreservationNotRequired))
         , m_hasSeenShouldRepatch(false)
         , m_hasSeenClosure(false)
         , m_clearedByGC(false)
@@ -113,6 +113,11 @@
         return specializationKindFor(static_cast<CallType>(m_callType));
     }
 
+    RegisterPreservationMode registerPreservationMode() const
+    {
+        return static_cast<RegisterPreservationMode>(m_registerPreservationMode);
+    }
+
     bool isLinked() { return m_stub || m_callee; }
     void unlink(RepatchBuffer&);
 
@@ -135,7 +140,7 @@
         CodeLocationNearCall callReturnLocation, CodeLocationDataLabelPtr hotPathBegin,
         CodeLocationNearCall hotPathOther, unsigned calleeGPR)
     {
-        m_isFTL = true;
+        m_registerPreservationMode = static_cast<unsigned>(MustPreserveRegisters);
         m_callType = callType;
         m_codeOrigin = codeOrigin;
         m_callReturnLocation = callReturnLocation;
@@ -207,9 +212,9 @@
         return m_stub.get();
     }
 
-    void setSlowStub(PassRefPtr<GCAwareJITStubRoutine> newStub)
+    void setSlowStub(PassRefPtr<JITStubRoutine> newSlowStub)
     {
-        m_slowStub = newStub;
+        m_slowStub = newSlowStub;
     }
 
     void clearSlowStub()
@@ -217,7 +222,7 @@
         m_slowStub = nullptr;
     }
 
-    GCAwareJITStubRoutine* slowStub()
+    JITStubRoutine* slowStub()
     {
         return m_slowStub.get();
     }
@@ -304,8 +309,6 @@
 
     void visitWeak(RepatchBuffer&);
 
-    static CallLinkInfo& dummy();
-
 private:
     CodeLocationNearCall m_callReturnLocation;
     CodeLocationDataLabelPtr m_hotPathBegin;
@@ -313,8 +316,8 @@
     JITWriteBarrier<JSFunction> m_callee;
     WriteBarrier<JSFunction> m_lastSeenCallee;
     RefPtr<PolymorphicCallStubRoutine> m_stub;
-    RefPtr<GCAwareJITStubRoutine> m_slowStub;
-    bool m_isFTL : 1;
+    RefPtr<JITStubRoutine> m_slowStub;
+    unsigned m_registerPreservationMode : 1; // Real type is RegisterPreservationMode
     bool m_hasSeenShouldRepatch : 1;
     bool m_hasSeenClosure : 1;
     bool m_clearedByGC : 1;

Modified: branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGDriver.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGDriver.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGDriver.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -81,19 +81,8 @@
     // make sure that all JIT code generation does finalization on the main thread.
     vm.getCTIStub(osrExitGenerationThunkGenerator);
     vm.getCTIStub(throwExceptionFromCallSlowPathGenerator);
-    if (mode == DFGMode) {
-        vm.getCTIStub(linkCallThunkGenerator);
-        vm.getCTIStub(linkConstructThunkGenerator);
-        vm.getCTIStub(linkPolymorphicCallThunkGenerator);
-        vm.getCTIStub(virtualCallThunkGenerator);
-        vm.getCTIStub(virtualConstructThunkGenerator);
-    } else {
-        vm.getCTIStub(linkCallThatPreservesRegsThunkGenerator);
-        vm.getCTIStub(linkConstructThatPreservesRegsThunkGenerator);
-        vm.getCTIStub(linkPolymorphicCallThatPreservesRegsThunkGenerator);
-        vm.getCTIStub(virtualCallThatPreservesRegsThunkGenerator);
-        vm.getCTIStub(virtualConstructThatPreservesRegsThunkGenerator);
-    }
+    vm.getCTIStub(linkCallThunkGenerator);
+    vm.getCTIStub(linkPolymorphicCallThunkGenerator);
     
     if (vm.typeProfiler())
         vm.typeProfilerLog()->processLogEntries(ASCIILiteral("Preparing for DFG compilation."));

Modified: branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGJITCompiler.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGJITCompiler.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/dfg/DFGJITCompiler.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -243,9 +243,7 @@
     for (unsigned i = 0; i < m_jsCalls.size(); ++i) {
         JSCallRecord& record = m_jsCalls[i];
         CallLinkInfo& info = *record.m_info;
-        linkBuffer.link(
-            record.m_slowCall,
-            FunctionPtr(linkCallThunk(vm(), info, info.specializationKind(), RegisterPreservationNotRequired).code().executableAddress()));
+        linkBuffer.link(record.m_slowCall, FunctionPtr(m_vm->getCTIStub(linkCallThunkGenerator).code().executableAddress()));
         info.setCallLocations(linkBuffer.locationOfNearCall(record.m_slowCall),
             linkBuffer.locationOf(record.m_targetToCheck),
             linkBuffer.locationOfNearCall(record.m_fastCall));

Modified: branches/jsc-tailcall/Source/_javascript_Core/ftl/FTLJSCallBase.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/ftl/FTLJSCallBase.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/ftl/FTLJSCallBase.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -69,11 +69,8 @@
 
 void JSCallBase::link(VM& vm, LinkBuffer& linkBuffer)
 {
-    MacroAssemblerCodeRef codeRef =
-        linkCallThunk(&vm, *m_callLinkInfo, CallLinkInfo::specializationKindFor(m_type), MustPreserveRegisters);
-    
     linkBuffer.link(
-        m_slowCall, FunctionPtr(codeRef.code().executableAddress()));
+        m_slowCall, FunctionPtr(vm.getCTIStub(linkCallThunkGenerator).code().executableAddress()));
 
     m_callLinkInfo->setUpCallFromFTL(m_type, m_origin, linkBuffer.locationOfNearCall(m_slowCall),
         linkBuffer.locationOf(m_targetToCheck), linkBuffer.locationOfNearCall(m_fastCall),

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -106,14 +106,19 @@
 
 void JIT::compileCallEvalSlowCase(Instruction* instruction, Vector<SlowCaseEntry>::iterator& iter)
 {
+    CallLinkInfo* info = m_codeBlock->addCallLinkInfo();
+    info->setUpCall(CallLinkInfo::Call, CodeOrigin(m_bytecodeOffset), regT0);
+
     linkSlowCase(iter);
     int registerOffset = -instruction[4].u.operand;
 
     addPtr(TrustedImm32(registerOffset * sizeof(Register) + sizeof(CallerFrameAndPC)), callFrameRegister, stackPointerRegister);
 
     load64(Address(stackPointerRegister, sizeof(Register) * JSStack::Callee - sizeof(CallerFrameAndPC)), regT0);
-    move(TrustedImmPtr(&CallLinkInfo::dummy()), regT2);
-    emitNakedCall(m_vm->getCTIStub(virtualCallThunkGenerator).code());
+    move(TrustedImmPtr(info), regT2);
+    MacroAssemblerCodeRef virtualThunk = virtualThunkFor(m_vm, *info);
+    info->setSlowStub(createJITStubRoutine(virtualThunk, *m_vm, nullptr, true));
+    emitNakedCall(virtualThunk.code());
     addPtr(TrustedImm32(stackPointerOffsetFor(m_codeBlock) * sizeof(Register)), callFrameRegister, stackPointerRegister);
     checkStackPointerAlignment();
 
@@ -124,8 +129,6 @@
 
 void JIT::compileOpCall(OpcodeID opcodeID, Instruction* instruction, unsigned callLinkInfoIndex)
 {
-    CallLinkInfo* info = m_codeBlock->addCallLinkInfo();
-
     int callee = instruction[2].u.operand;
 
     /* Caller always:
@@ -145,6 +148,9 @@
     COMPILE_ASSERT(OPCODE_LENGTH(op_call) == OPCODE_LENGTH(op_construct_varargs), call_and_construct_varargs_opcodes_must_be_same_length);
     COMPILE_ASSERT(OPCODE_LENGTH(op_call) == OPCODE_LENGTH(op_tail_call), call_and_tail_call_opcodes_must_be_same_length);
     COMPILE_ASSERT(OPCODE_LENGTH(op_call) == OPCODE_LENGTH(op_tail_call_varargs), call_and_tail_call_varargs_opcodes_must_be_same_length);
+    CallLinkInfo* info;
+    if (opcodeID != op_call_eval)
+        info = m_codeBlock->addCallLinkInfo();
     if (opcodeID == op_call_varargs || opcodeID == op_construct_varargs || opcodeID == op_tail_call_varargs)
         compileSetupVarargsFrame(instruction, info);
     else {
@@ -212,14 +218,9 @@
 
     linkSlowCase(iter);
 
-    MacroAssemblerCodeRef codeRef =
-        linkCallThunk(m_vm, *m_callCompilationInfo[callLinkInfoIndex].callLinkInfo,
-            opcodeID == op_construct || opcodeID == op_construct_varargs ? CodeForConstruct : CodeForCall,
-            RegisterPreservationNotRequired);
-
     move(TrustedImmPtr(m_callCompilationInfo[callLinkInfoIndex].callLinkInfo), regT2);
 
-    m_callCompilationInfo[callLinkInfoIndex].callReturnLocation = emitNakedCall(codeRef.code());
+    m_callCompilationInfo[callLinkInfoIndex].callReturnLocation = emitNakedCall(m_vm->getCTIStub(linkCallThunkGenerator).code());
 
     if (opcodeID == op_tail_call || opcodeID == op_tail_call_varargs) {
         // We must never come back here

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall32_64.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall32_64.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/JITCall32_64.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -186,6 +186,9 @@
 
 void JIT::compileCallEvalSlowCase(Instruction* instruction, Vector<SlowCaseEntry>::iterator& iter)
 {
+    CallLinkInfo* info = m_codeBlock->addCallLinkInfo();
+    info->setUpCall(CallLinkInfo::Call, CodeOrigin(m_bytecodeOffset), regT0);
+
     linkSlowCase(iter);
 
     int registerOffset = -instruction[4].u.operand;
@@ -194,10 +197,12 @@
 
     loadPtr(Address(stackPointerRegister, sizeof(Register) * JSStack::Callee - sizeof(CallerFrameAndPC)), regT0);
     loadPtr(Address(stackPointerRegister, sizeof(Register) * JSStack::Callee - sizeof(CallerFrameAndPC)), regT1);
-    move(TrustedImmPtr(&CallLinkInfo::dummy()), regT2);
+    move(TrustedImmPtr(info), regT2);
 
     emitLoad(JSStack::Callee, regT1, regT0);
-    emitNakedCall(m_vm->getCTIStub(virtualCallThunkGenerator).code());
+    MacroAssemblerCodeRef virtualThunk = virtualThunkFor(m_vm, *info);
+    info->setSlowStub(createJITStubRoutine(virtualThunk, *m_vm, nullptr, true));
+    emitNakedCall(virtualThunk.code());
     addPtr(TrustedImm32(stackPointerOffsetFor(m_codeBlock) * sizeof(Register)), callFrameRegister, stackPointerRegister);
     checkStackPointerAlignment();
 
@@ -208,7 +213,6 @@
 
 void JIT::compileOpCall(OpcodeID opcodeID, Instruction* instruction, unsigned callLinkInfoIndex)
 {
-    CallLinkInfo* info = m_codeBlock->addCallLinkInfo();
     int callee = instruction[2].u.operand;
 
     /* Caller always:
@@ -223,7 +227,9 @@
         - Caller initializes ReturnPC; CodeBlock.
         - Caller restores callFrameRegister after return.
     */
-    
+    CallLinkInfo* info;
+    if (opcodeID != op_call_eval)
+        info = m_codeBlock->addCallLinkInfo();
     if (opcodeID == op_call_varargs || opcodeID == op_construct_varargs || opcodeID == op_tail_call_varargs)
         compileSetupVarargsFrame(instruction, info);
     else {
@@ -296,14 +302,9 @@
     linkSlowCase(iter);
     linkSlowCase(iter);
 
-    MacroAssemblerCodeRef codeRef =
-        linkCallThunk(m_vm, *m_callCompilationInfo[callLinkInfoIndex].callLinkInfo,
-            opcodeID == op_construct || opcodeID == op_construct_varargs ? CodeForConstruct : CodeForCall,
-            RegisterPreservationNotRequired);
-    
     move(TrustedImmPtr(m_callCompilationInfo[callLinkInfoIndex].callLinkInfo), regT2);
 
-    m_callCompilationInfo[callLinkInfoIndex].callReturnLocation = emitNakedCall(codeRef.code());
+    m_callCompilationInfo[callLinkInfoIndex].callReturnLocation = emitNakedCall(m_vm->getCTIStub(linkCallThunkGenerator).code());
 
     if (opcodeID == op_tail_call || opcodeID == op_tail_call_varargs) {
         // We must never come back here

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -755,12 +755,11 @@
     return encodeResult(vm->getCTIStub(throwExceptionFromCallSlowPathGenerator).code().executableAddress(), reinterpret_cast<void*>(0));
 }
 
-inline SlowPathReturnType linkFor(
-    ExecState* execCallee, CallLinkInfo* callLinkInfo, CodeSpecializationKind kind,
-    RegisterPreservationMode registers)
+SlowPathReturnType JIT_OPERATION operationLinkCall(ExecState* execCallee, CallLinkInfo* callLinkInfo)
 {
     ExecState* exec = execCallee->callerFrame();
     VM* vm = &exec->vm();
+    CodeSpecializationKind kind = callLinkInfo->specializationKind();
     NativeCallFrameTracer tracer(vm, exec);
     
     JSValue calleeAsValue = execCallee->calleeAsValue();
@@ -779,7 +778,7 @@
     MacroAssemblerCodePtr codePtr;
     CodeBlock* codeBlock = 0;
     if (executable->isHostFunction())
-        codePtr = executable->entrypointFor(*vm, kind, MustCheckArity, registers);
+        codePtr = executable->entrypointFor(*vm, kind, MustCheckArity, callLinkInfo->registerPreservationMode());
     else {
         FunctionExecutable* functionExecutable = static_cast<FunctionExecutable*>(executable);
 
@@ -799,42 +798,22 @@
             arity = MustCheckArity;
         else
             arity = ArityCheckNotRequired;
-        codePtr = functionExecutable->entrypointFor(*vm, kind, arity, registers);
+        codePtr = functionExecutable->entrypointFor(*vm, kind, arity, callLinkInfo->registerPreservationMode());
     }
     if (!callLinkInfo->seenOnce())
         callLinkInfo->setSeen();
     else
-        linkFor(execCallee, *callLinkInfo, codeBlock, callee, codePtr, kind, registers);
-
+        linkFor(execCallee, *callLinkInfo, codeBlock, callee, codePtr);
+    
     return encodeResult(codePtr.executableAddress(), reinterpret_cast<void*>(1));
 }
 
-SlowPathReturnType JIT_OPERATION operationLinkCall(ExecState* execCallee, CallLinkInfo* callLinkInfo)
-{
-    return linkFor(execCallee, callLinkInfo, CodeForCall, RegisterPreservationNotRequired);
-}
-
-SlowPathReturnType JIT_OPERATION operationLinkConstruct(ExecState* execCallee, CallLinkInfo* callLinkInfo)
-{
-    return linkFor(execCallee, callLinkInfo, CodeForConstruct, RegisterPreservationNotRequired);
-}
-
-SlowPathReturnType JIT_OPERATION operationLinkCallThatPreservesRegs(ExecState* execCallee, CallLinkInfo* callLinkInfo)
-{
-    return linkFor(execCallee, callLinkInfo, CodeForCall, MustPreserveRegisters);
-}
-
-SlowPathReturnType JIT_OPERATION operationLinkConstructThatPreservesRegs(ExecState* execCallee, CallLinkInfo* callLinkInfo)
-{
-    return linkFor(execCallee, callLinkInfo, CodeForConstruct, MustPreserveRegisters);
-}
-
 inline SlowPathReturnType virtualForWithFunction(
-    ExecState* execCallee, CodeSpecializationKind kind, RegisterPreservationMode registers,
-    JSCell*& calleeAsFunctionCell)
+    ExecState* execCallee, CallLinkInfo* callLinkInfo, JSCell*& calleeAsFunctionCell)
 {
     ExecState* exec = execCallee->callerFrame();
     VM* vm = &exec->vm();
+    CodeSpecializationKind kind = callLinkInfo->specializationKind();
     NativeCallFrameTracer tracer(vm, exec);
 
     JSValue calleeAsValue = execCallee->calleeAsValue();
@@ -860,56 +839,27 @@
         }
     }
     return encodeResult(executable->entrypointFor(
-        *vm, kind, MustCheckArity, registers).executableAddress(), reinterpret_cast<void*>(1));
+        *vm, kind, MustCheckArity, callLinkInfo->registerPreservationMode()).executableAddress(),
+        reinterpret_cast<void*>(1));
 }
 
-inline SlowPathReturnType virtualFor(
-    ExecState* execCallee, CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
-    JSCell* calleeAsFunctionCellIgnored;
-    return virtualForWithFunction(execCallee, kind, registers, calleeAsFunctionCellIgnored);
-}
-
 SlowPathReturnType JIT_OPERATION operationLinkPolymorphicCall(ExecState* execCallee, CallLinkInfo* callLinkInfo)
 {
+    ASSERT(callLinkInfo->specializationKind() == CodeForCall);
     JSCell* calleeAsFunctionCell;
-    SlowPathReturnType result = virtualForWithFunction(execCallee, CodeForCall, RegisterPreservationNotRequired, calleeAsFunctionCell);
+    SlowPathReturnType result = virtualForWithFunction(execCallee, callLinkInfo, calleeAsFunctionCell);
 
-    linkPolymorphicCall(execCallee, *callLinkInfo, CallVariant(calleeAsFunctionCell), RegisterPreservationNotRequired);
+    linkPolymorphicCall(execCallee, *callLinkInfo, CallVariant(calleeAsFunctionCell));
     
     return result;
 }
 
-SlowPathReturnType JIT_OPERATION operationVirtualCall(ExecState* execCallee, CallLinkInfo*)
-{    
-    return virtualFor(execCallee, CodeForCall, RegisterPreservationNotRequired);
-}
-
-SlowPathReturnType JIT_OPERATION operationVirtualConstruct(ExecState* execCallee, CallLinkInfo*)
+SlowPathReturnType JIT_OPERATION operationVirtualCall(ExecState* execCallee, CallLinkInfo* callLinkInfo)
 {
-    return virtualFor(execCallee, CodeForConstruct, RegisterPreservationNotRequired);
+    JSCell* calleeAsFunctionCellIgnored;
+    return virtualForWithFunction(execCallee, callLinkInfo, calleeAsFunctionCellIgnored);
 }
 
-SlowPathReturnType JIT_OPERATION operationLinkPolymorphicCallThatPreservesRegs(ExecState* execCallee, CallLinkInfo* callLinkInfo)
-{
-    JSCell* calleeAsFunctionCell;
-    SlowPathReturnType result = virtualForWithFunction(execCallee, CodeForCall, MustPreserveRegisters, calleeAsFunctionCell);
-
-    linkPolymorphicCall(execCallee, *callLinkInfo, CallVariant(calleeAsFunctionCell), MustPreserveRegisters);
-    
-    return result;
-}
-
-SlowPathReturnType JIT_OPERATION operationVirtualCallThatPreservesRegs(ExecState* execCallee, CallLinkInfo*)
-{    
-    return virtualFor(execCallee, CodeForCall, MustPreserveRegisters);
-}
-
-SlowPathReturnType JIT_OPERATION operationVirtualConstructThatPreservesRegs(ExecState* execCallee, CallLinkInfo*)
-{
-    return virtualFor(execCallee, CodeForConstruct, MustPreserveRegisters);
-}
-
 size_t JIT_OPERATION operationCompareLess(ExecState* exec, EncodedJSValue encodedOp1, EncodedJSValue encodedOp2)
 {
     VM* vm = &exec->vm();

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.h (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.h	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/JITOperations.h	2015-07-30 18:21:36 UTC (rev 187590)
@@ -267,13 +267,6 @@
 SlowPathReturnType JIT_OPERATION operationLinkCall(ExecState*, CallLinkInfo*) WTF_INTERNAL;
 SlowPathReturnType JIT_OPERATION operationLinkPolymorphicCall(ExecState*, CallLinkInfo*) WTF_INTERNAL;
 SlowPathReturnType JIT_OPERATION operationVirtualCall(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationVirtualConstruct(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationLinkConstruct(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationLinkCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationLinkPolymorphicCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationVirtualCallThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationVirtualConstructThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;
-SlowPathReturnType JIT_OPERATION operationLinkConstructThatPreservesRegs(ExecState*, CallLinkInfo*) WTF_INTERNAL;
 
 size_t JIT_OPERATION operationCompareLess(ExecState*, EncodedJSValue, EncodedJSValue) WTF_INTERNAL;
 size_t JIT_OPERATION operationCompareLessEq(ExecState*, EncodedJSValue, EncodedJSValue) WTF_INTERNAL;
@@ -357,68 +350,6 @@
 
 } // extern "C"
 
-inline Sprt_JITOperation_ECli operationLinkFor(
-    CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
-    switch (kind) {
-    case CodeForCall:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            return operationLinkCall;
-        case MustPreserveRegisters:
-            return operationLinkCallThatPreservesRegs;
-        }
-        break;
-    case CodeForConstruct:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            return operationLinkConstruct;
-        case MustPreserveRegisters:
-            return operationLinkConstructThatPreservesRegs;
-        }
-        break;
-    }
-    RELEASE_ASSERT_NOT_REACHED();
-    return 0;
-}
-
-inline Sprt_JITOperation_ECli operationVirtualFor(
-    CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
-    switch (kind) {
-    case CodeForCall:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            return operationVirtualCall;
-        case MustPreserveRegisters:
-            return operationVirtualCallThatPreservesRegs;
-        }
-        break;
-    case CodeForConstruct:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            return operationVirtualConstruct;
-        case MustPreserveRegisters:
-            return operationVirtualConstructThatPreservesRegs;
-        }
-        break;
-    }
-    RELEASE_ASSERT_NOT_REACHED();
-    return 0;
-}
-
-inline Sprt_JITOperation_ECli operationLinkPolymorphicCallFor(RegisterPreservationMode registers)
-{
-    switch (registers) {
-    case RegisterPreservationNotRequired:
-        return operationLinkPolymorphicCall;
-    case MustPreserveRegisters:
-        return operationLinkPolymorphicCallThatPreservesRegs;
-    }
-    RELEASE_ASSERT_NOT_REACHED();
-    return 0;
-}
-
 } // namespace JSC
 
 #endif // ENABLE(JIT)

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -587,9 +587,7 @@
             patchBuffer.locationOfNearCall(fastPathCall));
 
         patchBuffer.link(
-            slowPathCall, FunctionPtr(linkCallThunk(vm, *callLinkInfo, CodeForCall, RegisterPreservationNotRequired).code().executableAddress()));
-        // This should always use a global stub from the VM
-        ASSERT(!callLinkInfo->slowStub());
+            slowPathCall, CodeLocationLabel(vm->getCTIStub(linkCallThunkGenerator).code()));
     }
     
     MacroAssemblerCodeRef code = FINALIZE_CODE_FOR(
@@ -1601,18 +1599,29 @@
 }
 
 static void linkSlowFor(
-    RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo,
-    CodeSpecializationKind kind, RegisterPreservationMode registers)
+    RepatchBuffer& repatchBuffer, VM*, CallLinkInfo& callLinkInfo, MacroAssemblerCodeRef codeRef)
 {
     repatchBuffer.relink(
-        callLinkInfo.callReturnLocation(),
-        virtualThunk(vm, callLinkInfo, kind, registers).code());
+        callLinkInfo.callReturnLocation(), codeRef.code());
 }
 
+static void linkSlowFor(
+    RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo, ThunkGenerator generator)
+{
+    linkSlowFor(repatchBuffer, vm, callLinkInfo, vm->getCTIStub(generator));
+}
+
+static void linkSlowFor(
+    RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo)
+{
+    MacroAssemblerCodeRef virtualThunk = virtualThunkFor(vm, callLinkInfo);
+    linkSlowFor(repatchBuffer, vm, callLinkInfo, virtualThunk);
+    callLinkInfo.setSlowStub(createJITStubRoutine(virtualThunk, *vm, nullptr, true));
+}
+
 void linkFor(
     ExecState* exec, CallLinkInfo& callLinkInfo, CodeBlock* calleeCodeBlock,
-    JSFunction* callee, MacroAssemblerCodePtr codePtr, CodeSpecializationKind kind,
-    RegisterPreservationMode registers)
+    JSFunction* callee, MacroAssemblerCodePtr codePtr)
 {
     ASSERT(!callLinkInfo.stub());
     
@@ -1632,38 +1641,34 @@
     if (calleeCodeBlock)
         calleeCodeBlock->linkIncomingCall(exec->callerFrame(), &callLinkInfo);
     
-    if (kind == CodeForCall) {
-        repatchBuffer.relink(
-            callLinkInfo.callReturnLocation(),
-            linkPolymorphicCallThunk(vm, callLinkInfo, registers).code());
+    if (callLinkInfo.specializationKind() == CodeForCall) {
+        linkSlowFor(
+            repatchBuffer, vm, callLinkInfo, linkPolymorphicCallThunkGenerator);
         return;
     }
     
-    ASSERT(kind == CodeForConstruct);
-    ASSERT(!CallLinkInfo::isTailCallType(callLinkInfo.callType()));
-    linkSlowFor(repatchBuffer, vm, callLinkInfo, CodeForConstruct, registers);
+    ASSERT(callLinkInfo.specializationKind() == CodeForConstruct);
+    linkSlowFor(repatchBuffer, vm, callLinkInfo);
 }
 
 void linkSlowFor(
-    ExecState* exec, CallLinkInfo& callLinkInfo, CodeSpecializationKind kind,
-    RegisterPreservationMode registers)
+    ExecState* exec, CallLinkInfo& callLinkInfo)
 {
     CodeBlock* callerCodeBlock = exec->callerFrame()->codeBlock();
     VM* vm = callerCodeBlock->vm();
     
     RepatchBuffer repatchBuffer(callerCodeBlock);
     
-    linkSlowFor(repatchBuffer, vm, callLinkInfo, kind, registers);
+    linkSlowFor(repatchBuffer, vm, callLinkInfo);
 }
 
 static void revertCall(
-    RepatchBuffer& repatchBuffer, CallLinkInfo& callLinkInfo, MacroAssemblerCodeRef codeRef)
+    RepatchBuffer& repatchBuffer, VM* vm, CallLinkInfo& callLinkInfo, MacroAssemblerCodeRef codeRef)
 {
     repatchBuffer.revertJumpReplacementToBranchPtrWithPatch(
         RepatchBuffer::startOfBranchPtrWithPatchOnRegister(callLinkInfo.hotPathBegin()),
         static_cast<MacroAssembler::RegisterID>(callLinkInfo.calleeGPR()), 0);
-    repatchBuffer.relink(
-        callLinkInfo.callReturnLocation(), codeRef.code());
+    linkSlowFor(repatchBuffer, vm, callLinkInfo, codeRef);
     callLinkInfo.clearSeen();
     callLinkInfo.clearCallee();
     callLinkInfo.clearStub();
@@ -1673,24 +1678,18 @@
 }
 
 void unlinkFor(
-    RepatchBuffer& repatchBuffer, CallLinkInfo& callLinkInfo,
-    CodeSpecializationKind kind, RegisterPreservationMode registers)
+    RepatchBuffer& repatchBuffer, CallLinkInfo& callLinkInfo)
 {
     if (Options::showDisassembly())
         dataLog("Unlinking call from ", callLinkInfo.callReturnLocation(), " in request from ", pointerDump(repatchBuffer.codeBlock()), "\n");
     
-    revertCall(
-        repatchBuffer, callLinkInfo,
-        linkCallThunk(repatchBuffer.codeBlock()->vm(), callLinkInfo, kind, registers));
+    VM* vm = repatchBuffer.codeBlock()->vm();
+    revertCall(repatchBuffer, vm, callLinkInfo, vm->getCTIStub(linkCallThunkGenerator));
 }
 
 void linkVirtualFor(
-    ExecState* exec, CallLinkInfo& callLinkInfo,
-    CodeSpecializationKind kind, RegisterPreservationMode registers)
+    ExecState* exec, CallLinkInfo& callLinkInfo)
 {
-    // FIXME: We could generate a virtual call stub here. This would lead to faster virtual calls
-    // by eliminating the branch prediction bottleneck inside the shared virtual call thunk.
-
     CodeBlock* callerCodeBlock = exec->callerFrame()->codeBlock();
     VM* vm = callerCodeBlock->vm();
 
@@ -1698,7 +1697,9 @@
         dataLog("Linking virtual call at ", *callerCodeBlock, " ", exec->callerFrame()->codeOrigin(), "\n");
     
     RepatchBuffer repatchBuffer(callerCodeBlock);
-    revertCall(repatchBuffer, callLinkInfo, virtualThunk(vm, callLinkInfo, kind, registers));
+    MacroAssemblerCodeRef virtualThunk = virtualThunkFor(vm, callLinkInfo);
+    revertCall(repatchBuffer, vm, callLinkInfo, virtualThunk);
+    callLinkInfo.setSlowStub(createJITStubRoutine(virtualThunk, *vm, nullptr, true));
 }
 
 namespace {
@@ -1709,13 +1710,12 @@
 } // annonymous namespace
 
 void linkPolymorphicCall(
-    ExecState* exec, CallLinkInfo& callLinkInfo, CallVariant newVariant,
-    RegisterPreservationMode registers)
+    ExecState* exec, CallLinkInfo& callLinkInfo, CallVariant newVariant)
 {
     // Currently we can't do anything for non-function callees.
     // https://bugs.webkit.org/show_bug.cgi?id=140685
     if (!newVariant || !newVariant.executable()) {
-        linkVirtualFor(exec, callLinkInfo, CodeForCall, registers);
+        linkVirtualFor(exec, callLinkInfo);
         return;
     }
     
@@ -1758,7 +1758,7 @@
             // If we cannot handle a callee, assume that it's better for this whole thing to be a
             // virtual call.
             if (exec->argumentCountIncludingThis() < static_cast<size_t>(codeBlock->numParameters()) || CallLinkInfo::isVarargsCallType(callLinkInfo.callType())) {
-                linkVirtualFor(exec, callLinkInfo, CodeForCall, registers);
+                linkVirtualFor(exec, callLinkInfo);
                 return;
             }
         }
@@ -1773,7 +1773,7 @@
     else
         maxPolymorphicCallVariantListSize = Options::maxPolymorphicCallVariantListSize();
     if (list.size() > maxPolymorphicCallVariantListSize) {
-        linkVirtualFor(exec, callLinkInfo, CodeForCall, registers);
+        linkVirtualFor(exec, callLinkInfo);
         return;
     }
     
@@ -1873,7 +1873,7 @@
         ASSERT(variant.executable()->hasJITCodeForCall());
         MacroAssemblerCodePtr codePtr =
             variant.executable()->generatedJITCodeForCall()->addressForCall(
-                *vm, variant.executable(), ArityCheckNotRequired, registers);
+                *vm, variant.executable(), ArityCheckNotRequired, callLinkInfo.registerPreservationMode());
         
         if (fastCounts) {
             stubJit.add32(
@@ -1903,7 +1903,7 @@
         
     LinkBuffer patchBuffer(*vm, stubJit, callerCodeBlock, JITCompilationCanFail);
     if (patchBuffer.didFailToAllocate()) {
-        linkVirtualFor(exec, callLinkInfo, CodeForCall, registers);
+        linkVirtualFor(exec, callLinkInfo);
         return;
     }
     
@@ -1916,11 +1916,7 @@
         patchBuffer.link(done, callLinkInfo.callReturnLocation().labelAtOffset(0));
     else
         patchBuffer.link(done, callLinkInfo.hotPathOther().labelAtOffset(0));
-    // This can set the callLinkInfo's slow path stub while this does
-    // not technically takes the slow path. But we repatch the slow
-    // path to take this as well.
-    MacroAssemblerCodePtr slowPathCodePtr = linkPolymorphicCallThunk(vm, callLinkInfo, registers).code();
-    patchBuffer.link(slow, CodeLocationLabel(slowPathCodePtr));
+    patchBuffer.link(slow, CodeLocationLabel(vm->getCTIStub(linkPolymorphicCallThunkGenerator).code()));
     
     RefPtr<PolymorphicCallStubRoutine> stubRoutine = adoptRef(new PolymorphicCallStubRoutine(
         FINALIZE_CODE_FOR(
@@ -1936,13 +1932,10 @@
     repatchBuffer.replaceWithJump(
         RepatchBuffer::startOfBranchPtrWithPatchOnRegister(callLinkInfo.hotPathBegin()),
         CodeLocationLabel(stubRoutine->code().code()));
-
-    // The real slow path is unreachable on 64 bit platforms, but not
-    // on 32 bit platform, since the cell check is performed before
-    // taking the polymorphic path on 32 bit platforms
-    repatchBuffer.relink(
-        callLinkInfo.callReturnLocation(),
-        slowPathCodePtr);
+    // The original slow path is unreachable on 64-bits, but still
+    // reachable on 32-bits since a non-cell callee will always
+    // trigger the slow path
+    linkSlowFor(repatchBuffer, vm, callLinkInfo);
     
     // If there had been a previous stub routine, that one will die as soon as the GC runs and sees
     // that it's no longer on stack.

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.h (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.h	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/Repatch.h	2015-07-30 18:21:36 UTC (rev 187590)
@@ -40,11 +40,11 @@
 void repatchPutByID(ExecState*, JSValue, Structure*, const Identifier&, const PutPropertySlot&, StructureStubInfo&, PutKind);
 void buildPutByIdList(ExecState*, JSValue, Structure*, const Identifier&, const PutPropertySlot&, StructureStubInfo&, PutKind);
 void repatchIn(ExecState*, JSCell*, const Identifier&, bool wasFound, const PropertySlot&, StructureStubInfo&);
-void linkFor(ExecState*, CallLinkInfo&, CodeBlock*, JSFunction* callee, MacroAssemblerCodePtr, CodeSpecializationKind, RegisterPreservationMode);
-void linkSlowFor(ExecState*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode);
-void unlinkFor(RepatchBuffer&, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode);
-void linkVirtualFor(ExecState*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode);
-void linkPolymorphicCall(ExecState*, CallLinkInfo&, CallVariant, RegisterPreservationMode);
+void linkFor(ExecState*, CallLinkInfo&, CodeBlock*, JSFunction* callee, MacroAssemblerCodePtr);
+void linkSlowFor(ExecState*, CallLinkInfo&);
+void unlinkFor(RepatchBuffer&, CallLinkInfo&);
+void linkVirtualFor(ExecState*, CallLinkInfo&);
+void linkPolymorphicCall(ExecState*, CallLinkInfo&, CallVariant);
 void resetGetByID(RepatchBuffer&, StructureStubInfo&);
 void resetPutByID(RepatchBuffer&, StructureStubInfo&);
 void resetIn(RepatchBuffer&, StructureStubInfo&);

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.cpp (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.cpp	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.cpp	2015-07-30 18:21:36 UTC (rev 187590)
@@ -77,7 +77,7 @@
 }
 
 static void slowPathFor(
-    CCallHelpers& jit, VM* vm, Sprt_JITOperation_ECli slowPathFunction, CallLinkInfo::CallType callType)
+    CCallHelpers& jit, VM* vm, Sprt_JITOperation_ECli slowPathFunction)
 {
     jit.emitFunctionPrologue();
     jit.storePtr(GPRInfo::callFrameRegister, &vm->topCallFrame);
@@ -94,28 +94,23 @@
     // 1) Exception throwing thunk.
     // 2) Host call return value returner thingy.
     // 3) The function to call.
-    // The second return value GPR will hold a zero value in case 1
-    // (we must not trash our own frame, since we won't ever perform
-    // the actual call) and a non-zero value in all other cases (we
-    // must replace our own frame).
+    // The second return value GPR will hold a non-zero value for tail calls.
 
     emitPointerValidation(jit, GPRInfo::returnValueGPR);
     jit.emitFunctionEpilogue();
 
     CCallHelpers::Jump doNotTrash = jit.branchTestPtr(CCallHelpers::Zero, GPRInfo::returnValueGPR2);
 
-    if (CallLinkInfo::isTailCallType(callType)) {
-        jit.preserveReturnAddressAfterCall(GPRInfo::nonPreservedNonReturnGPR);
-        jit.move(GPRInfo::returnValueGPR, GPRInfo::regT4); // FIXME
-        prepareForTailCall(jit);
-        jit.jump(GPRInfo::regT4);
-    }
+    jit.preserveReturnAddressAfterCall(GPRInfo::nonPreservedNonReturnGPR);
+    jit.move(GPRInfo::returnValueGPR, GPRInfo::regT4); // FIXME
+    prepareForTailCall(jit);
+    jit.jump(GPRInfo::regT4);
+
     doNotTrash.link(&jit);
     jit.jump(GPRInfo::returnValueGPR);
 }
 
-static MacroAssemblerCodeRef linkThunkFor(
-    VM* vm, CallLinkInfo& callLinkInfo, CodeSpecializationKind kind, RegisterPreservationMode registers)
+MacroAssemblerCodeRef linkCallThunkGenerator(VM* vm)
 {
     // The return address is on the stack or in the link register. We will hence
     // save the return address to the call frame while we make a C++ function call
@@ -124,132 +119,26 @@
     // been adjusted, and all other registers to be available for use.
     CCallHelpers jit(vm);
     
-    slowPathFor(jit, vm, operationLinkFor(kind, registers), callLinkInfo.callType());
+    slowPathFor(jit, vm, operationLinkCall);
     
     LinkBuffer patchBuffer(*vm, jit, GLOBAL_THUNK_ID);
-    return FINALIZE_CODE(
-        patchBuffer,
-        ("Link %s%s%s slow path thunk",
-            CallLinkInfo::isTailCallType(callLinkInfo.callType()) ? "tail " : "",
-            kind == CodeForCall ? "call" : "construct",
-            registers == MustPreserveRegisters ? " that preserves registers" : ""));
+    return FINALIZE_CODE(patchBuffer, ("Link call slow path thunk"));
 }
 
-MacroAssemblerCodeRef linkCallThunk(
-    VM* vm, CallLinkInfo& callLinkInfo, CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
-    if (CallLinkInfo::isTailCallType(callLinkInfo.callType())) {
-        RefPtr<GCAwareJITStubRoutine> stubRoutine =
-            adoptRef(new GCAwareJITStubRoutine(
-                linkThunkFor(vm, callLinkInfo, kind, registers), *vm));
-        callLinkInfo.setSlowStub(stubRoutine.release());
-        return callLinkInfo.slowStub()->code();
-    }
-
-    callLinkInfo.clearSlowStub();
-
-    ThunkGenerator generator;
-
-    switch (kind) {
-    case CodeForCall:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            generator = linkCallThunkGenerator;
-            break;
-        case MustPreserveRegisters:
-            generator = linkCallThatPreservesRegsThunkGenerator;
-            break;
-        }
-        break;
-    case CodeForConstruct:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            generator = linkConstructThunkGenerator;
-            break;
-        case MustPreserveRegisters:
-            generator = linkConstructThatPreservesRegsThunkGenerator;
-            break;
-        }
-        break;
-    }
-
-    return vm->getCTIStub(generator);
-}
-
-MacroAssemblerCodeRef linkCallThunkGenerator(VM* vm)
-{
-    return linkThunkFor(vm, CallLinkInfo::dummy(), CodeForCall, RegisterPreservationNotRequired);
-}
-
-MacroAssemblerCodeRef linkConstructThunkGenerator(VM* vm)
-{
-    return linkThunkFor(vm, CallLinkInfo::dummy(), CodeForConstruct, RegisterPreservationNotRequired);
-}
-
-MacroAssemblerCodeRef linkCallThatPreservesRegsThunkGenerator(VM* vm)
-{
-    return linkThunkFor(vm, CallLinkInfo::dummy(), CodeForCall, MustPreserveRegisters);
-}
-
-MacroAssemblerCodeRef linkConstructThatPreservesRegsThunkGenerator(VM* vm)
-{
-    return linkThunkFor(vm, CallLinkInfo::dummy(), CodeForConstruct, MustPreserveRegisters);
-}
-
 // For closure optimizations, we only include calls, since if you're using closures for
 // object construction then you're going to lose big time anyway.
-static MacroAssemblerCodeRef linkPolymorphicCallThunkFor(VM* vm, CallLinkInfo& callLinkInfo, RegisterPreservationMode registers)
+MacroAssemblerCodeRef linkPolymorphicCallThunkGenerator(VM* vm)
 {
     CCallHelpers jit(vm);
     
-    slowPathFor(jit, vm, operationLinkPolymorphicCallFor(registers), callLinkInfo.callType());
+    slowPathFor(jit, vm, operationLinkPolymorphicCall);
     
     LinkBuffer patchBuffer(*vm, jit, GLOBAL_THUNK_ID);
-    return FINALIZE_CODE(patchBuffer,
-        ("Link polymorphic %s%s slow path thunk",
-            CallLinkInfo::isTailCallType(callLinkInfo.callType()) ? "tail call" : "call",
-            registers == MustPreserveRegisters ? " that preserves registers" : ""));
+    return FINALIZE_CODE(patchBuffer, ("Link polymorphic call slow path thunk"));
 }
 
-MacroAssemblerCodeRef linkPolymorphicCallThunk(VM* vm, CallLinkInfo& callLinkInfo, RegisterPreservationMode registers)
+MacroAssemblerCodeRef virtualThunkFor(VM* vm, CallLinkInfo& callLinkInfo)
 {
-    if (CallLinkInfo::isTailCallType(callLinkInfo.callType())) {
-        RefPtr<GCAwareJITStubRoutine> stubRoutine =
-            adoptRef(new GCAwareJITStubRoutine(
-                linkPolymorphicCallThunkFor(vm, callLinkInfo, registers), *vm));
-        callLinkInfo.setSlowStub(stubRoutine.release());
-        return callLinkInfo.slowStub()->code();
-    }
-
-    callLinkInfo.clearSlowStub();
-
-    ThunkGenerator generator;
-
-    switch (registers) {
-    case RegisterPreservationNotRequired:
-        generator = linkPolymorphicCallThunkGenerator;
-        break;
-    case MustPreserveRegisters:
-        generator = linkPolymorphicCallThatPreservesRegsThunkGenerator;
-        break;
-    }
-
-    return vm->getCTIStub(generator);
-}
-
-MacroAssemblerCodeRef linkPolymorphicCallThunkGenerator(VM* vm)
-{
-    return linkPolymorphicCallThunkFor(vm, CallLinkInfo::dummy(), RegisterPreservationNotRequired);
-}
-
-MacroAssemblerCodeRef linkPolymorphicCallThatPreservesRegsThunkGenerator(VM* vm)
-{
-    return linkPolymorphicCallThunkFor(vm, CallLinkInfo::dummy(), MustPreserveRegisters);
-}
-
-static MacroAssemblerCodeRef virtualThunkFor(
-    VM* vm, CallLinkInfo& callLinkInfo, CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
     // The callee is in regT0 (for JSVALUE32_64, the tag is in regT1).
     // The return address is on the stack, or in the link register. We will hence
     // jump to the callee, or save the return address to the call frame while we
@@ -294,7 +183,8 @@
         GPRInfo::regT4);
     jit.loadPtr(
         CCallHelpers::Address(
-            GPRInfo::regT4, ExecutableBase::offsetOfJITCodeWithArityCheckFor(kind, registers)),
+            GPRInfo::regT4, ExecutableBase::offsetOfJITCodeWithArityCheckFor(
+                callLinkInfo.specializationKind(), callLinkInfo.registerPreservationMode())),
         GPRInfo::regT4);
     slowCase.append(jit.branchTestPtr(CCallHelpers::Zero, GPRInfo::regT4));
     
@@ -312,79 +202,17 @@
     slowCase.link(&jit);
     
     // Here we don't know anything, so revert to the full slow path.
-
-    slowPathFor(jit, vm, operationVirtualFor(kind, registers), callLinkInfo.callType());
+    slowPathFor(jit, vm, operationVirtualCall);
     
     LinkBuffer patchBuffer(*vm, jit, GLOBAL_THUNK_ID);
     return FINALIZE_CODE(
         patchBuffer,
-        ("Virtual %s%s%s slow path thunk",
-            CallLinkInfo::isTailCallType(callLinkInfo.callType()) ? "tail " : "",
-            kind == CodeForCall ? "call" : "construct",
-            registers == MustPreserveRegisters ? " that preserves registers" : ""));
+        ("Virtual %s%s slow path thunk at CodePtr(%p)",
+        callLinkInfo.specializationKind() == CodeForCall ? "call" : "construct",
+        callLinkInfo.registerPreservationMode() == MustPreserveRegisters ? " that preserves registers" : "",
+        callLinkInfo.callReturnLocation().dataLocation()));
 }
 
-MacroAssemblerCodeRef virtualThunk(
-    VM* vm, CallLinkInfo& callLinkInfo, CodeSpecializationKind kind, RegisterPreservationMode registers)
-{
-    if (CallLinkInfo::isTailCallType(callLinkInfo.callType())) {
-        RefPtr<GCAwareJITStubRoutine> stubRoutine =
-            adoptRef(new GCAwareJITStubRoutine(
-                virtualThunkFor(vm, callLinkInfo, kind, registers), *vm));
-        callLinkInfo.setSlowStub(stubRoutine.release());
-        return callLinkInfo.slowStub()->code();
-    }
-
-    callLinkInfo.clearSlowStub();
-
-    ThunkGenerator generator;
-
-    switch (kind) {
-    case CodeForCall:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            generator = virtualCallThunkGenerator;
-            break;
-        case MustPreserveRegisters:
-            generator = virtualCallThatPreservesRegsThunkGenerator;
-            break;
-        }
-        break;
-    case CodeForConstruct:
-        switch (registers) {
-        case RegisterPreservationNotRequired:
-            generator = virtualConstructThunkGenerator;
-            break;
-        case MustPreserveRegisters:
-            generator = virtualConstructThatPreservesRegsThunkGenerator;
-            break;
-        }
-        break;
-    }
-
-    return vm->getCTIStub(generator);
-}
-
-MacroAssemblerCodeRef virtualCallThunkGenerator(VM* vm)
-{
-    return virtualThunkFor(vm, CallLinkInfo::dummy(), CodeForCall, RegisterPreservationNotRequired);
-}
-
-MacroAssemblerCodeRef virtualConstructThunkGenerator(VM* vm)
-{
-    return virtualThunkFor(vm, CallLinkInfo::dummy(), CodeForConstruct, RegisterPreservationNotRequired);
-}
-
-MacroAssemblerCodeRef virtualCallThatPreservesRegsThunkGenerator(VM* vm)
-{
-    return virtualThunkFor(vm, CallLinkInfo::dummy(), CodeForCall, MustPreserveRegisters);
-}
-
-MacroAssemblerCodeRef virtualConstructThatPreservesRegsThunkGenerator(VM* vm)
-{
-    return virtualThunkFor(vm, CallLinkInfo::dummy(), CodeForConstruct, MustPreserveRegisters);
-}
-
 enum ThunkEntryType { EnterViaCall, EnterViaJump };
 
 static MacroAssemblerCodeRef nativeForGenerator(VM* vm, CodeSpecializationKind kind, ThunkEntryType entryType = EnterViaCall)

Modified: branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.h (187589 => 187590)


--- branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.h	2015-07-30 17:19:47 UTC (rev 187589)
+++ branches/jsc-tailcall/Source/_javascript_Core/jit/ThunkGenerators.h	2015-07-30 18:21:36 UTC (rev 187590)
@@ -38,21 +38,10 @@
 
 MacroAssemblerCodeRef throwExceptionFromCallSlowPathGenerator(VM*);
 
-MacroAssemblerCodeRef linkCallThunk(VM*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode);
 MacroAssemblerCodeRef linkCallThunkGenerator(VM*);
-MacroAssemblerCodeRef linkConstructThunkGenerator(VM*);
-MacroAssemblerCodeRef linkCallThatPreservesRegsThunkGenerator(VM*);
-MacroAssemblerCodeRef linkConstructThatPreservesRegsThunkGenerator(VM*);
-
-MacroAssemblerCodeRef linkPolymorphicCallThunk(VM*, CallLinkInfo&, RegisterPreservationMode);
 MacroAssemblerCodeRef linkPolymorphicCallThunkGenerator(VM*);
-MacroAssemblerCodeRef linkPolymorphicCallThatPreservesRegsThunkGenerator(VM*);
 
-MacroAssemblerCodeRef virtualThunk(VM*, CallLinkInfo&, CodeSpecializationKind, RegisterPreservationMode);
-MacroAssemblerCodeRef virtualCallThunkGenerator(VM*);
-MacroAssemblerCodeRef virtualConstructThunkGenerator(VM*);
-MacroAssemblerCodeRef virtualCallThatPreservesRegsThunkGenerator(VM*);
-MacroAssemblerCodeRef virtualConstructThatPreservesRegsThunkGenerator(VM*);
+MacroAssemblerCodeRef virtualThunkFor(VM*, CallLinkInfo&);
 
 MacroAssemblerCodeRef nativeCallGenerator(VM*);
 MacroAssemblerCodeRef nativeConstructGenerator(VM*);
_______________________________________________
webkit-changes mailing list
webkit-changes@lists.webkit.org
https://lists.webkit.org/mailman/listinfo/webkit-changes

Reply via email to