Seems that I've hit the same case without Maglev:
==== C stack trace ===============================
std::_Vector_iterator<std::_Vector_val<std::_Simple_types<std::pair<int,v8::internal::Tagged<v8::internal::HeapObject>
> > > >::operator-> [0x00007FFB8C978EB1+369]
v8::MemorySpan<v8::internal::Handle<v8::internal::Map>
>::to_address<std::_Vector_iterator<std::_Vector_val<std::_Simple_types<v8::internal::Handle<v8::internal::Map>
>
> > >,void> [0x00007FFB8D13EC83+19]
v8::MemorySpan<v8::internal::Handle<v8::internal::Map>
>::MemorySpan<v8::internal::Handle<v8::internal::Map>
><std::_Vector_iterator<std::_Vector_val<std::_Simple_types<v8::internal::Handle<v8::internal::Map>
>
> > >,1> [0x00007FFB8D13E704+52]
v8::internal::compiler::JSHeapBroker::ProcessFeedbackMapsForElementAccess
[0x00007FFB8E57704A+714]
v8::internal::compiler::JSHeapBroker::ReadFeedbackForPropertyAccess
[0x00007FFB8E5788E1+1841]
v8::internal::compiler::JSHeapBroker::GetFeedbackForPropertyAccess
[0x00007FFB8E573848+88]
v8::internal::compiler::JSNativeContextSpecialization::ReducePropertyAccess
[0x00007FFB8EB83319+681]
v8::internal::compiler::JSNativeContextSpecialization::ReduceJSSetKeyedProperty
[0x00007FFB8EB7EF21+321]
v8::internal::compiler::JSNativeContextSpecialization::Reduce
[0x00007FFB8EB73019+649]
v8::internal::compiler::Reducer::Reduce [0x00007FFB8E93D1EC+60]
v8::internal::compiler::GraphReducer::Reduce
[0x00007FFB8E93CEBE+190]
v8::internal::compiler::GraphReducer::ReduceTop
[0x00007FFB8E93D708+600]
v8::internal::compiler::GraphReducer::ReduceNode
[0x00007FFB8E93D32E+174]
v8::internal::compiler::GraphReducer::ReduceGraph
[0x00007FFB8E93D278+40]
v8::internal::compiler::InliningPhase::Run [0x00007FFB8E4E7CBE+1950]
v8::internal::compiler::PipelineImpl::Run<v8::internal::compiler::InliningPhase>
[0x00007FFB8E49B71B+123]
v8::internal::compiler::PipelineImpl::CreateGraph
[0x00007FFB8E4D03C8+168]
v8::internal::compiler::PipelineCompilationJob::ExecuteJobImpl
[0x00007FFB8E4D205C+428]
v8::internal::OptimizedCompilationJob::ExecuteJob
[0x00007FFB8CB5E11B+299]
v8::internal::OptimizingCompileDispatcher::CompileNext
[0x00007FFB8D0390A3+67]
v8::internal::OptimizingCompileDispatcher::CompileTask::Run
[0x00007FFB8D03A2F9+633]
v8::platform::DefaultJobWorker::Run [0x00007FFB8CD835F9+185]
v8::platform::DefaultWorkerThreadsTaskRunner::WorkerThread::Run
[0x00007FFB8CD83E72+194]
v8::base::Thread::NotifyStartedAndRun [0x00007FFB8C6D8904+52]
v8::base::OS::StrNCpy [0x00007FFB8C6D964D+205]
thread_start<unsigned int (__cdecl*)(void *),1>
[0x00007FFB8F67B6B5+165]
(minkernel\crts\ucrt\src\appcrt\startup\thread.cpp:97)
BaseThreadInitThunk [0x00007FFCBDDA7374+20]
RtlUserThreadStart [0x00007FFCBFDBCC91+33]
I suspect this thread is what triggered it:
0 # NtWaitForAlertByThreadId in ntdll+0xa0f24
1 # RtlAcquireSRWLockExclusive in ntdll+0x29205
2 # v8::base::SharedMutex::LockExclusive in app+0x59258f
3 #
`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>::operator()
in app+0xea0a99
4 #
v8::internal::LocalHeap::ParkAndExecuteCallback<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
> in app+0xe9f7c8
5 #
`v8::internal::LocalHeap::ExecuteWhileParked<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
>'::`2'::<lambda_1>::operator() in app+0xea0749
6 #
heap::base::Stack::SetMarkerAndCallbackImpl<`v8::internal::LocalHeap::ExecuteWhileParked<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
>'::`2'::<lambda_1> > in app+0xe9f99b
7 # PushAllRegistersAndIterateStack in app+0xf65abd
8 # heap::base::Stack::TrampolineCallbackHelper in app+0x7f3737
9 #
heap::base::Stack::SetMarkerAndCallback<`v8::internal::LocalHeap::ExecuteWhileParked<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
>'::`2'::<lambda_1> > in app+0xe9f8d4
10 #
v8::internal::LocalHeap::ExecuteWithStackMarker<`v8::internal::LocalHeap::ExecuteWhileParked<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
>'::`2'::<lambda_1> > in app+0xe9edfe
11 #
v8::internal::LocalHeap::ExecuteWhileParked<`v8::internal::ParkedSharedMutexGuardIf<1,0>::ParkedSharedMutexGuardIf<1,0>'::`25'::<lambda_2>
> in app+0xe9ec55
12 #
v8::internal::ParkedSharedMutexGuardIf<0,0>::ParkedSharedMutexGuardIf<0,0>
in app+0xea01dd
13 #
v8::internal::ParkedSharedMutexGuardIf<0,0>::ParkedSharedMutexGuardIf<0,0>
in app+0xea022a
14 # v8::internal::MapUpdater::ReconfigureToDataField in app+0xeaaa4d
15 # v8::internal::Map::Update in app+0x80f4c7
16 # v8::internal::Map::TransitionToDataProperty in app+0x80cf20
17 # v8::internal::LookupIterator::PrepareTransitionToDataProperty in
app+0x9d3cc5
18 # v8::internal::Object::TransitionAndWriteDataProperty in app+0x642167
19 # v8::internal::Object::AddDataProperty in app+0x5fc92e
20 # v8::internal::JSObject::DefineOwnPropertyIgnoreAttributes in
app+0x754a99
21 # v8::internal::JSObject::DefineOwnPropertyIgnoreAttributes in
app+0x754b5e
22 # v8::internal::JSObject::SetOwnPropertyIgnoreAttributes in
app+0x778e02
23 #
v8::internal::CastTraits<v8::internal::ObjectBoilerplateDescription>::AllowFrom
in app+0x1fd8252
24 #
v8::internal::CastTraits<v8::internal::ObjectBoilerplateDescription>::AllowFrom
in app+0x1fd6f4a
25 #
v8::internal::Cast<v8::internal::ObjectBoilerplateDescription,v8::internal::Object>
in app+0x1fd6c66
26 #
v8::internal::Cast<v8::internal::ObjectBoilerplateDescription,v8::internal::Object>
in app+0x1fd65d7
27 # v8::internal::AllocationSiteUsageContext::ShouldCreateMemento in
app+0x1fe14a8
28 # v8::internal::Runtime_CreateObjectLiteral in app+0x1fd93b4
On Wednesday, 25 June 2025 at 17:16:00 UTC+1 Audrius Butkevicius wrote:
> I've actually posted stacktraces of other threads on the user list (
> https://groups.google.com/g/v8-users/c/iaD_4IGqIyI) which hints this is a
> race condition.
> Seems that the code on head hasn't changed around this, so it still might
> be a bug now, but confirmed, the issue goes away by switching off maglev.
>
> On Wednesday, 25 June 2025 at 13:55:06 UTC+1 [email protected] wrote:
>
>> On Wed, Jun 25, 2025 at 2:11 PM Audrius Butkevicius
>> <[email protected]> wrote:
>> >
>> > Hi
>> >
>> > I'm running my application in debug mode, and I noticed it sometimes it
>> fails with his assert:
>> >
>> > C:\Program Files\Microsoft Visual
>> Studio\2022\Community\VC\Tools\MSVC\14.43.34808\include\vector(280) :
>> Assertion failed: can't dereference out of range vector iterator
>> >
>> > ...
>> >
>> > 3 # `DllMain'::`5'::<lambda_1>::operator() at dllmain.cpp:598
>> (app+0x371a7cd)
>> > 4 # `DllMain'::`5'::<lambda_1>::<lambda_invoker_cdecl> at
>> dllmain.cpp:614 (app+0x371a668)
>> > 5 # _VCrtDbgReportA at dbgrptt.cpp:391 (app+0x361df8f)
>> > 6 # _CrtDbgReport at dbgrpt.cpp:263 (app+0x35ee779)
>> > 7 #
>> std::_Vector_iterator<std::_Vector_val<std::_Simple_types<std::pair<int,v8::internal::Tagged<v8::internal::HeapObject>
>>
>> > > > >::operator-> in app+0x92054c
>> > 8 # v8::MemorySpan<v8::internal::Handle<v8::internal::Map>
>> >::to_address<std::_Vector_iterator<std::_Vector_val<std::_Simple_types<v8::internal::Handle<v8::internal::Map>
>> >
>> > > >,void> in app+0x10e5643
>> > 9 # v8::MemorySpan<v8::internal::Handle<v8::internal::Map>
>> >::MemorySpan<v8::internal::Handle<v8::internal::Map>
>> ><std::_Vector_iterator<std::_Vector_val<std::_Simple_types<v8::internal::Handle<v8::internal::Map>
>> >
>> > > >,1> in app+0x10e50c4
>> > 10 #
>> v8::internal::compiler::JSHeapBroker::ProcessFeedbackMapsForElementAccess
>> in app+0x251e77a
>> > 11 #
>> v8::internal::compiler::JSHeapBroker::ReadFeedbackForPropertyAccess in
>> app+0x2520011
>> > 12 # v8::internal::compiler::JSHeapBroker::GetFeedbackForPropertyAccess
>> in app+0x251af78
>> > 13 # v8::internal::maglev::MaglevGraphBuilder::VisitStaInArrayLiteral
>> in app+0x2862834
>> > 14 # v8::internal::maglev::MaglevGraphBuilder::VisitSingleBytecode in
>> app+0x2343e8f
>> > 15 # v8::internal::maglev::MaglevGraphBuilder::BuildBody in
>> app+0x230b567
>> > 16 # v8::internal::maglev::MaglevGraphBuilder::Build in app+0x230b385
>> > 17 # v8::internal::maglev::MaglevCompiler::Compile in app+0x230bd91
>> > 18 # v8::internal::maglev::MaglevCompilationJob::ExecuteJobImpl in
>> app+0xfe89b8
>> > 19 # v8::internal::OptimizedCompilationJob::ExecuteJob in app+0xb0583b
>> > 20 # v8::internal::maglev::MaglevConcurrentDispatcher::JobTask::Run in
>> app+0xfe9c23
>> > 21 # v8::platform::DefaultJobWorker::Run in app+0xd2a949
>> > 22 # v8::platform::DefaultWorkerThreadsTaskRunner::WorkerThread::Run in
>> app+0xd2b1c2
>> > 23 # v8::base::Thread::NotifyStartedAndRun in app+0x681104
>> > 24 # v8::base::OS::StrNCpy in app+0x681e4d
>> > 25 # thread_start<unsigned int (__cdecl*)(void *),1> at thread.cpp:97
>> (app+0x3622e45)
>> > 26 # BaseThreadInitThunk in KERNEL32+0x17374
>> > 27 # RtlUserThreadStart in ntdll+0x4cc91
>> >
>> > It's possible that I'm doing something wrong, but it's not very clear
>> what.
>> >
>> > Sadly, this is version 12.9.202, as I still need a static build that
>> uses MSVC.
>> >
>> > Any suggestions would be welcome, as to what I'm doing wrong.
>> >
>> > Thanks.
>>
>> Maybe try building with v8_enable_maglev=false. In node, we had maglev
>> disabled until at least 12.8 because of various crashes.
>>
>
--
--
v8-dev mailing list
[email protected]
http://groups.google.com/group/v8-dev
---
You received this message because you are subscribed to the Google Groups
"v8-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email
to [email protected].
To view this discussion visit
https://groups.google.com/d/msgid/v8-dev/71a5c543-7e4d-4b18-9d5d-655efd9081ddn%40googlegroups.com.