[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-05-09 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  On the gaming-class 2018 desktop, although the `bufferCapacity` value at 
10**0** sped things up as described on this here ticket, application of the 
CPU governor change did not seem to have any additional bearing (it took 2.47 
days as compared to its previous record of 2.44). It's possible that the 
existing BIOS configuration of the gaming-class 2018 desktop (which was already 
set to a high performance mode) was already squeezing out optimal performance, 
for example, or something else about the processor architecture's interaction 
with the rest of the hardware and operating system is just different as 
contrasted with the data center server. In any case, it's nice to see that the 
data center server is faster!
  
  One of my theories is that the gaming class desktop with 64GB of total RAM 
may play some role, but the hardware provider has indicated that although more 
memory can be installed, it will only run with 64GB RAM and can't jump to 128GB 
RAM. Another is that perhaps the default memory swappiness (60) on the gaming 
class desktop could play a role. However, I find this less likely, as memory 
spikes haven't seemed to be a problem on this machine while loading data, plus 
the hard drive is an NVMe and so paging is somewhat less likely to manifest 
problematically anyway. Maybe something to check another day, as we use a 
swappiness of 0 in the data center generally as with the WDQS hosts.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Dringsim, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-05-09 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  And for the second run in T362920: Benchmark Blazegraph import with increased 
buffer capacity (and other factors)  
we saw that this took about 3089 minutes, or about 2.**15** days, for the 
scholarly article entity graph with the CPU governor change (described in 
T336443#9726600  ) plus the 
bufferCapacity at 10**0** on wdqs2023.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Dringsim, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-05-06 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  In T362920: Benchmark Blazegraph import with increased buffer capacity (and 
other factors)  we saw that this 
took about 3702 minutes, or about 2.57 hours, for the scholarly article entity 
with the CPU governor change (described in T336443#9726600 
 ) alone on wdqs2023.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Dringsim, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-10 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Good news. With the N-triples style scholarly entity graph files, with a 
buffer capacity of 10**0**, a write retention queue capacity of 4000, and a 
heap size of 31g, on the gaming-class desktop, it took about 2.40 days. Recall 
that with buffer capacity of 10 it took about 3.25 days on this desktop 
(and again, recall that it was 5.875 days on wdqs1024). So, there was about a 
35% (1.35 minus 1) speed increase with the higher buffer capacity here on this 
gaming-class desktop.
  
  It appears then that the combination of faster CPU, NVMe, and a higher buffer 
capacity is somewhere around 144% (5.875 / 2.40 = 2.44, 2.44 minus 1 = 1.44) 
faster than what we observed on a target data center machine.
  
  It will likely be somewhat less dramatic on 10B triples if the previous 
munged file runs are any clue. I'm going to think on how to check this notion - 
it could be done by using the scholarly graph plus a portion of the main graph, 
which would be probably close enough for our purposes.
  
  A high speed NVMe is in the process of being acquired so that we can verify 
on wdqs2024 the level of speedup achieved on a server similar to what was used 
for the graph split test servers. wdqs2024 has a hardware profile similar to 
wdqs1024 at present.
  
  Some stuff from the terminal from the import on the gaming-class desktop:
  
ubuntu22:~$ head -9 ~/rdf/dist/target/service-0.3.138-SNAPSHOT/loadData.log
Sun Apr  7 12:03:19 PM CDT 2024
Processing part-0-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=64069ms, elapsed=64024ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=71897ms, commitTime=1712509470732, 
mutationCount=7349689Sun Apr  7 12:04:31 PM CDT 2024
Processing part-1-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz

# screen output at the end:

Processing part-01023-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=51703ms, elapsed=51703ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=181013ms, commitTime=1712716306763, 
mutationCount=7946575Tue Apr  9 09:31:50 PM CDT 2024
File 
/mnt/firehose/split_0/nt_wd_schol/part-01024-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
 not found, terminating

real3447m18.542s

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-08 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Update: With the buffer capacity at 10**0**, file number 550 of the 
scholarly graph was imported as of `Mon Apr  8 03:22:08 PM CDT 2024` . So, 
under 28 hours so far (buffer capacity at 10 was more than 36 hours).
  
Processing part-00550-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=51018ms, elapsed=51018ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=245278ms, commitTime=1712607725882, 
mutationCount=7414497Mon Apr  8 03:22:08 PM CDT 2024
  
  Will update when it completes.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-07 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  With bufferCapacity at 10**0**, kicked it off again with the scholarly 
article entity graph files:
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ date | tee 
loadData.log; time ./loadData.sh -n wdq -d /mnt/firehose/split_0/nt_wd_schol -s 
0 -e 0 2>&1 | tee -a loadData.log; time ./loadData.sh -n wdq -d 
/mnt/firehose/split_0/nt_wd_schol 2>&1 | tee -a loadData.log
Sun Apr  7 12:03:19 PM CDT 2024
Processing part-0-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-07 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Update. On the gaming-class machine it took about 3.25 days to import the 
scholarly article entity graph, using a buffer capacity of 10 (compare this 
with 5.875 days on wdqs1024 
). This resulted in 
7_643_858_078 triples as expected. Next up will be with a buffer capacity of 
10**0** to see if there is any obvious difference in import time.
  
>Sun Apr  7 03:34:59 AM CDT 2024
Processing part-01023-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=181901ms, elapsed=181901ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=226511ms, commitTime=1712479122009, 
mutationCount=7946575Sun Apr  7 03:38:46 AM CDT 2024
File 
/mnt/firehose/split_0/nt_wd_schol/part-01024-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
 not found, terminating

real4684m49.905s

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-05 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Just updating on how far along this run is, file 550 of the scholarly article 
entity side of the graph is being processed. There are files 0 through 1023 for 
this side of the graph. Note that I did think to `tee` output this time around 
so that generally/hopefully there's more info available to review output, stack 
traces (although hopefully there are none), and so on, should it be needed.
  
Processing part-00549-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=299675ms, elapsed=299675ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=392531ms, commitTime=1712329890306, 
mutationCount=7032172Fri Apr  5 10:11:32 AM CDT 2024
Processing part-00550-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
  
  Sidebar: the "non"-scholarly article entity graph also has files 0-1023 and 
is similarly sized in terms of triples, but naturally the manner in which nodes 
are interconnected varies in a sense because of the type of entities, what kind 
of data entities are imbued with, and so on.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-04 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Following roughly the procedure in P54284 
 to rename the Spark-produced graph 
files (and updating `loadData.sh` with 
`FORMAT=part-%05d-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz` and still 
having a `date` call after each `curl` in it), I kicked off an import of the 
scholarly article entity graph like so to see how it goes with a buffer 
capacity of 10:
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ date; time 
./loadData.sh -n wdq -d /mnt/firehose/split_0/nt_wd_schol -s 0 -e 0 2>&1 | tee 
loadData.log; time ./loadData.sh -n wdq -d /mnt/firehose/split_0/nt_wd_schol 
2>&1 | tee -a loadData.log
Wed Apr  3 09:32:54 PM CDT 2024
Processing part-0-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=55629ms, elapsed=55584ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=61598ms, commitTime=1712198035155, 
mutationCount=7349689Wed Apr  3 09:33:56 PM CDT 2024

real1m1.702s
user0m0.004s
sys 0m0.006s
Processing part-1-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=61251ms, elapsed=61251ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=71925ms, commitTime=1712198106800, 
mutationCount=7774048Wed Apr  3 09:35:08 PM CDT 2024
Processing part-2-46f26ac6-0b21-4832-be79-d7c8709f33fb-c000.ttl.gz
  
  This is with the following values in `RWStore.properties`
  
com.bigdata.btree.writeRetentionQueue.capacity=4000
com.bigdata.rdf.sail.bufferCapacity=10
  
  and the following variable in `loadData.sh`
  
HEAP_SIZE=${HEAP_SIZE:-"31g"}

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-03 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  This morning of April 3 around 6:25 AM I had SSH'd to check progress, and it 
was working, but going slowly, similar to the day before. It was on a file 
number in the 1200s, but I didn't write down the number or copy terminal 
output; I do remember seeing it was taking around 796 seconds for one of the 
files at that time. Look at the previous comment, you'll see those were going 
slow; not surprising as we know imports on these munged files are slower upon 
more stuff is imported.
  
  I checked several hours later in the middle of a meeting, and it had gone 
into a bad spiral.
  
  I've been able to use `screen` backscrolling to obtain much of the stack 
trace, but could not backscroll to a point of having all of the information to 
tell when the last successful file imported without a stack trace for sure. 
What we can say is that //probably// the last somewhat stable commit was on 
file 1302 at about 7:24 AM. And probably file 1303 and definitely 1304 and 1305 
have been failing badly and taking a really long time in doing so; this would 
probably continue indefinitely from here without killing the process. Just a 
slice of the paste here to give an idea of things (notice `lastCommitTime` and 
`commitCounter` in the stack trace).
  
Wed Apr  3 02:05:26 PM CDT 2024
Processing wikidump-01305.ttl.gz
SPARQL-UPDATE: updateStr=LOAD 

java.util.concurrent.ExecutionException: 
java.util.concurrent.ExecutionException: 
org.openrdf.query.UpdateExecutionException: java.lang.RuntimeException: Problem 
with entry at -83289912769511002: lastRootBlock=rootBlock{ rootBlock=0, 
challisField=1302, version=3, nextOffset=47806576684846562, 
localTime=1712147044389 [Wedne
sday, April 3, 2024 7:24:04 AM CDT], firstCommitTime=1711737574896 [Friday, 
March 29, 2024 1:39:34 PM CDT], lastCommitTime=1712147041973 [Wednesday, April 
3, 2024
 7:24:01 AM CDT], commitCounter=1302, 
commitRecordAddr={off=NATIVE:-140859033,len=422}, 
commitRecordIndexAddr={off=NATIVE:-93467508,len=220}, blockSequence=34555,
 quorumToken=-1, metaBitsAddr=26754033649714513, metaStartAddr=11989126, 
storeType=RW, uuid=f993598d-497c-46a7-8434-d25c8859a0b8, offsetBits=42, 
checksum=16003356
92, createTime=1711737574192 [Friday, March 29, 2024 1:39:34 PM CDT], 
closeTime=0}
  
  Unfortunately `jstack` seems to hiccup.
  
ubuntu22:~$ sudo jstack -m 987870
[sudo] password: 
Attaching to process ID 987870, please wait...
Debugger attached successfully.
Server compiler detected.
JVM version is 25.402-b06
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sun.tools.jstack.JStack.runJStackTool(JStack.java:140)
at sun.tools.jstack.JStack.main(JStack.java:106)
Caused by: java.lang.RuntimeException: Unable to deduce type of thread from 
address 0x7fecb400b800 (expected type JavaThread, CompilerThread, 
ServiceThread, JvmtiAgentThread, or SurrogateLockerThread)
at 
sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:169)
at sun.jvm.hotspot.runtime.Threads.first(Threads.java:153)
at sun.jvm.hotspot.tools.PStack.initJFrameCache(PStack.java:200)
at sun.jvm.hotspot.tools.PStack.run(PStack.java:71)
at sun.jvm.hotspot.tools.PStack.run(PStack.java:58)
at sun.jvm.hotspot.tools.PStack.run(PStack.java:53)
at sun.jvm.hotspot.tools.JStack.run(JStack.java:66)
at sun.jvm.hotspot.tools.Tool.startInternal(Tool.java:260)
at sun.jvm.hotspot.tools.Tool.start(Tool.java:223)
at sun.jvm.hotspot.tools.Tool.execute(Tool.java:118)
at sun.jvm.hotspot.tools.JStack.main(JStack.java:92)
... 6 more
Caused by: sun.jvm.hotspot.types.WrongTypeException: No suitable match for 
type of address 0x7fecb400b800
at 
sun.jvm.hotspot.runtime.InstanceConstructor.newWrongTypeException(InstanceConstructor.java:62)
at 
sun.jvm.hotspot.runtime.VirtualConstructor.instantiateWrapperFor(VirtualConstructor.java:80)
at 
sun.jvm.hotspot.runtime.Threads.createJavaThreadWrapper(Threads.java:165)
... 16 more
ubuntu22:~$ sudo jstack -Flm 987870
Usage:
jstack [-l] 
(to connect to running process)
jstack -F [-m] [-l] 
(to connect to a hung process)
jstack [-m] [-l]  
(to connect to a core file)
jstack [-m] [-l] [server_id@]
(to connect to a remote debug server)

Options:
-F  to force a thread dump. Use when jstack  does not respond 
(process is hung)
-m  to print both java and native frames (mixed mode)
   

[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-02 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Now this is interesting: we're now past 4 days (about 4 days and 1 hour) of 
this running, and with buffer capacity at 10 instead of 10**0** (but 
this time without any gap between the batches of files), there's still a good 
way to go yet.
  
Processing wikidump-01177.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=612796ms, elapsed=612796ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=689208ms, commitTime=1712085811545, 
mutationCount=12297407Tue Apr  2 02:23:35 PM CDT 2024
Processing wikidump-01178.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=850122ms, elapsed=850121ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=950693ms, commitTime=1712086762086, 
mutationCount=16659867Tue Apr  2 02:39:26 PM CDT 2024
Processing wikidump-01179.ttl.gz
  
  It's possible this means that a higher buffer capacity actually makes a 
difference. I will let this run complete so we can see what is the percentage 
difference.
  
  After this I will check if this sort of behavior is reproducible, and to what 
extent, with one side of the graph split when using these two different buffer 
sizes.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-04-01 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  The run with with buffer at 10**0** and heap size at 31g and queue 
capacity at 4000 on the gaming-class desktop completed.
  
Processing wikidump-01332.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=13580ms, elapsed=13580ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=266483ms, commitTime=1711304860167, 
mutationCount=4772590Sun Mar 24 01:27:45 PM CDT 2024

real5690m30.371s
  
  ... which is 3.95 days. I'm trying again, but going back to the buffer 
capacity at 10 instead of 10**0** for one last comparison with these 
runs on this subset of munged data, and without any larger pause between 
batches of files (remember the previous run with buffer capacity at 10 and 
31g heap and queue capacity at 4000 was done by first running files 1-150, then 
after coming back to the terminal sometime later resumed from file 151; but in 
the real world we usually hope to just let this thing run one file after 
another without any pause...in practice it could be that allowing the JVM time 
to heal itself created some artificial speed gains, but we'll see).
  
  Starting on Friday, March 29, 2024 at 1:40 PM CT...
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 1332
Processing wikidump-1.ttl.gz
  
  I'll update when it's done. It should complete presumably sometime in the 
next 24 hours.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, S8321414, 
Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, 
LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-21 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  **AWS EC2 servers**
  
  After exploring a battery of EC2 servers, four instance types were selected 
and the commands posted were run.
  
  The configuration most like our `wdqs1021-1023` servers (third generation 
Intel Xeon) is listed first. The fastest option among the four servers was a 
Gravitron3 ARM-based configuration from Amazon.
  
  | Time Disk ➡️ Disk | Time RAMdisk ➡️ RAMdisk | Instance Type 
 | Cost Per Hour | HD Transfer | Processor 
Comment 


  | RAM Comment|
  | - | --- | 
-- | - 
| --- | 
---
 | -- |
  | 26m46.651s| 26m26.923s  | m6id 
.16xlarge | $3.7968   | EBS 
➡️ NVMe | 64 vCPU @ "Up to 3.5 GHz 3rd Generation Intel Xeon Scalable 
processors (Ice Lake 8375C)"

| 256 GB @ DDR4  |
  | 22m5.442s | 20m31.244s  | m5zn 
.metal | $3.9641   | EBS 
➡️ EBS  | 48 vCPU @ ""2nd Generation Intel Xeon Scalable Processors (Cascade 
Lake 8252C) with an all-core turbo frequency up to 4.5 GHz""

 | 192 GiB @ DDR4 |
  | 21m40.537s| 20m57.268s  | c5d 
.12xlarge   | $2.304| 
EBS ➡️ NVMe | 48 vCPU @ " C5  and C5d 
12xlarge, 24xlarge, and metal instance sizes feature custom 2nd generation 
Intel Xeon Scalable Processors (Cascade Lake 8275CL) with a sustained all core 
Turbo frequency of 3.6GHz and single core turbo frequency of up to 3.9GHz." | 
96 GiB @ DDR4  |
  | 19m18.825s| 19m23.868s  | c7gd 
.16xlarge | $2.903| EBS 
➡️ NVMe | 64 vCPU @ "Powered by custom-built AWS Graviton3 processors"  


  | 128 GiB @ DDR5 |
  |
  
  **2018 gaming desktop**
  
  Commands were then run against a a gaming-class desktop from 2018. This 
outperformed the fastest Gravitron3 configuration in AWS.
  
  The Blazegraph `bufferCapacity` configuration variable was tested. Increasing 
the `bufferCapacity` from 10 to 100 yielded a sizable performance 
improvement.
  
  | Time Disk ➡️ Disk | Instance Type   

| 
bufferCapacity | HD Transfer   | Processor Comment  

   | RAM Comment  |
  | - | 
---
 | -- | - | 
-
 |  |
  | 18m31.647s| Alienware Aurora R7 

 (upgraded) i7-8700 | 10 | SATA SSD ➡️ NVMe  | 6 CPU @ up to 4.6 
GHz (i7-8700 

 page) | 64 GB @ DDR4 |
  | 18m3.798s | Alienware Aurora R7 

 (upgraded) i7-8700 | 10 | NVMe ➡️ same NVMe | 6 CPU @ up to 4.6 
GHz (i7-8700 

[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-21 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  By the way, I'm attempting a run for the first 1332 munged files (one shy of 
the 1333 where terminated last time around) with buffer at 10**0** and heap 
size at 31g and queue capacity at 4000 on the gaming-class desktop to see 
whether this imports smoothly and whether performance gains are noticeable.
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ date
Wed Mar 20 02:36:59 PM CDT 2024
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 1332
  
  ...screen'ing in to check:
  
Processing wikidump-00505.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=13452ms, elapsed=13452ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=167329ms, commitTime=1711041930967, 
mutationCount=4566497Thu Mar 21 12:25:35 PM CDT 2024
Processing wikidump-00506.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=15405ms, elapsed=15405ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=203202ms, commitTime=1711042135111, 
mutationCount=5262167Thu Mar 21 12:28:58 PM CDT 2024
Processing wikidump-00507.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=14701ms, elapsed=14700ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=178754ms, commitTime=1711042314114, 
mutationCount=5005853Thu Mar 21 12:31:57 PM CDT 2024

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-20 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  The run to check with heap size of 31g, queue capacity of 8000, and buffer at 
10**0** stalled at file 107.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-20 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Attempting a run with a **queue capacity of 8000** and buffer of 10**0** 
and heap size of 16g on the gaming-class desktop to mimic the MacBook Pro, 
things were slower than a queue capacity of 4000 and buffer of 100 and heap 
size of 31g on the gaming-class desktop 
.
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 150
...
real280m46.264s
  
  A run is in progress to verify if there's anything noticeable when the heap 
size is set to 31g but the queue capacity is at 8000 and the buffer is at 
10**0** when processing the first 150 files on the gaming-class desktop.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-19 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  **About Amazon Neptune**
  
  Amazon Neptune was set to import using the simpler N-Triples file format with 
its serverless configuration at 128 NCUs (about 256 GB of RAM with some 
attendant CPU). We don't use N-Triples files in our existing import process, 
but it is the sort of format used in the graph split imports.
  
curl -v -X POST \
-H 'Content-Type: application/json' \

https://db-neptune-1.cluster-cnim20k6c0mh.us-west-2.neptune.amazonaws.com:8182/loader
 -d '
{
  "source" : "s3://blazegraphdump/latest-lexemes.nt.bz2",
  "format" : "ntriples",
  "iamRoleArn" : "arn:aws:iam::ACCOUNTID:role/NeptuneLoadFromS3",
  "region" : "us-west-2",
  "failOnError" : "FALSE",
  "parallelism" : "HIGH",
  "updateSingleCardinalityProperties" : "FALSE",
  "queueRequest" : "TRUE"
}'
  
  This required a bunch of grants, and I had to make my personal bucket hosting 
the file listable and readable, as well as the objects listable and readable 
within it (it's possible to do chained IAM grants, but it is a bit of work and 
requires somewhat complicated STSes). It appeared that it was also necessary to 
create the VPC endpoint as described in the documentation.
  
  This was started at 1:30 PM CT on Monday, February 26, 2024. Note that this 
is the lexemes dump. I'm trying here to verify that with 128 NCUs it goes 
faster than with 32 NCUs. Because if it does, that will be useful for the 
bigger dump.
  
curl -v -X POST \
-H 'Content-Type: application/json' \

https://db-neptune-1-instance-1.cwnhpfsf87ne.us-west-2.neptune.amazonaws.com:8182/loader
 -d '
{
  "source" : "s3://blazegraphdump/latest-lexemes.nt.bz2",
  "format" : "ntriples",
  "iamRoleArn" : "arn:aws:iam::ACCOUNTID:role/NeptuneLoadFromS3Attempt",
  "region" : "us-west-2",
  "failOnError" : "FALSE",
  "parallelism" : "OVERSUBSCRIBE",
  "updateSingleCardinalityProperties" : "FALSE",
  "queueRequest" : "TRUE"
}'


{
"status" : "200 OK",
"payload" : {
"loadId" : "8ace45ed-2989-4fd4-aa19-d13b9a59e824"
}

curl -G 
'https://db-neptune-1-instance-1.cwnhpfsf87ne.us-west-2.neptune.amazonaws.com:8182/loader/8ace45ed-2989-4fd4-aa19-d13b9a59e824'


{
"status" : "200 OK",
"payload" : {
"feedCount" : [
{
"LOAD_COMPLETED" : 1
}
],
"overallStatus" : {
"fullUri" : "s3://blazegraphdump/latest-lexemes.nt.bz2",
"runNumber" : 1,
"retryNumber" : 0,
"status" : "LOAD_COMPLETED",
"totalTimeSpent" : 2142,
"startTime" : 1708975752,
"totalRecords" : 163715491,
"totalDuplicates" : 141148,
"parsingErrors" : 0,
"datatypeMismatchErrors" : 0,
"insertErrors" : 0
}
}
}
  
  Now, for the full Wikidata load. This was started at about 2:20 PM CT on 
Monday, February 26, 2024.
  
curl -v -X POST \
-H 'Content-Type: application/json' \

https://db-neptune-1-instance-1.cwnhpfsf87ne.us-west-2.neptune.amazonaws.com:8182/loader
 -d '
{
  "source" : "s3://blazegraphdump/latest-all.nt.bz2",
  "format" : "ntriples",
  "iamRoleArn" : "arn:aws:iam::ACCOUNTID:role/NeptuneLoadFromS3Attempt",
  "region" : "us-west-2",
  "failOnError" : "FALSE",
  "parallelism" : "OVERSUBSCRIBE",
  "updateSingleCardinalityProperties" : "FALSE",
  "queueRequest" : "TRUE"
}'

{
"status" : "200 OK",
"payload" : {
"loadId" : "54dc9f5a-6e3c-428d-8897-180e10c96dbf"
}


curl -G 
'https://db-neptune-1-instance-1.cwnhpfsf87ne.us-west-2.neptune.amazonaws.com:8182/loader/54dc9f5a-6e3c-428d-8897-180e10c96dbf'
  
  As a frame of reference, over 9B records imported in a a bit over 26 hours. 
This is in the ballpark of what's discussed in 
https://hal.science/hal-03132794/document , and remember this is serverless and 
is probably using lower clock speed vCPU and possibly lower rated RAM. A higher 
powered provisioned instance is likely to go faster.
  
{
"status" : "200 OK",
"payload" : {
"feedCount" : [
{
"LOAD_IN_PROGRESS" : 1
}
],
"overallStatus" : {
"fullUri" : "s3://blazegraphdump/latest-all.nt.bz2",
"runNumber" : 1,
"retryNumber" : 0,
"status" : "LOAD_IN_PROGRESS",
"totalTimeSpent" : 101809,
"startTime" : 1708978871,
"totalRecords" : 931260,

[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-19 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  **Going for the full import**
  
  Further import commenced from there with a `bufferCapacity` of 10**0**:
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ date
Mon Mar  4 06:31:06 PM CST 2024

ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 151 -e 2202
Processing wikidump-00151.ttl.gz
  
  Munge files 151 through 1333 were processed, stopping at Friday, March 8, 
2024 12:07:23 AM CST.
  
  So, we have about 4 hours for files 1-150, then another 77.6 hours for files 
151-1333. This means about 66% of the full dump was processed in about 3.5 days.
  
  As noted earlier, there may be an opportunity to set the queue capacity 
higher and squeeze out even better performance. This will need to wait until 
I'm physically at the gaming class desktop.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-19 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  **More about bufferCapacity**
  
  Similarly, with 150 munged files, was attempted with the buffer in 
RWStore.properties increased from 10 to 10**0** with the target as the 
NVMe.
  
  com.bigdata.rdf.sail.bufferCapacity=100
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 150
...
real240m5.344s
  
  Remember, for //nine// munged files the difference in performance for NVMe ➡️ 
same NVMe between the `bufferCapacity` of 10 versus about 10*0*  was 
about 34%. (~1.3412 minus 1.), and what we see here for //150// munged 
files is somewhat consistent at about 33%.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-19 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  **More about NVMe versus SSD**
  
  Runs were also done to see the effects on 150 munged files (out of a set of 
2202 files) from the full Wikidata import, which allows for exercising more 
disk related pieces. This was tried with both types of target disk - SATA SSD 
and M.2 NVMe - on the 2018 gaming desktop. This was done with the 
`bufferCapacity` of 10.
  
  The M.2 NVMe was faster, somewhere between 16%-19% faster.
  
  Notice in the following commands the paths
  
  - `~/rdf`, which is part of a mount on the NVMe
  - `/mnt/t`, which is a copy of `~/rdf`, but on a SATA SSD
  - `/mnt/firehose/`, yet another SATA SSD, bearing the full set of munged files
  
  **Target is NVMe**
  
ubuntu22:~/rdf/dist/target/service-0.3.138-SNAPSHOT$ time ./loadData.sh -n 
wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 150

...

>Processing wikidump-00150.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=33999ms, elapsed=33999ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=76005ms, commitTime=1709099819611, 
mutationCount=3098484
real319m50.828s
  
  **Target is SATA SSD, run attempt 1**
  
  Now, the SATA SSD as the target (as before, the source has been a different 
SATA SSD).
  
ubuntu22:/mnt/t/rdf/dist/target/service-0.3.138-SNAPSHOT$ time 
./loadData.sh -n wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 150

>Processing wikidump-00150.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=45665ms, elapsed=45665ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=114606ms, commitTime=1709141576293, 
mutationCount=3098484
real381m19.703s
  
  So, the SATA SSD as target yielded a result about 19% slower.
  
  **Target is SATA SSD, run attempt 2**
  
  The SATA SSD target was tried this again from the same directory (as always, 
first stopping Blazegraph and deleting the journal) again just to get a feeling 
of whether this wasn't a fluke on the SATA SSD.
  
ubuntu22:/mnt/t/rdf/dist/target/service-0.3.138-SNAPSHOT$ time 
./loadData.sh -n wdq -d /mnt/firehose/munge_on_later_data_set -s 1 -e 150

>totalElapsed=46490ms, elapsed=46490ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=120472ms, commitTime=1709169683880, 
mutationCount=3098484
real373m52.079s

Still, some 16.5% slower on the SSD.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-19 Thread bking
bking closed subtask T358727: Reclaim recently-decommed CP host for WDQS (see 
T352253) as Resolved.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, bking
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-08 Thread bking
bking added a comment.


  @ssingh @dr0ptp4kt hold up on the testing for on your hosts for now...we 
might be able to get an NVMe into this year's budget, will let you know.
  
  @dr0ptp4kt If you want to run i/o tests on the existing hosts, I recommend 
the approach detailed in this wikitech page 

 . Brendan Gregg is considered an expert on computer performance and 
benchmarking, you might try his approach 
 as 
well.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, bking
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-08 Thread dr0ptp4kt
dr0ptp4kt added a subscriber: ssingh.
dr0ptp4kt added a comment.


  @ssingh would you mind if the following command is run on one of the newer 
cp hosts with a new higher write throughput NVMe? If so, got a recommended 
node? I don't have access, but I think @bking may.
  
  `sudo sync; sudo dd if=/dev/zero of=tempfile bs=25M count=1024; sudo sync`
  
  Heads up, I'm out for the rest of the day.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: ssingh, bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-08 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  Thanks @bking ! It looks like the NVMe in this one is not a higher speed one 
for writes, and I'm also wondering if perhaps its write performance has 
degraded with age. I'll paste in the results here, but this was slower than the 
other servers, ironically (although not surprisingly because of the slower NVMe 
and slightly slower processor). This slower write speed is atypical of the 
other NVMes I've encountered. I believe the newer model ones are rated for 6000 
MB/s for writes. But, I'm going to ping on task to see if we can get a 
comparative read of disk throughput from one of the newer and faster cp 
NVMes.
  
dr0ptp4kt@wdqs1025:/srv/deployment/wdqs/wdqs-cache$ ls /srv/wdqs/
aliases.map  wikidata.jnl   wikidump-2.ttl.gz  
wikidump-4.ttl.gz  wikidump-6.ttl.gz  wikidump-8.ttl.gz
dumpswikidump-1.ttl.gz  wikidump-3.ttl.gz  
wikidump-5.ttl.gz  wikidump-7.ttl.gz  wikidump-9.ttl.gz
dr0ptp4kt@wdqs1025:/srv/deployment/wdqs/wdqs-cache$ cd cache
dr0ptp4kt@wdqs1025:/srv/deployment/wdqs/wdqs-cache/cache$ time 
./loadData.sh -n wdq -d /srv/wdqs -s 1 -e 9
Processing wikidump-1.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=214282ms, elapsed=214279ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=233942ms, commitTime=1709910647417, 
mutationCount=22829952Processing wikidump-2.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=196470ms, elapsed=196469ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=227786ms, commitTime=1709910874952, 
mutationCount=15807617Processing wikidump-3.ttl.gz
http://www.w3.org/TR/html4/loose.dtd;>blazegraph by SYSTAPtotalElapsed=183111ms, elapsed=183110ms, connFlush=0ms, 
batchResolve=0, whereClause=0ms, deleteClause=0ms, insertClause=0msCOMMIT: totalElapsed=213965ms, commitTime=1709911089170, 
mutationCount=12654001Processing wikidump-4.ttl.gz
^C

real14m4.855s
user0m0.084s
sys 0m0.053s
dr0ptp4kt@wdqs1025:/srv/deployment/wdqs/wdqs-cache/cache$ cd /srv
dr0ptp4kt@wdqs1025:/srv$ df .
Filesystem  1K-blocksUsed  Available Use% Mounted on
/dev/nvme0n1   1537157352 9508448 1449491832   1% /srv
dr0ptp4kt@wdqs1025:/srv$ sudo sync; sudo dd if=/dev/zero of=tempfile bs=25M 
count=1024; sudo sync
1024+0 records in
1024+0 records out
26843545600 bytes (27 GB, 25 GiB) copied, 27.1995 s, 987 MB/s
dr0ptp4kt@wdqs1025:/srv$ sudo sync; sudo dd if=/dev/zero of=tempfile bs=25M 
count=1024; sudo sync
1024+0 records in
1024+0 records out
26843545600 bytes (27 GB, 25 GiB) copied, 37.5448 s, 715 MB/s
dr0ptp4kt@wdqs1025:/srv$ lsblk -o MODEL,SERIAL,SIZE,STATE --nodeps
MODELSERIAL SIZE STATE
...
Dell Express Flash PM1725a 1.6TB SFF   S39XNX0JC01060   1.5T

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread Maintenance_bot
Maintenance_bot removed a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, Maintenance_bot
Cc: bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, 
Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, 786, Biggs657, 
Juan90264, Alter-paule, Beast1978, Un1tY, Hook696, Kent7301, joker88john, 
CucyNoiD, Gaboe420, Giuliamocci, Cpaulf30, Af420, Bsandipan, Lewizho99, 
Maathavan, Neuronton
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread bking
bking added a comment.


  @dr0ptp4kt `wdqs1025` should be ready for your I/O tests. Let us know how it 
goes!

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, bking
Cc: bking, dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, Astuthiodit_1, 786, 
Biggs657, karapayneWMDE, Invadibot, maantietaja, Juan90264, Alter-paule, 
Beast1978, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, joker88john, 
CucyNoiD, Nandana, Gaboe420, Giuliamocci, Cpaulf30, Lahi, Gq86, Af420, 
Bsandipan, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, 
Lewizho99, Maathavan, _jensen, rosalieper, Neuronton, Scott_WUaS, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread gerritbot
gerritbot added a comment.


  Change 1009574 **merged** by Bking:
  
  [operations/puppet@production] wdqs: move monitoring logic into role 
declaration
  
  https://gerrit.wikimedia.org/r/1009574

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, Astuthiodit_1, 786, 
Biggs657, karapayneWMDE, Invadibot, maantietaja, Juan90264, Alter-paule, 
Beast1978, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, joker88john, 
CucyNoiD, Nandana, Gaboe420, Giuliamocci, Cpaulf30, Lahi, Gq86, Af420, 
Bsandipan, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, 
Lewizho99, Maathavan, _jensen, rosalieper, Neuronton, Scott_WUaS, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread gerritbot
gerritbot added a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, Astuthiodit_1, 786, 
Biggs657, karapayneWMDE, Invadibot, maantietaja, Juan90264, Alter-paule, 
Beast1978, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, joker88john, 
CucyNoiD, Nandana, Gaboe420, Giuliamocci, Cpaulf30, Lahi, Gq86, Af420, 
Bsandipan, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, 
Lewizho99, Maathavan, _jensen, rosalieper, Neuronton, Scott_WUaS, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread gerritbot
gerritbot added a comment.


  Change 1009574 had a related patch set uploaded (by Bking; author: Bking):
  
  [operations/puppet@production] wdqs: make "monitoring_tier" var optional
  
  https://gerrit.wikimedia.org/r/1009574

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-07 Thread dr0ptp4kt
dr0ptp4kt added a comment.


  First, adding some commands that were used for Blazegraph imports on Ubuntu 
22.04. I had originally tried a good number of EC2 instance types, and then 
after that went back to focus on just four of them with a sequence of 
repeatable commands (this wasn't scripted, as I didn't want to spend time 
automating and also wanted to make sure I got the systems' feedback along the 
way). I forgot to grab RAM clock speed as a routine step when running these 
commands (I recall checking on one server maybe in the original checks, and did 
look at my Alienware), but generally servers are DDR4 unless the documentation 
in AWS says DDR5 (for my 2018 Alienware and 2019 MacBook Pro they're DDR4, BTW).
  
# get the specs, get the software, ready the mount
lscpu
free -h
lsblk
sudo fdisk /dev/nvme1n1
 n
 p
 1
 ENTER
 ENTER
 w
lsblk
sudo mkfs.ext4 /dev/nvme1n1p1
mkdir rdf
sudo mount -t auto -v /dev/nvme1n1p1 /home/ubuntu/rdf
sudo chown ubuntu:ubuntu rdf
git clone https://gerrit.wikimedia.org/r/wikidata/query/rdf rdfdownload
cp -r rdfdownload/. rdf
cd rdf
df -h .
sudo apt update
sudo apt install openjdk-8-jdk-headless
./mvnw package -DskipTests

# ready Blazegraph and run a partial import
sudo mkdir /var/log/wdqs
sudo chown ubuntu:ubuntu /var/log/wdqs
touch /var/log/wdqs/wdqs-blazegraph.log
cd /home/ubuntu/rdf/dist/target/
tar xzvf service-0.3.138-SNAPSHOT-dist.tar.gz
cd service-0.3.138-SNAPSHOT/
# using logback.xml like prod:
mv ~/logback.xml .
# using runBlazegraph.sh like prod, 31g heap and pointer to logback.xml:
mv ~/runBlazegraph.sh .
vi runBlazegraph.sh
screen
 ./runBlazegraph.sh
CTRL-a-d to leave screen up
time ./loadData.sh -n wdq -d /home/ubuntu/ -s 1 -e 9
screen -r
CTRL-c to kill Blazegraph
 exit from screen
ls -alh wikidata.jnl
rm wikidata.jnl

# try it with a ramdisk
sudo modprobe brd rd_size=50331648 max_part=1 rd_nr=1
sudo mkfs -t ext4 /dev/ram0
mkdir /home/ubuntu/rdfram
sudo mount /dev/ram0 /home/ubuntu/rdfram
sudo chown ubuntu:ubuntu /home/ubuntu/rdfram
cd
cp -r rdf/. rdfram
cd rdfram/dist/target/service-0.3.138-SNAPSHOT/
cp /home/ubuntu/wikidump-* /home/ubuntu/rdfram
df -h ./
screen
 ./runBlazegraph.sh
CTRL-a-d to leave screen up
time ./loadData.sh -n wdq -d /home/ubuntu/rdfram -s 1 -e 9

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-06 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-06 Thread bking
bking reopened subtask T358727: Reclaim recently-decommed CP host for WDQS (see 
T352253) as Open.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, bking
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-05 Thread Maintenance_bot
Maintenance_bot removed a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, Maintenance_bot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, 786, Biggs657, 
Juan90264, Alter-paule, Beast1978, Un1tY, Hook696, Kent7301, joker88john, 
CucyNoiD, Gaboe420, Giuliamocci, Cpaulf30, Af420, Bsandipan, Lewizho99, 
Maathavan, Neuronton
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-05 Thread gerritbot
gerritbot added a comment.


  Change rOPUP100894305575 

 **merged** by Bking:
  
  [operations/puppet@production] partman: configure wdqs1025 partioning
  
  https://gerrit.wikimedia.org/r/1008943

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, Astuthiodit_1, 786, 
Biggs657, karapayneWMDE, Invadibot, maantietaja, Juan90264, Alter-paule, 
Beast1978, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, joker88john, 
CucyNoiD, Nandana, Gaboe420, Giuliamocci, Cpaulf30, Lahi, Gq86, Af420, 
Bsandipan, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, 
Lewizho99, Maathavan, _jensen, rosalieper, Neuronton, Scott_WUaS, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-05 Thread gerritbot
gerritbot added a comment.


  Change rOPUP100894305575 

 had a related patch set uploaded (by Bking; author: Bking):
  
  [operations/puppet@production] partman: configure wdqs1025 partioning
  
  https://gerrit.wikimedia.org/r/1008943

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-05 Thread gerritbot
gerritbot added a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, gerritbot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Isabelladantes1983, 
Themindcoder, Adamm71, Jersione, Hellket777, LisafBia6531, Astuthiodit_1, 786, 
Biggs657, karapayneWMDE, Invadibot, maantietaja, Juan90264, Alter-paule, 
Beast1978, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, joker88john, 
CucyNoiD, Nandana, Gaboe420, Giuliamocci, Cpaulf30, Lahi, Gq86, Af420, 
Bsandipan, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, 
Lewizho99, Maathavan, _jensen, rosalieper, Neuronton, Scott_WUaS, 
Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-05 Thread bking
bking added a subtask: T358727: Reclaim recently-decommed CP host for WDQS (see 
T352253).

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, bking
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread dr0ptp4kt
dr0ptp4kt moved this task from Incoming to Current work on the 
Wikidata-Query-Service board.
dr0ptp4kt removed a project: Wikidata-Query-Service.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

WORKBOARD
  https://phabricator.wikimedia.org/project/board/891/

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, 
Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331, AWesterinen, Namenlos314, 
Lucas_Werkmeister_WMDE, merbst, Jonas, Xmlizer, jkroll, Jdouglas, Tobias1984, 
Manybubbles
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, AWesterinen, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, 
EBjune, KimKelting, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, 
Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread dr0ptp4kt
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, AWesterinen, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, 
EBjune, KimKelting, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, 
Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread Maintenance_bot
Maintenance_bot added a project: Wikidata.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt, Maintenance_bot
Cc: dr0ptp4kt, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, AWesterinen, 
karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, 
EBjune, KimKelting, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, 
Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T359062: Assess Wikidata dump import hardware

2024-03-04 Thread dr0ptp4kt
dr0ptp4kt changed the task status from "Open" to "In Progress".
dr0ptp4kt triaged this task as "Medium" priority.
dr0ptp4kt claimed this task.
dr0ptp4kt added projects: Wikidata-Query-Service, Discovery-Search (Current 
work).
dr0ptp4kt updated the task description.

TASK DETAIL
  https://phabricator.wikimedia.org/T359062

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: dr0ptp4kt
Cc: dr0ptp4kt, Aklapper, AWesterinen, Namenlos314, Gq86, 
Lucas_Werkmeister_WMDE, EBjune, KimKelting, merbst, Jonas, Xmlizer, jkroll, 
Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org