thiemowmde added a comment.

  I did a more complex benchmark with an item with 1000 labels, 1000 
descriptions, 1000 aliases (1 per language), 1000 statements, each statement 
with 1 main snak, 1 qualifier and 1 reference (all StringValues). This is a 
total of 3000 snaks.
  
  - This entity requires 4.2 MiB in memory.
  - Clone it via `unserialize( serialize( … ) )`, and the process requires 
additional 5.6 MiB. This number fluctuates with every clone, between 4 and 6 
MiB. I think this is because of the way `serialize` and `unserialize` are 
implemented. Calling these methods obviously allocates more space on top of 
what the cloned entity uses in the end.
  - Clone it with https://github.com/wmde/WikibaseDataModel/pull/626 applied, 
and the process requires additional 0.5 MiB.
  
  So in this example, which should be pretty close to the worst case we have in 
the wikidata.org database, the patch reduces our memory footprint by solid 5 
MiB.

TASK DETAIL
  https://phabricator.wikimedia.org/T126795

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: thiemowmde
Cc: Aklapper, StudiesWorld, thiemowmde, daniel, Bene, D3r1ck01, Izno, 
Wikidata-bugs, aude, JeroenDeDauw, Mbch331



_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to