matmarex added a comment.

  As I understand these are stored as regular MediaWiki pages now, so they have 
a maximum length of 2 MB. Even naive queries pulling the whole thing into 
memory would be fast enough at these scales. If we want to think about 
performance for large data, we should first think about overcoming the length 
limitation :)

TASK DETAIL
  https://phabricator.wikimedia.org/T120452

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Yurik, matmarex
Cc: Jdforrester-PERSONAL, Jdforrester-WMF, brion, ThurnerRupert, intracer, 
TerraCodes, Pokefan95, gerritbot, -jem-, Bawolff, MZMcBride, Alkamid, 
Milimetric, Thryduulf, JEumerus, MarkTraceur, Yurik, Matanya, ekkis, matmarex, 
Lydia_Pintscher, Aklapper, Steinsplitter, StudiesWorld, DannyH, Riley_Huntley, 
D3r1ck01, Izno, JAllemandou, Wikidata-bugs, aude, El_Grafo, Ricordisamoa, 
Shizhao, Fabrice_Florin, Mbch331, Jay8g, Krenair, jeremyb



_______________________________________________
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs

Reply via email to