[13:49:34] urbanecm: after being annoyed by it for a year, I just went into the settings and... yes, I can just turn it off ;) [13:49:41] :D [13:54:03] i just never discovered that feature [15:15:53] [[Tech]]; 5.191.129.129; /* Mugdat 212@gmail.com */ new section; https://meta.wikimedia.org/w/index.php?diff=25823027&oldid=25820218&rcid=28995387 [15:16:13] [[Tech]]; Tegel; Reverted changes by [[Special:Contributions/5.191.129.129|5.191.129.129]] ([[User talk:5.191.129.129|talk]]) to last version by JJMC89; https://meta.wikimedia.org/w/index.php?diff=25823028&oldid=25823027&rcid=28995388 [15:16:52] [[Tech]]; 5.191.129.129; /* Mugdat 212@gmail.com */ new section; https://meta.wikimedia.org/w/index.php?diff=25823029&oldid=25823028&rcid=28995389 [15:16:59] [[Tech]]; Tegel; Reverted changes by [[Special:Contributions/5.191.129.129|5.191.129.129]] ([[User talk:5.191.129.129|talk]]) to last version by Tegel; https://meta.wikimedia.org/w/index.php?diff=25823030&oldid=25823029&rcid=28995390 [18:32:13] is anyone good with php performance [18:32:24] trying to debug the wikistats wikia page [18:32:41] which it seems is too big for the way the code is wrote [18:32:41] do you have a profiler or such? [18:32:58] katia: no, i haven't profiled it [18:33:26] that might help unless you're sure where the problem is [18:33:46] katia: i know what the problem is, there's 70,000 wikia wikis logged [18:34:08] and the code is trying to load all of them into one html page to generate the table [18:34:19] https://gitlab.wikimedia.org/cloudvps-repos/wikistats/-/blob/master/var/www/wikistats/display.php?ref_type=heads [18:34:39] https://stats.wikimedia.org is it this page? [18:34:44] which causes aw snap on https://wikistats.wmcloud.org/display.php?t=wi [18:35:05] i tried https://gitlab.wikimedia.org/cloudvps-repos/wikistats/-/merge_requests/4#diff-content-37521090198c076027c9d4593edfe0a462fb6ccc and deployed it to https://wikistats.wmcloud.org/display.php?t=wi which just OOMs [18:35:13] katia: no, the other wikistats [18:35:51] the PR is https://wikistats.wmcloud.org/display2.php?t=wi [18:36:51] hmm that works for me RhinosF1 [18:36:58] takes a while to load [18:37:10] katia: which display.php or display2.php [18:37:17] display.php [18:37:27] but yeah, you are putting a lot of data in the DOM i suppose [18:37:35] katia: it doesn't for multiple others, has the browser actually finished? [18:37:41] the aw snap is a problem with the browser OOM [18:37:50] it takes a long time for me before the browser goes aw snap [18:37:57] so the problem is not actually in the php, more in how the table is made [18:38:02] yes i guessed that [18:38:12] but how do we make a less shit way of rendering this [18:38:29] so a way could be make the intial php page empty, and use a table library [18:38:38] and fill it with multiple requests to an API, paginating [18:39:20] doesn't need to be a library either really. perhaps just a way to load more once you are at the bottom of the page would work depending on your use of this page [18:40:54] another way could be to just fill the table with n rows, and at the bottom have a 'next page' link? then you can start at the offset of page*'n of items in table' [18:41:12] that wouldn't require you to write JS [18:41:18] it has pages [18:41:26] but they're all built in html [18:41:30] * RhinosF1 is not a web dev [18:42:11] where are the pages? [18:42:57] look at what the file does on gitlab [18:43:02] as to how it is building it [18:44:34] this is using the bootstrap framework [18:44:37] to create the tables [18:44:43] bootstrap has pagination feature [18:44:59] imho this is something to check upstream with bootstrap community [18:45:25] but before that even becomes relevant.. first let's focus on how to get the 250k current fandom wikis into the mysql table [18:45:37] so more like "do they even provide an API to fetch the list of wikis" [18:45:55] then "can we fetch stats from 250k wikis in a reasonable time with the current update.php" [18:46:05] and only if that is working.. we have to worry about the display issue [18:46:21] and we can still just LIMIT the mysql query to only fetch the 1000 largest ones or something [18:46:53] if the other things are already a no-go then dont need to worry about the display issue [18:47:02] the current 70k wikis in DB are outdated anyways [18:47:24] the "next page" link etc.. that is all included in bootstrap [18:47:35] and it works for tables that are not as extreme as this one [18:47:48] this one just pushes the limits somehow somewhere [18:48:24] mutante: it's not a bootstrap issue imho [18:48:36] it's a the way the code is doing it is not fit for purpose [18:48:55] the bootstrap stuff never gets called on the templates because it can't even render the html [18:49:16] it can't even load a html file that big properly, nevermind execute it [18:50:07] bootstrap is generating the HTML it cant render [18:50:09] afaict [18:50:28] no [18:50:37] it was specifically introduced to generate the table code [18:50:44] and add thigns like pagination [18:50:58] as you can see on working tables [18:50:58] it's 1,131,660 lines of html [18:51:09] that's never going to load [18:51:11] or 94M [18:51:19] sure, but that's not a contradiction [18:51:33] we agree it's just too many wikis in one table [18:51:37] when I changed to try and have the display generate in full before sending it, PHP OOM'd [18:51:43] it's too much html [18:52:13] yea, not enough RAM for that [18:52:14] https://www.irccloud.com/pastebin/Id42hfpT/ [18:52:23] mutante: there's enough ram [18:52:28] i can generate it locally [18:52:32] well, you showed me the OOM error [18:52:36] just not enough for a web request [18:53:11] it's not machine killing php [18:53:54] i am testing something [18:54:19] i pre-generated the html to see how long loading just the HTML takes in browser [18:54:21] https://wikistats.wmcloud.org/wikia.html [18:55:17] it's not bootstrap causing the OOM though [18:55:28] it's the fact it's 1 million lines of html [18:55:47] chrome still OOMs on pregenerated html [19:15:10] wfm on firefix :D [19:50:03] Nemo_bis: weird [19:50:11] although it's memory so it'll vary [19:50:16] it might be better soon [19:50:20] i'm updating the data