[00:23:07] If I already initially created a GitHub repo for a new MediaWiki extension, is there a convenient way to sync the Gerrit repository with the GitHub one? For example https://gerrit.wikimedia.org/r/admin/repos/mediawiki%2Fextensions%2FPhotoSwipe is outdated compared to latest pushes to https://github.com/jasonkhanlar/mediawiki-extensions-PhotoSwipe/ and I would like to push the same Git updates to the Gerrit repo also. [08:27:44] ryzenda: I think you'll need something like this: https://stackoverflow.com/questions/6882017/git-automatic-post-receive-hook [09:55:16] Also, If I start working on a new extension from scratch, is it acceptable to request to create a git repo on Gerrit to maintain the development history instead of using another git hosting service provider to mirror from later? [10:09:00] ryzenda: yes, that's acceptable [10:33:32] re: https://mediawiki.org/wiki/Manual:$wgAllowExternalImages are there any Wikimedia sites that have enabled hotlinking images? [10:35:01] no [10:35:27] On my mediawiki site, I don't think I will have adequate disk space to host images/videos/uploads, so I am implementing the site to handle content from external sources, but with recent extension I wrote for my site, I am curious if there are any other mediawiki sites that also have that capability. Notably I am curious if there are any places I could show demonstration example of it in use for others to see how it [10:35:27] works. [10:35:58] you can see settings for all Wikimedia wikis here: https://noc.wikimedia.org/conf/highlight.php?file=InitialiseSettings.php [10:37:06] ryzenda, maybe InstantCommons is what you want/need? https://www.mediawiki.org/wiki/InstantCommons [10:39:19] I was thinking about that, and glanced a little bit at it, however, for some of the contents that I may cite in pages, I am not sure about copyrights of the contents and how that may affect uploading or hosting images in which I do not have to worry about copyright concerns [10:40:40] "Each of these files is available under a free content license or in the public domain" [10:43:59] I could try to consult with every single copyright owner of all the images that will be cited/hotlinked to see if they would be willing to have it published on InstantCommons, but even just 1 case of a no, would make it difficult to then be able to reproduce the same showing of the image without using wikitext markup to load the image without using $wgAllowExternalImages [11:00:01] Also a little bit I was thinking of for external hosted image urls in wikitext, that possibly I can also prepare an extension that converts them to be loaded as appearing to be a locally hosted image, but really it just makes the server to request the remote url to then deliver to the user appearing as locally, that way to bypass concerns of data privacy and whatnot [11:01:03] Have you looked at potentially changing your hosting environment compared to writing a whole extension? [11:01:42] I am currently using DreamHost, but otherwise I can't afford to upgrade to handle much right now. [11:08:49] and answering the question, I most certainly have spent many hours of many days repeatedly looking at almost exhaustive changing of hosting environment glancing through many dozens of providers and all the features,across various different ways to host mediawiki websites, and I am constantly doing this every few days/weeks too, but it's been a couple weeks since I last checked. [11:25:58] Oh, also instead of an extension, maybe https://mediawiki.org/wiki/Manual:$wgHTTPProxy can handle converting the external img urls in wikitext to be presented locally queried. I'll have to look into how this works. [11:54:56] [[Tech]]; Hogweard; /* Code typing problem on the Old English Wikipedia */ Reply; https://meta.wikimedia.org/w/index.php?diff=23216594&oldid=23215546&rcid=23905204 [12:44:28] hi [13:13:37] Hi there ! Want to ask is it possible to download ukrainian wiki data? [13:13:38] I found this [13:13:38] https://dumps.wikimedia.org/backup-index.html [13:13:38] but there's only "uawikimedia" and I assume what this is not wiki data [13:14:30] uawikimedia is Wikimedia Ukraine chaptet wiki [13:14:34] You need ukwiki [13:14:41] The language code for Ukrainian is uk [13:14:49] UA is the country code for Ukraine [13:14:57] *chapter [13:15:22] That is assuming that what you want is to download a dump of Ukrainian Wikipedia [13:15:55] Ohh, I see... I'm stupid ) thank you .. [13:16:09] by the way my surname also Melnychuk )) [13:16:44] Can you advice some articles how to parse it maybe? [13:17:40] BrownQuetzal81: see "Using and re-using the dumps" on https://meta.wikimedia.org/wiki/Data_dumps [13:21:14] legoktm ok, I'll check it .. thank you [13:57:38] Is there a way to get a list of all edits to a project in some time period? Quarry, maybe? (I imagine Base might know … 😊 ) [13:58:54] I only really need 3 things: Page title, username and date for what I want to do [13:59:04] yes, you can use the revision table [13:59:23] (may not include flow edits, not sure) [13:59:42] Flow edits missing would not be a problem at all [14:01:16] SELECT page_title, actor_name, rev_timestamp FROM revision inner join page on rev_page = page_id inner join actor on rev_actor = actor_id WHERE rev_timestamp BETWEEN '20200101000000' AND '20200301000000'; [14:02:00] have not tested that yet [14:03:03] thanks! i'm testing it now :) [14:03:43] Of course, replacing the dates with the ones you are interested in (format YYYYMMDDHHMMSS ) [14:04:50] might want to include page_namespace as well if you care about namespace [14:09:21] I'm trying it for a 1 day span, but it's taking a long time. What I eventually want to do is to run it for a weekly or monthly span, but then maybe i risk a time-out or something? [14:10:03] which wiki are you targeting? [14:10:15] (it taking a long time isn't a problem in itself, i'm more afraid of timeouts) [14:10:33] Lucas_WMDE, incubatorwiki. so not the most high-traffic in terms of edits, but not the lowest either [14:10:46] hm, yeah, that feels like it should be okay [14:12:17] the difference in the oldid between two edits I made roughly 24 hours apart is 777, so not a huge amount [14:12:25] but would that work when you times it with 7 or 30? [14:12:26] I would expect this all to be indexed [14:13:01] so i'm surprised its taking a long time just for a day [14:14:32] The optimizer says that your query is using rev_timestamp index, and looking at about 1000 rows. I'm surprised its not instant [14:15:38] yeah, it’s not clear to me what would make it take so long [14:15:45] Jhs: Maybe something wrong with quarry. When i run it from toolforge command line it takes 0.01 seconds [14:15:56] it’ll have to check rev_deleted, but even so, it shouldn’t need to load too many rows [14:18:09] Jhs: if you have ssh access to toolforge, I would suggest using that instead [14:18:16] when i hit Stop, it gives me an alert() with a 500 error [15:48:50] i do have ssh access to toolforge. but how can i run queries there? [15:49:31] (i have almost zero experience with SQL queries, all i've ever done is copied and adapted some in quarry) [15:54:38] `sql enwiki` will give you a mysql prompt, and then you can just type in a query [15:54:53] (or whatever wiki you want) [16:01:48] oh, nice. will give that a shot now [16:04:52] yeah, that was almost instantaneous, so there must be some problem with quarry [16:35:12] bawolff, the query worked and was almost instantaneous when i ran it on Toolforge 👍 [16:35:31] woo [16:36:43] i guess i should also add, that query will not include any edits where the user has been revision deleted/oversighted [16:36:49] if that matters [16:37:28] nah, doesn't matter – my idea is to make a dashboard for the Incubator to see which wikis have activity (how many edits, how many unique users) over time [16:50:20] Jhs: Maybe the following query would be helpful for that: SELECT regexp_substr(page_title, "^[^/]*/[^/]*") 'Wiki', count(rev_id) 'Edits', count(distinct rev_actor) 'Unique users' FROM revision inner join page on rev_page = page_id WHERE rev_timestamp BETWEEN '20200101000000' AND '20200301000000' and page_namespace <= 1 group by 1;