[02:03:36] Hi, I would like to create a page from an external javascript app on a wiki. I have page title and content. How can I shove the suggested the content into a page creation article body content? Like https://test.wikipedia.org/w/index.php?title=Gryllida2&preload=Gryllida&action=edit but in my case I don't have the luxury of having content on an existing page yet. [02:58:50] gry: you can't [02:59:09] moonmoon: how do others do it then? like for app which uploads from flickr to commons? [02:59:09] well, let me clarify [02:59:25] you can't pre-populate page content for a user to then amend and submit via the UI [02:59:53] you *can* make a POST request to api.php and specify the content directly, provided you have the requisite tokens for the user you're editing as [03:00:29] (the api.php method will create/save the page without user interaction/intervention) [03:08:28] moonmoon: OK, what api is this for post? [03:08:53] https://www.mediawiki.org/wiki/API:Edit [03:14:41] moonmoon: i go https://test.wikipedia.org/w/api.php?action=query&meta=tokens take token to https://test.wikipedia.org/wiki/Special:ApiSandbox#action=edit&format=json&title=Gryllida2&text=Hello%20world%0AThis%20is%20a%20test.&summary=Created%20the%20page%20via%20api%20sandbox.&token=&formatversion=2 it says invalid token [03:15:26] you either used the wrong token, or you aren't presenting cookies [03:15:48] Special:ApiSandbox is going to be your friend here [03:15:52] or apps like Postman [03:19:29] moonmoon: i'm using api sandbox, it provides me with these urls. you can try it if you like. i couldn't get the right token from it. [03:22:42] what does the token you're putting in end with? [03:22:54] (last 3 characters) [03:28:52] the token should end with +\ (one backslash). The JSON representation contains 2 because backslashes need to be escaped in JSON, but passing 2 back will be invalid [03:29:13] if the token is +\ without anything before it, then you're being treated as logged out/anonymous [03:29:18] ah two slashes [03:29:19] ok [03:30:00] works now thanks [03:30:11] (the choice of +\ was intentional back in the day to act as a trap for misconfigured or miscoded scripts) [03:30:37] it's a pain for my script, i'll chop off one last symbol :-( [03:30:47] use a proper JSON parser, and then it'll work just fine [03:31:03] literally every programming language that isn't C has one [03:31:12] right, for example, https://test.wikipedia.org/w/api.php?action=query&meta=tokens&format=xml works ok [03:31:17] (C has one too it just isn't stdlib) [03:34:56] thanks for your help moonmoon this wasn't obvious [03:59:13] if you have an example i'd appreciate it, i'm working outside of the wiki so mw.api stuff is not available [04:33:31] example scripts are on the link, or you can use e.g. pywikipediabot [04:37:12] looking for js thanks i will check out the page [20:36:31] dcausse: I finally have access to the server again where I saw all those cirrusSearchElasticaWrite jobs… and are you sure those only get created when there are errors? (our last discussion on this was on 10 march ^^) [20:36:57] I see various cirrus jobs, including cirrusSearchLinksUpdatePrioritized, cirrusSearchIncomingLinkCount and cirrusSearchElasticaWrite, and they all seem to succeed [20:37:33] and looking at the code in includes/Updater.php, it looks to me like cirrusSearchElasticaWrite should be queued in normal situations, including updatePages() [20:37:50] (both in REL1_39 and REL1_43) [20:47:42] lucaswerkmeister: weird... in https://gerrit.wikimedia.org/g/mediawiki/extensions/CirrusSearch/+/553d9214143c0574cd0a47c3b8762554cf576bec/includes/Updater.php#464 ElasticaWrite jobs are only queued if $clusterSettings->isIsolated() otherwise they're run inline [20:47:58] haven't checked older mw version tho, but I don't think that behavior is new [20:54:19] 1.39 should run inline (https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CirrusSearch/+/refs/heads/REL1_39/includes/Updater.php#426), not sure what's happening... [21:12:08] oh, I see, I didn’t look at that line [21:12:39] let me see if I can figure out if it’s isolated or not for me [21:17:46] dcausse: the default for $wgCirrusSearchWriteIsolateClusters is null, and according to https://gerrit.wikimedia.org/r/plugins/gitiles/mediawiki/extensions/CirrusSearch/+/refs/heads/REL1_39/includes/ClusterSettings.php#98 null apparently means “all clusters are isolated” not “no clusters are isolated” [21:18:31] whether that’s correct or a bug, I can’t say :) [21:19:49] lucaswerkmeister: oh my bad, sorry about that did not realize this was behaving like that... [21:20:23] no problem! :D [21:20:24] that means I'm a bit puzzled about your jobqueue issue :) [21:20:43] hm, why? I thought that means everything is basically fine [21:21:15] except that the server isn’t running jobs quite fast enough, which https://www.mediawiki.org/wiki/Manual:Hooks/ApiMaxLagInfo#Example helps with a bit [21:21:25] (and I’ll also try to add another job runner or two) [21:22:02] but if I understand correctly, the cirrusSearchElasticaWrite are expected (since it’s treating the clusters as isolated ⇒ jobs are queued rather than ran inline) [21:22:38] lucaswerkmeister: if it's "just" throughput issue sure [21:23:33] lucaswerkmeister: yes they are, but unsure if this is what we intended... I wonder if we could skip this additional cirrusSearchElasticaWrite for "simple" setups [21:23:43] I think it’s just throughput, yeah [21:23:50] I can file a task if you like ^^ [21:23:59] lucaswerkmeister: please do :) [21:28:39] filed T389429 :) [21:28:40] T389429: Investigate whether it’s intentional / correct that default CirrusSearch setups run cirrusSearchElasticaWrite as separate jobs - https://phabricator.wikimedia.org/T389429 [21:28:51] thanks a lot for looking into it! [21:31:46] thanks! I'm pretty sure that extra hop is not necessary and will save some resource on non-wmf wikis [21:32:34] theres a lack of good 3D extensions for media wiki [21:33:23] 3D extension being...? [21:33:56] extensions that can render 3D objects (.stl, .glb, etc. files) [21:34:02] https://www.mediawiki.org/wiki/Extension:3D ? [21:34:25] i checked Extension:3D, i suppose it is the best one [21:34:52] the thing is, Extension:3D has a laborious installation.. it asks you to download 3d2png [21:35:22] 3d2png only runs in docker, so you have to setup a docker and everything (yes there is no alternative way without a docker container) [21:35:22] https://commons.wikimedia.org/wiki/File:Puchar_Wikipedia_10_cm.stl#/media/File:Puchar_Wikipedia_10_cm.stl [21:35:56] the other issue with it is .stl does not support textures [21:36:17] e.g., i am trying to render a minecraft build, and it doesnt look as pleasing without the textures (blocks cant be differentiated) [21:36:39] .glb/.gltf on the other hand seems rather the standard these days [21:36:51] .glb/.gltf on the other hand seems rather the standard these days (webgl) [21:36:52] you don't need docker to run 3d2png [21:37:54] > The deploy repository needs to be built on a system as similar to the production hosts as possible. For this reason, we use the service-runner package, which spins up a Docker container based on the definition provided in the deploy stanza of package.json, installs the distribution packages needed, builds the node_modules directory and updates [21:37:54] the source repo submodule. [21:37:55] i suppose i misread the instruction [21:37:55] it's just node.js [21:38:00] i see [21:38:08] yeah service-runner should take care of all that [21:38:34] but yeah only stl support is the other drawback [21:38:47] patches welcome [21:39:12] yes i am currently working on either improving the existing 3D extension or forking it [21:39:29] google has this really nice javascript library (modelviewer.dev) you can visit the site and see nice examples [21:39:39] and it supports webgl models [21:40:30] plan is to use the javascript library for rendering 3D models, but i have yet to get a grasp of the mediawiki extension landscape :( [21:41:15] from what i gathered, you have to register a MediaHandler of sorts?