[00:00:58] Okay [00:01:24] or you can use the option Reedy presented above, but that's more risky of course [00:01:51] Yeah [00:02:10] Would making myself the only interface admin and setting a very good password on my account help? [00:02:23] Or a separate interface admin account with a password not in my password manager [00:02:50] whichever account that makes int admins needs good protection too [00:10:06] A bureaucrat account can give itself any other rights, right? [00:10:13] yes [00:13:15] Okay, I took bureaucrat and interface admin rights from myself and the other sysop, and created a new account with those rights. The new account has a ten word xkcd password which is not stored on a computer other than MediaWiki's DB. Is it safe to enable css on restricted pages now? [00:15:57] You could just use a one character password if you wanted [00:16:01] The risk is upto you [00:16:16] No one is going to tell you that account couldn't be compromised [00:16:47] Yeah, I know that. [00:17:15] 2FA/MFA probably provides more protection [00:17:23] Okay [13:02:31] o/ mediawiki [13:02:48] which of this versions does mediawiki uses by default? [13:02:50] if page is called MyPage, we'd have MyPage.textile and Mypage// [13:02:50] or maybe MyPage/attachments// [13:02:50] or Mypage-attachments// [13:02:55] thanks Irelativism [13:17:17] that doesn't sound like what mediawiki does mediawiki at all [13:50:07] Hi everyone, I am a developer and would like to start programming for a Wikimedia project [14:18:50] taavi: is not mediawiki specific [14:18:55] is just a example [14:19:09] to undertand attachemen structure [14:19:25] btw something going on with IRC bridge [14:19:42] it keeps redirecting to newly created rooms [14:19:56] is there a way to add javascript code per wiki page to enable some small interactive games / visualizations (controlled access / no security concerns)? [14:32:13] SeriousFun01[m]: No, you'll have to add the script globally, and check the current page name to load or execute the script. [14:34:09] I found also this Extension:UseResource, that allows userscript / userstyle per page but there is little in terms of documentation. The main thing missing is if I can insert a new div element that the js can start manipulating [14:34:59] ^^... (full message at https://libera.ems.host/_matrix/media/r0/download/libera.chat/a1e1054d47b2c8ca2d50f5858083ec32983c41c7) [14:35:34] *MyPage.mw [14:35:40] not textile [14:45:22] SeriousFun01[m]: You can insert any div on a page with specific id or class, then get the current page title with mw.config.get('wgPageName') and execute the script. Or simply search for that id/class for every page [14:47:25] M3RELATIVISM[m]: you are not talking about MediaWiki at all [15:28:12] Remilia: indeed I am [15:28:39] MediaWiki does not use the URL scheme you mentioned [15:28:46] my question is when there are files inside a mediawiki article page how does the path structure look like? [15:28:54] is it on some directory [15:28:55] MEdiaWiki does not have ‘attachments’ [15:29:16] you could try https://www.mediawiki.org/wiki/Help:Images [15:29:21] yes I know attachements/media call it what you wish :) [15:29:36] my point is when the are files inside a page [15:29:43] there are no files inside a page [15:29:48] how does the structure of path look like? [15:30:01] "/file" [15:30:13] "file" [15:30:16] web server path to uploads is defined by LocalSettings.php [15:30:24] or like "/media/file" [15:30:25] wikitext uses the File: namespace [15:30:41] please refer to the link I just posted [15:31:36] you can literally open any MW-based wiki and inspect an image [15:31:47] and then look at the wikitext source [15:32:10] I dont have my instance running yet [15:32:26] also there might be nondefault configurations for that Remilia ;) [15:32:57] can you explain what exactly are you missing in the Installation and Configuration manual? [15:33:24] https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads [15:34:38] "can you explain what exactly are you missing in the Installation and Configuration manual?" that is kinda of unrelated to my question [15:34:58] M3RELATIVISM[m]: see https://pbarcwiki.kj7rrv.com/index.php?title=User:KJ7RRV https://pbarcwiki.kj7rrv.com/index.php?title=User:KJ7RRV&action=edit and https://pbarcwiki.kj7rrv.com/index.php?title=File:Leo.jpg [15:34:59] Remilia: dont seam to find any info regarding paths on https://www.mediawiki.org/wiki/Manual:Configuring_file_uploads [15:35:15] M3RELATIVISM[m]: have you considered reading through it? [15:35:38] I did :) [15:35:45] have you read the "Upload directory" section? [15:36:00] english is not my native language though [15:36:36] click the button that says English in the top right [15:36:55] select your native language [15:37:18] thanks [15:38:45] English is not my native language either [15:39:32] ok then so if I understood it correctly [15:39:43] hashes are used instead of IDs like redmine [15:40:02] and files are stored on a directory called /images [15:40:45] * M3RELATIVISM[m] goes look where that /images would be [15:42:28] M3RELATIVISM[m]: that section tells you that there is $wgUploadDirectory and links to https://www.mediawiki.org/wiki/Manual:$wgUploadDirectory [15:42:55] yes Im reading it [15:43:06] look like 2 first caracteres are used [15:43:57] in separate subdirectories [15:44:13] Remilia: why is hashing necessary? [15:44:15] this will complicate our migration extensively [15:44:16] we would need to had said function to our migration scripts :s [15:45:17] the manual page I linked also links to https://www.mediawiki.org/wiki/Manual:$wgHashedUploadDirectory [15:45:27] have you considered reading that? [15:45:41] I have read it already [15:46:12] including "If false, all images are uploaded in $wgUploadDirectory itself. (e.g. $IP/images/foo.jpg)"? [15:46:41] cool cheers [15:46:57] "we would need to had said function to our migration scripts" I cannot see why [15:47:08] you can use maintenance scripts to upload media [15:47:08] where is $IP/, root server directorie right? [15:47:19] or the API itself [15:47:24] "maintenance scripts"? [15:47:33] background [15:47:58] in the replicant.us project we are migrating our wiki off redmine [15:48:02] and into mediawiki [15:48:06] let me type "mediawiki maintenance scripts" into a search engine for you [15:48:19] https://www.mediawiki.org/wiki/Manual:Maintenance_scripts [15:48:40] Remilia: your condecendence is amuzing me [15:49:01] I though you refered to migration scripts as maintanence script btw [15:49:13] back on topic, so we can avoid duplication with outher [15:49:24] > background [15:49:42] I think your developers need to look into MW API themselves [15:49:49] it is very straightforward and rather stable [15:50:44] because you can have a direct link to an image hosted on MW by using https://www.mediawiki.org/wiki/Help:Images#Direct_links_from_external_sites [15:50:53] "there is no your developers" Im trying to help with it btw, anyway, in the replicant.us project we are migrating our wiki off redmine into mediawiki, and we are creating a script to export all editions into pages without loosing history [15:51:04] we are using git for this [15:51:29] so that we can use git remote later to import things to our own mediawiki instance [15:52:38] therefore we need to understand what structure our mediawiki will have to what comes to attachments/files/images, so we can translate that to git repository upon exportation from redmine [15:53:18] Remilia: Now you probably understand better why I asking all this semingly unecessary questions ;) [15:54:08] MediaWiki tracks all uploaded files in database. They should not only be placed on the filesystem, but also "registered" in MediaWiki. Otherwise, MediaWiki can't tell what file dimensions it have, its format, number of pages (if PDF, TIFF), if animated, etc [15:54:14] you are not supposed to use git for that [15:54:32] MW will not be able to pick up your files [15:55:00] this is why I said you could use the API to upload them [15:55:12] you could also use maintenance scripts to import files [15:55:52] https://www.mediawiki.org/wiki/Manual:ImportImages.php [15:56:49] Remilia: git != git mediawiki:remote [15:56:50] sounds good Vulpix thanks [15:56:57] how do you suggest, or have any idea on what would be the best way of doing it in our situation then? [15:57:51] so like one would need to use API after the fact for proper import [15:57:53] you did not provide enough information on what the requirements are [15:58:19] do you need to use files hosted on the wiki somewhere else? [15:58:38] yes that is were I really waned to get to [15:58:51] [18:50:43] because you can have a direct link to an image hosted on MW by using https://www.mediawiki.org/wiki/Help:Images#Direct_links_from_external_sites [15:58:53] So we in the future ounce infrastructure is setup [15:59:08] wish to release toghether with official replicantOS images [15:59:28] a offline client for wikimedia, we were thinking of kiwix [16:00:06] so our thinking is the otimal solution is to distribute files through clients [16:00:08] *said clients [16:00:34] so we dont relly on a single central media server [16:00:36] wikimedia is not mediawiki [16:00:52] has anyone ever experimented with this kind of ideas [16:01:12] Remilia: true wikimedia != mediawiki [16:01:23] wikimedia foundation -> wikipedia [16:01:29] anyway moving on [16:01:51] well I am not sure what you meant when you said ‘an offline cliengt for wikimedia’ [16:02:04] if we would like to have a distributed like system, is something like that implemented for mediawiki? [16:02:05] client even [16:02:08] or not really [16:02:23] and we should use main server for that for starters [16:02:45] Remilia: remember english is not my native language [16:02:51] we will use kiwix [16:03:18] I call it client you can call it whatever you perfer, or believe is technically more correct :) [16:03:25] so [16:03:52] when you say ‘wikimedia client’ you mean a client for all wikis under Wikimedia Foundation umbrella? [16:03:59] offline? [16:04:09] that is an insane amount of data [16:04:34] no just our wiki [16:04:56] I meant this: [18:59:28] a offline client for wikimedia [16:04:58] we want to package with our image a offline version of our wiki [16:05:14] sure for our wikimedia [16:05:25] I see [16:05:26] not wikimedia in general [16:05:53] by our I mean our replicant instance [16:05:59] docs.replicant.us [16:06:02] I am not sure what you did not understand here then: [19:00:36] wikimedia is not mediawiki [16:06:08] or wiki.replicant.us [16:06:33] Remilia: you are going arrounf in circles:P [16:06:37] MediaWiki is the software, right? are you talking about your instance of MediaWiki software? [16:06:49] or your instance of Wikimedia Foundation? [16:07:11] no, like I said, English is not my primary language so I am trying to understand [16:07:12] I talking about docs.replicant.us [16:07:19] yes replicant mediawiki instance [16:07:32] Remilia: ohh ok my bad sorry [16:07:33] if you want an offline version [16:07:40] just package a Docker image that runs locally [16:07:47] yes we are thinking of using kiwix [16:07:53] with all the files and whatnot [16:08:11] replicant will probably never package or run docker [16:08:18] given it is FSDG distribution [16:08:36] what are you trying to accomplish? [16:08:47] I'm getting serious XY Problem vibes here [16:09:04] so the plan is to use kiwix for a offline replicant mediawiki [16:09:18] and given we are doinf nice [16:09:26] *nice=that [16:09:41] we intend to distribute files over those peers [16:09:53] instead of relying of a central media server [16:10:04] ok... that's a thing that kiwix supports, yes [16:10:15] ok nice thanks [16:10:21] * M3RELATIVISM[m] goes look into it [16:10:42] just use your online wiki normally, including normally uploading files. mwoffliner will take care of pulling all of that down and packaging it for you when given the correct flags [16:23:44] "SeriousFun01: You can insert any..." <- looks like what I want to do is possible without being too hacky. will give it a try. many thanks [16:30:43] sorry my vms crashed moonmoon the issue there is that there is a single root of trust [16:30:51] prom for faliure [16:31:18] is there any efforts or fork of mediawiki released on a more distributed way? [16:31:19] not sure what you mean by single root of trust? [16:31:38] kiwix/mwoffliner works by pointing it at a specific wiki and making an offline version of it [16:31:41] you are dependent on a main media server [16:31:57] when files are uploaded originally locally [16:32:03] so you're limited to the pages and files visible to that wiki [16:32:18] for the uploaded files, those can come from potentially many different sources, see $wgForeignFileRepos [16:32:20] sure that is not what Im discussing [16:32:41] mediaserver != mediawiki replicant instance [16:32:58] for example, Wikipedia has access to files uploaded to both Wikipedia itself as well as files uploaded to Commons [16:33:21] that is what I mean if there was any experiments to use a distributed mediaserver [16:33:40] iinstead of centralized one normally used in mediawiki instances [16:33:46] that's something you'll configur on the wiki's end [16:33:59] sure true [16:34:04] https://www.mediawiki.org/wiki/Manual:$wgForeignFileRepos [16:34:06] but has anyone tried this [16:34:12] I literally gave one example [16:34:24] there are extensions for uploading and obtaining files from S3, etc. as well [16:34:31] or has anyone even tried to distribute mediawiki itself for example [16:34:39] with having edits stored locally [16:34:43] you can distribute it as well as any web application [16:35:06] you are not getting my point :S [16:35:07] run a database cluster, run a web server cluster, have a load balancer/reverse proxy in front to send out requests to the backend [16:35:23] in a source tree not just manually? [16:35:49] distributed infrastructure doesn't magically spring up, you'll need to set it up yourself... [16:36:02] indeed [16:36:32] but MediaWiki is used by very high-traffic websites, so it does support a fair amount of scalability [16:37:00] if you're asking about decentralized rather than distributed (i.e. there is no one main/central authority in control over the data), then no, MediaWiki doesn't and cannot do that [16:37:34] decentralized != distributed [16:37:46] yes, I just said as much [16:38:04] yes so why you assumed I said decentralized? [16:38:21] but seeing as I answered questions about distributed infra and you said it wasn't what you were talking about, I also answered a somewhat similar topic [16:38:36] I never stated such [16:38:47] you are not getting my point :S [16:38:56] anyway [16:39:05] for others [16:39:18] given moonmoon seams to be going in cricles [16:40:38] Is there any efforts or forks in mediawiki ecosystem that focus on a distributed offline version of the wiki and specially what regards to media server or that has not been tried [16:41:45] like async distribution not just fedration [16:42:03] Cheers for feedback in advance [16:52:14] MediaWiki doesn't support async distribution. Maybe you're trying to use the wrong software for your needs [16:52:59] Vulpix: doesnt that would make a lot of sense? [16:53:12] like distributing load over all peers [16:54:32] Wikipedia is one of the biggest sites using MediaWiki, and it isn't "distributed" as you need it to be. So I guess it doesn't make sense :) [17:00:29] that is sad mediawiki is so dependent on wikimedia foundation [17:00:43] but good at the same way :) [17:01:25] ok so no distributed mediawiki implementation on sight I se [17:09:15] https://en.wikipedia.org/wiki/Goanna_(software) [17:09:38] *delete [17:09:42] *wrong channel [17:35:03] for a truly peer to peer wiki system you would need to replace the RDBMS back-end with something else [17:35:36] since RDBMS makes it centralised by default [17:39:03] also you will have to deal with the elephant in the room that is conflict resolution [17:57:58] * M3RELATIVISM[m] look into RDBMS [17:58:21] Im sure nlnet would suport a grant on distributed mediawiki [17:59:15] Think it would be a much smarter move then the intention to "modernize" mediawiki presented in 36c3 [17:59:29] modularization is already happening [17:59:49] it would exponentially decrease load on main sync server [18:06:34] "for a truly peer to peer wiki..." <- there are those federating projects on the fediverse using activitypub and related protocols that manage to share at least some data [19:19:20] M3RELATIVISM[m]: what is a ‘main sync server’? [19:21:13] SeriousFun01[m]: yes I know about those, it does not quite help when you want to essentially get rid of the traditional DBMS dependency (you cannot expect users to set up their own DB clusters when, as this person is inclined, solutions like Docker are out of the question) [19:22:13] and MW is too deeply dependent on DBMS as the back-end anyway, so anything activitypub-like is a moonshot at best [19:22:40] the DB layer does not enforce any abstractions as it is [19:23:46] M3RELATIVISM[m]: you could get the database dumps released by WMF and store them in IPFS or something. if you just want to ensure they exist even if all WMF servers would disappear [19:24:12] but also dumps are not backup [19:27:51] yea, there is not one main server, db servers for wikipedia are sepearted into sections like this: https://noc.wikimedia.org/db.php [19:28:22] there is one section that is just "english wikipedia" though [20:02:29] Remilia: you concdencendence has reached a record level, from my experience with you so far you seam to have a big ego but you should try to get it under control. Multiple assumption and preconcept are also not helping [20:03:48] "what is a ‘main sync server’?" is the root of the source tree for the distribution, look into Yggdrasil for that type of distributed systems [20:05:19] Also Remilia docker is also not out of the question because of its dificulty or whtever nonsense you are going about, but because docker has nonfree bits, and because replicant needs to be FSDGuidelines complaint even if we wanted to package it we couldnt [20:06:17] you are asking for something fundamentally different from mediawiki at this point [20:06:26] thanks you very much mutante [20:06:27] consider that mediawiki is not the right solution for whatever it is you're trying to do [20:06:38] that was quite usefull information mutante ": you could get the database dumps released by WMF and store them in IPFS or something. if you just want to ensure they exist even if all WMF servers would disappear [20:06:38] but also dumps are not backup [20:06:38] yea, there is not one main server, db servers for wikipedia are sepearted into sections like this: https://noc.wikimedia.org/db.php [20:06:38] there is one section that is just "english wikipedia" though" [20:06:42] I appreciate it [20:07:11] moonmoon: "you are asking for something fundamentally different from mediawiki at this point" I understand that [20:07:11] you're welcome [20:10:26] moonmoon: and its all good because in the end we will find a solution, what is the point of fustration here is having a peer that shows repitetly a need to show said superiority, Im respectfull of said peer knowledge and appreciate the help from that peer nontheless [20:11:08] I don't see any of that above, people are trying to understand what you're talking about, answering your questions, and stating pitfalls in things you're after [20:11:28] but at the Remilia is not obliged to answer to my question, so if said peer is having a bad day or felling not so good is best that said condencendent attitude is not shown :) [20:11:29] try to assume good faith on the part of others [20:11:53] Remillia has being going in circles [20:12:07] with comments like "have you read it.." [20:12:33] when obviously I had given I had explain privously what was acussed of not knowing :S [20:12:51] is just a matter of humility [20:13:35] all of us ounce were on my position, and known here is obliged to share her/his knowledge ;) [20:13:44] Cheers regardless [20:13:52] for all the help I received here today [20:14:11] it trully helped me figure out better the structure of mediwiki [20:14:32] also have a couple of topics to study so trully, dearly thanks [20:14:51] throught this discussion was also able to fix the attachment issue [20:15:03] so thanks [20:15:12] * M3RELATIVISM[m] goes implement topics discuss here [20:15:19] Cheers [20:15:29] that's good at least! It's a very typical webapp in that it assumes centralized control over all of the infrastructure. You can scale/distribute components of it, but it still assumes you have central control over all of those components. [20:15:34] The offline/kiwix portion is meant primarily for cases where you want access to a wiki's contents but have zero internet access [20:15:44] that's what it was designed for and caters to [20:15:56] * tn pats moonmoon [20:16:53] sounds good I will keep that in mind, thank you very much [20:16:55] M3RELATIVISM[m]: there are oldschool wikis that were around before mediawiki existed, like usemod. the original one: http://www.usemod.com/cgi-bin/wiki.pl and that does not use a database backend. it stores all the things in files. that means it won't scale well but also it means you might have a better chance to turn that into something truly federated. Like if you get all your peers to rsync the [20:17:01] data dir or have some kind of shared storage for it. [20:17:28] it will be pretty rudimentary compared to a MediaWiki though:) [20:17:37] * Izno_ pats tn. [20:19:18] ^.^ [20:19:42] :/ [20:20:29] I will defenately look into taht mutante for personal use, for replicant though we need to use mediawiki, given one of the big reasons for migration is wikidata, and to avoid duplication of articles between the FSDG distributions > https://www.gnu.org/distros/free-distros.html [20:21:11] we will find a compromise IM sure [20:21:57] Wikimedia stopped using usemodwiki for a reason [20:22:20] NobodyLikedCamelCaseLinking [20:22:48] I tend to recommend dokuwiki for people that want a databaseless/file-backed wiki [20:23:06] Not as bad as DemocracY [20:23:43] moonmoon: sqlite is file backed [20:23:52] *techincally*, sure [20:24:24] soo. just put the sqlite into a torrent.. and you have a "distributed wiki" ?:P [20:24:40] ooh, or check it into git [20:24:42] You are torrenting a wiki? [20:25:01] ah yes, sqlite, well known for its ability to have multiple people trying to change it at the same time and well-known for robust clustering support [20:25:02] oh wait [20:25:19] :P [20:25:48] As opposed to text files which do all those things? [20:25:58] If anything, it suits sqlite even more [20:26:20] AntiComposite hinted at the solution for text files: version control [20:27:00] sure at that point you aren't using the wiki's UI for conflict resolution, etc. but it's doable in theory [20:27:13] in practice... you'd probably want something purpose-built for the task [20:27:15] ;) I guess it has to use https://golem.network then /me hides [20:27:50] I hope it's not too scary [20:28:00] I am on my phone in Washington right now [20:29:39] Well, this discussion took a turn. [20:29:48] Did it? [20:30:51] M3RELATIVISM[m] was asking about storage, or at least from the part I have up to date [20:31:18] I feel like it should start over with "what is being fixed". [20:31:28] if it was "not dependent on WMF so much" that's another story [20:31:36] *shrug* [20:32:11] I thought we moved on into other topics