PmWiki's original developer has been retired for over 10 years and pretty much not programming these days. It's had the same maintainer for like 15 years though, and is still getting regular releases. The last I saw it so many years ago (I have never visited TVT since the year we forked), TVT had significantly diverged from upstream that I'd consider it a different wiki software by then.
Most of our extensions are not on continuous updates, but on versioned branches, which means that someone has taken the time to backport a new feature off the development branch; in Gerrit, the development branch runs MW 1.43, which hasn't been released yet. Some of them are set to pull from a master branch when an appropriate version branch isn't available, but I believe that none of this is fully automatic, someone has to run the command to update extensions.
I guess the thing is that Miraheze has, from its beginning, decided to be on the bleading edge. Probably because it was founded by developers who like doing this kind of stuff. I've wondered if it would be a better idea to stick to the long-term support releases, which update every two years or so. People say that it would just be 4x the work at one time if we switched to this schedule, rather than doing 4 incremental updates, but I'm not sure about that.
But I don't think the updates have anything to do with downtime, honestly. It's either not enough servers, or inexperienced volunteers pushing the wrong button, and mainly just lack of redundancy and lack of people paid to keep servers in tip-top shape. Our just (re)launched fundraiser plans to increase our servers by 50%, which will help, for a little while anyway.
I also disagree that Mediawiki is the most reliable engine out there. It's slow, it has some crazy extensions, and it has a lot layers like Varnish and ElasticSearch that can get out of sync. I mean, it even runs a mysql variant! (which doesn't lose as much data as it used to, but still.) Nah, Mediawiki's strength is that it's powerful. You can do crazy things like process external JSON into layers of semantic data templates, and it all mostly works.
Most of our extensions are not on continuous updates, but on versioned branches, which means that someone has taken the time to backport a new feature off the development branch; in Gerrit, the development branch runs MW 1.43, which hasn't been released yet. Some of them are set to pull from a master branch when an appropriate version branch isn't available, but I believe that none of this is fully automatic, someone has to run the command to update extensions.
I guess the thing is that Miraheze has, from its beginning, decided to be on the bleading edge. Probably because it was founded by developers who like doing this kind of stuff. I've wondered if it would be a better idea to stick to the long-term support releases, which update every two years or so. People say that it would just be 4x the work at one time if we switched to this schedule, rather than doing 4 incremental updates, but I'm not sure about that.
But I don't think the updates have anything to do with downtime, honestly. It's either not enough servers, or inexperienced volunteers pushing the wrong button, and mainly just lack of redundancy and lack of people paid to keep servers in tip-top shape. Our just (re)launched fundraiser plans to increase our servers by 50%, which will help, for a little while anyway.
I also disagree that Mediawiki is the most reliable engine out there. It's slow, it has some crazy extensions, and it has a lot layers like Varnish and ElasticSearch that can get out of sync. I mean, it even runs a mysql variant! (which doesn't lose as much data as it used to, but still.) Nah, Mediawiki's strength is that it's powerful. You can do crazy things like process external JSON into layers of semantic data templates, and it all mostly works.
"Kitto daijoubu da yo." - Sakura Kinomoto