Using the "secure server" protocol, https, to communicate with a website (web server) has long been considered a must when editing from unsecured networks and from locations considered insecure. Using https encrypts communications between a user's computer and the Wikimedia servers (for example), preventing the interception of plaintext username-password combinations during a browsing session. In the fallout from the release of the Firesheep Firefox extension (see previous Signpost coverage), however, it became clear that many felt this solution alone to be insufficient, since editors often forgot to switch from http to https when the need arose. As a result, there were calls to make https the default for all editors and, in preparation for such a switch, the process of making Wikimedia more https-friendly began.
This week, work on switching to https took a leap forward with the introduction of "protocol-relative" URLs onto a test wiki. This means that instead of internal links (both hyperlinks and file references, for example for images) pointing to locations prefixed with specific protocols, they will now not specify a protocol. The user's browser is then expected to fulfil the request using the same protocol it used for the originating page: links on a page loaded using the https protocol will point to the https (secure) site, while links on an http page will point to the http (insecure) site. According to the Wikimedia Foundation blog, the benefits are obvious:
[Before,] if you were browsing the site in https mode, and another user was browsing the same pages in HTTP mode, two versions of those pages would be stored in our cache, as the links are different between the two modes. This splits our cache, which makes it less efficient and more expensive to operate.
When browsing in https mode, we want to ensure links point to the correct protocol. When pages are parsed, things like interwiki links are created by the parser. If we do not use protocol relative URLs, then links will point to either https or http, which cause users to switch modes randomly.
Google Summer of Code students reach halfway point
In addition to factual information, the post also disclosed thoughts from the students about what they had learned so far. "True learning can happen only in an open environment and with a highly supportive community", noted Akshay Agarwal, whilst fellow student Devayon Das commented that "A 30 second chat with a community member can save you 30 minutes of scratching your head in frustration". Salvatore Ingala chose to highlight the importance of unit tests (see previous Signpost coverage): "unit testing is boooooring, but ends up saving you a lot of time!", he wrote.
Some academics have already proposed using SMW on Wikipedia to tackle the problem of the many lists that have to be created manually, but according to Wikimedia Foundation Deputy Director Erik Möller it's still unclear whether SMW is up to the task of supporting a web site on the scale of Wikipedia. So while Semantic MediaWiki already powers a lot of web sites and is quite user-friendly, it remains to be seen whether it will eventually bring semantics to the ultimate wiki, Wikipedia.
However, volunteer developer Simetrical used the opportunity to clarify that SMW's adoption by Wikimedia projects was not just unclear, but impossible:
The problem with deploying SMW on Wikimedia sites like Wikipedia has always been that it's a big codebase (tens of thousands of lines), which shares few to no active developers with MediaWiki proper, and which has never had thorough review by core MediaWiki developers for security or performance. ... it's an awesome project, and its functionality is absolutely make-or-break for countless small to medium MediaWiki installs. But it's not possible for a project of this scale to be usable on a site as large as Wikipedia unless it was written that way to begin with, and (like almost all software) it wasn't.
Not all fixes may have gone live to WMF sites at the time of writing; some may not be scheduled to go live for many weeks.
How you can help
Give your views on <math>
This week, developers appealed for views on the rendering options available for <math>-tags. Does it affect you? Comment now!
Sumana Harihareswara, the Foundation's Volunteer Development Coordinator, investigated how the open source project Launchpad handles its own development workflow to see if MediaWiki could learn any lessons from it. Her report succeeded in drawing some useful comparisons, although by its own admission the learning process will be hindered by the fact that MediaWiki relies on a subtle interplay between paid and volunteer developers while Launchpad has only a very small volunteer developer community (wikitech-l mailing list). In unrelated news, Harihareswara also called for developers due to be present at this year's Wikimania conference to help plan coding "sprints" (wikitech-l mailing list).
Email notifications will no longer be sent to unconfirmed email addresses, to prevent accusations of spam (bug #17866).
This week saw the first testing of a new release system, known as "Heterogeneous", which will enable different WMF wikis to run different versions of MediaWiki software. This would allow for selective testing of new versions, for example, on right-to-left wikis (more software developments).
Wikimedia has joined the Unicode Consortium as a liaison member, putting it in the same category as the GNOME Foundation and Mozilla. Membership, which is free but requires approval, allows Wikimedia to contribute to official discussions relating to the Unicode character encoding standard (wikitech-l).
There was a discussion on the wikitech-l mailing list following the news that the "default assignees" for different categories of Bugzilla bugs had been reset to wikibugs, the "no-one in particular" superuser.
5 bot tasks were approved this week, but a number of proposals are still open, including one that would see a bot tag files as eligible for being moved to Wikimedia Commons.
The Signpost is written by editors like you — join in!