An Introduction

Operation Wikimedia Foundation and the Wikimedia chapters

Download 392.78 Kb.
Size392.78 Kb.
1   2   3   4   5   6   7


Wikimedia Foundation and the Wikimedia chapters

Wikimedia Foundation logoWikipedia is hosted and funded by the Wikimedia Foundation, a non-profit organization which also operates Wikipedia-related projects such as Wikibooks. The Wikimedia chapters, local associations of Wikipedia users, also participate in the promotion, the development, and the funding of the project.

Software and hardware

The operation of Wikipedia depends on MediaWiki, a custom-made, free and open source wiki software platform written in PHP and built upon the MySQL database.[113] The software incorporates programming features such as a macro language, variables, a transclusion system for templates, and URL redirection. MediaWiki is licensed under the GNU General Public License and used by all Wikimedia projects, as well as many other wiki projects. Originally, Wikipedia ran on UseModWiki written in Perl by Clifford Adams (Phase I), which initially required CamelCase for article hyperlinks; the present double bracket style was incorporated later. Starting in January 2002 (Phase II), Wikipedia began running on a PHP wiki engine with a MySQL database; this software was custom-made for Wikipedia by Magnus Manske. The Phase II software was repeatedly modified to accommodate the exponentially increasing demand. In July 2002 (Phase III), Wikipedia shifted to the third-generation software, MediaWiki, originally written by Lee Daniel Crocker. Several MediaWiki extensions are installed[114] to extend the functionality of MediaWiki software. In April 2005 a Lucene extension[115][116] was added to MediaWiki's built-in search and Wikipedia switched from MySQL to Lucene for searching. Currently Lucene Search 2.1,[117] which is written in Java and based on Lucene library 2.3,[118] is used.

Wikipedia currently runs on dedicated clusters of Linux servers (mainly Ubuntu),[119][120] with a few OpenSolaris machines for ZFS. As of February 2008, there were 300 in Florida, 26 in Amsterdam, and 23 in Yahoo!'s Korean hosting facility in Seoul.[121] Wikipedia employed a single server until 2004, when the server setup was expanded into a distributed multitier architecture. In January 2005, the project ran on 39 dedicated servers in Florida. This configuration included a single master database server running MySQL, multiple slave database servers, 21 web servers running the Apache HTTP Server, and seven Squid cache servers.
Wikipedia receives between 25,000 and 60,000 page requests per second, depending on time of day.[122] Page requests are first passed to a front-end layer of Squid caching servers.[123] Requests that cannot be served from the Squid cache are sent to load-balancing servers running the Linux Virtual Server software, which in turn pass the request to one of the Apache web servers for page rendering from the database. The web servers deliver pages as requested, performing page rendering for all the language editions of Wikipedia. To increase speed further, rendered pages are cached in a distributed memory cache until invalidated, allowing page rendering to be skipped entirely for most common page accesses. Two larger clusters in the Netherlands and Korea now handle much of Wikipedia's traffic load.

Delivery media

Wikipedia's original medium was for users to read and edit content using any standard web browser through a fixed internet connection. However, Wikipedia content is now also accessible through offline media, and through the mobile web.

On mobile devices access to Wikipedia from mobile phones was possible as early as 2004, through the Wireless Application Protocol (WAP), through the Wapedia service. In June 2007, Wikipedia launched, an official website for wireless devices. In 2009 a newer mobile service was officially released,[124] located at, which caters to more advanced mobile devices such as the iPhone, Android-based devices, or the Palm Pre. Several other methods of mobile access to Wikipedia have emerged (See Wikipedia:Mobile access). Several devices and applications optimise or enhance the display of Wikipedia content for mobile devices, while some also incorporate additional features such as use of Wikipedia metadata (See Wikipedia:Metadata), such as geoinformation.[125]
Collections of Wikipedia articles have been published on optical disks. An English version, 2006 Wikipedia CD Selection, contained about 2,000 articles.[126][127] The Polish version contains nearly 240,000 articles.[128] There are also German versions.[129]

License and language editions

Article number of different languages families (06/09/09)All text in Wikipedia was covered by GNU Free Documentation License (GFDL), a copyleft license permitting the redistribution, creation of derivative works, and commercial use of content while authors retain copyright of their work,[130] up until June 2009, when the site switched to Creative Commons Attribution-ShareAlike (CC-by-SA) 3.0.[131] Wikipedia had been working on the switch to Creative Commons licenses because the GFDL, initially designed for software manuals, is not suitable for online reference works and because the two licenses were incompatible.[132] In response to the Wikimedia Foundation's request, in November 2008, the Free Software Foundation (FSF) released a new version of GFDL designed specifically to allow Wikipedia to relicense its content to CC-BY-SA by August 1, 2009. Wikipedia and its sister projects held a community-wide referendum to decide whether or not to make the license switch.[133] The referendum took place from April 9 to 30.[134] The results were 75.8% "Yes", 10.5% "No", and 13.7% "No opinion".[135] In consequence of the referendum, the Wikimedia Board of Trustees voted to change to the Creative Commons license, effective June 15, 2009.[135] The position that Wikipedia is merely a hosting service has been successfully used as a defense in court.[136][137]

Percentage of all Wikipedia articles in English (red) and top ten largest language editions (blue). As of July 2007, less than 23% of Wikipedia articles are in English. The handling of media files (e.g., image files) varies across language editions. Some language editions, such as the English Wikipedia, include non-free image files under fair use doctrine, while the others have opted not to. This is in part because of the difference in copyright laws between countries; for example, the notion of fair use does not exist in Japanese copyright law. Media files covered by free content licenses (e.g., Creative Commons' cc-by-sa) are shared across language editions via Wikimedia Commons repository, a project operated by the Wikimedia Foundation.
There are currently 262 language editions of Wikipedia; of these, 24 have over 100,000 articles and 81 have over 1,000 articles.[1] According to Alexa, the English subdomain (; English Wikipedia) receives approximately 54% of Wikipedia's cumulative traffic, with the remaining split among the other languages (Japanese: 10%, German: 8%, Spanish: 5%, Russian: 4%, French: 4%, Italian: 3%).[3] As of July 2008, the five largest language editions are (in order of article count) English, German, French, Polish, and Japanese Wikipedias.[138]
Since Wikipedia is web-based and therefore worldwide, contributors of a same language edition may use different dialects or may come from different countries (as is the case for the English edition). These differences may lead to some conflicts over spelling differences, (e.g. color vs. colour)[139] or points of view.[140] Though the various language editions are held to global policies such as "neutral point of view," they diverge on some points of policy and practice, most notably on whether images that are not licensed freely may be used under a claim of fair use.[141][142][143]
Contributors for English Wikipedia by country as of September 2006.[144]Jimmy Wales has described Wikipedia as "an effort to create and distribute a free encyclopedia of the highest possible quality to every single person on the planet in their own language".[145] Though each language edition functions more or less independently, some efforts are made to supervise them all. They are coordinated in part by Meta-Wiki, the Wikimedia Foundation's wiki devoted to maintaining all of its projects (Wikipedia and others).[146] For instance, Meta-Wiki provides important statistics on all language editions of Wikipedia,[147] and it maintains a list of articles every Wikipedia should have.[148] The list concerns basic content by subject: biography, history, geography, society, culture, science, technology, foodstuffs, and mathematics. As for the rest, it is not rare for articles strongly related to a particular language not to have counterparts in another edition. For example, articles about small towns in the United States might only be available in English.

Translated articles represent only a small portion of articles in most editions, in part because automated translation of articles is disallowed.[149] Articles available in more than one language may offer "InterWiki" links, which link to the counterpart articles in other editions.

Share with your friends:
1   2   3   4   5   6   7

The database is protected by copyright © 2019
send message

    Main page