Ethiopia’s web filtering | Advanced technology, hypocritical criticisms, bleeding constitution

Thus far, Ethiopia’s internet filtering had been characterized by technical limitations and hesitance. That is reflected in the filtering methods employed and the hesitance to admit the act publicly. Presumably, the hesitance of decision-makers might be the cause of its randomized application of the filtering and slowed up technical capacity building .

Be that as it may, Dear friends, we are no longer ‘learners’ in internet filtering. The nation has made ‘technological strides’. If the VOA episode is any guide to this, an official statement in admission may be on the horizon, which will be followed by a wave of ideological condemnations. Ironically, that would only make the whole discourse convenient for the government.

Sadly, behind the high-flown ideological and partisan debate, the Constitution would be bleeding silently.

Foreseeing that, it would be irresponsible to keep on the illusive quest for more data, more observation and more reading. In fact, the only conclusive proof lies in the yet to come official admission. But, by then, it would be too little too late. With my favorite article of the constitution at stake, I would rather be wrong than let the discourse take a wrong turn one more time.

In the Beginning

As most observers would agree, Internet filtering was prompted by the post-election violence of 2005. At least, at a ‘noticeable’ scale. However, as there were no preparations in advance, it has to conducted through crude and elementary methods. Thus, until last June, the filtering was apparently conducted through the two methods that require no or little additional skill and equipment. That is, as the experts call it, ‘DNS tampering/poisoning’ and ‘IP header filtering’. The two methods are, in simple terms, blocking the url address of the targeted site or the IP address/addresses of the server that hosts it.

As you may have guessed, the easiest and accurate method is adjusting the DNS server of the ISP(Internet Service Provider) to block the url address of the targeted site when users try to access it. That means, for instance, if this blog is deemed ‘objectionable’, the ISP can simply add the url address (https://hornaffairs.com/) into the ‘blocked sites’ list on its DNS server. Then,when users request this url address, the DNS server of the ISP will return an error message, instead of sending the request to other DNS servers(located out side Ethiopia) as it normally does. This is tampering or poisoning the Domain Name System(DNS).

But, what if there are hundreds of similarly ‘objectionable’ blogs on WordPress? And, more problematically, what if dozens of new similar blogs pop-up every week? The authorities may find it ‘unfeasible’ to identify all ‘objectionable’ sites and update the server periodically, thus decide to block WordPress entirely. Of course, it is possible to block all url addresses containing ‘wordpress.com’ using the same method, but not all blog/sites hosted here contain ‘wordpress.com’ on their url, since it can be removed for a few bucks. Due to this and other considerations, the authorities may resort to the other easily applicable and cost-effective method of filtering. Blocking the (Internet Protocol) IP address – that is, the address of the web server hosting the targeted site.

Blocking the IP address is performed by the specialized computers, known as routers. In this case, the DNS server of the ISP would function normally. Normally, when a user requests a site, the DNS server of the ISP will send the request to other DNS servers to look up for the IP address. It is after the IP address is identified that the routers are supposed to send specific requests to the respective web server to get the specific page requested and deliver it to the user. However, if the IP address is blocked, then the routers will simply drop the request and send an error message to the user. For example, If the authorities blocked only the IP address of this blog 72.233.2.58, a users request to access this site will be denied once the routers identify this blog is at 72.233.2.58.

Since this IP address is shared by most sites hosted on WordPress, the blockade will affect them too. WordPress is hosted at several web servers, to block it entirely, you need to block some half a dozen IP addresses.Once the routers are programmed to deny access to these IP address, even if this blog is renamed danielberhane.com, while still being hosted at WordPress, it will remain blocked.

However, WordPress may not be the only site hosted on those blocked IP address or web servers, thus the blocking will make several other not-so-objectionable sites inaccessible.  In other words, it results in ‘over-blocking’. As one study in 2003 suggests, there is a high probability that blocking the IP address of one site will make other sites inaccessible, since ‘more than 87% of active domain names share their IP addresses with one or more additional domains.’

Recent Developments

However, Ethiopia’s internet filtering was neither comprehensive nor consistent. I am not talking about slips, such as when the server is overloaded, or users circumventing the blockade through other tools. Instead, it was a frequent happening. There were even ‘window hours’, when blocked sites were accessible almost daily for a few hours. For example: Ginbot7.org had been accessible almost daily from 3-5 pm, 3-5 am and sometimes around 6am. More importantly, the contents of most blocked sites were available on Google webcache. Not to mention that some sites that are inaccessible for most of the year become suddenly accessible for several weeks, if not months. .

This may not necessarily be a result of technical limitations of the two internet filtering methods described above, rather a tactical move. Since few would go to great length to obtain the materials, irregular blocking suffice to discourage most viewers and limit the reach of those sites. On the other hand, such non-comprehensive and irregular blockade might also be intended to maintain a level of deniability. Perhaps, the uncertainty caused by the randomization may be part of the reason for Open Net Initiative to label the Ethiopian case as ‘substantive filtering’, rather than ‘pervasive filtering’. It may also be the reason for Reporters without Borders not to place Ethiopia among ‘enemies of the Internet’ and to ponder on whether technical problem is the cause of the inaccessibility of some sites on March 2006, .

Now things are changed.

Last May, days before the polling began, on May 19 or 20 to be specific, the ‘window hours’ ceased to exit, thus blocked sites became inaccessible 24/7. Many assumed the change may be part of the precautionary measures to mitigate post-electoral violence. Yet, the situation remained unchanged months after the possibility of violence subsidized, except for a few occasions where those sites were accessible for several hours, even consecutive days. Though there were ‘newer’ problems on accessing google web-cache and sites that were ‘not-blocked-until-then’, it would require months of observation to attribute it to internet filtering, given its irregular happening and the usual patchiness of the internet connection.

In retrospect, it must have been a trial period. Because, it soon became evident that newer and advanced filtering tools are in town, as they became fully operational, or at a clearly observable intensity.

By late June, obtaining contents of blocked sites through on google web-cache became impossible and, more strikingly, facebookers became unable to visit specific pages, namely addisneger’s facebbok page(very recently, ethiopianreview facebook page). There were stages, of course. At times, the new blockade would ‘let out’ users who make persistent attempts, and especially, closing other windows and/or using speedier PC and/or modem to the newly blocked sites/pages. In other times, the blockade remains inescapable except for the ‘window hours’. For a couple of weeks, the newly blocked web addresses were accessible at night and on weekends. Again, there were weeks when a group of users were able to access while others could not. There are several possible explanations, including technical hitches of a newly installed technology and a possible intent to observe usage rate among different user groups. But, by the end of the first week of July, the blockade became fully operational, except for the ‘deliberately’ provided ‘window hours’ which has became less and less frequent anyway.

What is the new technology? It easier to explain what it does. As explained above, the two methods, blocking the url address or the IP address, result in over blocking while unable to prevent the accessibility of those materials through other means. For instance, users could access the contents of the blocked sites on google web-cache, since neither neither the domain name nor the IP address of http://webcache.googleusercontent.com is blocked. And, the authorities can not afford to block entirely sites that provide cache documents, file sharing services, etc., given their wide-range utility. Not to mention, the likely reaction from the western politicians who are in the pockets of those sites.

But there is a catch here. Though the cached document page is stored on a not-blocked site, its url contains the address of the blocked site. For example, the url for the contents from this blog on google webcache will likely be (http://webcache.googleusercontent.com/search?………www.danielberhane.wordpress.com…..) Thus, if you could only filter the contents of urls and block only those containing danielberhane.wordpress.com, then you would be able to consistently block all posts from this blog on google webcache, without affecting other pages. In similar fashion, it would be possible to block posts on addisneger facebook page automatically, without blocking facebook entirely or without having to register all those urls manually, since the url of all posts(notes, pictures, etc) on the page would contain the word ‘addis-neger’. This is at is exactly what the new technology, known as ‘IP content filtering’, does.

While it is certain an advanced technology that performs ‘IP content filtering’ is in place, it is difficult to determine the exact equipment being used, at least at my level. However, based on readings on internet filtering, one can rest assured the new equipments are placed at the ISP or somewhere on the way. It could be an accessory to the server(specifically the router) that enables conducting deep inspection. While normally the server(the router) checks only the header of the IP, which indicates its origin or destination(request origin – Addis Ababa; destination-72.233.2.58) However, the additional tool affixed to the router would check the contents of the request – what specific page/document contents is being requested.(inspects the presence of ‘keywords’ on it)

Another possibility is that a full-fledged ‘proxy-based filtering’ might be in place. If such is the case, an additional machine, referred to as ‘HTTP proxy server’, that relays all or part of the requests from the ISP server is installed. This machine boosts speed and minimizes cost, since it temporarily stores webpages, it provides repeatedly requested pages from its cache rather than sending the request to the international web server every time. (A crude example would be, the cache the browser on our PC stores) On the flip side, since this proxy server stores entire WebPages with complete url address, it makes possible inspecting, and thus, accurately blocking the specific ‘objectionable’ WebPages without blocking the entire domain or IP address.

However, if the ‘HTTP proxy server’ is to inspect the entire internet traffic, it will either cause significant slow down to the internet connection speed. Or, the system should be setup in a way it would be able to inspect in real time, without stopping the passing traffic, which is an expensive option. Thus, IP address filtering and HTTP proxy may be used in combination, a method known as ‘Hybrid TCP/IP and HTTP Proxy’ or, in short, ‘Hybrid HTTP Proxy’. In this case, the HTTP proxy will inspect only the traffic to and from a specified list of IP addresses. For instance, assuming this is the method currently employed in Ethiopia, it would be inspecting the IP addresses of facebook and google-webcache looking for url paths that contain addis-neger, ethiopianreview, etc. While the traffic to and from MySpace would be passing normally. Though installing this Hybrid filtering system is costly, its running cost is minimal, while the system serves multi-purposes. [By the way, the HTTP Proxy filtering system can even be programmed to inspect the body of the WebPages, for instance, pages containing the formula of homemade explosives. But no ISP has been been observed using that approach yet, according to literatures on the issue.]

Are you depressed? Don’t be.

Which ever specific equipment is in use, now Ethiopia’s internet filtering is at an advanced stage that makes possible blocking a webpage based a keyword found on its url.

Lest you take the the ideological demands of Cyber libertarians to heart, the technological progress is not a worrisome development, per se. Since it can be run at minimal cost, the indisputable concerns of cyber security and as there are a number of legitimate grounds for blocking sites, accepted even in the western hemisphere. In deed, the repulsiveness and ethnically divisiveness of some of the blocked sites is so blatant as to be publicly condemned even by prominent opposition figures, like Birtukan Mideksa and Lidetu Ayalew.[A video clip of the condemnatory speeches will be posted in the next part of this article]

Yet,that by itself doesn’t necessarily warrant its proper application nor its Constitutionality.

[To be continued.]

To read the next and last part – CLICK HERE.

****************

P.S.: The term ‘Internet filtering’ also includes other methods like surveillance on users, attacks on the server, deregistering a website, and other means. Plus, Internet filtering is not solely conducted by government organs.
This article focuses on the inaccessibility of opposition media sites. But that doesn’t mean government and ‘pro-government’ sites are immune from attacks, if you have not observed that while visiting mfa.gov.et, minfo.gov.et, ethio-channel.net.et, etc, read: Afrol-news, last March report, ‘Virus attack on Ethiopian websites’(Link) which speculates Eritrea or exiled opposition groups as the cause. Also on Wikileaks(Link) Though the two stories discuss cases of 2009 & 2010, I have observed the problem whenever I visited mfa.gov.et from mid-2007 to mid-2010. However, unlike the Afrol-news reporter, no harm occurred to my PC when I visited in disregard of the warning. It amazes me why the so-called right activists chose to ignore it, though this is part of the freedom of  information.

Daniel Berhane

more recommended stories