Draft essay (rev. 23sept07) for: "The Power of Ideas: Internet Governance in a Global Multistakeholder Environ-ment " (Wolfgang Kleinwächter, ed.), Nov. 2007 (forthcoming).

 

 

The Next Internet Governance Battles

 

By Kenneth Neil Cukier

 

 

I. Introduction: Tomorrow’s Network Will Be Different Than Today’s

 

The current debate over Internet governance risks becoming obsolete because the technology, architecture and use of the network is undergoing radical change. Yet policymakers are largely unaware of these changes. As a result, they are “fighting the last war,” so to speak. Rather than looking at the new challenges of naming and numbering in the next decade and beyond, they presume that the Internet that existed in 1998 when ICANN was created, and which operates today, will be the one that exists tomorrow.

 

But this is just not so. And trying to devise policies based on this would be like establishing rules for the telegraph just as the age of the telephone begins. Three forces are transforming how the Internet works: ubiquitous networking, new technical architectures and the developing world’s telecoms growth. This essay provides an overview of the changes taking place and considers their impact on the management of critical Internet resources (while noting shortcomings with existing approaches).

 

The first force is technology: the Internet is going from a network comprised of PCs to one in which all manner of devices -- from cars to washing machines to sensors on buildings, bridges, trees and inside people -- communicate over a network. It may sound like science fiction but initial versions already exist and the technology will be commonplace in about ten year’s time. This “ubiquitous networking” will place new demands on naming and numbering policy. It may even be the case that the most efficient approach is to bypass the current domain name system altogether.

 

The second change is in the architecture of the network: Internet engineers are redesigning the underlying protocols of the Internet so that they can better mature for more robust uses. In so doing, the engineers are calling into question certain tenants of the network dating back 35 years, that are enshrined in the way that Internet names and numbers are managed. Although it is too early to say how the new network might look, it is clear that the network will be different. Indeed, the very “uniformity” of the Internet’s architecture may be among the first sacrosanct principles to go.

 

The third change regards international development: the most impressive network growth has been in developing countries, not the West, and via the mobile phone, not the PC. New devices are being designed especially for this market. New applications and uses are also emerging. So far, the Internet has been created and used by “the first one billion” users -- and its infrastructure coordination naturally reflects this. Yet when the second and third billion users from developing countries join the information society, as they are starting to do, it calls into questions certain naming and numbering policies. Issues germane to the developing world will need to be better taken into account.

 

The result of these changes it that when governments discuss Internet governance and the management of critical Internet resources, they do so in a time-capsule. Ultimately, by trying to assert more control, governments may find they have planted their flagpoles into a sandbar -- for the network is in the process of a dramatic transformation for which the Internet-governance community is unprepared. 

 

 

II.  The Era of Ubiquitous Networking

 

The Internet today has slightly more than 1 billion users and mobile phones subscribers number around 2.7 billion. But this is nothing when compared to the amount of things that can be attached to a network, to send information about their status, location and operation, as well as to link with other devices to do new things. Over the next ten years, the Internet will be characterized by all manner of machines, structures, environments and people’s bodies connected to a network at all times. The network will need to accommodate a trillion devices, engineers estimate.

 

The groundwork for this has already been laid. Consider: around 10 billion microprocessors will be sold this year, embedded in everything from computers and coffee-makers to cars. Today, most of them “think” but do not “talk” -- that is, they do certain tasks but do not communicate. Yet this is changing. As the cost, size and power requirements of chips decline, and their performance increases, communications functions are being integrated into processors, mainly with wireless technology. At the same time, the wireless industry is investing billions of dollars to deploy 3G and nascent 4G (WiMax) high-speed mobile networks.

 

The technologies that already exist are staggering. For instance, a wireless chip for mobile phones that in 2003 cost $50 is today $5. Chips used for the Global Positioning System or Bluetooth wireless connections now cost as little as $1 and are the size of a matchhead. Chips for Zigbee technology, used for short-range sensors, which currently cost around $4 and are the size of a fingernail, are expected to shrink to a quarter of the price and size in five years. A far simpler kind of chip called a radio-frequency identification (RFID) tag, which sends a tiny bit of data over a short range when activated, can already be manufactured for 4 cents apiece. Hitachi has a prototype chip that fits into the groove of a thumb-print. In 2006 one billion RFID chips were sold and the figure is expected to almost double in 2007. And RFIDs are becoming more sophisticated, developing into true, two-way communications systems.

 

These technologies enable all sorts of things to connect to a network. For example, industrial building companies are preparing to commercialize products that add a small wireless node to every light fixture. This would enable them to be turned on and off remotely, as well as do new things, such as act as networked smoke detectors and security alarms. Cars are going beyond satellite navigational systems to include wireless modules that alert emergency-services in case of an accident, do electronic toll charges and traffic monitoring. Consumer-electronics makers are adding networking modules as a way to sell content services. Appliance manufacturers are looking at embedding communications in their products to regulate power consumption, upgrade software and provide “preventive maintenance.”

 

Meanwhile, bridges and buildings are getting sensors to continually monitor their structural health -- an important issue in light of a devastating bridge collapse in the US in the summer of 2007. The environment is also being monitored by sensors for climate change, as well as for more efficient farming. Amazingly, new networking technologies are being introduced inside people’s bodies for medical purposes, such as to scan the intestinal tract or monitor the blood fluid inside of a person’s heart to detect and prevent congestive heart failure. And it bears emphasizing that these technologies are not scribbles on paper in R&D labs but products undergoing regulatory approval and already being sold by major companies like General Electric, Philips, Honeywell and others.

 

This will change the Internet governance debate in profound ways. The Internet addressing system was designed for individuals to locate content from somewhat central repositories. In the future, it will be called on to allow billions of autonomous, self-organizing devices to interconnect with one another on the fly. Basic things like ensuring identity and security -- tricky on today’s far simpler Internet -- will become exponentially more difficult.

 

The current approach to Internet coordination is not perfectly suited to this environment. For example, the Internet Protocol version 4 (IPv4) addressing system was designed with around 4.3 billion unique addresses. The stock of addresses will be depleted by between 2010 and 2013, according to officials at IP address registries. A “graymarket” in addresses has already formed. Moves to transition to IPv6, which offers quadrillions of addresses, has been slow. Despite the huge amount of addresses their number is finite: unless care is exercised in their allocation, a similar shortage may emerge.

 

Moreover, the current policies make presumptions about how the Internet is used, when it is a variable, not constant, determinant. For instance, ICANN’s rules covering domain-name registries takes for granted that names are used to identify websites, as they did in 1998 when ICANN was created and the Web was less than a decade old. The idea that a name might refer not to a site per se, but a continuously changing “instantiation” of information is not envisaged. Furthermore, an environment of “Web 2.0” data flows mean that a web address may represent nothing more than the commingling of numerous discrete operations from different servers into a single service. Thus, a domain name might be automatically generated and only “alive” for a day, or even a few seconds. Who is to say? Yet ICANN’s policy of taking part of registration fees to support its operations disrupts these potential uses.

 

This example is not imaginary. In the March 2003 ICANN meeting in Rome, a representative of SITA, the airline consortium that operates .aero, explained that the group wanted to create a specific domain name for every commercial flight every day, so that the aviation industry as well as consumers could obtain information about it, from ground maintenance to flight delays. But SITA could not deploy it due to ICANN’s fee structure. Add to this a world in which every plane-engine has 20 different sensors all generating data in real-time, and the extent of the problem only grows. The point of this example is not to remedy this issue per se, but to underscore how policies can unwittingly stifle innovation. 

 

With the need to provide identifiers to every networked object, there will probably be an engineering incentive to bypass the Internet’s domain name system altogether. If this happened, it would mark an ironic twist. Just as governments started to get their hands around what Internet governance means by way of venues like the World Summit on the Information Society and the Internet Governance Forum, the very nature of what they debated changed shape, rendering their huffing and puffing rather moot.

 

 

III. Re-Engineering the Network’s Design

 

The Internet is not a series of tubes. It evolved like sedimentary rock, with newer technologies layered upon older ones. This has so far worked, but it does not scale well. To meet the future demands of a trillion connected devices, efforts are underway among Internet engineers to redesign the Internet. It is a way to pull out superfluous things, as well as incorporate features that were not initially a priority but are today regarded as important, such as better identity-authentication to minimize spam and hacking. 

 

Two initiatives are taking place under the US National Science Foundation. One is the Global Environment for Network Investigations (GENI), to build an advanced test-bed network for piloting new protocols and applications. The second is Future Internet Design (FIND), which considers specific ways the internet can be changed to address future needs. There is also a European Union initiative called EURO-NF (for
“Network of the Future”) with around 35 European institutions participating, mainly universities. A number of research proposals in the US have come forward that would change the way the Internet works, and with it, aspects of Internet governance.

 

One technique is “Internet indirection infrastructure.” It would overlay an addressing system atop of current Internet Protocol addresses, which would better enable mobility and multicast applications by bypassing the current point-to-point approach in circumstances when routing traffic that way is inefficient. A second idea is called “active networks” or “metanets.” It would permit diversity at the core of the network not just at the edge, by replacing routers with devices that can dynamically load new protocols. Applications would be able to reprogram the devices through the network for a specific protocol, optimized for the communications. The device would partition itself internally to support multiple, mini private networks.

 

How the network ultimately looks due to these proposals is unclear -- but change it will. It may require that things like IP address assignments be done differently -- or change the nature of IP addresses themselves. Would the institutions that exist based on the current DNS system be comfortable ceasing operations due to changes in technology? Or, would their first inclination be to resist the technical changes under the banner of upholding the Internet’s “stability”? The larger point is that the Internet’s underlying technology is something dynamic, not static, but is treated as stable by officials considering Internet governance policy.

 

 

IV. The Developing World Joins the Network

 

In 1995, when the US government first hosted discussions that would eventually lead to the creation of ICANN, around 94% of Internet hosts were located in the 31 industrialized countries that comprised the OECD. Today, the figure is closer to 50%. China has the most broadband subscribers in the world with over 100 million users, and the Chinese language has surpassed English as the dominant language on the Web. There are more than 35 million bloggers in China alone. China also has the most mobile phone subscribers, with more than 500 million users. India is coming up fast, and with China its companies are the biggest owners of sub-sea fiber optical Internet cables.

 

At the same time, the Gulf states are pouring some of their enormous oil wealth into major IT initiatives, allocating mobile phone licenses and even buying mobile networks around the world. In Saudi Arabia -- where mobile phone penetration has gone from 6% to 60% and Internet use grown three-fold since 2000 -- the Kingdom is investing $200 billion to create six new “economic cities” to attract international businesses, particularly high-tech firms.

 

Meanwhile, Africa has the highest new mobile phone subscription rates in the world; in many countries the number of new users more than double annually. In countries where many people live on less than $1 a day and gross domestic product is actually falling, mobile phone adoption is nevertheless increasing. Worldwide, 1.6 million new mobile phone subscribers are added every day. More broadly, in 2006 for the first time in history, more than half of the world’s gross domestic product came from developing countries.

 

The striking thing about these trends is that the developing world is joining the information society using a different model than the West. Instead of one person-one PC, as in industrialized countries, computers are more commonly shared among many users and the mobile phone is the device by which most people participate on the network. Today, it is mainly for phone calls -- the networks and devices do not support much Internet access and illiteracy is a major issue. But the variety and richness of mobile services are increasing rapidly, tailored to local needs. In time, the phones will basically be primitive Internet devices.

 

Moreover, they may operate in ways that are different than today’s Internet. For instance, in the project One Laptop Per Child, the networking modules for the $100 laptops are being designed to enable peer-communications rather than just linking onto the Internet backbone. This means that more network traffic may be off of the public Internet and privately routed. New addressing systems might be created to make this smoother, bypassing the traditional DNS.

 

Furthermore, mobile phone numbers rather than ICANN’s domain name system may be the most common identifiers used for most people online. This would give developing nations more control over information than they could ever enjoy via ICANN, since phone numbers and networks fall under national telecoms regulatory agencies. It is a power that China exercised when it was able to censor mobile phone SMS messages during the SARS outbreak in 2003.

 

The rise of the developing world online affects how the Internet’s infrastructure is managed. For instance, setting a wholesale rate of a few dollars for a domain name is prohibitively expensive in many countries -- an issue to which ICANN is sensitive. It also throws a spotlight on Internet governance in a world in which the 5 billion people who live in poor countries need to share the network resources with the 1 billion that are already connected. (As Chinese officials used grumble in the late 1990s, there were more IP addresses at Stanford University than in all of the Middle Kingdom.) How IP addresses are allocated and root servers are maintained and deployed may be open to scrutiny. Most importantly, it poses embarrassing questions to ICANN about why it is taking so long to introduce “internationalized” domain names, so people can use local scripts to send emails and navigate the Web.

 

 

V. Conclusion: The Heraclitian Internet

 

Taken together, the forces of ubiquitous networking, new Internet architecture and the developing world’s network growth, renders today’s Internet governance discussions somewhat passé. The magnitude of these changes is on a similar scale to the revolution of the Internet itself relative to the telephone system, a change that is still being digested by the telecoms industry, policy makers and society.

 

The Internet is only 35 years old, and as mainstream medium, not much older than a decade. Yet already there have been many iterations. In 1969 the national backbone ran at 56 kilobytes per second; by 1997 that speed was possible on a home modem; in 2007 users in Japan, Korea and Hong Kong enjoy 100 megabyte access. When the Internet was first designed, it linked 13 supercomputer centers at American universities and supported several hundred users, each of whom had to be approved to go online. Commercial traffic was forbidden. Domain names were not created until around 15 years later, in 1985, and today seems an archaic technology. There is no reason to believe they must continue to exist in the future.

 

This Internet history bears remembering, since it highlights the degree to which the network we use today is not set in stone but mutable, plastic, ever-changing. Likewise, its “governance,” viewed in historical perspective, is a series of changing rules and rulers. First, officials from DARPA, the US military’s research arm, called the shots -- though they largely let the engineers from academia do what they considered best, a process referred to as “Internet self-governance” (later, the term “self” would get left out). Then the academic-funding agency NSF had control, but again deferred to the “Internet community.” This grouping of researchers and network operators from academia and industry created around a dozen organizational structures over two decades, each with a new abbreviation: ICCP, NWG, IAB, IESG, IETF, IANA to name a few. What they actually stand for is not so important: the end result was that a set of institutions and mechanisms were established to manage the network.

 

Yet they never lasted long. One notable feature of the history of Internet governance is that institutions sometimes did not adapt to the changes in the network. Instead, they became obsolete and were superceded by new ones. By 1998, because these self-governance processes were considered too informal and relied too much on the US government, the US privatized and internationalized the system -- by creating ICANN. Ironically, the group then spent most of its time fighting off criticism that it was too informal and too American.

 

If the past offers a lesson, it is that both the network and its governance system are in a constant state of transformation, not something static that can have a set definition or rules applied for ever after. In the Internet’s early stages, both protocols and policy were made on the fly by engineers addressing concerns as they emerged. But in trying to formalize this with ICANN, policy became “ex ante” (ie, something that must be known in advance) rather than “emergent” (ie, continually revised based on changing circumstances).

 

The mismatch is that while ICANN (like any administrative institution) sets rigid polices, the technology remains emergent and ever-changing -- as witnessed by the rise of ubiquitous networking, new Internet architecture and telecoms in the developing world. This tension is inherent to ICANN and a reason why it is by nature a conservative force. Ultimately, the Internet is like Heraclites’ river. Just as we never step into the same stream two times, so too we never log onto the same network twice. Will today’s Internet governance institutions prove as fluid? What are the consequences if they do not?

 

_______________

 

Kenneth Neil Cukier is a correspondent for The Economist and writing a book on the history of Internet governance.