The Internet™: A Solution for Openness Through Closedness

 

By Kenneth Neil Cukier *

 

 

Summary: The Internet is facing numerous challenges to its management and architecture. In reaction to those who advocate change, proponents of the current system are trying to hold in place its existing design (be it policy over DNS management, or laws for network neutrality). However, codifying how the Internet ought function may inhibit its natural evolution. This paper proposes that trademark law be used to define what constitutes “The Internet.” This may ensure that the basic principles that account for the Internet’s success are preserved, while also permitting diversity and experimentation for new models of data-networking to take place.

 

 

I. Introduction

 

In 1990, few had heard of the Internet; in 2000 everyone praised the Internet; in 2010, there is a risk that the Internet will cease to exist. Obviously, this is a melodramatic prediction, but if one were to define the chief attributes of the network -- its end-to-end character; its famous dot-com and other addresses -- it could turn out to come true.

 

This is because in a number of small but important ways, the Internet’s design and policy institutions are being challenged by the twin forces of large telecoms operators on one side, and governments around the world on the other. In half a decade’s time, a network that we shall still call the Internet will surely exist -- but it may not embody the same principles of openness that we enjoy today.

 

This might happen because telecom operators or governments change the underlying tenants of the Internet to suit their commercial or political interests. But it could just as easily happen because the very people who are most supportive of the Internet’s current design may try to preserve it in a way that essentially encases it in amber.

 

This conundrum can be seen in the two biggest disputes pitting the current conception of the Internet against those who call for change: the control of the domain name system (DNS), and network neutrality (NN). There are many other matters, such as peering, network-address translation and efforts by carriers to build “next generation networks,” which are being discussed at the International Telecommunication Union. But DNS management and NN have generated the most tensions because any change in the current set up is perceived to be a revocation of the Internet’s original design, which is responsible for its success. It would undermine the Internet’s universality (with the DNS) and its capacity for innovation (with NN), say the critics of change.

 

The DNS issue has been controversial for a decade. The US oversees how Internet addressing works as a vestige of funding the network’s initial creation. Increasingly, other countries believe that a global resource such as the Internet should be managed by the international community. Meanwhile, the question of network neutrality has emerged in recent years as telecom carriers (mainly in the US but increasingly elsewhere) have considered charging customers based on the type of traffic carried over their lines rather than simply offering neutral pipes for content. Such a commercial strategy would centralize the network that has proven so innovative because of its historic decentralization.

 

 

II. The Problem

 

On the surface, the two matters are unrelated: one a concern of national sovereignty and international relations; the other a commercial question of market power by private firms. But seen in another way, they are two heads of the same hydra: in both cases, the reaction by those who oppose the changes is deeply conservative. They seek to retain what they regard as the current beneficial attributes of the network by force of law and policy, rather than permit an openness that would let the Internet evolve in different ways, and leave it to Internet users through the market to decide what network approach is best.

 

This is ironic. Just as free speech is not intended for views with which one agrees but for those one finds reprehensible, so too an open network should conceivably be open to change, even if it may winnow some of that very openness in return for other benefits (such as quality of service, security, diversity, or “fairness” in its management). Instead, the response by opponents of change is to enshrine the status quo in policy (in the case of the DNS) and federal law (in the case of NN in the United States).

 

Even if the proposed changes to the Internet are troubling, the reaction is problematic. The Internet’s most successful feature and defining characteristic is its ability to dynamically evolve. Tying down any part of it -- be it its protocols, architecture or institutional framework -- is rather “un-Internet-like.” It risks harming the very thing that its supporters most want to preserve: the Internet’s ability to innovate. It would probably even create new, unforeseeable problems. Moreover, it comes at a time when the earliest tenants of the Internet are being reexamined by the engineering community, who are investigating new network architectures with a clean-slate approach (i.e., initiatives by the US National Science Foundation called FIND and GENI).

 

The dilemma, therefore, is over how to reconcile the inherent tension between keeping the Internet as it currently is (with the ICANN-sanctioned DNS, and with NN as its underlying principle), while retaining the network’s ability to evolve and mature (even if this throws into doubt these attributes of the net). The Internet must be stable at the same time as it remains open and flexible -- and invites changes using market forces to determine how its development takes place.

 

Both the proponents of change and supporters of the status quo have not come up with a good way to let this evolution happen. Locking the Internet in place (for example, by mandating only one DNS system, or obligating NN) represents a “closed” rather than “open” solution and undermines the Internet’s generativity. Yet the opposite approach, casting the Internet to the winds of the market alone or government regulation alone, risks serious problems too.

 

On the one hand, markets sometimes embody market failures that are only known after harm is done, when it is usually too late to reverse the damage. On the other hand, legal remedies often pre-judge how technologies ought function, which freezes the current system in place and makes subsequent innovations harder to implement. History offers a powerful lesson in this respect: the public-switched telephone network evolved slowly precisely due to government oversight of telecommunications -- for which it took the Internet, as a revolution in communications design, to overthrow.

 

 

 

 

III. The Solution

 

In such a polarized situation, the goals appear mutually exclusive: keeping the Internet as it is, while enabling the Internet to change. It would seem that the world can’t have it both ways; there is only one Internet, after all. However, this is actually a presumption, not a fact -- and the presumption is faulty. Why not let many different “internets” (with a lower-case “i”) blossom? That is, let numerous approaches to data-networking using the underlying TCP/IP protocol try themselves in the market, at any networking layer, and see which are worth keeping versus letting pass by the wayside.

 

To “have it both ways” in this respect -- retaining the Internet of today while encouraging the potential improvements of tomorrow --  trademark law could be used: “The Internet ™”. This would represent the Internet as the network currently exists (with NN and the ICANN-sanctioned DNS), but it would be clearly delineated and protected by intellectual property rights. A trademark mechanism would not just usefully “lock-in” what many see as the Internet’s positive attributes; it would also enable the network to easily co-exist with alternative approaches, but not preclude them or treat them as hostile threats (as is the case today, where what constitutes “the Internet” is amorphous).

 

Putting this idea into practice would not be especially hard. It would be necessary to define what the Internet is, based on its current principles. Although this sounds difficult, there is actually already much consensus on the topic: the philosophy behind the network has been the subject of enormous amounts of discussion over decades, from David Clark’s notion of the “end-to-end principle” in the 1980s to definitions offered by people as diverse as Robert Kahn and Karl Auerbach; even the US Federal Communications Commission and other regulatory bodies have defined the Internet with narrow differences. From this, a base-line definition could emerge.

 

An institution would be needed to carry out this process, so a definition can be established and evolve over time. The organization would also need to legally hold the trademark and uphold it in the case of misuse. The risk is that the process the group simply re-creates the problems that bedeviled ICANN (e.g. being open, transparent, accountable, internationally representative, multi-stakeholder, etc.). But it need not be so controversial, since its legitimacy would be founded only on the basis of those who chose to recognize it, rather than any formal power it has (much like the Internet’s technical process of “rough consensus “ in the era of Jon Postel).

 

One organization that could take on this role is the Internet Society, which was founded by the network’s earliest engineers in 1991 for just this type of “political” purpose, to serve as a bridge between regulation and technology. ISOC already serves as the custodian of Internet technical-standards documents by the IETF.  It already has experience in this very area: in the late 1990s, ISOC funded a legal battle to strip a private company of a trademark on the term “Internet,” which it had been granted by the US Patent and Trademark Office. (The firm applied the term to its network of automatic cash machines, which post-dated TCP/IP; thus there was clear prior use).

 

Once “The Internet ™” is established, the market can do its job. First, this would spark disclosure by service providers as to whether they adhere to the concept of NN, and route traffic using the ICANN DNS (of course, they could always supplement this with additional online addressing and navigational systems). Second, network operators that advertise themselves as providing “The Internet ™” service would have to uphold those principles. If not, they would likely be in breach of consumer-protection rules such as false advertising or deceptive trading practices, and thus regulatory agencies could take action. Telecom operators that misled customers might also open themselves to liability from shareholder lawsuits if they were publicly-traded.

 

Also, those network operators would be contravening the terms of use of the trademarked term, which could have its own penalty. Yet the disclosure principle would hopefully be enough to compel network operators to comply with good “Internet” practices, since educated consumers might refuse to use operators that did otherwise.

 

The benefit of this approach is that network service providers, like the AOLs and CompuServes of yesteryear, could adopt new approaches to data-networking that differ from the current design of the Internet, such as routing alternative domains or charging users based on traffic type. But it would not be illegal to do this, which current proposed legislation would prohibit. Additionally, the world’s first commercial “Darknet” (a private, anonymous Internet-like network) is poised to start operation in 2006 in Sweden, which suggests that the day is nearing when a definitional understanding of what constitutes the Internet will be needed. If it is not defined on a private-sector basis, it may become subsumed as a “public” telecommunications service, which creates a new set of regulatory concerns.

 

Over time, “The Internet ™” might emerge as a sort of brand, such as Dolby Noise Reduction or the “CE” logo that appears on many electrical products cleared for use in the European Union. Furthermore, it would most likely be the default or de facto standard, as it is today (since for the moment no real alternatives exist, which itself speaks to the utility of the existing Internet’s design, as well as the difficulty in modifying it). “The Internet ™” might become like “Fair Trade” products and appeal to a niche customer. But it is more probable that it would be like publicly-traded companies in the US that have their financial accounts certified by one of the major accountancy firms rather than an unknown one. That is “The Internet ™” might be adhered to as a matter of course by operators, to the degree that it is taken for granted by consumers.

 

Under this system, alternative approaches would not be presumptively prohibited or precluded from being introduced, which they could be unless there were a way for different network deployments to co-exist peacefully. Trademark law, in this respect, may be the way to make this happen.

 

 

IV. Conclusion            

 

This is a “Gordian Knot” approach -- a single, swift action that constitutes a clean break to solve the problem. It presumes that holding on so firmly to the current state of the Internet actually risks strangling it, akin to the tale of the man who loved a bird so much that he kept it tightly grasped, only to discover that he crushed it.

 

The use of trademark law is permissive rather than restrictive -- it will allow newness, while clearly demarcating the traditional approach, and providing it a formal way to remain in tact. This is particularly important as the Internet moves forward. Where in the past, the chief threat to the Internet came from the potential for a plethora of variants that jeopardized its universality, today the central risk is its monolithic nature that undermines the network’s ability to change.

 

One fear of this approach may be that multiple variants of the network would “fracture” the net; this is the argument against differentiated service (which erases the benefit of NN) and alternative roots (which dilutes the authority of the ICANN DNS). Yet these concerns are overblown. There are so many examples of multiple technical standards in everything from wireless communications (CDMA versus UMTS, as well as differing frequency bands) to electrical voltage (110, 220 and others) that heterogeneity is the norm, and the Internet’s uniformity is the exception. Moreover, technology is able to meld to human needs. Mechanisms that act as translators, converters, gateways and bridges emerge in other domains, be it currency trading or language, it would certainly develop in this situation, too. Indeed, it is often easier in technology domains than others.

 

Ultimately, “The Internet ™” is a mechanism of closedness that  preserves openness. That is, it uses trademark law’s restrictiveness to maintain the current Internet approach as the standard, but does this at the same time as it welcomes newer approaches. Moreover, it leaves it to users themselves to determine which system they wish to use. The Internet, like markets, has always relied on experimentation, diversity and free choice as the way to decide which technologies get deployed, rather than deferring to decisions made before the fact by a central authority that presumes to know best.

 

It would be peculiar if a network characterized by continual change would have elements of it that could never change. Yet the risk is that a well-meaning attempt to preserve the Internet’s openness could actually render it closed. Enshrining the Internet in trademark law would enable it to evolve rather than remain static. In the past, we spelled the Internet with a capital “I”; now we may wish to capitalize the ‘T” as well, e.g., “The Internet ™”.

 

________________________________

 

* Technology and Telecommunications Correspondent, The Economist, London, UK.

Contact: KennethCukier@economist.com

 

 

# # #