Home

Two Bits Part 5

Two Bits - novelonlinefull.com

You’re read light novel Two Bits Part 5 online at NovelOnlineFull.com. Please use the follow button to get notification about the latest chapter next time when you visit NovelOnlineFull.com. Use F11 button to read novel in full-screen(PC only). Drop by anytime you want to read free – fast – latest novel. It’s great if you could leave a comment, share your opinion about the new chapters, new novel with others on the internet. We’ll do our best to bring you the finest, latest novel everyday. Enjoy

There was a problem, however (there always is): UNIX belonged to AT&T, and AT&T had licensed it to multiple manufacturers over the years, in addition to allowing the source code to circulate more or less with abandon throughout the world and to be ported to a wide variety of different machine architectures. Such proliferation, albeit haphazard, was a dream come true: a single, interoperable operating system running on all kinds of hardware. Unfortunately, proliferation would also undo that dream, because it meant that as the markets for workstations and operating systems heated up, the existing versions of UNIX hardened into distinct and incompatible versions with different features and interfaces. By the mid 1980s, there were multiple competing efforts to standardize UNIX, an endeavour that eventually went haywire, resulting in the so-called UNIX wars, in which "gangs" of vendors (some on both sides of the battle) teamed up to promote competing standards. The story of how this happened is instructive, for it is a story that has been reiterated several times in the computer industry.13 As a hybrid commercial-academic system, UNIX never entered the market as a single thing. It was licensed in various ways to different people, both academic and commercial, and contained additions and tools and other features that may or may not have originated at (or been returned to) Bell Labs. By the early 1980s, the Berkeley Software Distribution was in fact competing with the AT&T version, even though BSD was a sublicensee-and it was not the only one. By the late 1970s and early 1980s, a number of corporations had licensed UNIX from AT&T for use on new machines. Microsoft licensed it (and called it Xenix, rather than licensing the name UNIX as well) to be installed on Intel-based machines. IBM, Unisys, Amdahl, Sun, DEC, and Hewlett-Packard all followed suit and created their own versions and names: HP-UX, A/UX, AIX, Ultrix, and so on. Given the ground rules of trade secrecy and intellectual property, each of these licensed versions needed to be made legally distinct-if they were to compete with each other. Even if "UNIX" remained conceptually pure in an academic or pedagogical sense, every manufacturer would nonetheless have to tweak, to extend, to optimize in order to differentiate. After all, "if companies do nothing but adhere to standards, then all computers will become commodities, and n.o.body will be able to make any money."14 It was thus unlikely that any of these corporations would contribute the changes they made to UNIX back into a common pool, and certainly not back to AT&T which subsequent to the 1984 divest.i.ture finally released their own commercial version of UNIX, called UNIX System V. Very quickly, the promising "open" UNIX of the 1970s became a slough of alternative operating systems, each incompatible with the next thanks to the addition of market-differentiating features and hardware-specific tweaks. According to Pamela Gray, "By the mid-1980s, there were more than 100 versions in active use" centered around the three market leaders, AT&T's System V, Microsoft/SCO Xenix, and the BSD.15 By 1984, the differences in systems had become significant-as in the case of the BSD additions of the TCP/IP protocols, the vi editor, and the Pascal compiler-and created not only differentiation in terms of quality but also incompatibility at both the software and networking levels.

Different systems of course had different user communities, based on who was the customer of whom. Eric Raymond suggests that in the mid-1980s, independent hackers, programmers, and computer scientists largely followed the fortunes of BSD: "The divide was roughly between longhairs and shorthairs; programmers and technical people tended to line up with Berkeley and BSD, more business-oriented types with AT&T and System V. The longhairs, repeating a theme from Unix's early days ten years before, liked to see themselves as rebels against a corporate empire; one of the small companies put out a poster showing an X-wing-like s.p.a.ce fighter marked "BSD" speeding away from a huge AT&T 'death star' logo left broken and in flames."16 So even though UNIX had become the standard operating system of choice for time-sharing, multi-user, high-performance computers by the mid-1980s, there was no such thing as UNIX. Compet.i.tors in the UNIX market could hardly expect the owner of the system, AT&T, to standardize it and compete with them at the same time, and the rest of the systems were in some legal sense still derivations from the original AT&T system. Indeed, in its licensing pamphlets, AT&T even insisted that UNIX was not a noun, but an adjective, as in "the UNIX system."17 The dawning realization that the proliferation of systems was not only spreading UNIX around the world but also spreading it thin and breaking it apart led to a series of increasingly startling and high-profile attempts to "standardize" UNIX. Given that the three major branches (BSD, which would become the industry darling as Sun's Solaris operating system; Microsoft, and later SCO Xenix; and AT&T's System V) all emerged from the same AT&T and Berkeley work done largely by Thompson, Ritchie, and Joy, one would think that standardization would be a snap. It was anything but.

Figuring Out Goes Haywire.

Figuring out the moral and technical order of open systems went haywire around 198688, when there were no fewer than four competing international standards, represented by huge consortia of computer manufacturers (many of whom belonged to multiple consortia): POSIX, the X/Open consortium, the Open Software Foundation, and UNIX International. The blind spot of open systems had much to do with this crazy outcome: academics, industry, and government could not find ways to agree on standardization. One goal of standardization was to afford customers choice; another was to allow compet.i.tion unconstrained by "artificial" means. A standard body of source code was impossible; a standard "interface definition" was open to too much interpretation; government and academic standards were too complex and expensive; no particular corporation's standard could be trusted (because they could not be trusted to reveal it in advance of their own innovations); and worst of all, customers kept buying, and vendors kept shipping, and the world was increasingly filled with diversity, not standardization.

UNIX proliferated quickly because of porting, leading to multiple instances of an operating system with substantially similar source code shared by academics and licensed by AT&T. But it differentiated just as quickly because of forking, as particular features were added to different ports. Some features were reincorporated into the "main" branch-the one Thompson and Ritchie worked on-but the bulk of these mutations spread in a haphazard way, shared through users directly or implemented in newly formed commercial versions. Some features were just that, features, but others could extend the system in ways that might make an application possible on one version, but not on another.

The proliferation and differentiation of UNIX, the operating system, had peculiar effects on the emerging market for UNIX, the product: technical issues entailed design and organizational issues. The original UNIX looked the way it did because of the very peculiar structure of the organization that created and sustained UNIX: Bell Labs and the worldwide community of users and developers. The newly formed compet.i.tors, conceiving of UNIX as a product distinct from the original UNIX, adopted it precisely because of its portability and because of the promise of open systems as an alternative to "big iron" mainframes. But as UNIX was funneled into existing corporations with their own design and organizational structures, it started to become incompatible with itself, and the desire for compet.i.tion in open systems necessitated efforts at UNIX standardization.

The first step in the standardization of open systems and UNIX was the creation of what was called an "interface definition," a standard that enumerated the minimum set of functions that any version of UNIX should support at the interface level, meaning that any programmer who wrote an application could expect to interact with any version of UNIX on any machine in the same way and get the same response from the machine (regardless of the specific implementation of the operating system or the source code that was used). Interface definitions, and extensions to them, were ideally to be published and freely available.

The interface definition was a standard that emphasized portability, not at the source-code or operating-system level, but at the application level, allowing applications built on any version of UNIX to be installed and run on any other. The push for such a standard came first from a UNIX user group founded in 1980 by Bob Marsh and called, after the convention of file hierarchies in the UNIX interface, "/usr/group" (later renamed Uniforum). The 1984 /usr/group standard defined a set of system calls, which, however, "was immediately ignored and, for all practical purposes, useless."18 It seemed the field was changing too fast and UNIX proliferating and innovating too widely for such a standard to work.

The /usr/group standard nevertheless provided a starting point for more traditional standards organizations-the Inst.i.tute of Electrical and Electronics Engineers (IEEE) and the American National Standards Inst.i.tute (ANSI)-to take on the task. Both inst.i.tutions took the /usr/group standard as a basis for what would be called IEEE P1003 Portable Operating System Interface for Computer Environments (POSIX). Over the next three years, from 1984 to 1987, POSIX would work diligently at providing a standard interface definition for UNIX.

Alongside this development, the AT&T version of UNIX became the basis for a different standard, the System V Interface Definition (SVID), which attempted to standardize a set of functions similar but not identical to the /usr/group and POSIX standards. Thus emerged two competing definitions for a standard interface to a system that was rapidly proliferating into hundreds of tiny operating-system fiefdoms.19 The danger of AT&T setting the standard was not lost on any of the competing manufacturers. Even if they created a thoroughly open standard-interface definition, AT&T's version of UNIX would be the first to implement it, and they would continually have privileged knowledge of any changes: if they sought to change the implementation, they could change the standard; if they received demands that the standard be changed, they could change their implementation before releasing the new standard.

In response to this threat, a third entrant into the standards race emerged: X/Open, which comprised a variety of European computer manufacturers (including AT&T!) and sought to develop a standard that encompa.s.sed both SVID and POSIX. The X/Open initiative grew out of European concern about the dominance of IBM and originally included Bull, Ericsson, ICL, Nixdorf, Olivetti, Philips, and Siemens. In keeping with a certain 1980s taste for the integration of European economic activity vis-a-vis the United States and j.a.pan, these manufacturers banded together both to distribute a unified UNIX operating system in Europe (based initially on the BSD and Sun versions of UNIX) and to attempt to standardize it at the same time.

X/Open represented a subtle transformation of standardization efforts and of the organizational definition of open systems. While the /usr/group standard was developed by individuals who used UNIX, and the POSIX standard by an acknowledged professional society (IEEE), the X/Open group was a collective of computer corporations that had banded together to fund an independent ent.i.ty to help further the cause of a standard UNIX. This paradoxical situation-of a need to share a standard among all the compet.i.tors and the need to keep the details of that standardized product secret to maintain an advantage-was one that many manufacturers, especially the Europeans with their long experience of IBM's monopoly, understood as mutually destructive. Hence, the solution was to engage in a kind of organizational innovation, to create a new form of metacorporate structure that could strategically position itself as at least temporarily interested in collaboration with other firms, rather than in compet.i.tion. Thus did stories and promises of open systems wend their way from the details of technical design to those of organizational design to the moral order of compet.i.tion and collaboration, power and strategy. "Standards" became products that corporations sought to "sell" to their own industry through the intermediary of the consortium.

In 1985 and 1986 the disarrayed state of UNIX was also frustrating to the major U.S. manufacturers, especially to Sun Microsystems, which had been founded on the creation of a market for UNIX-based "workstations," high-powered networked computers that could compete with mainframes and personal computers at the same time. Founded by Bill Joy, Vinod Khosla, and Andreas Bechtolsheim, Sun had very quickly become an extraordinarily successful computer company. The business pages and magazines were keen to understand whether workstations were viable compet.i.tors to PCs, in particular to those of IBM and Microsoft, and the de facto standard DOS operating system, for which a variety of extremely successful business-, personal-, and home-computer applications were written.

Sun seized on the anxiety around open systems, as is evident in the ad it ran during the summer of 1987 (figure 4). The ad plays subtly on two anxieties: the first is directed at the consumer and suggests that only with Sun can one actually achieve interoperability among all of one business' computers, much less across a network or industry; the second is more subtle and plays to fears within the computer industry itself, the anxiety that Sun might merge with one of the big corporations, AT&T or Unisys, and corner the market in open systems by producing the de facto standard.

4a and 4b. Open systems anxiety around mergers and compatibility. Sun Microsystems advertis.e.m.e.nt, Wall Street Journal, 9 July 1987.

In fact, in October 1987 Sun announced that it had made a deal with AT&T. AT&T would distribute a workstation based on Sun's SPARC line of workstations and would acquire 20 percent of Sun.20 As part of this announcement, Sun and AT&T made clear that they intended to merge two of the dominant versions of UNIX on the market: AT&T's System V and the BSD-derived Solaris. This move clearly frightened the rest of the manufacturers interested in UNIX and open systems, as it suggested a kind of super-power alignment that would restructure (and potentially dominate) the market. A 1988 article in the New York Times quotes an industry a.n.a.lyst who characterizes the merger as "a matter of concern at the highest levels of every major computer company in the United States, and possibly the world," and it suggests that competing manufacturers "also fear that AT&T will gradually make Unix a proprietary product, usable only on AT&T or Sun machines."21 The industry anxiety was great enough that in March Unisys (a computer manufacturer, formerly Burroughs-Sperry) announced that it would work with AT&T and Sun to bring UNIX to its mainframes and to make its business applications run on UNIX. Such a move was tantamount to Unisys admitting that there would be no future in proprietary high-end computing-the business on which it had hitherto built its reputation-unless it could be part of the consortium that could own the standard.22 In response to this perceived collusion a group of U.S. and European companies banded together to form another rival organization-one that partially overlapped with X/Open but now included IBM-this one called the Open Software Foundation. A nonprofit corporation, the foundation included IBM, Digital Equipment, Hewlett-Packard, Bull, Nixdorf, Siemens, and Apollo Computer (Sun's most direct compet.i.tor in the workstation market). Their goal was explicitly to create a "competing standard" for UNIX that would be available on the hardware they manufactured (and based, according to some newspaper reports, on IBM's AIX, which was to be called OSF/1). AT&T appeared at first to support the foundation, suggesting that if the Open Software Foundation could come up with a standard, then AT&T would make System V compatible with it. Thus, 1988 was the summer of open love. Every major computer manufacturer in the world was now part of some consortium or another, and some were part of two-each promoting a separate standard.

Of all the corporations, Sun did the most to brand itself as the originator of the open-systems concept. They made very broad claims for the success of open-systems standardization, as for instance in an ad from August 1988 (figure 5), which stated in part: But what's more, those sales confirm a broad acceptance of the whole idea behind Sun.

The Open Systems idea. Systems based on standards so universally accepted that they allow combinations of hardware and software from literally thousands of independent vendors. . . . So for the first time, you're no longer locked into the company who made your computers. Even if it's us.

The ad goes on to suggest that "in a free market, the best products win out," even as Sun played both sides of every standardization battle, cooperating with both AT&T and with the Open Software Foundation. But by October of that year, it was clear to Sun that

5. It pays to be open: Sun's version of profitable and successful open systems. Sun Microsystems advertis.e.m.e.nt, New York Times, 2 August 1988.

the idea hadn't really become "so universal" just yet. In that month AT&T and Sun banded together with seventeen other manufacturers and formed a rival consortium: Unix International, a coalition of the willing that would back the AT&T UNIX System V version as the one true open standard. In a full-page advertis.e.m.e.nt from Halloween of 1988 (figure 6), run simultaneously in the New York Times, the Washington Post, and the Wall Street Journal, the rhetoric of achieved success remained, but now instead of "the Open Systems idea," it was "your demand for UNIX System V-based solutions that ushered in the era of open architecture." Instead of a standard for all open systems, it was a war of all against all, a war to a.s.sure customers that they had made, not the right choice of hardware or software, but the right choice of standard.

The proliferation of standards and standards consortia is often referred to as the UNIX wars of the late 1980s, but the creation of such consortia did not indicate clearly drawn lines. Another metaphor that seems to have been very popular in the press at the time was that of "gang" warfare (no doubt helped along by the creation of another industry consortia informally called the Gang of Nine, which were involved in a dispute over whether MicroChannel or EISA buses should be installed in PCs). The idea of a number of companies forming gangs to fight with each other, Bloods-and-Crips style-or perhaps more Jets-and-Sharks style, minus the singing-was no doubt an appealing metaphor at the height of Los Angeles's very real and high-profile gang warfare. But as one article in the New York Times pointed out, these were strange gangs: "Since 'openness' and 'cooperation' are the buzzwords behind these alliances, the gang often asks its enemy to join. Often the enemy does so, either so that it will not seem to be opposed to openness or to keep tabs on the group. IBM was invited to join the corporation for Open Systems, even though the clear if unstated motive of the group was to dilute IBM's influence in the market. AT&T negotiated to join the Open Software Foundation, but the talks collapsed recently. Some companies find it completely consistent to be members of rival gangs. . . . About 10 companies are members of both the Open Software Foundation and its archrival Unix International."23 The proliferation of these consortia can be understood in various ways. One could argue that they emerged at a time-during the Reagan administration-when ant.i.trust policing had diminished to

6. The UNIX Wars, Halloween 1988. UNIX International advertis.e.m.e.nt, Wall Street Journal and New York Times, 31 October 1988.

the point where computer corporations did not see such collusion as a risky activity vis-a-vis ant.i.trust policing. One could also argue that these consortia represented a recognition that the focus on hardware control (the meaning of proprietary) had been replaced with a focus on the control of the "open standard" by one or several manufacturers, that is, that compet.i.tion was no longer based on superior products, but on "owning the standard." It is significant that the industry consortia quickly overwhelmed national efforts, such as the IEEE POSIX standard, in the media, an indication that no one was looking to government or nonprofits, or to university professional societies, to settle the dispute by declaring a standard, but rather to industry itself to hammer out a standard, de facto or otherwise. Yet another way to understand the emergence of these consortia is as a kind of mutual policing of the market, a kind of paranoid strategy of showing each other just enough to make sure that no one would leapfrog ahead and kill the existing, fragile compet.i.tion.

What this proliferation of UNIX standards and consortia most clearly represents, however, is the blind spot of open systems: the difficulty of having collaboration and compet.i.tion at the same time in the context of intellectual-property rules that incompletely capture the specific and unusual characteristics of software. For partic.i.p.ants in this market, the structure of intellectual property was una.s.sailable-without it, most partic.i.p.ants a.s.sumed, innovation would cease and incentives disappear. Despite the fact that secrecy haunted the industry, its customers sought both openness and compatibility. These conflicting demands proved irresolvable.

Denouement.

Ironically, the UNIX wars ended not with the emergence of a winner, but with the rea.s.sertion of proprietary computing: Microsoft Windows and Windows NT. Rather than open systems emerging victorious, ushering in the era of seamless integration of diverse components, the reverse occurred: Microsoft managed to grab a huge share of computer markets, both desktop and high-performance, by leveraging its brand, the ubiquity of DOS, and application-software developers' dependence on the "Wintel" monster (Windows plus Intel chips). Microsoft triumphed, largely for the same reasons the open-systems dream failed: the legal structure of intellectual property favored a strong corporate monopoly on a single, branded product over a weak array of "open" and competing components. There was no large gain to investors, or to corporations, from an industry of nice guys sharing the source code and making the components work together. Microsoft, on the other hand, had decided to do so internal to itself; it did not necessarily need to form consortia or standardize its operating systems, if it could leverage its dominance in the market to spread the operating system far and wide. It was, as standards observers like to say, the triumph of de facto standardization over de jure. It was a return to the manacled wretches of IBM's monopoly-but with a new dungeon master.

The denouement of the UNIX standards story was swift: AT&T sold its UNIX System Labs (including all of the original source and rights) to Novell in 1993, who sold it in turn to SCO two years later. Novell sold (or transferred) the trademark name UNIX to the X/Open group, which continued to fight for standardization, including a single universal UNIX specification. In 1996 X/Open and the Open Software Foundation merged to form the Open Group.24 The Open Group eventually joined forces with IEEE to turn POSIX into a single UNIX specification in 2001. They continue to push the original vision of open systems, though they carefully avoid using the name or concept, referring instead to the trademarked mouthful "Boundaryless Information Flow" and employing an updated and newly inscrutable rhetoric: "Boundaryless Information Flow, a shorthand representation of 'access to integrated information to support business process improvements' represents a desired state of an enterprise's infrastructure and is specific to the business needs of the organization."25 The Open Group, as well as many other partic.i.p.ants in the history of open systems, recognize the emergence of "open source" as a return to the now one true path of boundaryless information flow. Eric Raymond, of course, sees continuity and renewal (not least of which in his own partic.i.p.ation in the Open Source movement) and in his Art of UNIX Programming says, "The Open Source movement is building on this stable foundation and is creating a resurgence of enthusiasm for the UNIX philosophy. In many ways Open Source can be seen as the true delivery of Open Systems that will ensure it continues to go from strength to strength."26 This continuity, of course, deliberately disavows the centrality of the legal component, just as Raymond and the Open Source Initiative had in 1998. The distinction between a robust market in UNIX operating systems and a standard UNIX-based infrastructure on which other markets and other activities can take place still remains unclear to even those closest to the money and machines. It does not yet exist, and may well never come to.

The growth of Free Software in the 1980s and 1990s depended on openness as a concept and component that was figured out during the UNIX wars. It was during these wars that the Free Software Foundation (and other groups, in different ways) began to recognize the centrality of the issue of intellectual property to the goal of creating an infrastructure for the successful creation of open systems.27 The GNU (GNU's Not Unix) project in particular, but also the X Windows system at MIT, the Remote Procedure Call and Network File System (NFS) systems created by Sun, and tools like sendmail and BIND were each in their own way experiments with alternative licensing arrangements and were circulating widely on a variety of the UNIX versions in the late 1980s. Thus, the experience of open systems, while technically a failure as far as UNIX was concerned, was nonetheless a profound learning experience for an entire generation of engineers, hackers, geeks, and entrepreneurs. Just as the UNIX operating system had a pedagogic life of its own, inculcating itself into the minds of engineers as the paradigm of an operating system, open systems had much the same effect, realizing an inchoate philosophy of openness, interconnection, compatibility, interoperability-in short, availability and modifiability-that was in conflict with intellectual-property structures as they existed. To put it in Freudian terms: the neurosis of open systems wasn't cured, but the structure of its impossibility had become much clearer to everyone. UNIX, the operating system, did not disappear at all-but UNIX, the market, did.

Open Systems Two: Networks.

The struggle to standardize UNIX as a platform for open systems was not the only open-systems struggle; alongside the UNIX wars, another "religious war" was raging. The attempt to standardize networks-in particular, protocols for the inter-networking of multiple, diverse, and autonomous networks of computers-was also a key aspect of the open-systems story of the 1980s.28 The war between the TCP/IP and OSI was also a story of failure and surprising success: the story of a successful standard with international approval (the OSI protocols) eclipsed by the experimental, military-funded TCP/IP, which exemplified an alternative and unusual standards process. The moral-technical orders expressed by OSI and TCP/IP are, like that of UNIX, on the border between government, university, and industry; they represent conflicting social imaginaries in which power and legitimacy are organized differently and, as a result, expressed differently in the technology.

OSI and TCP/IP started with different goals: OSI was intended to satisfy everyone, to be the complete and comprehensive model against which all competing implementations would be validated; TCP/IP, by contrast, emphasized the easy and robust interconnection of diverse networks. TCP/IP is a protocol developed by bootstrapping between standard and implementation, a mode exemplified by the Requests for Comments system that developed alongside them as part of the Arpanet project. OSI was a "model" or reference standard developed by internationally respected standards organizations.

In the mid-1980s OSI was en route to being adopted internationally, but by 1993 it had been almost completely eclipsed by TCP/IP. The success of TCP/IP is significant for three reasons: (1) availability-TCP/IP was itself available via the network and development open to anyone, whereas OSI was a bureaucratically confined and expensive standard and partic.i.p.ation was confined to state and corporate representatives, organized through ISO in Geneva; (2) modifiability-TCP/IP could be copied from an existing implementation (such as the BSD version of UNIX) and improved, whereas OSI was a complex standard that had few existing implementations available to copy; and (3) serendipity-new uses that took advantage of availability and modifiability sprouted, including the "killer app" that was the World Wide Web, which was built to function on existing TCP/IP-based networks, convincing many manufacturers to implement that protocol instead of, or in addition to, OSI.

The success of TCP/IP over OSI was also significant because of the difference in the standardization processes that it exemplified. The OSI standard (like all official international standards) is conceived and published as an aid to industrial growth: it was imagined according to the ground rules of intellectual property and as an attempt to facilitate the expansion of markets in networking. OSI would be a "vendor-neutral" standard: vendors would create their own, secret implementations that could be validated by OSI and thereby be expected to interoperate with other OSI-validated systems. By stark contrast, the TCP/IP protocols were not published (in any conventional sense), nor were the implementations validated by a legitimate international-standards organization; instead, the protocols are themselves represented by implementations that allow connection to the network itself (where the TCP/IP protocols and implementations are themselves made available). The fact that one can only join the network if one possesses or makes an implementation of the protocol is generally seen as the ultimate in validation: it works.29 In this sense, the struggle between TCP/IP and OSI is indicative of a very familiar twentieth-century struggle over the role and extent of government planning and regulation (versus entrepreneurial activity and individual freedom), perhaps best represented by the twin figures of Friedrich Hayek and Maynard Keynes. In this story, it is Hayek's aversion to planning and the subsequent privileging of spontaneous order that eventually triumphs, not Keynes's paternalistic view of the government as a neutral body that absorbs or encourages the swings of the market.

Bootstrapping Networks.

The "religious war" between TCP/IP and OSI occurred in the context of intense compet.i.tion among computer manufacturers and during a period of vibrant experimentation with computer networks worldwide. As with most developments in computing, IBM was one of the first manufacturers to introduce a networking system for its machines in the early 1970s: the System Network Architecture (SNA). DEC followed suit with Digital Network Architecture (DECnet or DNA), as did Univac with Distributed Communications Architecture (DCA), Burroughs with Burroughs Network Architecture (BNA), and others. These architectures were, like the proprietary operating systems of the same era, considered closed networks, networks that interconnected a centrally planned and specified number of machines of the same type or made by the same manufacturer. The goal of such networks was to make connections internal to a firm, even if that involved geographically widespread systems (e.g., from branch to headquarters). Networks were also to be products.

The 1970s and 1980s saw extraordinarily vibrant experimentation with academic, military, and commercial networks. Robert Metcalfe had developed Ethernet at Xerox PARC in the mid-1970s, and IBM later created a similar technology called "token ring." In the 1980s the military discovered that the Arpanet was being used predominantly by computer scientists and not just for military applications, and decided to break it into MILNET and CSNET.30 Bulletin Board Services, which connected PCs to each other via modems to download files, appeared in the late 1970s. Out of this grew Tom Jennings's very successful experiment called FidoNet.31 In the 1980s an existing social network of university faculty on the East Coast of the United States started a relatively successful network called BITNET (Because It's There Network) in the mid-1980s.32 The Unix to Unix Copy Protocol (uucp), which initially enabled the Usenet, was developed in the late 1970s and widely used until the mid-1980s to connect UNIX computers together. In 1984 the NSF began a program to fund research in networking and created the first large backbones for NSFNet, successor to the CSNET and Arpanet.33 In the 1970s telecommunications companies and spin-off start-ups experimented widely with what were called "videotex" systems, of which the most widely implemented and well-known is Minitel in France.34 Such systems were designed for consumer users and often provided many of the now widespread services available on the Internet in a kind of embryonic form (from comparison shopping for cars, to directory services, to p.o.r.nography).35 By the late 1970s, videotex systems were in the process of being standardized by the Commite Consultative de Information, Technologie et Telecommunications (CCITT) at the International Telecommunications Union (ITU) in Geneva. These standards efforts would eventually be combined with work of the International Organization for Standardization (ISO) on OSI, which had originated from work done at Honeywell.36 One important feature united almost all of these experiments: the networks of the computer manufacturers were generally piggybacked, or bootstrapped, onto existing telecommunications infrastructures built by state-run or regulated monopoly telecommunications firms. This situation inevitably spelled grief, for telecommunications providers are highly regulated ent.i.ties, while the computer industry has been almost totally unregulated from its inception. Since an increasingly core part of the computer industry's business involved transporting signals through telecommunications systems without being regulated to do so, the telecommunications industry naturally felt themselves at a disadvantage.37 Telecommunications companies were not slow to respond to the need for data communications, but their ability to experiment with products and practices outside the scope of telephony and telegraphy was often hindered by concerns about ant.i.trust and monopoly.38 The unregulated computer industry, by contrast, saw the tentativeness of the telecommunications industry (or national PTTs) as either bureaucratic inertia or desperate attempts to maintain control and power over existing networks-though no computer manufacturer relished the idea of building their own physical network when so many already existed.

TCP/IP and OSI have become emblematic of the split between the worlds of telecommunications and computing; the metaphors of religious wars or of blood feuds and cold wars were common.39 A particularly arch account from this period is Carl Malamud's Exploring the Internet: A Technical Travelogue, which doc.u.ments Malamud's (physical) visits to Internet sites around the globe, discussions (and beer) with networking researchers on technical details of the networks they have created, and his own typically geeky, occasionally offensive takes on cultural difference.40 A subtheme of the story is the religious war between Geneva (in particular the ITU) and the Internet: Malamud tells the story of asking the ITU to release its 19,000-page "blue book" of standards on the Internet, to facilitate its adoption and spread.

The resistance of the ITU and Malamud's heroic if quixotic attempts are a parable of the moral-technical imaginaries of openness-and indeed, his story draws specifically on the usable past of Giordano Bruno.41 The "bruno" project demonstrates the gulf that exists between two models of legitimacy-those of ISO and the ITU-in which standards represent the legal and legitimate consensus of a regulated industry, approved by member nations, paid for and enforced by governments, and implemented and adhered to by corporations.

Opposite ISO is the ad hoc, experimental style of Arpanet and Internet researchers, in which standards are freely available and implementations represent the mode of achieving consensus, rather than the outcome of the consensus. In reality, such a rhetorical opposition is far from absolute: many ISO standards are used on the Internet, and ISO remains a powerful, legitimate standards organization. But the clash of established (telecommunications) and emergent (computer-networking) industries is an important context for understanding the struggle between OSI and TCP/IP.

The need for standard networking protocols is unquestioned: interoperability is the bread and b.u.t.ter of a network. Nonetheless, the goals of the OSI and the TCP/IP protocols differed in important ways, with profound implications for the shape of that interoperability. OSI's goals were completeness, control, and comprehensiveness. OSI grew out of the telecommunications industry, which had a long history of confronting the vicissitudes of linking up networks and facilitating communication around the world, a problem that required a strong process of consensus and negotiation among large, powerful, government-run ent.i.ties, as well as among smaller manufacturers and providers. OSI's feet were firmly planted in the international standardization organizations like OSI and the ITU (an organization as old as telecommunications itself, dating to the 1860s).

Even if they were oft-mocked as slow, bureaucratic, or c.u.mbersome, the processes of ISO and ITU-based in consensus, international agreement, and thorough technical specification-are processes of unquestioned legitimacy. The representatives of nations and corporations who attend ISO and ITU standards discussions, and who design, write, and vote on these standards, are usually not bureaucrats, but engineers and managers directly concerned with the needs of their const.i.tuency. The consensus-oriented process means that ISO and ITU standards attempt to satisfy all members' goals, and as such they tend to be very large, complex, and highly specific doc.u.ments. They are generally sold to corporations and others who need to use them, rather than made freely available, a fact that until recently reflected their legitimacy, rather than lack thereof.

TCP/IP, on the other hand, emerged from very different conditions.42 These protocols were part of a Department of Defensefunded experimental research project: Arpanet. The initial Arpanet protocols (the Network Control Protocol, or NCP) were insufficient, and TCP/IP was an experiment in interconnecting two different "packet-switched networks": the ground-linebased Arpanet network and a radio-wave network called Packet Radio.43 The problem facing the designers was not how to accommodate everyone, but merely how to solve a specific problem: interconnecting two technically diverse networks, each with autonomous administrative boundaries, but forcing neither of them to give up the system or the autonomy.

Until the mid-1980s, the TCP/IP protocols were resolutely research-oriented, and not the object of mainstream commercial interest. Their development reflected a core set of goals shared by researchers and ultimately promoted by the central funding agency, the Department of Defense. The TCP/IP protocols are often referred to as enabling packet-switched networks, but this is only partially correct; the real innovation of this set of protocols was a design for an "inter-network," a system that would interconnect several diverse and autonomous networks (packet-switched or circuit-switched), without requiring them to be transformed, redesigned, or standardized-in short, by requiring only standardization of the intercommunication between networks, not standardization of the network itself. In the first paper describing the protocol Robert Kahn and Vint Cerf motivated the need for TCP/IP thus: "Even though many different and complex problems must be solved in the design of an individual packet-switching network, these problems are manifestly compounded when dissimilar networks are interconnected. Issues arise which may have no direct counterpart in an individual network and which strongly influence the way in which Internetwork communication can take place."44 The explicit goal of TCP/IP was thus to share computer resources, not necessarily to connect two individuals or firms together, or to create a compet.i.tive market in networks or networking software. Sharing between different kinds of networks implied allowing the different networks to develop autonomously (as their creators and maintainers saw best), but without sacrificing the ability to continue sharing. Years later, David Clark, chief Internet engineer for several years in the 1980s, gave a much more explicit explanation of the goals that led to the TCP/IP protocols. In particular, he suggested that the main overarching goal was not just to share resources but "to develop an effective technique for multiplexed utilization of existing interconnected networks," and he more explicitly stated the issue of control that faced the designers: "Networks represent administrative boundaries of control, and it was an ambition of this project to come to grips with the problem of integrating a number of separately administrated ent.i.ties into a common utility."45 By placing the goal of expandability first, the TCP/IP protocols were designed with a specific kind of simplicity in mind: the test of the protocols' success was simply the ability to connect.

By setting different goals, TCP/IP and OSI thus differed in terms of technical details; but they also differed in terms of their context and legitimacy, one being a product of international-standards bodies, the other of military-funded research experiments. The technical and organizational differences imply different processes for standardization, and it is the peculiar nature of the so-called Requests for Comments (RFC) process that gave TCP/IP one of its most distinctive features. The RFC system is widely recognized as a unique and serendipitous outcome of the research process of Arpanet.46 In a thirty-year retrospective (published, naturally, as an RFC: RFC 2555), Vint Cerf says, "Hiding in the history of the RFCs is the history of human inst.i.tutions for achieving cooperative work." He goes on to describe their evolution over the years: "When the RFCs were first produced, they had an almost 19th century character to them-letters exchanged in public debating the merits of various design choices for protocols in the ARPANET. As email and bulletin boards emerged from the fertile fabric of the network, the far-flung partic.i.p.ants in this historic dialog began to make increasing use of the online medium to carry out the discussion-reducing the need for doc.u.menting the debate in the RFCs and, in some respects, leaving historians somewhat impoverished in the process. RFCs slowly became conclusions rather than debates."47 Increasingly, they also became part of a system of discussion and implementation in which partic.i.p.ants created working software as part of an experiment in developing the standard, after which there was more discussion, then perhaps more implementation, and finally, a standard. The RFC process was a way to condense the process of standardization and validation into implementation; which is to say, the proof of open systems was in the successful connection of diverse networks, and the creation of a standard became a kind of ex post facto rubber-stamping of this demonstration. Any further improvement of the standard hinged on an improvement on the standard implementation because the standards that resulted were freely and widely available: "A user could request an RFC by email from his host computer and have it automatically delivered to his mailbox. . . . RFCs were also shared freely with official standards bodies, manufacturers and vendors, other working groups, and universities. None of the RFCs were ever restricted or cla.s.sified. This was no mean feat when you consider that they were being funded by DoD during the height of the Cold War."48 The OSI protocols were not nearly so freely available. The ironic reversal-the transparency of a military-research program versus the opacity of a Geneva-based international-standards organization-goes a long way toward explaining the reasons why geeks might find the story of TCP/IP's success to be so appealing. It is not that geeks are secretly militaristic, but that they delight in such surprising reversals, especially when those reversals exemplify the kind of ad hoc, clever solution to problems of coordination that the RFC process does. The RFC process is not the only alternative to a consensus-oriented model of standardization pioneered in the international organizations of Geneva, but it is a specific response to a reorientation of power and knowledge that was perhaps more "intuitively obvious" to the creators of Arpanet and the Internet, with its unusual design goals and context, than it would have been to the purveyors of telecommunications systems with over a hundred years of experience in connecting people in very specific and established ways.

Success as Failure.

By 1985, OSI was an official standard, one with widespread acceptance by engineers, by the government and military (the "GOSIP" standard), and by a number of manufacturers, the most significant of which was General Motors, with its Manufacturing Automation Protocol (MAP). In textbooks and handbooks of the late 1980s and early 1990s, OSI was routinely referred to as the inevitable standard-which is to say, it had widespread legitimacy as the standard that everyone should be implementing-but few implementations existed. Many of the textbooks on networking from the late 1980s, especially those slanted toward a theoretical introduction, give elaborate detail of the OSI reference model-a generation of students in networking was no doubt trained to understand the world in terms of OSI-but the ambivalence continued. Indeed, the most enduring legacy of the creation of the OSI protocols is not the protocols themselves (some of which, like ASN.1, are still widely used today), but the pedagogical model: the "7 layer stack" that is as ubiquitous in networking cla.s.ses and textbooks as UNIX is in operating-systems cla.s.ses.49 But in the late 1980s, the ambivalence turned to confusion. With OSI widely recognized as the standard, TCP/IP began to show up in more and more actually existing systems. For example, in Computer Network Architectures and Protocols, Carl Sunshine says, "Now in the late 1980s, much of the battling seems over. CCITT and ISO have aligned their efforts, and the research community seems largely to have resigned itself to OSI." But immediately afterward he adds: "It is ironic that while a consensus has developed that OSI is indeed inevitable, the TCP/IP protocol suite has achieved widespread deployment, and now serves as a de facto interoperability standard. . . . It appears that the vendors were unable to bring OSI products to market quickly enough to satisfy the demand for interoperable systems, and TCP/IP were there to fill the need."50 The more implementations that appeared, the less secure the legitimate standard seemed to be. By many accounts the OSI specifications were difficult to implement, and the yearly networking-industry "Interop" conferences became a regular locale for the religious war between TCP/IP and OSI. The success of TCP/IP over OSI reflects the reorientation of knowledge and power to which Free Software is also a response. The reasons for the success are no doubt complex, but the significance of the success of TCP/IP ill.u.s.trates three issues: availability, modifiability, and serendipity.

Availability The TCP/IP standards themselves were free to anyone and available over TCP/IP networks, exemplifying one of the aspects of a recursive public: that the only test of partic.i.p.ation in a TCP/IP-based internetwork is the fact that one possesses or has created a device that implements TCP/IP. Access to the network is contingent on the interoperability of the networks. The standards were not "published" in a conventional sense, but made available through the network itself, without any explicit intellectual property restrictions, and without any fees or restrictions on who could access them. By contrast, ISO standards are generally not circulated freely, but sold for relatively high prices, as a source of revenue, and under the general theory that only legitimate corporations or government agencies would need access to them.

Related to the availability of the standards is the fact that the standards process that governed TCP/IP was itself open to anyone, whether corporate, military or academic. The structure of governance of the Internet Engineering Task Force (the IETF) and the Internet Society (ISOC) allowed for anyone with the means available to attend the "working group" meetings that would decide on the standards that would be approved. Certainly this does not mean that the engineers and defense contractors responsible actively sought out corporate stakeholders or imagined the system to be "public" in any dramatic fashion; however, compared to the system in place at most standards bodies (in which members are usually required to be the representatives of corporations or governments), the IETF allowed individuals to partic.i.p.ate qua individuals.51 Modifiability Implementations of TCP/IP were widely available, bootstrapped from machine to machine along with the UNIX operating system and other tools (e.g., the implementation of TCP/IP in BSD 4.2, the BSD version of UNIX), generally including the source code. An existing implementation is a much more expressive and usable object than a specification for an implementation, and though ISO generally prepares reference implementations for such standards, in the case of OSI there were many fewer implementations to work with or build on. Because multiple implementations of TCP/IP already existed, it was easy to validate: did your (modified) implementation work with the other existing implementations? By contrast, OSI would provide independent validation, but the in situ validation through connection to other OSI networks was much harder to achieve, there being too few of them, or access being restricted. It is far easier to build on an existing implementation and to improve on it piecemeal, or even to rewrite it completely, using its faults as a template (so to speak), than it is to create an implementation based solely on a standard. The existence of the TCP/IP protocols in BSD 4.2 not only meant that people who installed that operating system could connect to the Internet easily, at a time when it was by no means standard to be able to do so, but it also meant that manufacturers or tinkerers could examine the implementation in BSD 4.2 as the basis for a modified, or entirely new, implementation.

Serendipity Perhaps most significant, the appearance of widespread and popular applications that were dependent on TCP/IP gave those protocols an inertia that OSI, with relatively few such applications, did not have. The most important of these by far was the World Wide Web (the http protocol, the HTML mark-up language, and implementations of both servers, such as libwww, and clients, such as Mosaic and Netscape). The basic components of the Web were made to work on top of the TCP/IP networks, like other services that had already been designed (ftp, telnet, gopher, archie, etc.); thus, Tim Berners-Lee, who co-invented the World Wide Web, could also rely on the availability and openness of previous work for his own protocols. In addition, Berners-Lee and CERN (the European Organization for Nuclear Research) dedicated their work to the public domain more or less immediately, essentially allowing anyone to do anything they wished with the system they had cobbled together.52 From the perspective of the tension between TCP/IP and OSI, the World Wide Web was thus what engineers call a "killer app," because its existence actually drove individuals and corporations to make decisions (in favor of TCP/IP) that it might not have made otherwise.

Conclusion.

Openness and open systems are key to understanding the practices of Free Software: the open-systems battles of the 1980s set the context for Free Software, leaving in their wake a partially articulated infrastructure of operating systems, networks, and markets that resulted from figuring out open systems. The failure to create a standard UNIX operating system opened the door for Microsoft Windows NT, but it also set the stage for the emergence of the Linux-operating-system kernel to emerge and spread. The success of the TCP/IP protocols forced multiple competing networking schemes into a single standard-and a singular ent.i.ty, the Internet-which carried with it a set of built-in goals that mirror the moral-technical order of Free Software.

This "infrastructure" is at once technical (protocols and standards and implementations) and moral (expressing ideas about the proper order and organization of commercial efforts to provide high-tech software, networks, and computing power). As with the invention of UNIX, the opposition commercial-noncommercial (or its doppelgangers public-private, profit-nonprofit, capitalist-socialist, etc.) doesn't capture the context. Constraints on the ability to collaborate, compete, or withdraw are in the making here through the technical and moral imaginations of the actors involved: from the corporate behemoths like IBM to (onetime) startups like Sun to the independent academics and amateurs and geeks with stakes in the new high-tech world of networks and software.

The creation of a UNIX market failed. The creation of a legitimate international networking standard failed. But they were local failures only. They opened the doors to new forms of commercial practice (exemplified by Netscape and the dotcom boom) and new kinds of politicotechnical fractiousness (ICANN, IPv6, and "net neutrality"). But the blind spot of open systems-intellectual property-at the heart of these failures also provided the impetus for some geeks, entrepreneurs, and lawyers to start figuring out the legal and economic aspects of Free Software, and it initiated a vibrant experimentation with copyright licensing and with forms of innovative coordination and collaboration built on top of the rapidly spreading protocols of the Internet.

6. Writing Copyright Licenses.

To protect your rights, we need to make restrictions that forbid anyone to deny you these rights or to ask you to surrender the rights. -Preamble to the GNU General Public License The use of novel, unconventional copyright licenses is, without a doubt, the most widely recognized and exquisitely refined component of Free Software. The GNU General Public License (GPL), written initially by Richard Stallman, is often referred to as a beautiful, clever, powerful "hack" of intellectual-property law-when it isn't being denounced as a viral, infectious object threatening the very fabric of economy and society. The very fact that something so boring, so arcane, and so legalistic as a copyright license can become an object of both devotional reverence and bilious scorn means there is much more than fine print at stake.

By the beginning of the twenty-first century, there were hundreds of different Free Software licenses, each with subtle legal and technical differences, and an enormous legal literature to explain their details, motivation, and impact.1 Free Software licenses differ from conventional copyright licenses on software because they usually restrict only the terms of distribution, while so-called End User License Agreements (EULAs) that accompany most proprietary software restrict what users can do with the software. Ethnographically speaking, licenses show up everywhere in the field, and contemporary hackers are some of the most legally sophisticated non-lawyers in the world. Indeed, apprenticeship in the world of hacking is now impossible, as Gabriella Coleman has shown, without a long, deep study of intellectual-property law.2 But how did it come to be this way? As with the example of sharing UNIX source code, Free Software licenses are often explained as a reaction to expanding intellectual-property laws and resistance to rapacious corporations. The text of the GPL itself begins deep in such a.s.sumptions: "The licenses for most software are designed to take away your freedom to share and change it."3 But even if corporations are rapacious, sharing and modifying software are by no means natural human activities. The ideas of sharing and of common property and its relation to freedom must always be produced through specific practices of sharing, before being defended. The GPL is a precise example of how geeks fit together the practices of sharing and modifying software with the moral and technical orders-the social imaginaries-of freedom and autonomy. It is at once an exquisitely precise legal doc.u.ment and the expression of an idea of how software should be made available, shareable, and modifiable.

In this chapter I tell the story of the creation of the GPL, the first Free Software license, during a controversy over EMACS, a very widely used and respected piece of software; the controversy concerned the reuse of bits of copyrighted source code in a version of EMACS ported to UNIX. There are two reasons to retell this story carefully. The first is simply to articulate the details of the origin of the Free Software license itself, as a central component of Free Software, details that should be understood in the context of changing copyright law and the UNIX and open-systems struggles of the 1980s. Second, although the story of the GPL is also an oft-told story of the "hacker ethic," the GPL is not an "expression" of this ethic, as if the ethic were genotype to a legal phenotype. Opposite the familiar story of ethics, I explain how the GPL was "figured out" in the controversy over EMACS, how it was formed in response to a complicated state of affairs, both legal and technical, and in a medium new to all the partic.i.p.ants: the online mailing lists and discussion lists of Usenet and Arpanet.4 The story of the creation of the GNU General Public License ultimately affirms the hacker ethic, not as a story of the ethical hacker genius, but as a historically specific event with a duration and a context, as something that emerges in response to the reorientation of knowledge and power, and through the active modulation of existing practices among both human and nonhuman actors. While hackers themselves might understand the hacker ethic as an unchanging set of moral norms, their practices belie this belief and demonstrate how ethics and norms can emerge suddenly and sharply, undergo repeated transformations, and bifurcate into ideologically distinct camps (Free Software vs. Open Source), even as the practices remain stable relative to them. The hacker ethic does not descend from the heights of philosophy like the categorical imperative-hackers have no Kant, nor do they want one. Rather, as Manuel Delanda has suggested, the philosophy of Free Software is the fact of Free Software itself, its practices and its things. If there is a hacker ethic, it is Free Software itself, it is the recursive public itself, which is much more than a list of norms.5 By understanding it in this way, it becomes possible to track the proliferation and differentiation of the hacker ethic into new and surprising realms, instead of a.s.suming its static universal persistence as a mere procedure that hackers execute.

Free Software Licenses, Once More with Feeling.

In lecturing on liberalism in 1935, John Dewey said the following of Jeremy Bentham: "He was, we might say, the first great muck-raker in the field of law . . . but he was more than that, whenever he saw a defect, he proposed a remedy. He was an inventor in law and administration, as much so as any contemporary in mechanical production."6 Dewey's point was that the liberal reforms attributed to Bentham came not so much from his theories as from his direct involvement in administrative and legal reform-his experimentation. Whether or not Bentham's influence is best understood this way, it nonetheless captures an important component of liberal reform in Europe and America that is also a key component in the story of Free Software: that the route to achieving change is through direct experiment with the system of law and administration.

A similar story might be told of Richard Stallman, hacker hero and founder of the Free Software Foundation, creator of (among many other things) the GNU C Compiler and GNU EMACS, two of the most widely used and tested Free Software tools in the world. Stallman is routinely abused for holding what many perceive to be "dogmatic" or "intractable" ideological positions about freedom and the right of individuals to do what they please with software. While it is no doubt quite true that his speeches and writings clearly betray a certain fervor and fanaticism, it would be a mistake to a.s.sume that his speeches, ideas, or belligerent demands concerning word choice const.i.tute the real substance of his reform. In fact, it is the software he has created and the licenses he has written and rewritten which are the key to his Bentham-like inventiveness. Unlike Bentham, however, Stallman is not a creator of law and administrative structure, but a hacker.

Stallman's GNU General Public License "hacks" the federal copyright law, as is often pointed out. It does this by taking advantage of the very strong rights granted by federal law to actually loosen the restrictions normally a.s.sociated with ownership. Because the statutes grant owners strong powers to create restrictions, Stallman's GPL contains the restriction that anybody can use the licensed material, for any purpose, so long as they subsequently offer the same restriction. Hacks (after which hackers are named) are clever solutions to problems or shortcomings in technology. Hacks are work-arounds, clever, shortest-path solutions that take advantage of characteristics of the system that may or may not have been obvious to the people who designed it. Hacks range from purely utilitarian to mischievously pointless, but they always depend on an existing system or tool through which they achieve their point. To call Free Software a hack is to point out that it would be nothing without the existence of intellectual-property law: it relies on the structure of U.S. copyright law (USC17) in order to subvert it. Free Software licenses are, in a sense, immanent to copyright laws-there is nothing illegal or even legally arcane about what they accomplish-but there is nonetheless a kind of lingering sense that this particular use of copyright was not how the law was intended to function.

Like all software since the 1980 copyright amendments, Free Software is copyrightable-and what's more, automatically copyrighted as it is written (there is no longer any requirement to register). Copyright law grants the author (or the employer of the author) a number of strong rights over the dispensation of what has been written: rights to copy, distribute, and change the work.7 Free Software's hack is to immediately make use of these rights in order to abrogate the rights the programmer has been given, thus granting all subsequent licensees rights to copy, distribute, modify, and use the copyrighted software. Some licenses, like the GPL, add

Please click Like and leave more comments to support and keep us alive.

RECENTLY UPDATED MANGA

Nine Star Hegemon Body Arts

Nine Star Hegemon Body Arts

Nine Star Hegemon Body Arts Chapter 5550: Thunderous Blow Author(s) : 平凡魔术师, Ordinary Magician View : 8,605,630
Star Odyssey

Star Odyssey

Star Odyssey Chapter 3264: Leaving Author(s) : Along With The Wind, 随散飘风 View : 2,217,431
Chaos' Heir

Chaos' Heir

Chaos' Heir Chapter 945: Genius Author(s) : Eveofchaos View : 691,202
I'll Be the Male Leads Sister-in-Law

I'll Be the Male Leads Sister-in-Law

I'll Be the Male Leads Sister-in-Law Chapter 547 Author(s) : September Flowing Fire, 九月流火 View : 162,456
I Am the Fated Villain

I Am the Fated Villain

I Am the Fated Villain Chapter 1364 Author(s) : Fated Villain, 天命反派 View : 1,286,884
All My Disciples Suck!

All My Disciples Suck!

All My Disciples Suck! Chapter 766 Author(s) : Rotating Hot Pot, 回转火锅 View : 550,778

Two Bits Part 5 summary

You're reading Two Bits. This manga has been translated by Updating. Author(s): Christopher M. Kelty. Already has 1232 views.

It's great if you read and follow any novel on our website. We promise you that we'll bring you the latest, hottest novel everyday and FREE.

NovelOnlineFull.com is a most smartest website for reading manga online, it can automatic resize images to fit your pc screen, even on your mobile. Experience now by using your smartphone and access to NovelOnlineFull.com