Competitive dynamics of technological standardization: The case of third generation cellular communications
Glimstedt, Henrik
This paper is concerned with the processes through which technological standards are developed, and how these provide platforms for the expansion and development of new markets. The case we investigate is that of third generation mobile communications. This is of interest not just in itself but also because the process of standardization here is contributing to the creation of an explosive worldwide market, and because the process of standardization is so intimately connected to competitive struggles between the major players, both as individual firms and major regions of the world economy. It appears to be the case that at the beginning of the 21st century the market for mobile cellular communications is about to take off in a very big way, due at least in part to the fact that a package of global standards has been developed that allows for modular innovation within standardized interfaces, and for regional variations. It is no longer a “winner-take-all” standardization approach, which in itself reflects the competing but collaborating interests of the parties involved.
The research took as its starting point a review of arguments for the continued importance of international standardization in the face of widespread deregulation. It linked this approach to three sets of initial assumptions, namely (a) that there is occurring a sustained long-term trend towards open standards in information and communication technology (Solomon 1998; National Research Council 1999); (b) that standards have frequently been used as a preferred method of controlling information markets (Shapiro and Varian 1999); and (c) the idea that successful restructuring and innovation processes in electronics are spatially embedded (Saxenian 1994).
Following these lines of thought, it was also assumed at the outset of this research project that in third generation OG) mobile telecommunications the European system suppliers like Ericsson and Nokia would be trying to repeat their success in developing the dominant worldwide second generation standard, GSM. Such a development path would result in one dominant 3G standard. And there was, in the late 1990s, ample evidence of a holy war in the next generation cellular telecommunications. Herschel Shosteck Associates, a well-known firm analyzing the wireless market, reported in 1998 that: We [Europe] can’t let Japan and US get ahead; they might do to us what we did to them with GSM. So the ETSI [European Telecommunication Standardization Institute] is charging ahead to introduce their third generation wireless proposals to assure the Europeans have a defensible position. (Herschel Shosteck, quoted in Emmett 1998) Furthermore, a European Commission Communication on the issue of standardization in the area of ITC technologies bluntly states, for example, that “standards form a vital part of European industrial competitiveness policy”. The same report concludes that the European ITC industry will depend on taking a lead in the process of formulating the technical specifications for future dominant technologies (European Union 1996a: 2).
By contrast, the empirical investigation nevertheless reveals a much more complex reality where standards are treated in modular fashion, and regional compromises have been forced on the major players. The European camp found itself eventually united behind a technological compromise, in alliance with the Japanese, while the Americans favored more market-determined outcomes, led by the dominant system supplier, Qualcomm of San Diego. The standardization process started with regional efforts to sponsor a singular technology in order to achieve a competitive technological edge. However, well into the process some of the key actors began to change their conceptual framework, embracing concepts like a “family” of standards. A revealing case in point is provided by the changes in Ericsson’s standardization strategy. Having sponsored the technology it developed for Japan and the NTT mobile operator DoCoMo, which was based on a standard known as W-CDMA, Ericsson moved towards sponsoring a “family of standards”. Evidently, Ericsson found it expedient to promote technological interfaces which facilitated roaming by users between cellular systems using different technologies (Nilsson 1999). Thus compatibility has been maintained, up to a point, and the scope for market expansion has taken priority over any single company’s interests.
Based on these insights, the paper develops a chronological account of the competitive dynamics that lie behind the emergence of first generation (1G), second generation (2G) and third generation (3G) cellular communications standards. It focuses mainly on the European experience, comparing and contrasting it with the American and Japanese experiences, and latterly the worldwide efforts to develop a global package of 3G standards.
3G CELLULAR TELECOM SERVICES, STANDARDS AND MARKET POSITION
3G thrives on the convergence of switched telephony and TCP/IP communications. From the early analogue systems established in the 1970s and early 1980s until today’s 2G digital systems, mobile telephony represents an extension of the fixed telephone systems. As has been noted in connection with the debate on the Internet and various National Information Infrastructure initiatives, seamless integration of telecom systems is not the same thing as seamless integration of computer communications systems. Seamless interconnect in a telecom system simply means that a, say Swedish, caller is able to establish a connection to a subscriber in a foreign telecom system. The system switches, or connects the call, but the telecommunications system is really but a one-trick pony-it does nothing beyond connecting the two persons. This means that the communications between the two parties involved in the telephone call ultimately relies on the ability of the two persons to communicate in a common language.
In the world of computer communications and wireless Internet, having interconnect is not enough. To establish meaningful communication the system has to take connectivity to a higher order of interoperability. Because computers need to communicate, the computer industry has had to tackle network-to-network communications issues in a more complex way. In essence, seamless interoperability is achieved when, for example, different PCs and Macs are able to read, download and change database files on a mainframe computer. Another case of seamless interoperability is web services, such as e-commerce and other Internet applications (Bar and Borrus 1999). As the basis for formal or informal standardization the ITC industry is relying on a so-called “layered functional model”.
The layered model shown in Figure 1 enables engineers and companies to handle this increasingly complex technology as a modular design (Baldwin and Clark 1997). A modular system is composed of units (modules) that may be designed independently but still function as an integrated whole. System designers achieve modularity by partitioning information into visible design rules and hidden design parameters. Visible design rules fall into three categories:
1. An architecture that specifies what modules will be part of the system and what their function will be.
2. Interfaces that describe in detail how the modules will interact, including how the modules will fit together, connect and communicate.
3. Testing rules, for testing a module’s conformity to design rules and for testing one module’s performance relative to another.
Engineers tend to lump these elements of visible information together, referring to them simply as the architecture, the interfaces, or the standard. The hidden design parameters (also known as the hidden information) are decisions that do not affect the design beyond the local module.
Figure 1 illustrates the basic types of logical interfaces in the converged 3G environment. For example, a communications and networking role will require many middeware functions in order to have the resources to be able to offer the range of service components associated with this role. The interoperability of computer systems hinges on a far broader range of interfaces than the telecom system does. To achieve interoperability there is not only a question of PC-to-network and network– to-network connectivity, but also a matter of how (a) machines are operating under different operating systems and applications, (b) client-server operating system compatibility, and (c) file formats.
Historically, a common and centralized control of technical standards was exerted by national PTTs or by otherwise regulated monopolies. In this environment, formal ex-ante standardization has been the regulator’s chief instrument to establish the technical rules of telecommunications. Today, the world of mobile telecommunications is turning to the “market”, or more precisely to the software and hardware manufacturers and to the inventors of new services to set the ex-post or de facto standards. As is generally understood, open network protocols are expected to resolve the technical problems of connectivity and interoperability without resorting to an earlier central planning approach, with business services leading the way. But even when all firms benefit from making their systems compatible, firms face a coordination problem in selecting which technology on which to standardize. A third approach, voluntary formal recognition of existing industry standards (also known as Public Available Specifications, PASs) has recently emerged as a way of setting standards.
Standards as Market Domination Strategies
In mobile telecommunications the underlying economic rationale has been associated with the idea of “network effects” and “natural monopolies”. While bandwagon effects may create lock-in effects, a similar effect may be caused by biased formal standardization processes in which actors are trying to set standards that favor certain national industrial goals by trying to promote domestic technologies as international standards. Following from the same logic, participation in voluntary standardization groups is an important part of product development in the information technology sector (Weiss and Sirbu 1990). In telecom technology more generally, participation in standardization processes remains an integral part of product development and marketing strategies (Funk 1998). Along the same line of reasoning, a recent European Union sponsored research project concluded that the early promotion of the Nordic Mobile Telephone (NMT) standard “gave Sweden and Finland a competitive advantage and crucially assisted Nokia’s and Ericsson’s entry into mobile telephony”. Early standardization is thus held to be critical to competitiveness in network markets (Edquist et al. 1998a: 29; Edquist et al. 1998b).
THE FIRST GENERATION
Early mobile systems, which were introduced between the 1950s and the mid-1970s, faced severe lack of radiotelephone frequencies. New cellular radio technology provided the answer to the problem with limited frequencies. The cellular structure splits the geographic area into slightly overlapping “cells”, each served by a radio base station. The core in the system is the database with keeps track of the movements of the subscriber and the “hand over” function, which in practice means that the subscriber can move from one cell into a neighboring cell without losing the connection to the network (National Research Council 1997). By the early 1980s at least two different standards seemed to provide viable solutions for an international standard, but both the US and the Nordic solution faced a politically complicated reality.
In the USA, Bell Labs had worked since the early post-war years on a cellular analogue system, resulting in the establishment of the Advanced Mobile Phone Service (AMPS) standard in 1970. AMPS solves the problem with frequency limitation through so-called narrow sub-band channels, each carrying one phone circuit in a system where any mobile phone can access any of the frequencies. The system relies on 824-849 MHz for receiving signals and 869-894 MHz for sending signals. Given this character of the solution, the generic technology behind AMPS is thus called FDMA (Frequency Division Multiple Access). The source of the elements that make up cellular wireless services in the USA has been greatly influenced by elements unique to the country’s history of telecommunications, e.g. the position of telecommunications as a regulated private monopoly rather than as a public administration and the fragmentation of decision-making processes. First, early work on cellular systems, undertaken by AT&T in the 1960s, was blocked from commercialization for more than two decades. Cellular technology was stymied by regulatory delays. The fear among US regulatory powers, mainly the FCC, was that AT&T would be an even more powerful monopoly if it was allowed to turn its critical research into commercial services. Only in 1982, did the FCC allow mobile communication technology to be put onto the telecom market. The American deregulation process also crippled the operator’s access to advanced cellular technologies. Under the terms of consent decree that broke up AT&T in 1984, the emerging cellular operators had no access to the abundant technical resources of Bellcore, the research unit of the regional Bell companies (National Research Council 1997).
The American standard was, furthermore, never really a contender in the European setting. The Conference on European Post and Telecommunications (CEPT) operated on the basis of clear industrial policy preferences-regardless of technological advantages, CEPT had the principle of excluding non-European technologies from the standardization process (Ruottu 1998). While the development of the AMPS standard suffered from poor institutional support, a more successful attempt at an international mobile standard was made in Sweden and the Nordic countries. Sweden’s early efforts in mobile telephony standards during the 1970s shall be seen against the backdrop of the particular Swedish regulatory framework, which allowed private operation of mobile telephony networks.1 The emergence of a handful of private networks stimulated both early movers within system suppliers of mobile telephony systems and efforts to specify an efficient mobile standard. In 1971, a total of 13 operators offered mobile services, based on a total of 45 privately owned base stations. Besides these private networks, the Swedish PTT (Sweden Telecom) experimented with analogue mobile networks from the 1950s onwards. These activities resulted in the development of a plethora of technological solutions and (preliminary) air interface standards.
On the private side, efforts to integrate the financially struggling private networks resulted in consolidation of the private operators. This process stimulated a number of entrepreneurs to explore the commercial possibilities in mobile telephony, involving development of radio base stations, antenna technology, switches and handsets On the public side, individuals within the national telecom operator began to investigate the idea of a national mobile network. In essence, the chief engineer at Swedish Telecom’s radio division proposed in 1967 that Swedish telecom would supply a nation-wide mobile network, drawing on the early experimental standards. However, since mobile terminals were expensive and clumsy pieces of equipment — normally fitted in the trunk of a car-they conceived the domestic market as being too limited for profitable operation of a mobile network. Hence, the idea of a Nordic standard. This idea materialized within Swedish Telecom’s radio lab and the subsequent initiative resulted in the formation of a joint Nordic group, the Nordic Mobile Telephony (NMT) group, consisting of members of the public telecom operators in Sweden, Norway, Denmark and Finland. The task was to introduce a fully automated mobile network, covering the Nordic area.
Although the NMT group managed to sell the NMT450 standard to a relatively large number of European countries, Germany, France and the UK chose to remain outside the NMT community. France and Germany wanted to develop their own standards, which would offer protection for the respective national telecom manufacturing sectors. It was a question, then, about industrial policy and political control of the national telecom market. Although the NMT system was selected by a number of European operators, the Nordic effort to create an international standard for cellular telephony was met by scepticism throughout Europe. France, Germany and the UK were involved in different attempts at establishing a European standard, a plan that collapsed under industrial tensions between Alcatel and Siemens. France Telecom promoted a non-cellular technology while Germany backed its so-called C-Netz, launched in 1985 (Ruottu 1998).
By the early 1980s, the European mobile telephony system consisted of several incompatible standards. Fragmentation thus prevailed-no standard seemed to be able to “tip” the market towards itself. The Nordic NMT450/900 standards were the most widely deployed with uses in Sweden, Denmark, Norway, Finland, Iceland, France, Spain, Belgium, and The Netherlands. Other than that, the picture remained divided because of the inability of CEPT to bridge national interests. Thus, each national monopolist telecom operator promoted its own national standard, being concerned about protecting their own national telecom manufacturer.
THE SECOND GENERATION: GSM
The European countries tried to establish a common standard for mobile telephony in the early 1980s. At the supra-national level, the European CEPT assumed in 1982 the responsibility of creating a single European standard for mobile telecommunications.2 Having a membership of around 25 of Europe’s postal, telephone and telegraph operators (PTTs), the CEPT commissioned a sub-body called Groupe Speciale Mobile (GSM) to set a pan-European standard. This body consisted mainly of representatives of PTTs with responsibility for the development of national mobile telephony systems and research engineers from the PTT research labs.
The objective behind the CEPT’s 1982 effort was to achieve “harmonization”, i.e. to coordinate an effort to develop clear specifications for the interface between basic elements that make up a mobile telecom system, such as base station, switches and terminals. A political vision drove the harmonization effort within CEPT. In particular, CEPT’s leadership wanted to create not just a mobile telecom system covering all major European countries. It also wanted to increase the size of equipment markets for the European manufacturers (Ruottu 1998).
Although efforts were made throughout the 1980s, the European actors remained unable to agree on a single European standard. Although the GSM group’s efforts in the early 1980s lacked impact, new cards were dealt by the mid-1980s. During 1985 and 1986 the GSM group’s efforts were fueled by the anticipation of digital technologies in mobile telecommunications. The work on specifications of a digital system peaked in 1986/87, when the blueprints for the future digital GSM standard were laid out (Ruottu 1998). As we will see in the following account of the European standardization process, it was only after a complete reshuffling of the political and institutional leadership in European telecommunications that a international standard could be realized.
The Struggle over the European GSM Standard
The initial decision to create a pan-European cellular telecommunication system promoted of course various efforts among both manufacturers and operators to specify and invent the technological platform for the future GSM services. An analysis by Bekkers et al. (2000) of the 140 or so essential patents in GSM technology shows that the first phase of the technological development was dominated by Motorola, AT&T, Philips and Bull. For Bull, these technological achievements stemmed from Bull’s know-how in the encryption field, whereas the other early mover patents were based on advances in the fields such as switching, radio transmission and speech coding. Philips’ patenting in these fields, however, was not based on a clearly defined strategy as to what extent the company should enter in the GSM industry.3 From the early patenting activity, it is however possible to conclude that the most formidable actor in the GSM field was not European but based in the United States. Of all essential technologies in the GSM technology, Motorola held a dominant position with the notable exception of digital switching.
However, in the early period in the history of GSM, there was huge technological uncertainty. Much was however expected from the digital technology. As for technological paths, the GSM group contemplated two versions of the same basic technology. At an early stage, the need for more bandwidth in the air made the so-called TDMA technology attractive. The other option available at this time, the FDMA technology, which mainly was deployed as the basis for digital mobile telephony on the American continent, was seen as being much inferior to TDMA by the Europeans.
In spite of their shared opinions of the advantages of the TDMA over the FDMA technology, the Europeans remained apart on the choices between two versions of the TDMA, e.g. wideband-TDMA and narrowband-TDMA. The backdrop to this division within Europe is hardly surprising. Germany and France strongly subsidized development of digital technology, hoping to ensure a dominant position for their national suppliers. The large European countries, Germany, France, the UK and Italy, joined forces in the mid-1980s in an effort to draft the specifications for a contender to the European digital GSM system. This effort was based on wideband-TDMA. All in all, substantial resources in equipment and man-hours were sunk into the panEuropean project. The operators and telecom vendors, who put their money on wideband-TDMA in the mid-1980s, were determined to reap the profits from these investments. France Telecom backed by Alcatel and the German public operator, DPT, receiving backing from Siemens, championed the wideband alternative (Ruottu 1998: 257-258). Both France and Germany were determined to promote their respective national vendors as clear leaders of the European telecom technology. For the other key actors, such as Nokia and Ericsson, the French-German proposal was simply considered to be too proprietary, i.e. based on French and German intellectual property rights (Cattano 1994).
On the other side of the fence, the Nordic countries were able to create some unity around narrowband technology. On this side, Sweden played a heavy role in providing the key input both in the discovery of the principles for the specification of narrowband technology and in the political leadership in the narrowband camp (McKelvey et al. 1998; Ruottu 1998; Molleryd 1999). But the specification of the narrowband platform for the European standard, which was in the main achieved by engineers at Ericsson and Telia, was dependent on a set of already existing technologies. Rather than being a radical innovation, the narrowband specification also utilized basic cellular technologies protected by a number of established intellectual property rights stemming from mainly three sources: Motorola, AT&T, and Bull and Philips (Bekkers et al. 2000).
Motorola’s management expected the European wireless market to be well protected by European political interests. Hence, Motorola decided to devise an aggressive IPR policy in radio base stations technology, which would generate long-term revenue streams from key patents if direct sales should fail.4 Also, Motorola expected that key patents in radio technology could be traded against Ericsson’s and Siemen’s patents in digital switching in a cross-licensing scheme (Iversen 2000).
The claim of Bekkers et al. is that, in using the bargaining power that came with its dominating technological position, Motorola “imposed a market structure by conditioning exclusive cross-licensing agreements with a selected number of other parties on the market” (Bekkers et al. 2000). Because Motorola selectively licensed a number of key technologies to Nokia and Ericsson, the narrowband alternative gained even more steam. Ericsson hence succeeded in securing a cross-license agreement with Motorola by virtue of having developed know-how in digital switching and having developed the selected technical proposal for narrowband GSM. Nokia was also successful in its efforts to win cross-licensing agreement with Motorola, further adding technological strength to the narrowband camp.5
The two European main camps-Franco-German group and narrowband camp championed by actors in the Nordic region, i.e. Ericsson, Nokia, Swedish Telecom– were both facing the end-game, beginning with a CEPT meeting in Madeira in February 1987, which aimed at agreeing on which of the two proposed technologies should be selected as the technological standard for the future GSM system. As the reports on the Madeira meeting show, the initial outcome of that meeting reflected the strong division within Europe. While the choice of TDMA as the basis for digital technology remained uncontested, the choice between wide- and narrowband technology was hotly debated. Two strong contenders were presented to CEPT, one wideband solution from the Franco-German group and one narrowband solution from the Nordic group. Out of 15 member countries making up CEPT, 13 countries supported the Nordic narrowband solution whereas the wideband alternative only received backing from its initial promotors, France and Germany (Ruottu 1998; Bach 2000).
European Liberalization and Institutional Changes, 1987-1992
At this point, however, the political weight of the EC in the telecommunications sectors had increased considerably (Sandholtz 1992, 1993). A key event was certainly the deregulation and liberalization of telecommunications markets in the USA. After the divestiture of the telephone giant AT&T and the liberalization of the remaining core in US telecommunications, AT&T entered European markets through joint ventures with Philips and Olivetti. At the same time, IBM diversified into telecommunications by purchasing stock in MCI. In the eyes of EC industrial policy-makers, this was an alarming development. They feared that US multinationals, in addition to their hegemony in information technology, would also conquer Europe’s communication market (Dang-Nguyen et al. 1993). Thus a Special Task Force was created in 1983 within the DG III (Internal Market) in order to implement the actions that Commissioner Davignon had in mind. To increase the competitiveness of the European telecommunications industry, the Commission proposed opening the internal telecommunications market by means of national liberalization and deregulation measures (Dang-Nguyen et al. 1993; Sandholtz 1993).
In the wake of European deregulation, also the future of the GSM system was becoming a critical issue for the Commission of the European Communities to the extent that the Commission was beginning to conceive European Telecom infrastructure as linked to the European Single Market Project. Naturally, the Commission saw with discomfort Europe gearing up for a repeat performance of the stalemate of the standardization of analogue mobile telecom systems. Lacking voice within CEPT, the Commission opted for a radical solution, which was made public in the Green Paper published at the time of the Madeira meeting (European Union 1987). The Green Paper outlined the Commission’s challenge to PTT dominance of European telecom markets by suggesting community-wide competition in the areas of network equipment, terminals and communication services. In addition, the Commission proposed the creation of a European Telecommunications Standards Institute (ETSI), a crucial institutional innovation with far-reaching consequences for the implementation of GSM.
In this process, CEPT lost its initiative and leadership of the European telecom scene to the Commission, which supported ETSI rather than CEPT as the key locus of European telecom standardization processes. ETSI, with the DG:XIII’s firm backing, started hence to play a more active role in the European standardization process. Considering the process of European telecommunications policy, the Commission’s leadership was highly instrumental in the success of European telecommunications policy The Commission acquired competences, arguing that while it started out coordinating policy, it was now primus inter pares. Thus European institutions do not remain mere aggregates of their members’ interests but develop institutional selfinterests and become corporate actors in their own right (Sandholtz 1993; Schmidt 1997).
In 1988, under great pressure from the Commission of the European Union, the GSM project was removed from CEPT and transferred to the newly created ETSI. The new supra-national body, ETSI, laid a new framework for the European telecom scene, which undermined the ties between national industrial policy issues and standardization. In many respects, ETSI’s institutional design is a direct response to the deficiencies of CEPT. Having ensured enough voice in ETSI, the Commission focused on three moves to minimize the influx of national industrial policy interests into the decision process (Ruottu 1998; Bach 2000).
* ETSI was opened to the emerging private operators and vendors.
* ETSI distanced itself from the consensus based decision-making process typifying CEPT. Whereas qualified majority was needed for only a limited number of questions, simple majority principles pertained to most voting procedures.
* The Commission is also able to prevent ETSI from adopting standards that it believed would block European trade.
Given the changes in membership and voting procedures, Germany and France found themselves not only isolated but also on the brink of defeat in the European standardization wars. But what ultimately mattered, as also pointed out by David Bach, was the fact that the Commission already had adopted a Council Decision on 22 December 1986 on standardization in the field of information and telecommunications, which requires EU members and their telecommunications administrations to use official European standards in public procurements. To this effect, the Commission publishes Commission Decisions on Technical Regulation, alerting members to a new ETSI standard and requiring its use in public procurement. This institutional arrangement ensures that ETSI standards will be the basis of public networks in all member states, as ETSI has become the EU’s principle standard setting body in the area of telecommunications (Each 2000).
Differently put, Germany and France were unable to block the GSM process since the Commission changed voting procedures to qualified majority voting principles. If ETSI adopted GSM without German or French support, the governments of those two countries would yet be forced under EU law to use GSM as the basis for the public mobile telecom network. Also, since the Commission was able to prevent new standard proposals that it deemed would have a negative European trade, any new proposal of French or German origin in the area of mobile telecommunication could be stopped by the Commission.
Thus after successfully isolating the Franco-German camp and moving key issues in the European standardization process, the EU Commission paved the way for the Nordic narrowband solution. The underlying rationale of the MoU, as argued by Bekkers et al., was that manufacturers were forced to serve the whole GSM community, both suppliers and operators, on fair, reasonable and non-discriminatory conditions. The companies that refused to accept this condition were not entitled to supply equipment to a number of operators within the European Union (Ruottu 1998; Bach 2000; Bekkers et al. 2000).
The Development of GSM Relative to Other Digital Standards: Overview
Following the successful standardization of the 2G mobile telecom system, the boom in mobile telephony has been astounding (Table 1). Not just so in Europe with its integrated GSM system. In less than 10 years, subscribers soared from c. 11 million in 1990 to some 300 million in 1998. As the percentage of total telephone subscribers, mobile subscribers consequently increased rapidly from 2 to 38 percent during the same period. While GSM has fueled the growth of European subscribers, the other continents remain less unified.
Parallel to the European march from analogue systems to the digital GSM system, the US authorities approved a digital standard. Because US wireless was thriving with a single analogue standard, AMPS (advanced mobile phone service), the CTIA (Cellular Telecommunications Industry Association) approved a similar standard, D-AMPS or IS54, but required it to be compatible with AMPS. The relative homogeneity of the European market was achieved under heavy pressure from the European Union. Rather than just promoting the idea of a common European infrastructure, it effectively undermined the role of national industrial policy interests.
THE THIRD GENERATION: IMT-2000 AND UMTS
In order to play a catalyst role in the development of 3G services, ITU initiated the evolution of third generation by defining the requirements for the Future Public Land Mobile Telecommunications System (FPLMTS). Under the aegis of the ITU, work began on the definition of the FPLMTS standard and in 1992 the World Radio Conference identified spectrum for FPLMTS in the 2 GHz band. As the momentum of the global cellular market grew and new 2G standards became deployed, the ITU created the IMT-2000 concept for third generation, anticipating the increasingly sophisticated requirements of users for broadband capabilities. ITU realized at an early stage in the development of the global cellular market that there was a need to create an evolutionary path from the 2G digital mobile systems (GSM) to the 3G mobile multimedia systems.
Initially, ITU’s initiative received the strongest support from Japan. One possible explanation for Japan’s support of the IMT-2000 process is that the Japanese were running out of spectrum for their mobile and personal communications systems. Another explanation is that the Japanese had learned the GSM lesson very well.
From a technical point of view, the new telecom services are based on two standards: TCP/IP computer communication, including voice over TCP/IP, and mobile telephony. The ITU parameters for third generation specify the capability to deliver data at 144 kbit/s to fast moving subscribers, at 384 kbit/s to slow moving subscribers and at 2 Mbit/s in a stationery environment -a remarkable advance on today’s mobile data services. As envisioned by ITU, IMT-2000 was to encompass three dimensions: first, it was to be a new wireless technology rather than an enhancement of present ones. Second, it was to be deployed on virginal 1.9-2.1 GHz spectrum. Third, it was to provide national regulatory authorities with the option of licensing new wireless carriers and, thereby, expanding wireless competition. This technology base will be the carrier of the much discussed new added value services: wireless access to office local-area-networks, wireless access to the Internet, and, ultimately wireless multimedia.
The structure of the selection of air interface standard for IMT-2000 The aim of IMT-2000 is to achieve, through the medium of ITU standardization, the goal of enabling customers to roam globally and have anytime, anywhere connectivity. This connectivity will extend to roaming onto multiple networks-fixed and mobile, cordless, cellular and satellite. The introduction of a single global third generation standard will enable massive economies of scale in the production of equipment, bringing the capability of global communications within the reach of everyone on the planet. Increased competition will drive down tariffs and the third generation technology will enable the deployment of new functionality, services and applications. A global standard for 3G wireless services will also have considerable benefits for developing countries, helping the ITU’s goal of erasing the gap in access to communications and information between developed and developing nations.
As envisioned in the IMT-2000 project, the third generation would have a common radio interface and network. The ITU anticipated an international competition leading to a radio interface that could be developed and deployed by the year 2000. According to ITU’s principles, the standardization of 3G mobile telephony was supposed to follow the following pattern: first, the ITU issues a request for standard proposals which initiates activities within regional standardization bodies in Asia, the USA and Europe. The standardization bodies then submit the different proposal for standards. Upon this response, the ITU starts a process to build consensus on which standard the ITU will select and implement.
There will then be a lengthy period of specification activity by specialist committees working under Task Group 8/1 during which the outline proposals will be developed into draft ITU recommendations. These recommendations will be given final consideration by the ITU late in 2001 for final approval. As the development of the IMT-2000 standards will be closely tracked by manufacturers and potential operators, who, for most of them, will in fact play an integral role in the development process as members of the ITU, the launch of commercial 3G services should take place within a reasonable period following approval of the appropriate standards. The initial submissions are now being evaluated by 16 independent evaluation groups from around the world. The final selection of IMT-2000 radio transmission technologies will be made by the end of 2000 on the basis of an iterative, consensus building approach to achieve the best result for 3G implementation.
The main contenders. The IMT-2000 selection process has already begun. By June 1998, the ITU received 15 submissions for consideration as candidate radio transmission technologies for IMT-2000. Of the submissions to ITU, two main submissions stood out as key contenders (Table 2).
Other technologies represented in the submissions were: Wideband Time Division Multiple Access (W-TDMA), and three satellite solutions. Sponsoring organizations for these came from Europe, the USA and Korea.
In brief, the IMT process resulted in three different technical standards, each of them incompatible in terms of radio access standard and type of core network signaling. For the users, the radio access interface incompatibility simply means that the user with a W-CDMA mobile phone will not be able to connect to a cell in a network operating on the basis of CDMA2000 technology and vice versa. The other type of incompatibility concerns network-to-network compatibility. Different signaling principles mean that the networks’ interfaces are not harmonized, which limits roaming and interoperability of the networks.
These observations raise a number of questions: how did the key promoters behind the two main alternatives (W-CDAM vs. CDMA2000) act in order to promote their proposals; to what extent were they willing to pursue towards an “end-game” aiming at tipping the market to their advantage through standardization processes; could this conflict of interest be resolved, and if so, what institutional and conceptual innovations were created to ensure roaming and interoperability of the 3G system?
In providing some answers to these questions, we have to go back to the political and institutional background of the UMTS initiative as well as to the way that the US actors, given their institutional and conceptual contingencies, interpreted the nature of the European efforts to create a 3G standard within the IMT-2000 process.
BUILDING A NEW GLOBAL MARKET AND COMPETITION REGIME FOR 3G SERVICES
In many respects, telephony has been underpinned by the idea of universal service. This almost universally applied policy ensures the citizen’s fair priced access to basic telecommunication services. Traditionally, data communications and mobile telephony have not been part of the universal service policy. But with the Internet revolution, which sparked the notion of the Global Information Society (GIS) and various National Information Infrastructure initiatives (NII), most national governments as well as the European Union are about to include access to the Internet (fixed and wireless) in the concept of universal services. Integrating the Internet and related access technologies into a universal service offering is complicated because it requires that the Internet be more regulated and that the governments would have the right to set certain technological standards.
This section thus argues that the standardization of the 3G services is not just another war between the players in the wireless business trying to dominate the market. It tells the story of how the 3G standardization issue was entangled by the GIS/NII debates and on how to achieve interoperability on the basis of standards. Keeping the traditional telecom institutions at bay, key actors in the USA such as FCC have pointed towards the importance of technological development and market driven standards. The European Union has, on the other hand, proposed a more active role of the traditional telecom institutions, preferring ex-ante standardization.
One major result of these debates on standardization and interoperability is that the standardization issue could not be resolved in the closed rooms of international standardization bodies, such as the ITU -it was left to the major vendors to lay down the rules and principles for how the basic 3G standards are to be set in the future. As indicated by our account of the shifting relations between two major vendors of competing cellular systems-Ericsson and Qualcomm-the manufacturers are in the process of moving away from the winner-takes-it-all standardization strategy and about to grow a cooperative attitude, supporting a modular system that will include several compatible regional standards.
In the EU regulatory context, prospects for the 3G services have been shaped by different policies. In the main, two sets of policies are directly relevant: universal service and the European GII policies. The former has its roots in the telecommunication industry and embodies the idea that every citizen shall have equal access to telecommunication services at a fair price regardless of geographical location. The latter plays a role in this context because it integrates a host of policies (such as patent and copyright laws) which will matter to the IT sector’s innovativeness and profitability.
European coordination of R&D and standard promotion
The European UNITS, which has deep roots in European politics, dates back to the first part of the 1990s. As called for by the Council and the European Parliament, the Commission issued a Green Paper setting out EU policy for the development of mobile and personal communications (European Union 1994). This paper connects back to the European GSM experience, but it certainly set the parameters for the things to come. Captured in one word, what the Commission strove for was coordination in order to balance the US dominance in PC and Internet related technologies.6
The idea of coordinated technological consortia is a deeply rooted European tradition. As noted by Hawkins (1999), the first real prototype of the consortium model appeared in Europe. The European Computer Manufacturers Association (ECMA) was founded in 1963 to deal with growing concern in Europe that the mainframe computer market was too concentrated amongst a few, mostly US-based companies. ECMA established many of the basic organizational characteristics of the consortium. Membership is voluntary, industry-wide, and international (within the European region), the focus is on special technical problems, the process relies on voluntary technical committees, and financial support is on the basis of member contributions. ECMA was among the first ICT industry organization to issue PAS. Moreover, ECMA established close relationships very early on with the International Organization for Standardization (ISO). In the late 1960s and early 1970s, ECMA was one of the main advisors to ISO Technical Committee 97 in setting up the first major international computer standardization program, the Open System Interconnection. From the beginning, ISO reciprocated by placing ECMA recommendations on a fast track whenever they were presented for consideration as international standards.
Being persuaded by the idea of mobile telecommunication as a specific area where Europe enjoyed a technological advantage, key policy-makers within the European Union saw good reasons to set up a new mechanism for coordination of the development of technology, market regulations and standardization issues. These sentiments led up to the Commission’s 1994 Green Paper on the future of European mobile technology and services, with several far-reaching consequences. On the basis of the consultation process following upon the 1994 Green Paper, the Commission issued a proposal including full deregulation and support for the development of advanced mobile services. In response to these developments, the European Parliament adopted in May 1995 a Resolution taking a very liberal outlook and forward looking view on the mobile sector, which served the purpose of reinforcing the Commission’s position in this matter. These and subsequent decisions taken by the European Council in 1996 laid the institutional framework for future efforts in European mobile technology and services.
In accordance with the mandate given by the Parliament and the Council, the Commission spearheaded the creation of three separate bodies. First, it assisted in the creation of the UMTS Forum, an organization consisting of mainly private and public market actors. The main objective of this Forum is to contribute to the elaboration of a European mobile policy based on industry-wide consensus on regulatory issues. Secondly, in parallel with the work carried out within the UNITS Forum, the Commission organized a broad European consulting process on 3G communication issues. Third, on the basis of the consultation process and the reports by the UNITS Forum, the Commission issued a proposal for an Action Plan aiming to shape the regulatory environment to ensure successful development and launching of UNITS based 3G services. Fourth, the Commission was also instrumental in instigating technological research on mobile technology, mainly related to the development of CDMA, within the RACE program under the 4th Framework program.
ETSI and the Standardization of W-CDMA Technology
Following the new approach to market driven standards described above, the Commission identified need for regulatory action to be coordinated at the European level. Standardization of the UMTS was, accordingly to be worked out in close cooperation between the ETSI, operators, manufacturers and national regulators. The Commission concluded that the standardization work should in particular aim at ensuring the endto-end interoperability in a pan-European UMTS environment (European Union 1997).
Accordingly, a special role was henceforth identified for ETSI as a key player in European standardization. The Commission also considered that UMTS could not be conceived as an isolated island, but needed also to be seen as part of the global communication infrastructure-the Internet. Thus the Commission called upon the member states to promoting the “UMTS standard under development within ETSI as a key element of the IMT-2000 recommendations currently in preparation at the ITU” (European Union 1997: 18).
According to the notion of coordinated European action, ETSI’s role in the European 3G scene was to set a formal standard based on the vendor’s proposals. But according to the notion of coordinated action, the European Union should also help coordinate industrial R&D efforts across Europe. For almost a decade, hence, the European Union has developed a stake in W-CDMA technology. An intensive effort to develop a fully 3G compatible W-CDMA based wireless technology was carried out within the RACE/ CODIT program during the first part of the 1990s. This project has also served as a testbed for the major wireless manufacturers, such as Ericsson, which fashioned its Ericsson Wideband Testbed technology on the RACE/CODIT program. Another example of an EU sponsored program is the FRAMES project, serving the purpose of developing the basic technological concepts for the terrestrial radio access network elements of the UMTS system. The EU programs were hence merged and submitted to ETSI as a candidate technology for UMTS.
Resulting from activities related to the R&D programs mentioned above, two main proposals for the UMTS standard, of which both were based on different versions of CDMA technology were submitted to ETSI as technological candidates for laying the basis for the future UMTS specification. One was led by Ericsson and Nokia, proposing the wideband W-CDMA concept and the other led by Alcatel, Siemens, Nortel and others, who tabled the time division CDMA (TD-CDMA). After a number of efforts to come to an agreement over the standard voting in 1998, ETSI was caught in the battle between the two European groups. But after voting in January 1999, the two contesting groups arrived at a compromise. Now ETSI member companies will jointly support one consistent air interface for wideband wireless telephony that involved a merger of standards incorporating W-CDMA by Ericsson and TD-CDMA by Siemens and Alcatel. W-CDMA will be used for the bulk of wide-area applications-the normal outdoor wireless calls whereas TD-CDMA will be used primarily for low mobility indoor applications. In a joint statement, the four major European companies, Ericsson, Nokia, Siemens and Alcatel, pledged their support for the combination of technologies as the basis for future 3G services. That compromise and subsequent joint announcement was an important milestone in Ericsson’s quest for market dominance.
Towards Inclusion of the Internet in EU Universal Service Policy
The official formulation of universal service policy by the European Union is a byproduct of recent liberalization and privatization of telecommunications. Prior to the 1990s universal service was one of many policies guiding national monopolies. European policy-makers recognized strong synergies between competition and universal service goals; competition was expected to drive down prices and increase the incentives for the new ITC services as well as the investments in the technologies on whose back they will be piggy backed. But these new services were expected to be unevenly distributed. Explicit universal service policies were thus issued by the European Union to balance the negative sides of EU driven liberalization of the telecommunications sector.
The concept of social needs has been critical to the formulation of the European universal service policy. In the early policy documents, the policy-makers defined the scope of universal service, as almost exclusively related to traditional telecommunication services (access to fixed voice telephony, fax and low-bandwidth modems). Universally available and fair priced access to fixed voice telephone is thus the core of the universal service offering, not least because it constitutes a pathway to public social services, including police, etc. Until the late 1990s, one result of this stance has been that the Internet and related broadband services have not been included in the universal service policy (European Union 1996a).
More recently, however, the Internet revolution has changed our understanding of communications and hence altered the basis for universal service policies. Widespread use of e-mail, web-based information, and even official services such as Internet distribution of public documents, has forced policy-makers to rethink the notion of information infrastructures. In the EU context, this has been most sharply expressed in the Green Paper on Public Sector Information in the Information Society.
Dissemination of public sector information on the Internet does not automatically imply that all citizens have an equal access to it. Substantial differences exist in access to the tools of the Information Society (computers/modems, etc.) and the ability to use them. In this context the Report on job opportunities in the Information Society stresses that the access to such tools and the skills to use them are prerequisites for job creation and need to be prioritised. (European Union 1998) Interoperability and standardization As was concluded in the above section of 3G technology, the convergence of telecommunications and the Internet calls for network interoperability, which can be achieved if critical interfaces between network elements are clearly defined. This was also spelled out in the 1997 Green Paper on the Convergence of the Telecommunications, Media and Information Technology Sectors (European Union 1997), stating that
… one of the most important consequences of the blurring of technological borders between information technology, telecommunications and consumer electronics is the increasing globalisation of services. The inherently global nature of the Information Society calls for any standardisation in support of its development to be similarly global. Users may want access from any terminal to any service, independently of the technology used, or the geographical point of such access, within a multi-vendor environment. A major objective for standardisation therefore is to achieve interoperability between networks and services. Technological harmonisation is not an objective. However, standardisation is a tool which can reinforce both general policy objectives, such as the creation of an Internal Market for communications services, and the regulatory framework. Encouraging best business practices in areas related to data protection and security of digital signatures may be supported by standardisation and consensus-building within an appropriate regulatory framework. (emphasis added)
To illustrate the need for open standards, the European Commission has on several occasions pointed towards the role of browser technology (e.g. Netscape and Microsoft Explorer) as gateways to the Internet services.
The Internet community is trying to build on open standards that allow both interoperability and competition. Open standards are particularly important with regard to hardware and software tools for Internet use and access. Items such as browser software are in a way the “entry ramps” to the information superhighway, and it is important that they be based on open standards so that all users may have equal access to the Internet. Otherwise proprietary standards and their attendant licensing schemes will control access to content and electronic commerce transactions, and will adversely influence licensing and other market behaviour. (European Union 1998: 56)
ETSI’s special role as a consequence of the quest for interoperability
In formulating a policy that includes access to Internet services in the concept of universal services, the European Union has also recognized that interoperability is the key. In order to set out a strategy and policy orientations for Europe’s information infrastructure, the Commission issued a communication on the benefit of the Wireless Information Society (European Union 1997). It advanced the notion that an overall strategy is urgently needed to provide technological regulatory certainty for the wireless communications beyond voice services.
This called for a new approach to standardization issues, which resulted in the EC Communication on Standardization and the Global Information Society (European Union 1996a). Here, the policy-makers involved in the making of a new European standardization policy for the ITC area saw problems in the earlier system of formalized standards due to the slow working methods and endless processes to reach consensus. Market driven standards implies, as also recognized in the Communication, that several common technical specifications may emerge in parallel, followed by a shakeout in which one or possibly two common technical standards representing the technology that is dominant in the market are all that survive. Whilst the ideal standardization process consists of an open consensus “… it is not unusual for dominant market players to reinforce, by means of technological specifications, their dominant position in the market place.” (European Union 1996a: 3)
Today, the Commission observed, this often takes the form of private standardization consortia, which draw up common technical specifications with the aim of establishing a de facto standard-what the authors of the Communication call PAS (Publicly Available Specifications). The tendency by industry to joint consortia to formulate technological specifications for de facto standards has, according to the Communication, to be seen in connection with the slowness of the formal standardization process. In a study of the rise of consortia in Europe, Hawkins (1999) outlines an intriguing approach to monitoring emerging ICT industry dynamics that is useful in interpreting these findings. This approach classifies various groupings of ICT interests as incumbents as existing suppliers of telecommunication and computer products and services with an extensive installed base of technology linked to an established customer base; insurgents as newer firms seeking to build market shares for goods and services based on new technology; and virtual communities centered around emerging configurations of dominant users of networked services, especially in an Internet environment. A look at the groups of consortia founder members makes it clear that incumbent and insurgent perspectives led most of the consortia formation that occurred throughout the 1990s. In setting up consortia, incumbents were looking for new ways to maintain and increase revenues by maximizing and enhancing existing investments in network facilities in order to exploit the commercial possibilities of these new markets for electronic services. Consortia like ADSL-F and ATM-F emerged predominantly among incumbents with these objectives. Insurgents sought to use consortia to develop market share quickly by breaking up some of the vertical integration that still existed among incumbents. This yielded consortia like OMG, IMA and the Open Group who focused more on the articulations between software, digitized content and networked services than on platforms and network facilities as such. In some cases these agendas overlapped, yielding high degrees of joint participation (as with DAVIC, OMG and W3C). Other consortia tended to gravitate towards specific incumbent interests or competencies, particularly those consortia oriented mostly to telecommunication like TINA-C, ASDL-F or EURESCOM (Hawkins 1999).
While PASs in many ways take on the form of standards, they are an instance of cooperation between private companies. As such they have to be assessed based on the principles of European and national competition law, i.e. Article 85-86 of the European Treaty. From this vantage the PAS does not constitute a standard, since it is an entirely private arrangement between two or several firms. This assessment of the PAS led the Commission to the position that PAS could be turned into legally defined standards if they were confirmed by formal standardization organizations, such as ETSI. Formal standards, therefore, present a particular form of legitimacy which distinguishes them from de facto standards and PAS, and which allow national and Community law to have resource on them (Hawkins 1999). The emerging idea was that PAS that was developed by private actors could be recognized as proper standards if they were adopted by ETSI (European Union 1996a).
Based on the formula of European technology consortia, the European Commission tried to revitalize ETSI and helped it carve out a new niche in the standardization process. To reinforce the establishment of this middle way between de facto and formalized standardization, the Commission was instrumental in setting up a new European standardization body, the ITC Standards Board, serving the purpose of coordinating standardization processes within private industry and the formal standardization processes within ETSI (ICT Standards Board 1999).
THE USA-FOCUS ON MARKET “DRIVEN” EX-POST STANDARDS
European wireless services are typified by far-reaching convergence around the GSM standard. In the USA, by contrast, three systems are offering rival and incompatible technological platforms for cellular telephony. They are incompatible in the sense that subscribers to one service will not be able to place a call through a competitor’s network without buying a new phone, which has a compatible radio access interface. However, they are compatible on the level of network-to-network connection. Roaming thus only works between “islands of compatibility”, e.g. when a GSM subscriber places a call through a GSM network which is switched to a CDMA network and to a subscriber with a CDMA compatible mobile phone.
As the 3G system has emerged as one of the most promising new markets. Agreement on the general objectives, e.g. the need for interconnection and interoperability conceals substantial disagreement over how to stimulate and regulate the new information infrastructure. Although there is a significant difference between the governance structure for second and third generation of mobile telecommunications, the idea of international standardization ex ante still looms large in the European viewpoint. The Europeans hence advocated the UMTS system as a unified international standard set by ETSI and to be confirmed by the ITU before the services are implemented as fully operating, commercial services. While the European efforts to regulate the future 3G system builds on an effort to increase the role of the traditional telecommunications institutions, the USA has embarked on a route that goes in the opposite direction. The European strategy encountered two types of roadblocks.
First, the US approach to communication technology R&D is far more competitive. Individual firms perform much of the research in the context of their market planning. Both regulatory and private industry actors are determined to upkeep the ideas that guided the development of the Internet-open networks, cumulative user-driven innovation and fierce competition. Since the 1984 deregulation, the regulatory steps taken by the FCC made it impossible for the cellular operators to draw on Bellcore’s (Lucent’s) vast resources, which helped foster a highly competitive cellular equipment manufacturing sector, with the result that proprietary rather than open systems standards have emerged in the US cellular industry (National Research Council 1997; West 2000).
Within this regulatory line of thinking, the notion of true market driven, bandwagon standardization processes plays an important role. This is a far cry from the image of universal service that animate the regulated vision of the European wireless information infrastructure. Coordination takes place in a diverse setting of standards setting organizations. But for the most part standardization is a competitive rather than cooperative process, with each company or group of companies striving to protect their commercial interests. Subsequently, rather than supporting the idea of a common 3G standard as the basis for the globalization of the 3G market, the FCC and other key policy-makers on the US scene pushed for diversity in terms of competition between multiple but compatible standards. Hence, key actors in the USA began to observe the European initiative to create a global 3G standard with increased distrust. William Daley, Secretary of Commerce and Secretary of State Madeleine Albright accused the European Union of being driven by “industrial policy considerations” and has asked that the European Union support “converged or multiple standards as deemed necessary by ITU participants” (Emmett 1998).
One implication of the US legacy was that the decisions of concern were partly driven out of the formal standardization processes. In particular, the position taken by the FCC put the intellectual property right issues connected to the standardization in focus. Thus, the European strategy for the UMTS was up against Qualcomm, the Californian defense contractor and commercial developer of CDMA technology, who persistently has promoted CDMA technology and, since the mid-1990s, has opted for a pre-emptive strategy aiming at reaping profits from its first-mover advantages in CDMA.
Universal service vs. technological development in the US context
In the eyes of the Americans, it was a textbook case of how standards were used to tip the market. What struck the US-based industry as particularly questionable was ETSI’s position that market driven standards, e.g. standards that grew out of the industry’s capacity to meet the demand side’s preferences, was compatible with exante standards. As for the standardization of next generation wireless services, FCC’s standpoint has been heavily influenced by its historical past, particularly its interpretation of the effects of the regulatory efforts to shelter the Internet from traditional ex-post standardization and “bundled-system” or black-box approach to innovation traditionally favored by the telecom industry.
Two historical references are still playing a particular role in FCC’s collective memory (Neuman et al. 1997). According to the 1956 Consent Decree, the final judgement for the case brought by the US Justice department’s landmark case against Western Electric in 1949, AT&T agreed to close its business activities outside telecommunications. The agreement between AT&T and the regulatory institutions drastically restricted AT&T from expanding into the emerging computer industry and allied technologies; AT&T could not develop freestanding skills in software or digital computers. Already in the wake of the first real breakthroughs in computer communication, the FCC promoted the idea of an open infrastructure. Rather than strictly regulating the data communication sector, the regulatory framework created by FCC unbundled network elements rather than strictly regulating the network. In sharp contrast to the ideas that guided the regulation of AT&T’s monopoly, it became conceptually and practically possible to isolate the basic network elements, improve different parts of the network and recombine these parts in an innovative way; from the late 1960s and onwards this open access policy helped foster a critical group of innovations contributing to increased network performance. Since the 1960s onwards the FCC successfully freed the data communications sector from the rigidities of telecom regulations. In pursuing the policy of open network architecture, the FCC allowed rapid innovations and subsequent commercialization of the data communication technologies, such as time-sharing, which provided us with many of the basic technological principles of the Internet.
As noted above, under the terms that broke up AT&T, the emerging Baby Bell operating companies were also awarded mobile telephony licenses but were also cut off from the abundant technological resources of Bell Labs (later Lucent). Very little cellular technology was transferred from BellLabs to BellCore, which was to serve the Baby Bells. Under these regulatory conditions, the new cellular operators had to turn to the established system vendors, such as Ericsson and Motorola. In the longer run, the operating companies expanded their connections from the already established vendors into the Californian computer and communication industry (National Research Council 1997).
The US preference for market driven ex-post standards
To be sure, the FCC did not arrive at the formula of open architecture and user-driven innovation in one stroke. It has however been rather consistent in its support for the open network architecture and the unbundling of network elements. In later years, the FCC has unfailingly supported user-driven innovation by keeping the critical network elements (such as “the last mile”) open to new services on cost effective terms. The object has been to prevent the phone companies from closing the architecture of the network and hence block future innovation. The FCC firmly holds that it was the open architecture that set the trends that created the Internet. If the FCC had been less protective of innovations in different network elements, we would have suffered from detailed regulations of technology by the monopolist AT&T (Bar and Borrus 1999). As will be discussed below, the result of this interpretation of its own past, has inspired FCC to strongly oppose ex-ante standardization processes preferred by the European key actors, such as ETSI.
That particular kind of historical experience is held in high regard by the American regulators. FCC’s position on standardization seems to be fundamentally in accord with the view on standardization that has emerged within the White House in connection with the GII Framework. The Framework document clearly recognizes the importance of technical standards for the long-term success of the Internet and related access systems in areas such as electronic payment, security, high-speed network technologies and data interchange. But the White House also concludes that “there need not to be one standard for every product associated with the GII, and technical standards need not to be mandated” (p. 18).
In a response to the Directorate General (DG) XIII of the European Commission’s communication on the need for strengthened international coordination of the GII (discussed above in the section on the European Union), the American National Standards Institute (ANSI) replied that it would be a serious mistake to include browser technology (Explorer and Netscape) in the list of essential open interfaces since
… [browsers] are commercial and competitive products. Their success or lack of success may or may not depend on the use of open standards. [Advocating that successful GIS products take on the status of public utility] goes against the basic concept of a market led Global Information Society … (American National Standards Institute 2000)
Following this line of argument, the FCC developed a passive stance towards the European invitation to participate in ex-ante standardization in wireless services. As some observers suggest, the FCC were tied by the fact that US participants in the 3G technology race represented different technological alternatives, implying that the FCC remained neutral in the standardization process.7 But the issue of 3G standards was also affected by the more general question how to standardize the National Information Infrastructure initiative. According to most observers, the regulation of the new information infrastructure has gravitated towards a clearer recognition of market driven standards. As the world of mobile telecommunications and computer communication (the Internet) collide, the clear trend is for direct regulation to withdraw from the market. This means, according to Bar and Borrus, “an almost inevitable move away from traditional telecom approach to interconnection, and toward the competitive mechanisms characteristics of the computer industry– checked by the possibility of court intervention” (Bar and Borrus 1999). Although wireless communications were not initially included in the NII initiative, later discussions reveal an interest for “information nomads”, clearly referring to wireless access to the Internet and 3G services (National Research Council 1997).
In line with the more general policy towards datacom/telecom regulation within the framework of the Internet, the FCC quite clearly expressed the view that the ITU’s IMT-2000 should, contrary to the European standpoint, be based on a set of compatible standards. “I am concerned,” FCC Chairman William Kennard said in an interview, “that Europe may be effectively bypassing the ITU consensus process by prematurely adopting a particular standard without regard to the market-based needs of service providers in other countries” (quoted in Emmett 1998). In explicitly addressing the 3G standards and US trade policy, the President’s Export Council vindicated the view that the European efforts to set a single standard should be blocked. in its advice to the President in this matter, the Council warned the White House against accepting 3G technical convergence.
The members of the President’s Export Council want to take this opportunity to emphasize the importance of maintaining and expanding international markets for U.S. telecommunications and related businesses. The only effective means of accomplishing this critical objective in the context of the forthcoming ITU determination is for this Administration to promote aggressively, in all available domestic and international fora, a multiple standards regime. That is, given the fact that U.S. telecommunications companies have invested billions of dollars in multiple, market-accepted 3G technologies, it would be sharply counter to the interests of U.S. industry as a whole either to advance any particular U.S. technology over another, or to fail to promote a multiple standards regime and thus allow our global competitors to assume that multiple standards are not a U.S. Government priority. In this sense, moreover, we urge you to oppose governmentmandated “convergence” of any particular U.S. 3G standard with the standard of any other region, since doing so would, again, relegate other U.S.-adopted, market-accepted technologies to a lesser status and risk stranding billions of dollars in U.S. investments around the world.
FCC’s strategy yielded two significant results. First, supporting ITU’s vision of “a family of standards”, it blocked the route towards 3G convergence. Thus ETSI’s vision of using the ITU process as a way to set a global standard based on the European technological version of W-CDMA technology had no future. Secondly, now that ETSI’s strategy was hindered it was left to the vendors of mobile systems to battle out who controlled the key intellectual property right to CDMA technology.
FROM COMPETITIVE STANDARDIZATION TO AN ACCORD: ERICSSON VS. QUALCOMM
As can be expected, ETSI’s UNITS proposal received limited support in the US. First, ETSI’s proposed framework involved ex-ante standardization, meaning that the standardization bodies should work out a singular standard for 3G services. This was seen as being at odds with the importance of an open and competitive network architecture. Secondly, the main US manufacturers of wireless systems were unwilling to place their future in the hands of a standardization body which would make a final decision between competing technologies.
The two sides clashed over intellectual property rights in the standardization process. Here Sweden’s Ericsson and Qualcomm, the California-based pioneer in CDMA technology, battled out both future standardization principles and the balance of power between the two main system technology suppliers. Instead of seeking to put forward CDMA as a national standard, Qualcomm embarked on a performance play strategy (Shapiro and Varian 1999). Since the 1990s, when few believed in the feasibility of spread spectrum technology, Qualcomm successfully has promoted the CDMA as a superior technology for digital cellular services with huge advantages over GSM and D-AMPS for both service providers and their customers. The industry was taken by surprise when large cellular operators, such as Bell Atlantic and AirTouch picked CDMA technology. By bringing a handful of large operators onboard, Qualcomm forced equipment manufacturers to start producing CDMA equipment based on Qualcomm’s patents in the technology.
Beginning in 1998 Qualcomm, the main US manufacturer of CDMA systems and holder of several key patents in CDMA technology, began to flag that they no longer wanted to play. In essence, Qualcomm aggressively began to pursue the standpoint that they held several key patents in CDMA technology and that they henceforth were in the position to choose who to grant licenses to. Qualcomm thus made it known that they could shape and not entirely block the future use of the CDMA air interface as the basis for the 3G system (Shankar 1998). This threat certainly sent a shock wave into the international telecom community. In reaction to Qualcomm’s threat Ericsson turned to ITU with warnings also Ericsson held what they described as “essential” IPRs in CDMA technology, both to Ericsson’s own W-CDMA and the version of CDMA championed by Qualcomm (International Telecommunication Union 2000).
Nominally, the divisions spring from arcane technical issues of “chip rate” and “synchronization” of proposed 3G standards and their backward compatibility with current GSM and CDMA/IS-95. Ericsson and the GSM camp are arguing for ETSI’s UMTS standard. Qualcomm and the CDMA/IS-95 camp are arguing for a single 3G standard — based on their IPR – which would be backward compatible with all current 2G technologies, including CDMA/IS-95 (Emmett 1998). According to both ITU and company sources,8 Ericsson claims that it holds patents and/or pending applications) for patent that are essential to the two different proposed 3G standards based on WCDMA and CDMA2000. Ericsson stated that it was fully prepared to grant license to these patents on fair, reasonable, and non-discriminatory terms, however, subject to conditions of reciprocity which are required to create fairness in a multi-standard environment. It adds “Ericsson wants each country, or region, to be able to choose among the alternative global standards without being hindered by unequal IPR policies. Ericsson will therefore grant licenses to the alternative 3G standards on the basis of full reciprocity on a global scale between treatment of essential IPRs for these standards. This means, according to Ericsson, that it is not prepared to offer licenses, provided that some other company does not apply such reciprocity in its licensing commitments and by such non-reciprocal action, hinder free choice on equal terms between available standards.
According to the same source, Qualcomm’s position is that, on CDMA2000 technology, it claims it is holding essential IPRs and is not willing to waive its rights but is willing to negotiate licenses with other parties on a non-discriminatory basis on reasonable terms and conditions on most other CDMA proposals, and most significantly namely: Europe’s UMTS/W-CDMA and Japan’s ARIB’s W-CDMA. Furthermore, Qualcomm was not willing to waive the IPR rights it says it holds, nor is it willing to agree to negotiate licenses with other parties on a non-discriminatory basis on reasonable terms and conditions. It was prepared to license its IPR only if the following three principles were met:
* a single, converged worldwide CDMA standard should be selected for 3G;
* the converged CDMA standard must accommodate equally the two dominant network standards in use today (ANSI-41 and GSM MAP); and
* disputes on specific technological points should be resolved by selecting the proposal that either is demonstrably superior in terms of performance, features, or cost, or, in the case of alternatives with no demonstrable material difference, the choice that is most compatible with existing technology.
These technical divisions and IPR questions mask the more fundamental political maneuvering by Ericsson and Qualcomm. Backwards compatibility of 2G and 3G networks is imperative for established carriers to minimize their initial investment in 3G technology. Without backwards compatibility, carriers would be compelled to build entirely new 3G networks rather than building gradually, beginning in high traffic urban areas.
Ericsson’s proposed 3G standard would mean a different – and incompatible – 3G technology for GSM on the one hand and CDMA/IS-95 on the other. Separate standards would, in effect, prevent current CDMA/IS-95 infrastructure vendors, in particular Qualcomm and Samsung, from providing 3G infrastructure compatible with current GSM networks. In theory, nothing would preclude these companies from providing two types of 3G-one compatible with current CDMA/IS-95 networks and the other compatible with current TDMA/IS-136 and GSM networks. However, according to industry commentators and financial analysts, neither company has the experience or the engineering resources to do so. Thus, given multiple 3G standards, Qualcomm and Samsung, and the carriers currently committed to 2G CDMA/IS-95, would become isolated islands of technology. They would be limited to, at most, the approximately 15-17 percent share of 2G world subscribers which CDMA/IS-95 systems currently or in the near future will serve.
Reconciliation
In order to successfully promote its strategy in the USA and ITU, Ericsson and the ETSI camp needed an American ally. This time Ericsson has chosen to act through GSM operators in the USA. On February 9, 1999, the Universal Wireless Communications Consortium (UWCC) and the North American GSM Alliance (GSMA)-trade groups for carriers with TDMA/IS-136 and GSM networks, respectively-announced a “TDMA-GSM Interoperability Agreement”. This agreement envisions full interoperability among future TDMA/IS-136, GSM, and AMPS networks and terminals and pledges cooperation among TDMA/IS-136 and GSM carriers and manufacturers to reach that goal. As demonstrated by Herschel Shosteck, this agreement tilted the balance of power between Ericsson and Qualcomm (Herschel Shosteck Associates 1998).
Interoperability, according to the UWCC-GSMA agreement, will provide three benefits for TDMA/IS-136 and GSM carriers and their end-users. First, with the exceptions of Japan and Korea, interoperability will form a combined global footprint for TDMA/IS-136 and GSM carriers. This will require “translations” between the MAP signaling of GSM networks and the ANSI-41 signaling of TDMA/IS-136 networks. However, the necessary translation protocols are already in operation. Second, interoperability will enable a fully integrated set of features and services across the two digital technologies. This will allow end-users full access to advanced services wherever they may roam on a TDMA/IS-136 or GSM network. Such access will become increasingly important as advanced services proliferate.
Third, over the long term, interoperability will increase economies of R&D and manufacturing scale. This will give increasing advantage to TDMA/IS-136 and GSMand their future enhancements (including 3G)-in terms of providing as of yet unknown services and features.
The announced agreement does not commit to integration of TDMA/IS-136 and GSM into a single 3G technology. Nonetheless, it forms a possible framework for doing so. Notwithstanding, an announced “cohabitation” rather than marriage, it is clear that the majority of carriers and manufacturers affiliated with the UWCC and GSMA intend the future paths of TDMA/IS-136 and GSM to move closer together. The agreement facilitates optimum compatibility of anticipated advances in current TDMA/ IS-136 and GSM, in particular EDGE and GPRS.
The two main opponents announced that Ericsson would buy Qualcomm’s infrastructure business, involving the latter’s manufacturing of CDMA systems. This deal implied that Qualcomm was concentrating on revenues from CDMA patents and its CDMA-chips production. Through the deal, Ericsson put itself in the position to jumpstart sales of CDMAOne and CDMA2000 systems. In addition, the two firms announced that they also entered into a long-term cross-licensing agreement in order to end the ongoing `IPR war’ between the firms.
More tham anything else, the deal resulted from US lobby and Chinese trade policy goals. As reported widely in the business press, it was dawning upon Ericsson’s chief strategists that Qualcomm’s market share might be taking a giant leap forward in China. In particular, the Chinese government was beginning to respond in positive terms to the strong political lobby aiming at opening the Chinese market for Qualcomm’s CDMA technology, which the second Chinese operator was considering. It was already well established that US foreign policy had targeted the Chinese market for Qualcomm’s version of the cellular technology, but the Chinese seemed by early spring more open than before. At this point, the Chinese were using Qualcomm as a pawn in a much bigger game. In fact, the Chinese indicated that the door could be opened for Qualcomm if the US government would support China’s effort to enter the World Trade Organization (Biers and Wilhelm 2000; Lynch and Chou 2000; Robertsson 2000).
Furthermore, Ericsson and Qualcomm made it known that they “also have agreed to jointly support approval by the ITU and other standard bodies, including the U.S. Telecommunications Industry Association (TIA) and the ETSI, of a single CDMA 3G standard” (Ericsson Press Release 99.3.25). The agreement commits both groups “to work cooperatively for the acceptance of each other’s Third Generation … submissions to the ITU”. Of particular importance, both groups “continue to support the view that multiple Third Generation standards are necessary …”. While this supports separate 3G standards submissions to the ITU for each technology, it does not preclude eventual standards convergence. In fact, the agreement commits both groups “to work cooperatively for the acceptance of each other’s Third Generation … submissions to the ITU”. Of particular importance, both groups “continue to support the view that multiple Third Generation standards are necessary…”. These tendencies have later been confirmed by independent reports. Observers covering the standardization process seem inclined to conclude that the IMT-2000 standard will, in fact, consist of two or three compatible versions of the CDMA air interface, involving most notably Ericsson’s W-CDMA and Qualcomm’s CDMA2000 (Nilsson 1999).
CONCLUDING REMARKS
The European strategy of seeking a dominant technology was questioned both by the US federal authorities and by wireless industry in North America. Neither did the US side support the idea that future integration of mobile telecommunications systems and the Internet should be regulated by the old telecom institutions, nor did they accept the idea that the 3G system should be based on a singular universal standard. Instead they wanted to impose the US notion of market led standards, connecting directly to the US experience of regulating the Internet. In 1998, the two sides clashed over air interface standards and intellectual property rights to CDMA technology. The way that this conflict was resolved indicates to the outside observers that some particular European key actors, such as Ericsson and Nokia, are moving away from its original idea of a singular standard for the 3G services and moving towards the position that the new mobile telecom services should be based on several different but compatible standards. Thus, Ericsson and Nokia seem less inclined to uphold their original standardization strategy, which was designed to “tip the market” in their favor.
The alternative now favored by the European industry is akin to the open architecture of the Internet. Network architecture should be as open as possible, allowing user-led innovation and new combinations of radical technologies. In the area of air interfaces, Ericsson and Nokia were ready to accept the idea that the new 3G services could be based on several compatible standards, constituting “a family of standards”. From a technical perspective, this means that the network is open to modular innovations. From a business perspective the standardization war resulted in a mutual recognition that different standards could co-exist in the same network and that market shares would vary from region to region.
ACKNOWLEDGEMENTS
The author thanks Hakan Ledin for thoughtful criticism and useful advice. Thanks for comments on earlier versions are also due to Steven Casper, Roger Hollingsworth, Bart Noteboom, David Soskice and Richard Whitley, and, as usual, to Jonathan Zeitlin.
1 For good overviews of the Swedish development, see: McKelvey et al. (1998) and Molleryd (1999).
2 The other important standardization body, the International Telecommunications Union (IM stayed initially out of the standardization of mobile telecom systems.
3 In fact, some of Philips’ essential intellectual property rights in GSM, such as the speech coder, are licensed at no cost. See Bekkers et at (2000).
4 The author thanks Mr Hakan Ledin for pointing this out.
5 The firms that were left outside the cross-licensing agreements suffered from lack of centrality and connections into the European nexus of GSM technology. Because its pre-existing ties with large Japanese firms with proven capacity to develop cellular systems, Motorola for example refused to license its key patents to DanCall, the Danish manufacturer of GSM telephones that had showed the first working prototype of a GSM phone. The French electronic specialist Matra was another victim of Motorola’s licensing strategy. It made several attempts to secure licenses but was judged to have too close ties with Motorola’s US and Japanese main opponents.
6 For this and the overview of European communication policy below, see European Union (1999) Status Report on European Union Electronic Communications Policy, Update December 1999, Brussels, 22 December 1999 (http:// www.ispo.cec.be/infosec/telecompolicy/tcstatus.htm).
7 Interview with Tom Lindstrom, Ericsson, April 11, Washington, DC.
8 The following account of the IPR conflict is based on press releases from Ericsson, Qualcomm and ITL inter alia.
REFERENCES
American National Standards Institute 2000: Comments by the Information Infrastructure Panel on the need for coordination, http://web.ansi.org/public/iisp/intlcharter.htm.
Bach, David 2000: International cooperation and the logic of networks: Europe and the global system for mobile communications (GSM). Berkeley: BRIE Working Papers No. 19. Baldwin, Carliss Y and Clark, Kim B. 1997: Managing in the age of modularity, Harvard Business Review, September-October.
Bar, Francois and Borrus, Michael 1999: Islands in the bit-stream: charting the NII interoperability debate. Berkeley: BRIE Working Papers.
Bekkers, Rudi, Duysters, Geert and Verspagen, Bart 2000: Intellectual property rights, strategic technology agreements and market structure. The case of GSM. Paper presented at Swedish International Symposium on Economics. Law and Intellectual Property Rights, Stockholm, June.
Biers, Dan and Wilhelm, Kathy 2000; A cautious courtship, Far Earstern Economic Review, 7 December.
Cattano, G. 1994: The making of pan-European network as a path-dependency process: the case of GSM versus integrated broadband communication, in P Pogorel (ed.), Global Telecommunication Strategies. Amsterdam: Elsevier Science.
Dang-Nguyen, Godefroy, Schneider, Volker and Werle, Raymund 1993: Corporate actor
networks in European policy making: harmonizing telecommunications policy. Koln: Max. Planck-Institut fir Gesellschaftsforschung.
Edquist, Charles, Hommen, Leif et al. 1998a: The ISE Policy Statement-The Innovation Policy Implications of the Innovation Systems and European Integration. Linkoping: ISE Research Project, University of Linkoping. http://www.tema.liu.se/tema-t/sirp/PDF/ 34_ .pdf
Edquist, Charles, Hommen, Leif and Tsipouri, Lena 1998b: Findings and Conclusions of the ISE Case Studies on Public Technology Procurement. Linkoping: (ISE) Research Project, University of Linkoping. http://www.tema.liu.se/tema-t/sirp/abstract/322-12.htm
Emmett, Arielle 1998: 3G wireless: will the best technology win?, America’s Network, 102(3): 34-38.
European Union 1987: Green Paper on the development of the Common Market for telecommunications services and equipment, COM (87) 290.
European Union 1994: Green Paper on a common approach in the field of mobile and personal communications in the European Union, COM (94) 145.
European Union 1996a: Communication from the Commission to the Council and the Parliament on “Standardization and the Global Information Society”, COM (96) 359. European Union 1996b: Communication of 27 November 1996 on assessment criteria for
national schemes for the costing and financing of universal service in telecommunications and guidelines for the Member States on the operation of such schemes, COM (96) 608. European Union 1997: Communication from the Commission to the Council, the European
Parliament, the Economic and Social Committee and the Committee of the Regions Strategy and policy orientations with regard to the further communications (UMTS). Outcome of the public consultation and proposals for creating a favourable environment, COM (97) 513.
European Union 1998: Communication on globalisation and the information society, COM (98) 50.
European Union 1999: Green Paper on public sector information in the information society, COM (1998) 585.
Hawkins, Richard 1999: The rise of consortia in the information and communication technology industries: emerging implications for policy, Telecommunication Policy, 23(2): 159-173.
Funk, Jefrey L. 1998: Competition between regional standards and the success and failure of firms in the world-wide mobile communication market, Telecommunications Policy, 22(4-5): 419-441.
Herschel Shosteck Associates 1998: Rekindling of the Religious RF wars and the economic and market consequence, E-mail briefing. Chicago: Herschel Shosteck Associates.
Herschel Shosteck Associates 1999: End of the Ericsson-Qualcomm IPR WAR-what does it mean?, E-mail briefing. Chicago: Herschel Shosteck Associates.
ICT Standards Board 1999: Standards for a New Age: ITC Standardization in Europe. Sophia Antipolis: European Telecommunications Standards Institute.
ITU 2000: ITU warns that CDMA-based RTT proposals for IMT-2000 could be excluded from further consideration if IPR stalemate is not resolved, Press Release. Geneva: International Telecommunications Union.
Iversen, Eric J. 2000: Standardization and intellectual property rights: conflicts between innovation and diffusion in new telecommunications systems, in K. Jacobs (ed.), Information Technology Standards and Standardization: A Global Perspective. Hershey, USA and London, UK: IDEA Group Publishing.
Lynch, Grahame and Chou, Fiona 2000: Surprise twist in CDMA’s future, America’s Network, 15 November.
McKelvey, Maureen, Texier, Francois and Alm, Ha kan 1998: The dynamics of high tech industry: Swedish firms developing mobile telecommunication systems. Linkoping: University of Linkoping, Department of Technology and Social Change, Systems of Innovation Research Program (SIRP).
Molleryd, Bengt G. 1999: Entrepreneurship in technological systems: the development of mobile telephony in Sweden. PhD dissertation, Stockholm School of Economics, Stockholm.
National Research Council 1997: The Evolution of Untethered Communications. Washington, DC: National Academy Press.
National Research Council 1999: Funding a Revolution: Government Support for Computing Research. Washington, DC: National Academy Press.
Neuman, W Russell, McKnight, Lee W. and Solomon, Richard Jay 1997: The Gordian Knot: Political Gridlock on the Information Highway. Cambridge, MA: MIT Press.
Nilsson, Mats 1999: Third generation radio access standards, Ericsson Review, No. 3. Robertsson, Jack 2000: China continues on long (cell) march, Electronic Buyer’s News, 11 December.
Ruottu, Annina 1998: Governance within the European Television and Mobile Communications Industries: PALplus and GSM -a case study of Nokia. PhD dissertation, University of Sussex, Sussex.
Sandholtz, Wayne 1992: High-Tech Europe. Berkeley: University of California Press. Sandholtz, Wayne 1993: Institutions and collective action: the new telecommunications in Western Europe, World Politics, 45(2): 242-270.
Saxenian, AnnaLee 1994: Regional Advantage., Culture and Competition in Silicon Valley and Route 128. Cambridge, MA: Harvard University Press.
Schmidt, Suzanne K. 1997: Sterile debates and dubious generalisations: European integration theory tested by telecommunications and electricity, Journal of Public Policy, 16(3): 233271.
Shankar, Bhawani 1998: 3rd Generation, 4th Dimension, Telecommunications, 32(3): 36-40. Shapiro, Carl and Varian, Hal R. 1999: Information Rules: a Strategic Guide to the Network Economy. Boston, MA: Harvard Business School Press.
Solomon, James D. 1998: Mobile IP.- the Internet Unplugged, Prentice-Hall series in computer networking and distributed systems. Upper Saddle River, NJ: PTR Prentice-Hall.
Weiss, Martin and Sirbu, Marvin 1990: Technological choice in voluntary standard committees: an empirical analysis, Economics of Innovation and New Technology, 1: 111-134.
West, Joel 2000: Institutional constraints in the initial deployment of cellular telephone services on three continents, in K. Jacobs (ed.), Information Technology Standards and Standardization: A Global Perspective. Hershey, USA and London, UK: IDEA Group Publishing.
Henrik Glimstedt is an Assistant Professor at the Institute of International Business, Stockholm School of Economics. He was awarded a PhD from Goteborg University in history. Professor Glimstedt has mainly published in areas such as comparative business systems, historical construction of markets, and historical patterns of globalization and fragmentation of industrial activities, contributing chapters to such books as The Americanisation of European Business (edited by Kipping and Bjarnar) and Americanization and its Limits (edited by Zeitlin and Herrigel). In December 2000, he joined the International Editorial Advisory Board of Industry and Innovation. He can be contacted at: henrik.glimstedt@hhs.se.
Copyright Journal of Industry Studies Apr 2001
Provided by ProQuest Information and Learning Company. All rights Reserved