The subject of interest in this book is the use of Digital Subscriber Line (DSL) technology to increase the rate and improve the quality of data communications over copper cable. It is an important topic both within the context of data communications today and into the future. All, or almost all, aspects of this subject will be explored. However, it seems rather forbidding just to jump into this topic. Rather, it is more appropriate to take a step back and talk about the nature of communications first, in order to introduce some needed terminology. Such a step back will also provide us with a broader perspective on the subject of DSL technology as a transmission facilitator. In short, it will help us to answer the question, "Why should we be interested in DSL?"
The reader well-versed in data communications may, of course, choose to skip this introduction and suffer no real penalty.
The subject of communications really begins with the situation shown in Figure 1-1. Here is an entity called the Source and one called the User - located remotely from the Source. The Source generates Information, and the User desires to learn what this Information is.
Examples of this situation abound. However, let us focus our attention on the case illustrated in Figure 1-2. Here, the Information is a sequence of binary digits - 0s and 1s, commonly called "bits." Information in this case is termed "data." Information of this type is generally associated with computers, computing-type devices, and peripherals - equipment shown in Figure 1-3. Limiting Information to data presents no real limitation. Voices, images, indeed most other types of Information can be processed to look like data by carrying sampling and Analog-to-Digital conversion.
In practice, it is impossible for the User to obtain the Information without the chance of error. Such errors may spring from a variety of deleterious effects, which we will examine, in greater detail later in this chapter.
The possibility of error means that the User seeking the Information - that is, the binary sequence - must be content in learning it to within a given fidelity. The fidelity measure usually employed is the Bit Error Rate (BER). The BER is the probability that a specific generated binary digit at the Source, a bit, is received in error, opposite to what it is, at the User.
There are some real questions as to how appropriate this fidelity measure is in certain applications. Nonetheless, it is so widely employed in practice that further discussion is not warranted.
The question then arises as to how to send the binary data stream from the Source to the User. We refer to any physical entity used for this purpose as a Transmission Medium.
As shown in Figure 1-4, the Transmission Medium is located between the Source and the User, accessible to both. The Transmission medium has a set of properties described by physical parameters. This set of properties exists in a quiescent state; however, at least one of these properties can be stressed or disturbed at the Source end. This is accomplished by imparting energy in order to stress the property. The disturbance affects the parts of the Transmission Medium around it, then travels from the Source end to the User end. Once the disturbance or stressed property reaches the User end, it can be sensed and measured. This propagation of a disturbance by the Transmission Medium is illustrated in Figure 1-5.
There are many types of transmission media. The Transmission Medium could be air, with the stressed property being the air pressure put on sound waves. It could be an electromagnetic field set up in space by the current put on an antenna - a radio or wireless system. It could be a pair of electrical conductors, with the stressed property being the potential difference (the voltage) between the conductors - an electrical transmission line. It could be a cylindrical glass tube with the stressed property being the intensity of light in the tube - a fiber optic cable. Even written communication can be interpreted in this fashion: a sheet of writing paper provides the Transmission Medium, with the stressed property being the light-dark pattern on the paper.
The Source can have a disturbance to the Transmission Medium generated in sympathy to the Information - that is, it can generate a disturbance which varies in time exactly as the Information. This encoded disturbance will propagate to the User. The User can then sense the disturbance and decide the identity of the Information that it represents. The process of the Source generating a disturbance in sympathy with the Information and launching it into the Transmission Medium is referred to as "modulation and transmission." The process of the User sensing the received disturbance and deciding what Information it represents is referred to as "reception and demodulation." In this work, we will refer to the device that carries out modulation and transmission as the Transmitter. We will refer to the device that carries out reception and demodulation as the Receiver.
The whole of data communications then devolves to the model illustrated in Figure 1.6. Here, the Source generates bits as Information. The User wants to learn the identity of this Information, these bits. The entities used to get the Information from the Source to the User are the Transmitter, the Transmission Medium and the Receiver. The fundamental problem of communications is to choose the terminal equipment - the Transmitter and Receiver - and to choose the Transmission Medium so as to satisfy the requirements for a given Source-User pair.
The fundamental problem of communications is one of design. Collectively, the combination of Transmitter, Transmission Medium and Receiver is known as the "communication link" or "data link" - the latter term deriving from the limitation placed on the Information to the form of a sequence of bits. The disturbance launched into the Transmission Medium by the Transmitter is usually referred to as the "input data signal." The resulting disturbance at the Receiver is termed the "output data signal." In the context of our discussion, the fundamental problem is to design a data link appropriate for connecting a given Source-User pair.
There is no cookbook method to solve this design problem and come up with the best unique solution. While there is science here, there is also art. There are always alternative solutions. Each solution has its own particular twist, which in turn provides some additional attractive feature to the solution. However, the feature is peripheral to Source-User requirements.
Most exercises in obtaining the design solution usually begin with choosing a Transmission Medium to meet the general requirements of the Source-User pair. In other words, the data link design process pivots on choosing the Transmission Medium. Every Transmission Medium has constraints on its operation, on its performance. It is these constraints that truly decide which Transmission Medium will be employed for the data link design.
Have a Transmitter launch a disturbance, an input data signal, into a Transmission Medium. As the disturbance propagates down the Transmission Medium to the Receiver, its amplitude will decrease, growing weaker and weaker. The disturbance is said to suffer attenuation, a situation illustrated in Figure 1-7.
One immediate question that arises is why does attenuation occur? There are several reasons. It would be worthwhile to point out and describe two of them: spatial dispersion and loss due to heat.
Spatial dispersion can best be considered by revisiting Figure 1-7, which illustrates a one-dimensional propagation of the disturbance. However, often, this disturbance may propagate in two or even three dimensions. The User/Receiver may be located in a small solid angle relative to the Source/Transmitter. The received disturbance, the output data signal, appears attenuated relative to the transmitted disturbance because, in fact, it represents only a small fraction of the overall energy imparted in the disturbance when it was launched. This is exactly the situation with free space propagation of waves through an electromagnetic field transmission medium, such as that which occurs in any sort of radio transmission.
Loss due to heat refers to the basic interaction of the disturbance with the material from which the Transmission Medium is comprised. As the disturbance propagates, a portion of the energy is transferred into the Transmission Medium and heats it. For a mechanical analogy, consider rolling a ball down a cement lane. The ball is the disturbance launched into the lane, which represents the Transmission Medium. As the ball rolls along, it encounters friction. It loses part of its kinetic energy to heating the cement lane and begins to slow down. The disturbance becomes attenuated. This is the situation with using the potential difference between a pair of electrical conductors as the Transmission Medium.
Attenuation increases with the distance through the Transmission Medium. In fact, the amplitude attenuation is measured in dB/km. As propagation continues, attenuation increases. Ultimately, the propagating signal is attenuated to a minimal detectable level. That is, the signal is attenuated until it can just be sensed by the Receiver - in the presence of whatever interference is expected. The distance at which the signal reaches this minimal level could be quite significant. The Transmission Medium has to be able to deliver at least the minimal detectable level of output signal to the Receiver by the User. If it cannot, communications between the Source and User cannot take place.
There are some tricks to getting around this. Suppose the disturbance has been attenuated to the minimal detectable level, yet it has still not arrived at the Receiver/User. The output signal at this location can then be regenerated. The signal can be boosted back up to its original energy level. It can be repeated and continue to propagate on its way to the Receiver/User. This is shown in Figure 1-8.
Nonetheless, the attenuation characteristics are an item of significance. The Transmission Medium selected in the design must have its attenuation characteristics matched to the Source-User separation. The lower the attenuation in dB/km, the greater advantage a Transmission Medium has.
Have a Transmitter launch an input data signal into a Transmission Medium. As it propagates down the Transmission Medium, the disturbance will encounter all sorts of deleterious effects, which are termed "noise" or "interference." In the simplest example, that of one person speaking to another person, what we refer to as noise really is what we commonly understand noise to be.
What is noise/interference? It is some extraneous signal that is usually generated outside of the Transmission Medium. Somehow, it gets inside of the Transmission Medium and realizes its effect - usually by adding itself to the propagating signal, but sometimes by multiplying the propagating signal. The term noise is generally used when this extraneous signal appears to have random amplitude parameters, like background static in AM radio. The term interference is used when this extraneous signal has a more deterministic structure, like 60-cycle hum on a TV set. In any case, when the Receiver obtains the output data signal, it must make its decision about what Information it represents - and demodulate the signal - in the presence of this noise/interference.
Noise/interference may originate from a variety of sources. It may come from the signals generated by equipment located near the Transmitter/Transmission Medium/Receiver. This may be equipment that has nothing at all to do with the data link, such as motors on air conditioners or automated tools. Noise/interference may also come from atmospheric effects or from the use of multiple electric grounds. It may be generated by active circuitry in the Transmitter or the Receiver, or it may come from the operation of other data links.
In obtaining the design solution, noise/interference makes its effect best known through the BER. The level of noise/interference drives the BER. Of course, this can be countered by having the Transmitter inject a stronger input signal. It can also be countered by making the Receiver capable of detecting lower minimal output signals. However, this comes with greater expense. Neither of these solutions hides the fact that there is concern with noise/interference because of its impact on the BER.
The susceptibility to noise/interference varies from Transmission Medium to Transmission Medium. Consequently, during the design process, the designer must pay attention to the application underlying the communication needed by the Source-User pair and to the BER required by this application. The designer must then select the Transmission Medium that has a noise/interference level capable of delivering the required BER.
Consider again the model illustrated in Figure 1-6. Suppose the input signal the Transmitter sends to the Transmission Medium is the simple cosinusoidal signal of amplitude '1' at frequency 'f0' Hz. The output response to this at the Receiver is designated 'T(f0)'. Now consider the cosinusoidal test input signal frequency f0 to be varied from 0 Hz on up to ∞. The resulting output signal as a function of frequency is T(f0) - or, suppressing the subscript, T(f). This is generally referred to as the transfer function of the Transmission Medium. Generally, the ordinate target value 'T(f)' for a given frequency 'f' is referred to as the transfer function gain - although, in fact, it is a loss - and is expressed logarithmically in dB relative to the amplitude '1' of the input signal.
One example transfer function is illustrated in Figure 1-9. Though it is just an example, not to be taken as typical in any sense, it illustrates a feature common to the transfer function of any Transmission Medium that obtainable in the real, physical world. The transfer function rolls off with frequency. The transfer function shown here oscillates, but the maximum value of its oscillation becomes less and less. However, the transfer function itself never rolls off completely to become dead flat zero beyond a certain frequency. This roll off with frequency means that the Transmission Medium attenuates the cosinusoidal signals of the higher frequencies that are given to it as inputs. The energy of these higher frequency signals is somehow lost, usually as heat, in traversing the Transmission Medium. The greater the distance through the Transmission Medium, the more high frequency signals get attenuated. This is a consequence of the greater interaction between the propagating signals and the material comprising the Transmission Medium.
This roll off feature of the transfer function is present in every Transmission Medium regardless of how it is derived. It is present in sound waves, in electrical conductors, in fiber optic cables, in CDs, in audio or videotapes, and even in a sheet of writing paper.
The transfer function shown rolls off with frequency. However, most of its activity, most of its area, most of its mass, most of its spread, seems to be below a given frequency. In this example, it looks like the frequency 'F.' The frequency spread of the transfer function is referred to as its bandwidth. As mentioned above, bandwidth decreases with the propagation distance through the Transmission Medium.
As frequency spread is very subjective, so too is the measure of bandwidth. When you discuss communications with someone and they mention bandwidth, it would be wise to ask exactly how they are defining it. There is a definition in the glossary at the back of this book, but this is only one such definition. There are many. For example, there is the 3 dB bandwidth, mean square bandwidth, first lobe bandwidth, brick wall bandwidth and on and on. In a study carried out seventeen years ago. Dr. Kenneth S. Schneider identified over twenty-five separate definitions of bandwidth. All have validity. Whether one definition is meaningful or not depends on the context in which it is applied. One definition may be appropriate for describing satellite communication links and another more appropriate for an FCC official considering the request for a broadcast AM radio license.
In any case, a Transmission Medium has a transfer function, and the frequency spread of this transfer function is measured by the bandwidth. The bandwidth parameter has implications with respect to the performance of the data link being designed.
Consider the illustration shown in Figure 1-10. Here, the Source is generating data, '0s' and '1s', every T seconds. Let T=1/R, in which case the Source generates data at R bits per second (BPS). To send this data to the User, the Transmitter generates either a positive or a negative impulse every T seconds. What is an impulse? It is an infinitesimally narrow pulse that is also infinitely high, so that it has energy of '1.'
Now what comes out at the Receiver in response to the positive impulse sent at time zero to represent the binary data bit '1?' An example result is illustrated in Figure 1-11. Notice that this response out of the Transmission Medium to the input impulse is a pulse spread out in time with its center at t seconds, when t is not equal to 0 seconds. While this example output cannot be called typical, it does indicate a property typical of all output signals received from the Transmission Medium: the time spreading of the output pulse, called "time dispersion." Time dispersion is a result of the finite bandwidth of the Transmission Medium. To be exact, it is due to the fact that the transfer function of the Transmission Medium - indeed, of any Transmission Medium - attenuates the higher signals.
Look closely at the output signal pulse shown in Figure 1-11. Because it is spread in time, it will interfere with the output pulses, due to input data signals which will come after it. These do not appear in the illustration, but the implication should be clear. Likewise, these subsequent data signals will generate output pulses that will also be spread in time. Each will also interfere with both the pulses coming before it and after it. This type of interference is called "intersymbol interference." It is not just a consequence of the input signals being impulses. An input signal, of finite duration and of any shape, will generate an output signal with time dispersion.
As the data rate from the Source increases, the intersymbol interference problem grows worse. Output pulses with time dispersion get squeezed next to one another. The growing level of intersymbol interference makes it increasingly harder for the Receiver to demodulate these signals.
To some extent, the intersymbol interference can be undone by sophisticated signal processing in the Receiver. This usually goes under the name of "equalization." However, in many cases equalization still cannot deliver the data from the Receiver with the BER required by the Source-User pair. In other cases, the data being generated by the Source, say R BPS, is so high that an equalizer cannot be obtained fast enough to keep up with the output signals.
In considering the data link design task, the first line of defense against time dispersion and intersymbol interference lies in the proper selection of the Transmission Medium. The larger the bandwidth of the Transmission Medium, the fewer high frequency components will be attenuated during propagation and the smaller the time dispersion. As a result, there will be less interference between different output pulses. Make no mistake. Intersymbol interference will not disappear. Rather, it will be lessened and made more tolerable as the bandwidth grows larger. In particular, to lessen the intersymbol interference the bandwidth of the Transmission Medium must get larger in relation to the Source's generated bit rate, R BPS.In considering the data link design task, the first line of defense against time dispersion and intersymbol interference lies in the proper selection of the Transmission Medium. The larger the bandwidth of the Transmission Medium, the fewer high frequency components will be attenuated during propagation and the smaller the time dispersion. As a result, there will be less interference between different output pulses. Make no mistake. Intersymbol interference will not disappear. Rather, it will be lessened and made more tolerable as the bandwidth grows larger. In particular, to lessen the intersymbol interference the bandwidth of the Transmission Medium must get larger in relation to the Source's generated bit rate, R BPS.
The Transmission Medium must be selected to accommodate the bit rate generated by the Source. It must have sufficient bandwidth so that it will generate tolerable intersymbol interference at the Receiver. This means selecting a Transmission Medium that has a bandwidth that is some multiple of the bit rate, R. A number of rules of thumb are often used to do this. However, they are too specific and not worth discussing at this point, particularly as the measure of bandwidth is subjective. The important point is that the selection of Transmission Medium candidates is limited to those matched to the data rate requirement, R. This means that as R increases, the selection of Transmission Medium candidates becomes more limited.
The information technology explosion in the world has made this selection task ever more challenging. Continuously, PCs are becoming more powerful. More complex applications programs can be run and are finding their way into easily usable software. As a result, the Source bit rate requirement is growing geometrically every few years. To put this in perspective, consider that just ten years ago a Transmission Medium would have been quite acceptable if it had a bandwidth matched to a Source bit rate of 9,600 BPS. This source rate was typical of that generated by most data equipment applications. Today, with the growing demand for video services and the plethora of graphics in computer applications, the demand more often than not is for a Transmission Medium with a bandwidth requirement matched to Source bit rates well upwards of 1 MBPS, possibly 1 GBPS.
You may be able to find the ideal Transmission Medium relative to attenuation, interference and bandwidth. Yet you still may not be able to select it as part of the solution to the data link design problem for the simple reason that it costs too much. It presents an expense beyond the budget allowed for the Source-User communications.
This is nothing new or revolutionary. Money does not drive the world, but it does have a tremendous influence on the ultimate choice of solution to any problem based in technology. This is as true today at the turn of the millenium as it was at the turn of the twentieth century.
As a case in point, let us examine briefly the fiber optic solution to the problems of attenuation, interference and bandwidth. Fiber optic cable - at least, that of the pure glass-silica variety (glass core with glass cladding) - has a far lower attenuation rate than coaxial cable. Whether it is fabricated fully from glass or uses plastic cladding, fiber optic cable can carry signals with full immunity from electromagnetic-based forms of noise and interference. In terms of bandwidth, fiber optic cable has superiority over copper of several orders of magnitude - transmitting well above 10 MHz for up to 4 km. In some cases, dependent on distance and repeaters, it can transmit data at rates measurable in gigabits per second (1 billion bits per second - GBPS) or even terabits per second (1 trillion bits per second - TBPS). To put this in perspective, unshielded twisted pair copper cable transmitting over a distance of 4 km can support 0-to-100 MBPS, while coaxial cable can support about 20 MBPS over the same distance. Thus in terms of attenuation, interference and bandwidth, fiber optic cable beats copper, hands down.
Fiber optic cable, however, has problems of its own, and cost ranks chief among them. As illustrated in Figure 1-12, fiber is far more expensive than Unshielded Twisted Pair (UTP) copper. If you are starting from scratch, building your premises and its communications infrastructure from the ground up, fiber presents a worthwhile investment - a large investment, to be sure, but one that will eventually pay for itself.
Suppose, however, that you are not starting from scratch. In this case, you would have to rip out the old copper infrastructure before you could lay down your new fiber optic cable. Herein lies the problem. UTP copper cable has been the Transmission Medium of choice for nearly one hundred twenty years.. There is a tremendous amount of copper infrastructure already in place at every level, from the home office to the global communications network. The local loop connecting business premises and the telephone central office (CO) runs on copper pairs. For the same reason, copper provides the most common Transmission Medium for Internet access. Simply put: copper is everywhere. As a result, the cost replacing copper with fiber is often prohibitively high.
Another drawback of fiber lies in one of its strengths. Fiber transmits data via lightwaves rather than electrical signals. This cuts down on interference, but it also eliminates one of the benefits that copper grants: the ability to transmit DC voltage along with the signal. This additional voltage allows telephones to continue functioning during a power outage. Without the additional voltage, you risk losing your phones as well as your PCs and peripherals when the lights go out. For this reason alone, it is unlikely fiber will ever replace copper entirely for desktop communications.
This appears to place us at an impasse. Traditional copper is too slow and too vulnerable to cope with the increasingly steep demands of data transmission. Fiber can be too expensive to make a shift practicable, even without its own vulnerability. If only there were a way to marry the cost benefits of copper to the technological advantages of fiber, we would have a really attractive Transmission Medium.
Thankfully, there is and we do. It is called Digital Subscriber Line (DSL) technology.
Necessity is the mother of invention. In the case of DSL, that necessity took the form of the need to eliminate interference - particularly in the form of noise generated by inclement weather, to which analog signals transmitted along copper wire are so vulnerable. Shortly before World War II, a British engineer working for ITT in France grew so annoyed by this analog line noise that he set to work on the problem of how to digitize analog voice signals. The war soon put an end to these experiments, but the increasing globalization of the economy that followed the war led to a demand for constant improvement in telecommunications quality.
AT&T, in conjunction with IBM, carried out much of the basic postwar research on digital telephone technology. These experiments came to focus on a technique of sharing bandwidth in time slots known as "time division multiplexing" (TDM) - a method long-considered too expensive and technically impractical for analog transmission. By the early 1960s, this led to the development of the T-carrier system - the basis of which was a local loop digital system known as T1 (T-carrier, level 1 multiplexing). The T-carrier system led to the development of digital trunk lines. By the mid-1970s, digital trunk lines had become commonplace and digital switches made their first appearances. Through this period, T1 remained under the control of the sole public switched telephone network (PSTN), AT&T.
Early digital telecommunications enthusiasts predicted the growth of an integrated digital network, a technology that later came to be called Integrated Services Digital Network (ISDN). Skeptics, noting the failure of the digital promise to produce through the 1970s, joked that the acronym really stood for "It Still Does Nothing".. By 1981, however, ISDN began meeting initial expectations, and 1982 saw ISDN form the core of the original DSL technology: IDSL (ISDN DSL).
Two years later, the US Government ordered the divestiture of AT&T. With the breakup of the PSTN, T1 first became available for customer installation and DSL technological development exploded. This explosion fed and was itself nourished by the rapid advances in computer technology and the development of the Internet over the next decade, both of which demanded increasingly higher rates of data transmission. ISDN, so long in coming, soon found itself surpassed by newer flavors of DSL, particularly High-bit-rate DSL (HDSL, developed between 1988-91), Asymmetric DSL (ADSL, developed between 1991-95), and Very-high-bit-rate DSL (VDSL, under development since 1995). The universe of DSL technology referred to collectively as xDSL, now forms a key ingredient of the asphalt that makes up the Information Superhighway.
This book has been written so that each chapter stands on its own. There is no need to read the chapters in order. While there may occasionally be cross-references from one chapter to another, the information can be gleaned easily without going back to the very beginning.
A brief summary of the succeeding chapters follows:Chapter 2 - We examine first the basic technological architecture underlying the DSL modem. With this foundation, we shall follow with a study of the various different flavors of xDSL modems that have appeared over the past two decades, along with their specifications and uses. These flavors include: ISDN DSL (IDSL), Asymmetric DSL (ADSL), Single-line or Symmetric DSL (SDSL), Rate Adaptive DSL (RADSL), Universal ADSL (UADSL, also known as G. Lite or DSL Lite), Consumer DSL (CDSL), Moderate Speed DSL (MSDSL), High-bit-rate DSL and High-bit-rate DSL 2 (HDSL/HDSL2), and Very-high-bit-rate DSL (VDSL).
Chapter 3 - The DSL modem is only half the system that allows you to convert your copper
local loop into a high-speed data transmission channel. The other half lies at the CO, the platform that gathers together signals from DSL and traditional POTS, combines them in a digital signal and dispatches them down the line to the destination CO. This is the Multiservices Digital Subscriber Line Access Multiplexer (DSLAM), which we shall discuss here.
Chapter 4 - This section deals with the testing of DSL modems and, as such, should be of special interest to manufacturers. Theoretically, there are several ways to test your modem's specifications: testing on the local loop itself, on a wire coil, and on a DSL
simulator. The simulator overcomes difficulties of space and distance constraints, crosstalk and mutual inductance - difficulties that afflict the first two methods of testing - by combining real world operational factors with
laboratory convenience. As such, it presents the manufacturer with the most realistic testing environment for a DSL product.
Chapter 5 - This section enumerates standards that cover the use of DSL technology, providing the names and contact information of organizations from which these standards can be ordered.
Chapter 6 - A glossary that covers the subject of digital subscriber line technology. It provides terminology specifically covered in this book, as well as terminology that may be encountered in the broader realm of communications in general.