Not surprisingly, one of the key results of signal processing is the sampling theorem for bandlimited functions often attributed to Shannon, since it appears in the above-mentioned paper , the theorem which single-handedly enabled the digital revolution. However, the strength of the sampling theorem and its variations e. In order to showcase such powerful applications, the last chapter is entirely devoted to developing an end-to-end communication system, namely a modem for communicating digital information or bits over an analog channel.
This real-world application which is present in all modern communication devices, from mobile phones to ADSL boxes nicely brings together many of the concepts and designs studied in the previous chapters. Being less formal, more abstract and application-driven seems almost like moving simultaneously in several and possibly opposite directions, but we believe we came up with the right balancing act. Ultimately, of course, the readers and students are the judges!
A last and very important issue is the online access to the text and supplementary material. A full html version together with the unavoidable errata and other complementary material is available at www. A solution manual is available to teachers upon request. As a closing word, we hope you will enjoy the text, and we welcome your feedback. Let signal processing begin, and be fun! The current book is the result of several iterations of a yearly signal processing undergraduate class and the authors would like to thank the students in Communication Systems at EPFL who survived the early versions of the manuscript and who greatly contributed with their feedback to improve and refine the text along the years.
Invaluable help was also provided by the numerous teaching assistants who not only volunteered constructive criticism but came up with a lot of the exercices which appear at the end of each chapter and their relative solutions. In no particular order: Andrea Ridolfi provided insightful mathematical remarks and also introduced us to the wonders of PsTricks while designing figures.
Olivier Roy and Guillermo Barrenetxea have been indefatigable ambassadors between teaching and student bodies, helping shape exercices in a hopefully more user-friendly form. Luciano Sbaiz always lent an indulgent ear and an insightful answer to all the doubts and worries which plague scientific writers. Conceptually, it is important to note that signal processing operates on an abstract representation of a physical quantity and not on the quantity itself. At the same time, the type of abstract representation we choose for the physical phenomenon of interest determines the nature of a signal processing unit.
A temperature regulation device, for instance, is not a signal processing system as a whole. The physical nature of this unit depends on the temperature model: a simple design is that of a mechanical device based on the dilation of a metal sensor; more likely, the temperature signal is a voltage generated by a thermocouple and in this case the matched signal processing unit is an operational amplifier.
Digital signal processing is a flavor of signal processing in which everything including time is described in terms of integer numbers; in other words, the abstract representation of choice is a one-size-fit-all countability. Probably the earliest recorded example of digital signal processing dates back to the 25th century BC. After a flood, the banks would be left covered with a thin layer of nutrient-rich silt capable of supporting a full agricultural cycle.
The floods of the Nile, however, were 1 a rather capricious meteorological phenomenon, with scant or absent floods resulting in little or no yield from the land.
As a consequence, studying and predicting the trend of the floods and therefore the expected agricultural yield was of paramount importance in order to determine the operating point of a very dynamic taxation and redistribution mechanism. Yet, the Palermo Stone is arguably the first recorded digital signal which is still of relevance today. As soon as the interaction with the world becomes more complex, so necessarily do the models used to interpret the world itself.
Geometry, for instance, is born of the necessity of measuring and subdividing land property.
Heavily steeped in its geometric roots i. In the continuum, the infinitely big and the infinitely small dance together in complex patterns which often defy our intuition and which required almost two thousand years to be properly mastered. This is of course not the place to delve deeper into this extremely fascinating epistemological domain; suffice it to say that the apparent incompatibility between the digital and the analog world views appeared right from the start i. Zeno of course was well aware of the empirical evidence to the contrary but he was brilliantly pointing out the extreme trickery of a model of the world which had not yet formally defined the concept of infinity.
A first-year calculus student may be tempted to offhandedly dismiss the problem by stating. The two competing models for the world, digital and analog, coexisted quite peacefully for quite a few centuries, one as the tool of the trade for farmers, merchants, bankers, the other as an intellectual pursuit for mathematicians and astronomers. Slowly but surely, however, the increasing complexity of an expanding world spurred the more practically-oriented minds to pursue science as a means to solve very tangible problems besides describing the motion of the planets.
If only. As Cauchy unsurpassably explained later, everything in calculus is a limit and therefore everything in calculus is a celebration of the power of the continuum.
Still, in order to apply the calculus machinery to the real world, the real world has to be modeled as something calculus understands, namely a function of a real i. As mentioned before, there are vast domains of research well behaved enough to admit such an analytical representation; astronomy is the first one to come to mind, but so is ballistics, for instance. If we go back to our temperature measurement example, however, we run into the first difficulty of the analytical paradigm: we now need to model our measured temperature as a function of continuous time, which means that the value of the temperature should be available at any given instant and not just once per day.
Even in the rare cases in which an analytical model of the phenomenon is available, a second difficulty arises when the practical application of calculus involves the use of functions which are only available in tabulated form. The trigonometric and logarithmic tables are a typical example of how a continuous model needs to be made countable again in order to be put to real use.
Download Le Avventure Di Cipollino
Algorithmic procedures such as series expansions and numerical integration methods are other ways to bring the analytic results within the realm of the practically computable. One of the fundamental problems in signal processing is to obtain a permanent record of the signal itself.
Think back of the ambient temperature example, or of the floods of the Nile: in both cases a description of the phenomenon was gathered by a naive sampling operation, i. Manually this operation is clearly quite slow but it is conceivable to speed it up mechanically so as to obtain a much larger number of measurements per unit of time. Consider for instance a thermograph: this is a mechanical device in which temperature deflects an ink-tipped metal stylus in contact with a slowly rolling paper-covered cylinder. The problem with these analog recordings is that they are not abstract signals but a conversion of a physical phenomenon into another physical phenomenon: the temperature, for instance, is converted into the amount of ink on paper while the sound pressure wave is converted into the physical depth of the groove.
The advent of electronics did not change the concept: an audio tape, for instance, is obtained by converting a pressure wave into an electrical current and then into a magnetic deflection. The fundamental consequence is that, for analog signals, a different signal processing system needs to be designed explicitly for each specific form of recording.
Oh no, there's been an error
Consider for instance the problem of computing the average temperature over a certain time interval. In both cases, in spite of the simplicity of the problem, we can instantly see the practical complications and the degree of specialization needed to achieve something as simple as an average for an analog signal.
Another way to look at the problem is to ask ourselves how much information we are discarding by only keeping samples of a continuous-time function. Let us put the proviso aside for the time being and concentrate instead on the good news: first, the analog and the digital world can perfectly coexist; second, we actually possess a constructive way to move between worlds: the sampling theorem, discovered and rediscovered by many at the beginning of the 20th century 4 , tells us that the continuous-time function can be obtained from the samples as.
So, in theory, once we have a set of measured values, we can build the continuous-time representation and use the tools of calculus. Quantitatively, the sampling theorem links the speed at which we need to repeatedly measure the signal to the maximum frequency contained in its spectrum. Spectra are calculated using the Fourier transform which, interestingly enough, was originally devised as a tool to break periodic functions into a countable set of building blocks. Everything comes together.
While it appears that the time continuum has been tamed by the sampling theorem, we are nevertheless left with another pesky problem: the precision of our measurements. Consider our temperature example once more: we can use a mercury thermometer and decide to write down just the number of degrees; maybe we can be more precise and note the half-degrees as well; with a magnifying glass we could try to record the tenths of a degree — but we would most likely have to stop there. With a more sophisticated thermocouple we could reach a precision of one hundredth of a degree and possibly more but, still, we would have to settle on a maximum number of decimal places.
Selected Topics in Digital Signal Processing | McGraw-Hill Education - Access Engineering
Now, if we know that our measures have a fixed number of digits, the set of all possible measures is actually countable and we have effectively mapped the codomain of our temperature function onto the set of integer numbers. This process is called quantization and it is the method, together with sampling, to obtain a fully digital signal. There is a very good reason for that and it goes under the name of noise. The mechanical recording devices we just saw, such as the thermograph or the phonograph, give the illusion of analytical precision but are in practice subject to severe mechanical limitations.
Noise is a fact of nature that cannot be eliminated, hence our acceptance of a finite i. Noise is not just a problem in measurement but also in processing. Figure 1. An analog signal processing system, much like the slide rule, uses the displacement of physical quantities gears or electric charge to perform its task; each element in the system, however, acts as a source of noise so that complex or, more importantly, cheap designs introduce imprecisions in the final result good slide rules used to be very expensive.
Digital signal processing works with countable sequences of integers so that in a digital architecture no processing noise is introduced. A classic example is the problem of reproducing a signal. Basically only first generation copies of the purchased vinyl were acceptable quality on home equipment. With digital formats, on the other hand, duplication is really equivalent to copying down a very long list of integers and even very cheap equipment can do that without error. Finally, a short remark on terminology.
Neglecting quantization will allow us to obtain very general results but care must be exercised: in the practice, actual implementations will have to deal with the effects of finite precision, sometimes with very disruptive consequences. The radio wave travels to the base station in which it is demodulated, converted to digital format to recover the voice signal.
The call, as a digital signal, continues through a switch and then is injected into an optical fiber as an analog light wave.
Related The FFT in the 21st Century: Eigenspace Processing
Copyright 2019 - All Right Reserved