Front Page slideshow

Sub-Nyquist sampling: fiction or reality?


Bandwidth crisis

The first new resource crisis of the millennium may be the bandwidth crisis [1]. Technically, bandwidth is best defined as the capacity to move information through a channel. The more information you move through the channel, the more bandwidth you use. Hence video uses much more bandwidth than for instance email. A bandwidth shortage occurs when the demand to move information exceeds the capacity of the channel.

Wireless devices

The real bandwidth shortages will be in wireless, where demand is growing and supply is weak. The industry cannot keep up with wireless demand. We’re already seeing more and more slow connections. Jams will become the norm instead of the exception. If we want the pleasure and convenience of a high-bandwidth society, someone will need to figure out a solution to the bandwidth dilemma soon [1].

Storage bottleneck

The amount of data generated worldwide is growing by 58% per year [3]. In contrast, the total amount of world data storage in hard drives, memory chips, and tape, is growing at only 40% per year. A milestone was reached in 2007, when the world produced more data than could fit in all of the world’s storage. In 2011 we already produced over twice as much data as can be stored. This expanding gap means that we will face a deluge of data that will be unavailable later for further analysis [2].

Sparse signal processing

Throughout computational science and engineering, several attempts have been made to represent data in a parsimonious or sparse way. A representation is considered sparse if it accounts for most or all information in data with a combination of only a few generating elements or atoms.

Because of the bandwidth and storage issues, sparsity has become a true priority. A sparser model means higher compression, less data collection, storage or transmission, and a reduced model complexity. The ultimate goal is to determine a sparse representation of a digital signal directly from only a few data samples, rather than first acquire a massive amount of data to compress at a later stage.

Currently, non-sparse data sampling still underlies nearly all signal acquisition used in modern consumer electronics, biomedical monitoring and medical imaging devices. But many signals are sparse or compressible when expressed using the proper atoms. Sparse representations are being investigated in power systems, sonar, electrical and electronic engineering, astronomy, signal processing, hyperspectral imaging, MRI, telecommunication, telemonitoring, sensor networks, ambient assisted living, high-frequency radar, econophysics, and so on. There is almost no area in modern technology that is not impacted by signal processing!


[1]   Tim Wu, “Bandwidth Is the New Black Gold”, Time Magazine, Thursday, March 11, 2010.
[2]   Richard G. Baraniuk, "More Is Less: Signal Processing and the Data Deluge", Science 331 (2011) p. 717.
[3]   J. Gantz, D. Reinsel, "The Digital Universe Decade. Are You Ready?", IDC White Paper, May 2010.