RSS feedburner email feedburner Google reader Linked-in Facebook Twitter

Search. Loading...

Recent topics
[802.22]
[ICASSP 2011]
[TV-band access]
[Spectrum sensing]
[About CR patents]
[802.16h]


Resources
[Calls & conferences]
[Cognitive radio links]


Sites I follow
[openspectrum.info]
[Nuit Blanche]
[CRT blog]
[SpectrumTalk]
[Linda Doyle]


Archives
[01/2010]
[02/2010]
[03/2010]
[04/2010]
[05/2010]
[06/2010]
[07/2010]
[09/2010]
[10/2010]
[11/2010]
[12/2010]
[01/2011]
[05/2011]
[06/2011]
[07/2011]


Powered by Blogger


<< Back to blog's front page.


Oct 11, 2010

Comments on information theory

Cambridge.Recently I moved to Cambridge, UK, in order to join the Signal Processing and Communications Lab for three months. Here I started to work in some topics related to information theory. Reading Gallager's book [G69] I discovered a quite different view of information theory to the one taught at Stanford, based on Cover and Thomas's book [CT06]. Here is why.

Stanford courses EE376A (taught by Cover) and EE479 (course on Multiuser Information Theory taught by El Gamal) offer a clean view of Information Theory. While the first course is more about the intuition behind information theory, developing some basic tools to deal with simple channels, the second constructs a solid framework useful to asses the achievability and converse proofs of most of the capacity problems. However both courses are the result of several years of iterations, at each of them cleaning the proofs and the concepts to make everything simple and elegant. In fact the notion of typicality, while useful to derive capacity in an elegant way, hides many concepts and machinery behind it.

Of course simple proofs allow to solve complicated problems. However it exists the risk that the students are not aware of the concepts hidden behind. For example, I got the impression that typicality arguments in a random coding communications scheme arise naturally when trying to derive capacity. It was surprising to find in [G69] that capacity is also the outcome when applying Maximum Likelihood decoding to a point to point communication scheme. Although a couple of apparently magic bounds are used in the derivation, it shows the concept of the error exponent which links the pure theoretical results to more practical schemes.

To complete this post I have to mention [H03], a book that tries to generalize many aspects of the classical information theory through the concept of information spectrum. While I did not have to read it in depth, the approach followed also gives interesting insights into the problem.

[CT06]

Thomas M. Cover, Joy A. Thomas Elements of Information Theory, 2nd Edition. Wiley. 2006.

[G69]

Robert G. Gallager Information Theory and Reliable Communication. Wiley. 1969.

[H03]

Te Sun Han Information-Spectrum Methods in Information Theory. Springer. 2003. (Original Japanese edition published by Baifukan, Tokyo, 1998).

Labels:


Oct 2, 2010

ECC Report 159: European white space devices

ECC. While in the US the FCC pushed the "final" rules for unlicensed access of the television band in Europe the ECC approved a similar report, although in a preliminary phase. This report, published as ECC Report 159, can be downloaded from the CEPT meeting documentation area (by selecting group 43, year 2010, folder SE43#7-1009-Biel>>Minutes and document number SE43(10)126-Annex 3). In this report the working group SE43 studies both the protection requirements of the licensed users of the 470-790MHz band (and its neighboring bands) and the operational characteristics of the unlicensed devices, in the document referred as white space devices.

The document studies three candidate techniques to be implemented by the cognitive radio devices, namely sensing, geo-location database and beacon. However in a similar line to the FCC conclusion the report indicates that the current technology is not adequate for sensing based standalone systems:
"The sensing thresholds were derived for a limited number of scenarios using the methodology developed within this report and taking into account a range of potential DTT receiver configuration. Some of the values so obtained (being in the range from -91 to -165 dBm depending on the DTT planning scenario) appear to be too low to be implemented using the current technology. Moreover, in some scenarios, even these low values for the detection threshold do no guarantee a reliable detection of the presence/absence of the broadcasting signals at the distance corresponding to the interference potential of a WSD."

While the ECC Report 159 also studies the combination of sensing and geolocation database to assure the required protection to primary users, the ECC will probably conclude that geolocation based devices offer enough protection without additional sensing. This may look as bad news for the companies which invested in spectral sensing research. However, as Roberto points out in a comment of the last post spectrum sensing devices may be useful to build and keep up to date the information present in the database.

Labels: , , ,

<< Back to blog's front page.