Is the PHY Layer Dead?
Last post closed the series devoted to ICASSP 2011. Today I want to refer to an article I read some time ago, "Is the PHY Layer Dead?" [DH+10], coauthored by M. Dohler, R. W. Heath Jr., A. Lozano, C. Papadias and R. A. Valenzuela. The origins of this paper go back to a discussion held at IEEE VTC Spring 2009 about the relevance of current research in physical layer (PHY). The article is really interesting and worth reading to any researcher working in the field.
Some of the questions raised there can be particularized to cognitive radio. Here a couple of thoughts:
Cognitive radio research community has developed an extensive set of detectors for multiple system models. Have we achieved a detection performance close to what we can expect from a Cognitive Radio device?
As James Neel argued in one of his posts,
In my opinion the answer is not so clear. First, in most practical detection problems there exists no clear performance limit that can be used as a reference for the available improvement margin. The optimal detector, given by the Neyman-Pearson detector, could in principle be used as a benchmark. However it is not implementable in the presence of nuisance parameters, and this its performance cannot be guaranteed to be achievable.
Second, in certain scenarios the analysis of "good performing" detectors, such as the GLRT, may offer insights in the information a learning algorithm requires. One simple example, if the GLRT detector is a function only of the largest eigenvalue of the empirical covariance matrix, this parameter is a good input to a learning algorithm. Hence the algorithm does not need to process the whole data set, what may be computationally unfeasible.
Cognitive radio community has focused mainly in clean and ideal problems, which conducted to an extensive set of algorithms and mathematical tools. Can these be translated to more sophisticated system problems, such as the ones one expect to find in real environments?
As Volkan pointed out, WiFi can always deal with simple scenarios. However when there are 570 Wi-Fi base stations operating in one room all these uncoordinated networks crash. In my opinion this "worst case" should be taken always into account when thinking about cognitive radio algorithms. Moreover, the empirical results using test-beds are so far quite limited and should be promoted.
These are just some ideas. Several other questions come to my mind, for example, if we are focusing too much in a specific application (why cognitive radio?), connections between academia and industry (is there already an industry around cognitive radio?)... what do you think?
Some of the questions raised there can be particularized to cognitive radio. Here a couple of thoughts:
Cognitive radio research community has developed an extensive set of detectors for multiple system models. Have we achieved a detection performance close to what we can expect from a Cognitive Radio device?
As James Neel argued in one of his posts,
"that there’s waaay too many signal classification / detection papers and effort would be better spent on other aspects of learning about a radio’s environment."
In my opinion the answer is not so clear. First, in most practical detection problems there exists no clear performance limit that can be used as a reference for the available improvement margin. The optimal detector, given by the Neyman-Pearson detector, could in principle be used as a benchmark. However it is not implementable in the presence of nuisance parameters, and this its performance cannot be guaranteed to be achievable.
Second, in certain scenarios the analysis of "good performing" detectors, such as the GLRT, may offer insights in the information a learning algorithm requires. One simple example, if the GLRT detector is a function only of the largest eigenvalue of the empirical covariance matrix, this parameter is a good input to a learning algorithm. Hence the algorithm does not need to process the whole data set, what may be computationally unfeasible.
Cognitive radio community has focused mainly in clean and ideal problems, which conducted to an extensive set of algorithms and mathematical tools. Can these be translated to more sophisticated system problems, such as the ones one expect to find in real environments?
As Volkan pointed out, WiFi can always deal with simple scenarios. However when there are 570 Wi-Fi base stations operating in one room all these uncoordinated networks crash. In my opinion this "worst case" should be taken always into account when thinking about cognitive radio algorithms. Moreover, the empirical results using test-beds are so far quite limited and should be promoted.
These are just some ideas. Several other questions come to my mind, for example, if we are focusing too much in a specific application (why cognitive radio?), connections between academia and industry (is there already an industry around cognitive radio?)... what do you think?
[DH+10] | M. Dohler, R.W. Heath Jr., A. Lozano, C. Papadias, R.A. Valenzuela, Is the PHY Layer Dead? IEEE Communications Magazine, 2010. |
Labels: cognitive radio, phy, research
0 Comments:
Post a Comment
Note: Only a member of this blog may post a comment.
Subscribe to Post Comments [Atom]
<< Back to blog's front page.