Estimating unknown signal in Wireless Sensor Networks (WSNs) requires sensor nodes to transmit their observations of the signal over a multiple access channel to a Fusion Center (FC). The FC uses the received observations, which is corrupted by observation noise and both channel fading and noise, to find the minimum Mean Square Error (MSE) estimate of the signal. In this paper, we investigate the effect of the source-node correlation (the correlation between sensor node observations and the source signal) and the inter-node correlation (the correlation between sensor node observations) on the performance of the Linear Minimum Mean Square Error (LMMSE) estimator for three correlation models in the presence of channel fading. First, we investigate the asymptotic behavior of the achieved distortion (i.e., MSE) resulting from both the observation and channel noise in a non-fading channel. Then, the effect of channel fading is considered and the corresponding distortion outage probability, the probability that the distortion exceeds a certain value, is found. By representing the distortion as a ratio of indefinite quadratic forms, a closed-form expression is derived for the outage probability that shows its dependency on the correlation. Finally, the new representation of the outage probability allows us to propose an iterative solution for the power allocation problem to minimize the outage probability under total and individual power constraints. Numerical simulations are provided to verify our analytic results. © 2013 IEEE.
ASJC Scopus subject areas
- Signal Processing
- Electrical and Electronic Engineering