Abstract:
The theoretical model and results of simulation of a communication system representing the physical layer of wireless sensor networks are presented in this paper. The spreading sequences used are generated according to the standard specification for low rate wireless sensor networks. Due to the sever influence of fading on signal transmission in this kind of networks, the chip interleaving technique is investigated as the mean for fading mitigation in the channel. The theoretical derivations for bit error rate (BER) in closed form are derived for the case when noise, noise and fading and interleavers are present in the communication system. It is proved that the BER can be significantly improved in fading channel using the interleaver/deinterleaver technique. The theoretical analysis, derivations of the closed form BER expressions and simulations are based on discrete time domain representation of all signals in the system. These discrete time domain system representations of signals allow direct implementation of the developed system in digital technology which was one of the aims of this research. Following the theoretical model, simulators were developed and the results of simulation confirmed the derived theoretical expressions.