Subject: Re: CVS commit: src/sys/dev/ic
To: None <rpaulo@netbsd.org>
From: Greg Troxel <gdt@ir.bbn.com>
List: source-changes
Date: 06/09/2006 10:18:17
--=-=-=
Content-Transfer-Encoding: quoted-printable


Rui Paulo <rpaulo@netbsd.org> writes:

> Module Name:	src
> Committed By:	rpaulo
> Date:		Thu Jun  8 20:56:41 UTC 2006
>
> Modified Files:
> 	src/sys/dev/ic: rt2661.c rt2661var.h
>
> Log Message:
> Bring the following change from OpenBSD:
>
>   Keep track of the average RSSI using an Exponential Moving Average (EMA=
).
>   Use it to dynamically tune radio receive sensitivity.
>
>   The idea is simple:
>   - increase sensitivity when the RSSI is bad to optimize throughput on
>     long distance to the AP, and
>   - decrease sensitivity when the RSSI is good to reduce noise level and
>     optimize throughput on short distance to the AP
>
>   The EMA allows to smooth RSSI variations so we don't end up changing the
>   sensitivity too frequently.  We check if it would be worth updating the
>   sensitivity every one second.
>   RSSI thresholds were taken from the Ralink Tech. Linux driver.

In Amateur Radio receivers, there is commonly a preamplifier which can
be enabled or disabled, and a bit less often an attenuator which can
be inserted (the latter is sometimes called "Intercept Point
Optimization").  The basic issue is that amplifiers are nonlinear, and
that processing signals at higher levels leads to more nonlinearity.

Broadly, nonlinearity hurts in two ways:

  distortion because the intended signal is very strong

  distortion from off-channel signals resulting in on-channel energy

I don't know what the "sensitivity" adjustment in this chipset
actually does.  But lowering gain is likely to reduce distortion from
strong signals, and for a single intended receiver this approach seems
sensible (although it may not be helpful depending on the chipset
design).

For IBSS or hostap modes, the right approach is far less clear (as
others have already commented).  I expect the basic approach to be to
find the gain setting that results in least total impairment from
off-channel distortion and on-channel lack of sensitivity.  I don't
think this can be done from just rssi, since one is worried about
hearing from previously unknown nodes.  Perhaps one could adapt
SampleRate to receive....

At BBN I am working on a project that is developing a link state ad
hoc subnet layer and MAC intended for use with software radios.  At
all levels parameters are exposed for cognitive control.  So my
reaction is that this algorithm should be optional with a sysctl, and
that the sensitivity should be settable.

It's not clear how RSSI and sensitivity interact.
Ideally, the assessment of received signal level would be correct in
dBm regardless of sensitivity.  If RSSI is average energy at some
point in the receive chain, I'd expect it to decrease when sensitivity
is decreased.

While I was reading code: why is the raw rssi sent to the 802.11
layer, instead of the decoded rssi via rt2661_get_rssi?

=2D-=20
        Greg Troxel <gdt@ir.bbn.com>

--=-=-=
Content-Type: application/pgp-signature

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.4.3 (NetBSD)

iD8DBQFEiYMs+vesoDJhHiURAqy5AJ46ueQ/Ds6Zi52SWjJT6PdaGPdCpQCcCYwT
XOOm0EC4jSBjSu+sjfjoEDc=
=ILoG
-----END PGP SIGNATURE-----
--=-=-=--