Current-Users archive

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index][Old Index]

Re: ntpd stratum 1 funny offset with NetBSD 6 branch



Frank Kardel wrote:

> The NMEA driver has a section that checks the relation between
> the time code and the PPS time stamps (refclock_ppsrelate).
> 
> This code attempts to determine if the last PPS time stamp matches
> the received timecode within bounds. This code is sensible to
> time1 (pps offset) and time2 (end of line offset).
> If the end of line offset (default 0) is far from the true end of line 
> offset
> this code may come to the wrong conclusions.
> 
> Maybe a short debugging session can help there - without more
> information  analysis is a bit elaborate here.
> 
> ntpd -d -d should shed some light on the actual
> receive time stamps. These should be compensated by time2.
> 
> I have not checked whether refclock_ppsrelate is correct.

I'm not convinced that toying with time2 is the answer, as the offset
past the second is relatively random.  Eg, 0.650 one time, 0.640 another
and 0.625 another.  But I'll change my mind by the end of this email. :)

I've changed the GPS config so that it only emit RMC sentences, and now
at least it's producing a sentence every second instead of a group of
sentences every two seconds.  The driver doco even goes as far as to say
"The driver expects the receiver to be set up to transmit at least one
supported sentence every second." so I think this was something that I
needed to do anyways.  Also for kicks I changed with PPS pulse to 100ms
instead of 40ms (more to prove that it would work!).

I've put some wide (wider than 80 columns I'm afraid) output below.
Lines starting with a "p" are a PPS state change (similar to pps-api's
output) and lines starting with a "n" are NMEA sentences.  The timestamp
consistently shows that the NMEA output appears about half a second
after the second.

Here's timestamp from "ntpd -d -d"

refclock_gtraw: fd 5 time 3583402981.660202 timecode 73 
$GPRMC,134301,A,3810.2921,S,14418.4055,E,000.0,000.0,210713,011.5,E,A*0F

It looks like the timestamp is added when the whole sentence is
received.  At 4800 baud, that sentence should take about 0.15 second.
This, aligned with receiving the sentence around 0.45 past the second
seems to add up nicely with the 0.600 - 0.650 offset I regularly see.

A little further digging shows we're returning from refclock_ppsrelate()
always here:

        if (fabs(delta) > 0.45)
                return PPS_RELATE_EDGE; /* cannot PLL with atom code */

This is the 0.45 second case that Frank pointed out earlier, so thus
we're always ignoring the PPS timestamp.

Going back to time2 - is it just a case of "get it close enough that the
PPS is valid and we'll use the PPS anyway"?  I'm still concerned that
I don't see the same offset all the time so any value for time2 isn't
going to be entirely accurate.

With "fudge time2 -0.650" I get:

     remote           refid      st t when poll reach   delay   offset  jitter
==============================================================================
 127.127.20.0    .PPS.            0 l   11   16  377    0.000  -1000.0   0.017
+192.168.0.1     27.50.90.253     3 u   30   64  177    0.491   -1.617   0.059
 192.168.0.42    192.168.0.1      4 u   28   64  177    0.463   -4.895   0.096
 128.184.218.53  169.254.0.1      3 u   28   64  177   22.199   -2.344   2.684
*116.66.160.39   130.234.255.83   2 u   32   64  177   18.926   -0.821  33.310
 202.127.210.37  118.143.17.82    2 u   32   64  177   20.045   -0.473  30.881

so we went the wrong way, but with "fudge time2 0.650" I get:

     remote           refid      st t when poll reach   delay   offset  jitter
==============================================================================
 127.127.20.0    .PPS.            0 l    7   16  377    0.000   -0.033   0.009
-192.168.0.1     27.50.90.253     3 u    7   64  377    0.452   -1.649   0.072
-192.168.0.42    192.168.0.1      4 u    9   64  377    0.461   -4.633   0.145
+202.191.108.73  47.187.174.51    2 u   12   64  377   26.623   -2.724   5.725
+202.60.94.15    116.66.160.39    3 u    4   64  377   40.921   -4.530  22.255
*27.54.95.11     218.100.43.70    2 u    5   64  377   53.308   -4.057  25.255

Aha - we at last seem to be on a winner!

I'll let it run overnight, but things are finally looking sane.

Cheers,
Simon.
--

p 1374413286.100783 > assert = 0.000000000  clear = 1374413286.100696632  0  1 
1374413286.100696564
n 1374413286.485185 > 
$GPRMC,132806,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*03
p 1374413287.000744 > assert = 1374413287.000694025  clear = 
1374413286.100696632  1  1 -0.899997393
p 1374413287.100742 > assert = 1374413287.000694025  clear = 
1374413287.100696459  1  2  0.100002434
n 1374413287.468114 > 
$GPRMC,132807,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*02
p 1374413288.000781 > assert = 1374413288.000693037  clear = 
1374413287.100696459  2  2 -0.899996578
p 1374413288.100790 > assert = 1374413288.000693037  clear = 
1374413288.100695657  2  3  0.100002620
n 1374413288.468119 > 
$GPRMC,132808,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*0D
p 1374413289.000765 > assert = 1374413289.000693531  clear = 
1374413288.100695657  3  3 -0.899997874
p 1374413289.100790 > assert = 1374413289.000693531  clear = 
1374413289.100694447  3  4  0.100000916
n 1374413289.473172 > 
$GPRMC,132809,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*0C
p 1374413290.000770 > assert = 1374413290.000692025  clear = 
1374413289.100694447  4  4 -0.899997578
p 1374413290.100739 > assert = 1374413290.000692025  clear = 
1374413290.100694496  4  5  0.100002471
n 1374413290.469093 > 
$GPRMC,132810,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*04
p 1374413291.000769 > assert = 1374413291.000692407  clear = 
1374413290.100694496  5  5 -0.899997911
p 1374413291.100778 > assert = 1374413291.000692407  clear = 
1374413291.100693879  5  6  0.100001472
n 1374413291.477791 > 
$GPRMC,132811,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*05
p 1374413292.000780 > assert = 1374413292.000690642  clear = 
1374413291.100693879  6  6 -0.899996763
p 1374413292.100748 > assert = 1374413292.000690642  clear = 
1374413292.100693187  6  7  0.100002545
n 1374413292.468145 > 
$GPRMC,132812,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*06
p 1374413293.000783 > assert = 1374413293.000690469  clear = 
1374413292.100693187  7  7 -0.899997282
p 1374413293.100760 > assert = 1374413293.000690469  clear = 
1374413293.100692644  7  8  0.100002175
n 1374413293.462064 > 
$GPRMC,132813,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*07
p 1374413294.000750 > assert = 1374413294.000689852  clear = 
1374413293.100692644  8  8 -0.899997208
p 1374413294.100759 > assert = 1374413294.000689852  clear = 
1374413294.100692545  8  9  0.100002693
n 1374413294.525224 > 
$GPRMC,132814,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*00
p 1374413295.000787 > assert = 1374413295.000689234  clear = 
1374413294.100692545  9  9 -0.899996689
p 1374413295.100771 > assert = 1374413295.000689234  clear = 
1374413295.100691817  9 10  0.100002583
n 1374413295.467089 > 
$GPRMC,132815,A,3810.2917,S,14418.4056,E,000.0,000.0,210713,011.5,E,A*01
p 1374413296.000735 > assert = 1374413296.000688135  clear = 
1374413295.100691817 10 10 -0.899996318
p 1374413296.100752 > assert = 1374413296.000688135  clear = 
1374413296.100690459 10 11  0.100002324



Home | Main Index | Thread Index | Old Index