OK, I ran 2 irtt tests overnight at 4am local. SQM was disabled, network was quiet.

First test was run with --dscp=0xfe -i3ms -d5m, so default (small packets):

Total rtt latency is red, receive latency is blue, send latency is green, packet loss % is orange (I did a shifted rolling average here for visualization), and the small brown dots on the bottom indicate the seconds of the Starlink optimization/shift (although pretty obvious from the graph).


                         Min     Mean   Median      Max   Stddev
                         ---     ----   ------      ---   ------
                RTT  57.76ms  84.43ms  83.12ms    200ms  12.71ms
         send delay  15.19ms  33.28ms  31.52ms  146.5ms  10.49ms
      receive delay  39.45ms  51.15ms   51.2ms  105.7ms   5.78ms
                                                                
      IPDV (jitter)    142µs   4.12ms   2.99ms  83.71ms   3.75ms
          send IPDV   6.54µs   3.96ms   2.99ms  81.36ms   3.29ms
       receive IPDV       0s    732µs   26.5µs  46.18ms    2.3ms
                                                                
     send call time   5.68µs   54.6µs            1.77ms     25µs
        timer error       0s   29.3µs           14.58ms   76.8µs
  server proc. time    620ns      3µs            86.5µs   3.48µs

                duration: 5m1s (wait 599.9ms)
   packets sent/received: 99670/87038 (12.67% loss)
 server packets received: 87089/99670 (12.62%/0.06% loss up/down)
     bytes sent/received: 5980200/5222280
       send/receive rate: 159.5 Kbps / 139.3 Kbps
           packet length: 60 bytes
             timer stats: 328/99998 (0.33%) missed, 0.98% error

Then here is the second run with packet size at its maximum --dscp=0xfe -i3ms -d5m -l 1472

                         Min     Mean   Median      Max   Stddev
                         ---     ----   ------      ---   ------
                RTT  59.64ms  115.7ms  82.63ms  501.4ms   97.9ms
         send delay  14.22ms  66.15ms  33.48ms  448.7ms  96.96ms
      receive delay  40.54ms  49.55ms  49.24ms  107.7ms   5.38ms
                                                                
      IPDV (jitter)   35.6µs   4.01ms   2.98ms  111.2ms   3.63ms
          send IPDV     60ns   3.61ms   2.97ms  99.64ms    3.4ms
       receive IPDV       0s    997µs   48.1µs  52.83ms   2.16ms
                                                                
     send call time   9.68µs   66.7µs             2.5ms   22.5µs
        timer error      1ns   36.8µs            8.12ms   51.6µs
  server proc. time    620ns   2.85µs             199µs    2.8µs

                duration: 5m2s (wait 1.5s)
   packets sent/received: 99704/94276 (5.44% loss)
 server packets received: 94335/99704 (5.38%/0.06% loss up/down)
     bytes sent/received: 146764288/138774272
       send/receive rate: 3.91 Mbps / 3.70 Mbps
           packet length: 1472 bytes
             timer stats: 296/100000 (0.30%) missed, 1.23% error

In hindsight I'm wondering if I should have used a smaller packet size than the maximum. From my Starlink data logger I can see that it was uploading a little over 4 Mbps at that time, so perhaps it was hitting the upper limit of upload bandwidth available on a couple satellites, hence the send latency spike? But that's also really interesting, like toward the end it just spiked in one 15 second block and then immediately came back down.

What if Starlink optimizes bandwidth levels at those 15 second intervals, so when dishy switches to a satellite it has a set bandwidth level (based on TDMA timeslots or however Starlink operates) for that entire 15 second interval?

If you or anyone wants to play with the data more, you can download the irtt json output here and the script I used to generate these graphs:

import pandas as pd
import numpy as np
import matplotlib.pyplot as plt
import matplotlib.dates as mdates
import math
import json
import zipfile
import sys


if len(sys.argv) < 3:
    print('Usage:', sys.argv[0], 'input_filename.json output_filename.svg')
    exit(-1)
    
input_filename = sys.argv[1]
output_filename = sys.argv[2]

f = open(sys.argv[1])
data = json.load(f)

round_trips = data['round_trips']
rtts = []
receive_latency = []
send_latency = []
ts = []
index = []
lost_packet = []
count = 0
for round_trip in round_trips:
    ts.append(round_trip['timestamps']['client']['send']['wall'])
    if round_trip['lost'] == 'false':
        rtts.append(round_trip['delay']['rtt']/1000000)
        receive_latency.append(round_trip['delay']['receive']/1000000)
        send_latency.append(round_trip['delay']['send']/1000000)
        lost_packet.append(0)
    else:
        #rtts.append(-1)
        rtts.append(np.nan)
        receive_latency.append(np.nan)
        send_latency.append(np.nan)
        lost_packet.append(1)
    index.append(count)
    count = count + 1

df = pd.DataFrame()
df['rtts'] = rtts
df['receive_latency'] = receive_latency
df['send_latency'] = send_latency
df['ts'] = ts
df['lost_packet'] = lost_packet
df['rolling_min'] = df['rtts'].rolling(100, 10).min().shift(-100)
df['rolling_max'] = df['rtts'].rolling(100, 10).max().shift(-100)
df['rolling_mean'] = df['rtts'].rolling(100, 10).mean().shift(-100)
df['date'] = df['ts'].astype('datetime64[ns]')
#df['usecs_past_minute'] = df['ts'] % 60000000
#df['secs_past_minute'] = df['usecs_past_minute'] / 1000000
df['usecs_past_minute'] = df['date'].dt.microsecond
df['secs_past_minute'] = df['date'].dt.second
df['tenths_past_minute'] = df['secs_past_minute'] + round(df['usecs_past_minute'] / 1000000, 1)

df.loc[df['secs_past_minute'].isin([12,27,42,57]), 'starlink_switch'] = 1

print(df)

#timeData = df.groupby('secs_past_minute')['rtts'].sum()
timeData = df.groupby('tenths_past_minute')['rtts', 'lost_packet', 'receive_latency', 'send_latency'].mean()
with pd.option_context('display.max_rows', None,):
    print(timeData)

plt.figure()
plt.scatter(df['date'], df['rtts'], s=0.5, color='red')
#plt.scatter(df['date'], df['rolling_min'], color='red')
#plt.scatter(df['date'], df['rolling_max'], color='blue')
#plt.scatter(df['date'], df['rolling_mean'], color='green')
#plt.scatter(df['date'], df['tenths_past_minute'], color='blue')
plt.scatter(df['date'], df['receive_latency'], s=0.5, color='blue')
plt.scatter(df['date'], df['send_latency'], s=0.5, color='green')
plt.scatter(df['date'], df['lost_packet'].rolling(100).sum().shift(-100), s=0.5, color='orange')
plt.scatter(df['date'], df['starlink_switch'], s=2, color='brown')
plt.title('gba Atlanta Starlink RTT')
plt.xlabel('Time')
plt.ylabel('Latency (ms)')
#plt.xticks(rotation=45)
#plt.xticks(rotation=45)
plt.grid()
ax = plt.gca()
ax.xaxis.set_major_locator(mdates.SecondLocator(interval=1))
plt.savefig(output_filename)
plt.show()

1 Like