Summary
Feed latency is the time in seconds since new data from a channel was last written to disk or memory at the IRIS DMC. It is measured once every four hours for all real time data. Feed latency will be at or near zero when data is steadily being written to the data center disk buffer. Channels with lower sample rates generally have larger feed latencies since it takes longer to fill a SEED record.
Uses
Steadily increasing feed latency may indicate data feed loss on the sending end or a problem writing data on the receiving end. When data is backfilling, the data latency may be significant, but the feed latency will be near zero.
Data Analyzed
Traces – one N.S.L.C (Network.Station.Location.Channel) per measurement
Window – zero; each measurement represents an instantaneous measurement
Data Source – IRIS BUD near-real time data cache
SEED Channel Types – All Time Series Channels
Algorithm
- For a specified N.S.L.C, request the current
- Ta: time the latest sample became available on disk or in memory at the DMC,
- Tm: time that the latency metric measurement is calculated.
- Calculate and report the feed latency:
feed_latency = Tm – Ta
Metric Values Returned
value – feed latency in seconds
target – the trace analyzed, labeled as N.S.L.C.Q (Network.Station.Location.Channel.Quality)
start – Tm: time that the latency measurement was taken (UTC)
end – same as start
lddate – date/time the measurement was made and loaded into the MUSTANG database (UTC)
Notes
Data that was not archived as real-time data will not have latency metrics. All data in the real time feeds have SEED quality indicators of ‘R’ or ‘D’.