Synchronizing Time Taggers via White Rabbit Technology

Processing data obtained from multiple remote locations is crucial in numerous high-precision applications, such as distributed scientific experiments, large-scale networks, financial operations and global positioning systems. In these contexts, achieving precise time synchronization is essential for accurate data fusion and analysis. Synchronizing multiple Time Taggers over a Network is therefore highly desirable.

Each Time Tagger has its own clock, which is very stable and allows for reaching specified timing jitter. However, the frequency of each device is likely to deviate slightly from others. Once we attempt to correlate signals from different Time Taggers, this small frequency difference will become very visible as it defines the time base of the recording. We can still perform such correlations if we ensure each devices’ time base is equal. This can be achieved by supplying a common external signal to each Time Tagger and external clock feature. In the case of remote locations, one needs to make sure to use stable and well-known reference frequencies from sources like atomic clocks or deliver a common clock signal by other means.

In this solution, we will provide a further guidance to set up a simple Network of synchronized Time Taggers supporting the information and the results presented in our recent Application Note .

What do we need?

We synchronize our remote Time Taggers using White Rabbit (WR) technology. Namely, we leverage the White Rabbit Lite Embedded Node (WR-LEN) components from Safran, that locally generate output signals (1PPS, 10 MHz), which the Time Taggers take as input for synchronization. The user can still follow the guide if uses any other component relying on a different technology (e.g. GNSSDO), as far as it outputs a Time (e.g. 1PPS) and Frequency (e.g. 10 MHz) signals. In the specific case, the WR-LEN is a solution that brings subnanosecond accuracy and picosecond precision to WR daisy chains where each WR-LEN board receives synchronization from a higher level of the hierarchy and provides it to a lower level. Within a time and phase transfer over a 1G optical fibers Ethernet, the WR-LEN provides dynamic calibration over distances up to tens kilometers and its scalability goes beyond. Moreover, the WR-LEN versatility allows the user to transfer time and frequency in Master-Slave and Master-Master modes.

  • 2x Time Taggers along with USB cables, Power Supplies and SMA cables.
  • 2x PCs and Ethernet cables.
  • 2x White Rabbit Lite Embedded Node (WR-LEN) boards from Safran (link) and 5V&1.2A DC Power Supply.
  • 1x SFP Module for 1000BASE-BX10 fiber optic interface (LC Simplex for 9/125 um cable) with 1Gb networks. ONU transceiver (1310nm Tx,1490nm Rx): e.g. AXGE-1254-0531 (blue) in link
  • 1x SFP Module for 1000BASE-BX10 fiber optic interface (LC Simplex for 9/125 um cable) with 1Gb networks. OLT transceiver (1490nm Tx, 1310nm Rx): e.g. AXGE-3454-0531 (purple) in link
  • Fiber Optic Cable Single-mode, Simplex, LC/APC-LC/APC (<50 m for test: link)

We demonstrate synchronization between Time Taggers using a test signal generated by our Pulse Streamer 8/2, split using a power splitter (e.g. link )

Experimental Protocol

First of all, the WR-LEN nodes need to be synchronized. To this end, we need to link them with the above-mentioned SFP transceivers and fiber cable. It is really important to use the proper SFP at the right port, i.e., the blue SFP at slave port and the purple one at the master side.

To set up the basic scheme, perform the following steps:

  • Firstly, we connect the SFP transceiver and fiber cable to synchronize the WR-LENs. Please notice that by default the WR-LEN is configured as slave at port 1 and as master at port 2 (see Figure 1 in the Application Note).
  • Then we feed the WR-LENs with the power supplies +5V connectors.
  • We wait a minute (more or less) to the slave device reaches the synchronization. The LED green above the SFP ports will start to blink.

We connect the 10 MHz and 1PPS signal from each node to the corresponding Time Tagger, as shown in Figure of the Application Note . We ensure the input voltages of these signals are within the specified signal input range of y Time Tagger model.

To demonstrate the synchronization between the Time Taggers, the idea is to input correlated test signals to the time taggers and measure the cross-correlation. In our test, our test signals originate from a split signal generated by one Pulse Streamer 8/2, which are therefore jitter-free.

Measurement

Our Time Tagger software is currently (version <= 2.17) unable to treat multiple remote Time Taggers (not connected with our Synchronizer) as a single synchronized group. Therefore, when analyzing data from multiple remote Time Taggers, it is required to first save all incoming data and merge them in a single file. Only then, it is possible to process the data using our TimeTaggerVirtual class, which takes as input a file containing all the merged data.

In this section, we use the White Rabbit Network for clock distribution, while we use our local Network for data transfer. Please see the dedicated section if you want to transmit data via the optical link instead.

The first step is to start a Time Tagger server at the Server PC. This can be done using our GUI Time Tagger Lab or the Web Interface, or programmatically, e.g. employing Python. In our installation folder we provide a dedicated example (server.py). The default path is: C:\Program Files\Swabian Instruments\Time Tagger\examples\python\8-NetworkTimeTagger. The same example is also available for Matlab and Labview.

After runnig this script, we need to create a Time Tagger and a TimeTaggerNetwork instances in the client PC. We can also use the scanTimeTaggerServer() function to search for Time Tagger servers in the local network, as done in the client.py example.


#Connect to the Time Taggers
tt_master = createTimeTagger('', Resolution.Standard)
# Create a TimeTaggerNetwork instance and connect to the server
tt_slave = createTimeTaggerNetwork('172.16.42.236:41101')

At this stage we generate out test signal. We stream an endless sequence of rectangle pulses with 100 ns period (10 MHz). Next, we declare the input channels of the Time Taggers used, we adjust the hardware settings and enable the software clock.


# You might want to generate the test signal here

# Use the same channels for the two Time Taggers
test_signal = 14 # Split test signal
WR_PPS = 9 # 1PPS signal
WR_clock = 5 # 10 MHz clock

# Adjust the Hardware settings as you like
tt_master.setTriggerLevel(test_signal, 0.3)
tt_slave.setTriggerLevel(test_signal, 0.3)

# Define the time base using the software clock on the WR clock channels
tt_master.setSoftwareClock(input_channel=WR_clock, input_frequency=1e7, averaging_periods=1e3)
tt_slave.setSoftwareClock(input_channel=WR_clock, input_frequency=1e7, averaging_periods=1e3)

Since we are not interested in permanently store the raw time tags, we create a temporary folder as storage fot the FileWriter files. Then, we dump the time tags.


tempdir = tempfile.TemporaryDirectory()
filename_master = tempdir.name + os.sep + "tagger_master.ttbin"
filename_slave = tempdir.name + os.sep + "tagger_slave.ttbin"
filename_merged = tempdir.name + os.sep + "tagger_merged.ttbin"
filenames = [filename_master, filename_slave]

# Dump the Time Tags in the two ttbin files
fw_master = FileWriter(tagger=tt_master, filename=filenames[0], channels=[test_signal, WR_PPS])
fw_slave = FileWriter(tagger=tt_slave, filename=filenames[1], channels=[test_signal, WR_PPS])

print('Writing')
sleep(10) # import it from python time module

del fw_master
del fw_slave

The timestamp of each event is relative to the initialization of the Time Tagger detecting it. Since we create the two Time Taggers at the software level in two different instants, we need to re-align the corresponding time bases. To determine the time offset between the time bases, we use the first occurrence of the 1PPS signals.


# Read the first occurence of 1PPS signals
first_PPS = []
for file in filenames:
        fr = FileReader(file)
        data = fr.getData(1e8)
        channels, timestamps = data.getChannels(), data.getTimestamps()
        index = np.argmax(channels == WR_PPS) #Find the first occurence of the PPS signal
        first_PPS.append(timestamps[index])
print("First occurence of 1PPS are respecively at: ", first_PPS, "ps")
del fr

Now, we are ready to merge the ttbin files shifting the timestamps of each of them by the corresponding timestamp value of the first PPS signal occurrence.


# Merge the ttbin files using 1PPS signals offeset
channel_offsets = [0, 100]
mergeStreamFiles(filename_merged,
                        input_filenames=filenames,
                        channel_offsets=channel_offsets,
                        time_offsets=-first_PPS,
                        overlap_only=False)

# Change the number of channels for the 2 TT
[test_signal_master, test_signal_slave] = [x+y for x,y in zip([test_signal, test_signal], channel_offsets)]

freeTimeTagger(tt_master)

Finally, we replay the time tags to measure the cross-correlation using the TimeTaggerVirtual.


# Measure the correlation offline using TimeTaggerVirtual
virtual_tagger = createTimeTaggerVirtual()

corr_replay = Correlation(tagger=virtual_tagger,
                                        channel_1=test_signal_slave,
                                        channel_2=test_signal_master, binwidth=1,n_bins=1000)

virtual_tagger.setReplaySpeed(speed=-1)
# Start the replay
virtual_tagger.replay(file=filename_merged)

virtual_tagger.waitForCompletion()
index = corr_replay.getIndex()
data = corr_replay.getData()

Please note that remote synchronization of Time Taggers can also be achieved in other ways, such as with GPS-disciplined oscillators. In this case the accuracy is above 1 ns, therefore the PPS signal can be ten of ns far away. Moreover using diffent cables length gives rise to an external delay that needs to be compensated. In these scenarios, it is more convenient to replay a first coarse correlation with larger binwidth, in order to quantify the interchannels delay as the position of the Correlation peak. This value can be used to better align the correlated signals using setInputDelay() method. Afterward, a finer correlation can be replayed using a finer binwidth.

Time jitter using different Time Tagger models

We quantify the time error across Time Taggers 300 apart, synchronized using WR-LEN end nodes, for different models and Resolution options. For this test, we apply 1 MHz-square wave, 1 Vpp, 1 ns rise time, applied to one input channel of each Time Tagger using a power splitter, and set the trigger 50%. The standard deviation of the distribution measures the jitter of two remote input channels. The RMS jitter of each individual channel is σ/√2.

RMS jitterStandardHighResAHighResBHighResC
Time Tagger Ultra6.8 ps6.6 ps5.9 ps5.4 ps
Time Tagger X3.0 ps-2.7 ps-
Table: Results obtained using TTU with hardware revision 1.7 and 1.8, while both the TTX have hardware revision 1.2.

Figure 1: Correlation between two channels of remote Time Tagger Ultra, synchronized using WR-LEN, obtained using the parameters specified in the main text.

Data Transfer over the Local Area Network/White Rabbit Network

The White Rabbit network integrates clock distribution and data transmission into a single layer, presenting a distinct advantage over other technologies, e.g. GPS-disciplined clock oscillators. This unified approach enhances precision and reliability while simplifying the infrastructure, making it more efficient and easier to manage. The ability to synchronize time and transmit data concurrently sets the White Rabbit network apart, providing superior performance and reducing the complexity typically associated with separate systems.

The data transmission over the optical link (WR) can be tested disconnecting the PCs from the local network and connect each WR-LEN node to the PC with Ethernet cable. It is necessary to assign static IP addresses to the PCs with the same subnet. According to our test, based on the trasnfer_rate.py example that is in the installation folder (default path: C:\Program Files\Swabian Instruments\Time Tagger\examples\python\6-Various-Examples), the maximum data rate over White Network is around 13 MTags/s.

Higher data rates can be achieved using a local network for data transmission. If the local network uses 1 Gbit/s technology, the data rate that one can achieve is something between 25-30 MTags/s of the total outgoing data rate from the server. The 25 MTags/s bandwidth is share between all the clients.

If the local network uses 10 Gbit/s technology, we tested that the each client gets up to 40 MTags/s. In this case, the full bandwidth is 300 MTag/s, so multiple clients can receive the data at 40 MTag/s rate.

Include a White Rabbit Switch for larger Network

Including a White Rabbit Switch (WRS) is essential for expanding the network to accommodate more components. However, it is important to consider that adding additional layers of White Rabbit components, such as switches and nodes, can increase time jitter. This increase results from the cumulative effects of multiple phase-locked loops (PLLs) and other synchronization mechanisms in each layer. While White Rabbit technology maintains subnanosecond accuracy and picosecond precision, these added layers can introduce minor delays that impact overall timing precision.

In this experiment, we use the White Rabbit Switch (Low Jitter version) from Safran to distribute the clock to two WR-LEN end nodes using two 300 m-optical fibers. In this topology, the WRS acts as the master, while the nodes act as the slave. As before, 10 MHz and 1PPS signals are used to synchronize the two remote Time Taggers. For two Time Tagger X, we achieve a RMS time jitter of 3.5 ps per channel. This demonstrates that adding a second layer, very slightly degrades the time precision, while allowing a network of up to 18 nodes directly connected to the Switch.

Figure 2: Correlation between two channels of remote Time Tagger X (HighRes), synchronized using two WR-LEN, connected to a WRS with 300 m optical link. The parameters of the test signal and the settings of the Time Taggers are the same as in the previous paragraphs.

Cookie Policy

We use third party service providers, like Freshworks Inc ("Freshworks") to enable interaction with you on our website and/or our product. As a data processor acting on our behalf, Freshworks automatically receives and records certain information of yours like device model, IP address, the type of browser being used and usage pattern through cookies and browser settings. Freshworks performs analytics on such data on our behalf which helps us improve our service to you. You can read about the cookies Freshworks' sets in their cookie policy here.