CNES Subnet Compression on SWOT data
This dataset contains the raw data of the execution of the testbench on the SWOT mission products. They consist of four HDF5 files. • signal 1 (s1): synthetic data (427.8 MB) • signal 3D (s3D): synthetic data (4.2 MB) • 12 fields of real world measurement (10.4 MB): pixel_cloud: classification, coherent_power, cross_track, dheight_dphase, dlatitude_dphase, dlongitude_dphase, height, illumination_time, incidence_angle, latitude, longitude, pixel_area; • 7 fields of experimental data (11.4 MB): SWOT_L2: cross_track, height, illumination_time, latitude, longitude, pixel_area, range_index; • 9 fields of experimental data (5.2 MB): pixel_cloud: continuous_classification, num_med_looks, num_rare_looks, phase_noise_std, power_left, power_right, sigma0, x_factor_left, x_factor_right; • 1 field of experimental data (10.4 Mb) ifgram. In order to estimate the compression performance on each type of data, every single field had been extracted and compression/decompression was performed on each of them in a testbench derived from LZBench (see related links). Field extraction is performed using the HDFtools and generates a binary file encoded in the original format of the related field. Then a testbench derived from LZBench is performed on each field. It has been modified to add additional algorithms such as SLx. The results are compared to the in-memory data copy memcpy which provides no compression but maximum throughput – 4.2 GB/s measured The workstation used to execute the testbench is an Intel(R) Xeon(R) CPU E5-2620 v3 running at 2.40 GHz processor, 64 GB of memory. The codes in the "Related Links" Section were used to implement the testbench. The “Related Links” section provide the additional codes and articles we used to perform this work.
Steps to reproduce
To make it easier to dig into the data, we advise to use an SQL database to store the results, thus would allow the use of 'GROUP BY' SQL queries.