Artefact for: Automatic Synthesis of Transiently Correct Network Updates via Petri Games

  • Martin Didriksen (Ophavsperson)
  • Peter Gjøl Jensen (Ophavsperson)
  • Jonathan F. Jønler (Ophavsperson)
  • Andrei-Ioan Katona (Ophavsperson)
  • Sangey D.L. Lama (Ophavsperson)
  • Frederik B. Lottrup (Ophavsperson)
  • Shahab Shajarat (Ophavsperson)
  • Jiřı́ Srba (Ophavsperson)

Datasæt

Beskrivelse

This is the artefact for the paper "Automatic Synthesis of Transiently Correct Network Updates via Petri Games". Requirements The setup is build for running in an x86_64 architecture running Ubuntu 20.04 with python3 installed with the following pip3-package: networkx. The original experiments were conducted on AMD EPYC 7551 processors with hyperthreading disabled and limited to 25 GB of memory. Setup To create the folder structure needed for running the experiments, please start by running mkdirs.sh. The networkx library can be installed (in user space) by executing: pip3 install --user networkx All binaries are included and for repeatability we also include the source-code of the verifypn version used in these experiments. We use the following revision of the verifypn tool: https://bazaar.launchpad.net/~verifypn-cpn/verifypn/verifypn-games/revision/269 The source code of the revision 269 which is used for these experiments is located in the verifypn subfolder Running experiments All models come pre-build, the experiments can be executed sequentially by the following commands: ./solve_zoo.sh # experiments on the original, none-nested zoo-topologies ./solve_nested.sh # nested zoo topology experiments ./solve_synthetic.sh # experiments on synthetic networks Memory, time and execution-environment can be set by setting the variables MEMORY, TIME and EXECUTOR variables in bash prior to execution. For instance, to limit each single execution to 60 seconds, 500 MB of memory and run using sbatch the following can be used export MEMORY=$((500*1024)) # memory in KB export TIME=60 # time in seconds export EXECUTOR=sbatch # to run on a slurm-enabled cluster ./solve_zoo.sh # experiments on the original, none-nested zoo-topologies ./solve_nested.sh # nested zoo topology experiments ./solve_synthethic.sh # experiments on synthetic networks It will take more than 24 hours to complete the entire experimental setup with a one-hour timeout on a single core. Using a 10 second timeout, a substantial portion of the experiments can be repeated within a few hours. Data collection After the execution, data can be collected using the ./extract_all.sh script which will generate .csv files for each subfigure of Figure 8 of the paper. The data reported in the .csv-files is in milliseconds for time and kilobytes for memory. Notice that the data collection by ./extract_all.sh can take a few minutes to complete. Notice that the data-folder is pre-populated with the results from the paper - these will be overwritten by a subsequent execution. These .csv files can be turned into the graphs of the paper using the scripts provided in the plot-subfolder (this requires that gnuplot is installed). From here, plots can be generated by running plot_all.sh for time-plots, or plot_all_mem.sh for memory-plots (not shown in paper). Notice that Figure 9 from the paper corresponds to (a): cactus-disjoint.pdf, (b) cactus-dependent_single.pdf, (c) cactus-dependent_10.pdf, (d) cactus-dependent_5.pdf, (e) shared-single.pdf and (f) cactus-nested.pdf. The remaining plots are from experiments not presented in the paper as they show similar trends to those included in the paper. Notice that the conrete values given in the paper were found by manual inspection of the generated .csv files. We used a standard spreadsheet software, e.g. using the COUNTIF and AVERAGEIF-functions of libreoffice calc to help with this task. The raw result output is postfixed with the engine generating the given result-file. Computed update sequences can be found directly in these raw ouputs for netsynth while the sequences for verifypn are currently written to /dev/null as the verbose output format of verifypn make several of the strategies (implicitly update sequences) take up several gigabytes of space. The strategies of verifypn are still constructed and converted into textual representation and the overhead of doing so is included in the reported runtimes. Notice that there is no uniform output format of the update sequence provided at the moment. However, providing an API for communicating directly with verifypn is fairly straight forward. Lastly, consistency in the answers between netsynth and verifypn on the "nested" experiment can be checked by the consistent.sh script. Modifying the experiments Several parts of the experiments can be modified by changing the values of the generators. Notice that the artefact comes pre-loaded with a pre-generated set of models - so this step is optional. The Generate_Synthetic.py facilitates the generation of all the synthetic models of the paper. This script will fill the data/synthethic_json folder. The Generate_Nested.py constructs nested topologies from an existing set of .gml-files. Specifically it iterates through the contents of data/gml/ and randomly "subnets" networks into each other. The output is a new set of .gml files located in data/nested_gml/. The Generate_Json.py reads the folders data/gml/ and data/nested_gml/ and creates (by random) a set of synthesis-problems based on the input topologies. By default the "source" and "target"-routes generated are appended up to n times to make harder instances. The value n ranges from 1-5. Notice that Generate_Json.py can take more than a day to execute, given the rather brute-force nature of the optimization-problem solved for generating the random examples. The results of Generate_Json.py is placed in data/{zoo,nested}_json. Notice also that it is not guaranteed that a "sane" update synthesis problem is generated for all input .gml-files . To be a "sane" update synthesis problem, at least one waypoint must exist. Several attempts are made by the Generate_Json.py script to randomly generate such sane problems, however, this process can fail. The last step is translation. The Translate.py script converts json-files into both .pnml-files for verifypn and .ltl-files for netsynth. The results are placed in the data/{synthethic,zoo,nested}_{ltl,pn}/ folders.
Dato for tilgængelighed2021
ForlagZenodo

Citationsformater