**Background**

To reduce the cost of SoS model verification, slicing techniques can be considered. We proposed SoS GaP slicer—a slicing framework for SoS goal models and simulation models that is based on SoS changes. The changed parts and change-related parts of the SoS model are sliced by backward slicing, and the slices are given to the verification engine. SoS GaP slicer focuses on SoS change analysis, and it only uses verification properties that are related to change as the slicing criteria. However, SoS managers need to analyze SoS model based on other properties related to existence, absence, universality, transient states, steady states, and minimum duration.

Applying dynamic slicing on the probabilistic SoS simulation model can increase the efficiency of statistical verification by reducing size of the model. SMC and dynamic slicing are both performed on actual model executions. SMC verifies the model not by checking every execution path but rather by sampling several executions of the model. Slicing the probabilistic SoS simulation model based on several execution samples reduces the scope of the probabilistic SoS simulation model to be sliced. By contrast, if we perform static slicing on an entire SoS model, the cost of analysis will be substantially larger. However, existing dynamic slicing approaches do not generate executable and accurate slices of probabilistic SoS simulation models. JavaSlicer, a dynamic slicing tool, cannot generate executable slices because the purpose of the tool is to profile Java programs for parallelism. Another slicing approach, observation-based slicing (ORBS), compares the intended simulation trajectories to the modified simulation trajectories of the subject programs. When applying ORBS directly to the probabilistic SoS simulation models that have uncertainty, we cannot compare the simulation trajectories because they will vary between executions due to the uncertainty.

We propose a general analysis approach that slices probabilistic SoS simulation models based on user-interested variables, statements, and verification properties. We utilize dynamic backward slicing for generating accurate slices and the slices become executable by modified observation-based slicing. This hybrid approach complements the limitations of JavaSlicer and ORBS approaches. We have implemented the proposed hybrid approach as a model slicer module of Simulation-based Verification and Analysis (SIMVA) for SoS, an integrated tool for the statistical verification of SoS. The input of the proposed approach, the probabilistic SoS simulation model is a Java program, whereas the output is an executable probabilistic SoS simulation model slice that enables the efficient use of SMC.

**Overall Approach/Method**

*Figure: Overall approach*

The goal of the proposed approach is to generate an SoS simulation model slice that is based on the SoS verification properties or analysis criteria. The above figure shows the overall approach of the SoS simulation model slicer, which includes two parts: 1) dynamic backward slicing (the upper parts, which are indicated by dotted lines) and 2) modified observation-based slicing (the lower parts, which are indicated by solid lines). After performing modified observation-based slicing (mORBS) repeatedly, we can obtain executable probabilistic SoS simulation model slices as an output. Dynamic backward slicing provides an intermediate slice, which is used as the slicing criteria for the mORBS. The mORBS deletes the instructions (inst.) according to the intermediate slice information, not to the trajectory information.

Dynamic backward slicing is divided into four steps: inserting probes into the probabilistic SoS simulation model, extracting the simulated traces, slicing the traces based on the specified properties, and generating a closure slice of the probabilistic SoS simulation model.

The proposed approach inserts probes into the probabilistic SoS simulation model to analyze the control and data dependency graphs as well as to see which instructions are executed when the model is running. Probe insertion in the probabilistic SoS simulation model is mainly performed at the Java bytecode level. When the probabilistic SoS simulation model with the probes is executed, JavaSlicer outputs a simulation trace in the form of a log. Backward slicing is performed on the extracted simulation traces. The slicing criteria of the simulation traces can be either variables or statements for model analysis, and verification properties for statistical verification. Slicing based on variables or statements follows a general backward slicing technique. Slicing based on verification properties involves analyzing the variables that are included in the verification properties as well as performing backward slicing. As a result of the slicing, an intermediate result (a closure slice) is generated. The result of the dynamic backward slicing (the closure slice) becomes an input for the next step. As described earlier in Section III, ORBS repeats the deleting-observing-reverting procedure. The difference between the original ORBS and mORBS is shown in the deleting and observing steps. Unlike ORBS, the closure slice guides mORBS on which statements to delete in the deleting step. In the observing step, mORBS observes only the syntax errors in compilation instead of comparing trajectories of an SoS model. In this way, mORBS considers the random variables’ dependencies and can still generate executable models.

In the mORBS part, the instructions are deleted one by one from the last instruction of the probabilistic SoS simulation model. However, if the instruction exists in the closure slice, the proposed approach skips the deletion of the instruction and deletes the next instruction. The probabilistic SoS simulation model slice is compiled for execution with one instruction deleted. If compilation errors occur, the proposed approach reverts the deleted instruction. When the proposed approach compiles the probabilistic SoS simulation model, the compiled Java bytecode has lost some information, such as where curly-brackets ({,}) were written in the model. Thus, the closure slice does not have such information. This process repeats until there are no more instructions to delete. After completing all the steps of the slicing approach, an executable SoS simulation model slice is created with respect to the verification property.

**Results**

*Figure: Slicing result*

We compared the sliced model and the manually generated model slice, per the static slicing algorithm [13], line by line. Because ORBS is conducted on a line-by-line basis, a difference between the models can occur by the deletion of lines. We used the “compare” function of Notepad++ to perform the comparison. The left model in the above figure is the manually-generated model slice, and the right model is the SoS model slice that was generated by the proposed technique. Lines that were not in the model slice on the right when it was compared to the manually generated model on the left are colored in red. The formula for accuracy measure was as follows:

As a result, 96.55% of the lines were similar to the manually-generated model slice. The remaining 3.45% was caused by the characteristic of sampling execution. Although slicing the SoS model based on the simulation traces could improve the efficiency, it could also degrade to some extent in accuracy. For example, lines 18 to 22 in the above figure show that the patients occur in double, with a one-in-ten thousand chance. With the number of patients determined, the simulation is executed in lines 23 and 24. Per the algorithm of static backward slicing, line 21 should not be deleted (left-hand side in the above figure). However, the SoS model slice that is generated by the proposed technique deletes line 23 because that line would be executed with a rare probability; therefore, it is not in the simulation trace.

When slicing for verification rather than slicing for analysis, it is necessary to confirm whether the verification result of the model slice is the same as the verification result of the original model to confirm the accuracy. After slicing the original model based on slicing criteria and verifying the original and sliced models on the six verification properties, the verification results of the model slices and the original model were compared. the maximum difference of the verification results was 2%. We think that the difference between the two verification results was due to the characteristics of statistical verification. Because the input model of the statistical model checker has uncertainty, the verification results may vary slightly depending on the simulation executions.

**Related Publications**

- Song, Jiyoung, et al. “Slicing Executable System-of-Systems Models for Efficient Statistical Verification.”
*2019 SESoS*. IEEE, 2019.

**Related Tools**

- See <SoSSimulationModelSlicer>