Commit 401179ea authored by Alexandre Duret-Lutz's avatar Alexandre Duret-Lutz
Browse files

Delete the cutscc algorithms.

These were used in old experiments, but have not turned useful in
practice.  Not worth keeping and maintaining.

* src/tgbaalgos/cutscc.cc, src/tgbaalgos/cutscc.hh: Delete.
* bench/scc-stats/, bench/split-product/: Delete.
* configure.ac, src/tgbaalgos/Makefile.am, README, bench/Makefile.am:
Adjust.
parent 1a93166d
......@@ -181,8 +181,6 @@ bench/ Benchmarks for ...
ltl2tgba/ ... LTL-to-Büchi translation algorithms,
ltlcounter/ ... translation of a class of LTL formulae,
ltlclasses/ ... translation of more classes of LTL formulae,
scc-stats/ ... SCC statistics after translation of LTL formulae,
split-product/ ... parallelizing gain after splitting LTL automata,
spin13/ ... compositional suspension and other improvements,
wdba/ ... WDBA minimization (for obligation properties).
wrap/ Wrappers for other languages.
......
## Copyright (C) 2008, 2009, 2010, 2012, 2013 Laboratoire de Recherche
## Copyright (C) 2008, 2009, 2010, 2012, 2013, 2014 Laboratoire de Recherche
## et Dveloppement de l'Epita (LRDE).
## Copyright (C) 2005 Laboratoire d'Informatique de Paris 6 (LIP6),
## dpartement Systmes Rpartis Coopratifs (SRC), Universit Pierre
......@@ -19,5 +19,4 @@
## You should have received a copy of the GNU General Public License
## along with this program. If not, see <http://www.gnu.org/licenses/>.
SUBDIRS = emptchk ltl2tgba scc-stats split-product ltlcounter \
ltlclasses wdba spin13 dtgbasat
SUBDIRS = emptchk ltl2tgba ltlcounter ltlclasses wdba spin13 dtgbasat
## Copyright (C) 2009 Laboratoire de Recherche et Développement
## de l'Epita (LRDE).
##
## This file is part of Spot, a model checking library.
##
## Spot is free software; you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation; either version 3 of the License, or
## (at your option) any later version.
##
## Spot is distributed in the hope that it will be useful, but WITHOUT
## ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
## or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public
## License for more details.
##
## You should have received a copy of the GNU General Public License
## along with this program. If not, see <http://www.gnu.org/licenses/>.
AM_CPPFLAGS = -I$(srcdir)/../../src $(BUDDY_CPPFLAGS)
AM_CXXFLAGS = $(WARNING_CXXFLAGS)
LDADD = ../../src/libspot.la
noinst_PROGRAMS = \
stats
stats_SOURCES = stats.cc
bench: $(noinst_PROGRAMS)
./stats $(srcdir)/formulae.ltl
This directory contains the input files and test program used to compute basic
statistics on TGBA.
==========
CONTENTS
==========
This directory contains:
* formulae.ltl
A list of 96 handwritten formulae with their negations. They come
from three sources:
@InProceedings{ dwyer.98.fmsp,
author = {Matthew B. Dwyer and George S. Avrunin and James C.
Corbett},
title = {Property Specification Patterns for Finite-state
Verification},
booktitle = {Proceedings of the 2nd Workshop on Formal Methods in
Software Practice (FMSP'98)},
publisher = {ACM Press},
address = {New York},
editor = {Mark Ardis},
month = mar,
year = {1998},
pages = {7--15}
}
@InProceedings{ etessami.00.concur,
author = {Kousha Etessami and Gerard J. Holzmann},
title = {Optimizing {B\"u}chi Automata},
booktitle = {Proceedings of the 11th International Conference on
Concurrency Theory (Concur'00)},
pages = {153--167},
year = {2000},
editor = {C. Palamidessi},
volume = {1877},
series = {Lecture Notes in Computer Science},
address = {Pennsylvania, USA},
publisher = {Springer-Verlag}
}
@InProceedings{ somenzi.00.cav,
author = {Fabio Somenzi and Roderick Bloem},
title = {Efficient {B\"u}chi Automata for {LTL} Formul{\ae}},
booktitle = {Proceedings of the 12th International Conference on
Computer Aided Verification (CAV'00)},
pages = {247--263},
year = {2000},
volume = {1855},
series = {Lecture Notes in Computer Science},
address = {Chicago, Illinois, USA},
publisher = {Springer-Verlag}
}
* full.ltl
A list of 1000 large randomly generated LTL formulae with ./randtgba.
=======
USAGE
=======
Use the stats program.
Usage : ./stats ltl_file
Where ltl_file is a file with a single LTL formula per line.
==========================
INTERPRETING THE RESULTS
==========================
Results can be found in file 'results'.
Here is the list of the measured values:
- Accepting Strongly Connected Components.
Total number of accepting SCC.
- Dead Strongly Connected Components.
Total number of dead SCC.
An SCC is dead if no accepting SCC is reachable from it.
- Accepting Paths.
Number of maximal accepting paths.
An path is maximal and accepting if it ends in an accepting
SCC that is only dead (i.e. non accepting) successors, or no
successors at all.
- Dead Paths.
Number of paths to a terminal dead SCC.
A terminal dead SCC is a dead SCC without successors.
- Max Effective Splitting.
A clue to measure the potential effectiveness of splitting the formula.
This is the maximum number of possible sub automata with unique states.
Beyond this threshold, more sub automata can be generated, but all their
states will be included in some of the previous automata.
- Self Loops per State.
Number of self loops.
A self loop is a transition from a state to itself.
For each of these measured value, we provide the following statistics
computed on all the formulae in ltl_file:\
- Min
- Max
- Mean
- Median
- Standard Deviation
\ No newline at end of file
[](!p0)
<>p1 -> (!p0 U p1)
[](p2 -> [](!p0))
[]((p2 & !p1 & <>p1) -> (!p0 U p1))
[](p2 & !p1 -> (!p0 U (p1 | []!p0)))
<>(p0)
!p1 U ((p0 & !p1) | []!p1)
[](!p2) | <>(p2 & <>p0)
[](p2 & !p1 -> (!p1 U ((p0 & !p1) | []!p1)))
[](p2 & !p1 -> (!p1 U (p0 & !p1)))
<>p1 -> ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | (!p0 U p1)))))))))
[]((p2 & <>p1) -> ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | (!p0 U p1))))))))))
[](p2 -> ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | ((!p0 & !p1) U (p1 | ((p0 & !p1) U (p1 | (!p0 U (p1 | []!p0)) | []p0)))))))))
[](p0)
<>p1 -> (p0 U p1)
[](p2 -> [](p0))
[]((p2 & !p1 & <>p1) -> (p0 U p1))
[](p2 & !p1 -> (p0 U (p1 | [] p0)))
!p0 U (p3 | []!p0)
<>p1 -> (!p0 U (p3 | p1))
[]!p2 | <>(p2 & (!p0 U (p3 | []!p0)))
[]((p2 & !p1 & <>p1) -> (!p0 U (p3 | p1)))
[](p2 & !p1 -> (!p0 U ((p3 | p1) | []!p0)))
[](p0 -> <>p3)
<>p1 -> (p0 -> (!p1 U (p3 & !p1))) U p1
[](p2 -> [](p0 -> <>p3))
[]((p2 & !p1 & <>p1) -> (p0 -> (!p1 U (p3 & !p1))) U p1)
[](p2 & !p1 -> ((p0 -> (!p1 U (p3 & !p1))) U (p1 | [](p0 -> (!p1 U (p3 & !p1))))))
<>p0 -> (!p0 U (p3 & !p0 & X(!p0 U p4)))
<>p1 -> (!p0 U (p1 | (p3 & !p0 & X(!p0 U p4))))
([]!p2) | (!p2 U (p2 & <>p0 -> (!p0 U (p3 & !p0 & X(!p0 U p4)))))
[]((p2 & <>p1) -> (!p0 U (p1 | (p3 & !p0 & X(!p0 U p4)))))
[](p2 -> (<>p0 -> (!p0 U (p1 | (p3 & !p0 & X(!p0 U p4))))))
(<>(p3 & X<>p4)) -> ((!p3) U p0)
<>p1 -> ((!(p3 & (!p1) & X(!p1 U (p4 & !p1)))) U (p1 | p0))
([]!p2) | ((!p2) U (p2 & ((<>(p3 & X<>p4)) -> ((!p3) U p0))))
[]((p2 & <>p1) -> ((!(p3 & (!p1) & X(!p1 U (p4 & !p1)))) U (p1 | p0)))
[](p2 -> (!(p3 & (!p1) & X(!p1 U (p4 & !p1))) U (p1 | p0) | [](!(p3 & X<>p4))))
[] (p3 & X<> p4 -> X(<>(p4 & <> p0)))
<>p1 -> (p3 & X(!p1 U p4) -> X(!p1 U (p4 & <> p0))) U p1
[] (p2 -> [] (p3 & X<> p4 -> X(!p4 U (p4 & <> p0))))
[] ((p2 & <>p1) -> (p3 & X(!p1 U p4) -> X(!p1 U (p4 & <> p0))) U p1)
[] (p2 -> (p3 & X(!p1 U p4) -> X(!p1 U (p4 & <> p0))) U (p1 | [] (p3 & X(!p1 U p4) -> X(!p1 U (p4 & <> p0)))))
[] (p0 -> <>(p3 & X<>p4))
<>p1 -> (p0 -> (!p1 U (p3 & !p1 & X(!p1 U p4)))) U p1
[] (p2 -> [] (p0 -> (p3 & X<> p4)))
[] ((p2 & <>p1) -> (p0 -> (!p1 U (p3 & !p1 & X(!p1 U p4)))) U p1)
[] (p2 -> (p0 -> (!p1 U (p3 & !p1 & X(!p1 U p4)))) U (p1 | [] (p0 -> (p3 & X<> p4))))
[] (p0 -> <>(p3 & !p5 & X(!p5 U p4)))
<>p1 -> (p0 -> (!p1 U (p3 & !p1 & !p5 & X((!p1 & !p5) U p4)))) U p1
[] (p2 -> [] (p0 -> (p3 & !p5 & X(!p5 U p4))))
[] ((p2 & <>p1) -> (p0 -> (!p1 U (p3 & !p1 & !p5 & X((!p1 & !p5) U p4)))) U p1)
[] (p2 -> (p0 -> (!p1 U (p3 & !p1 & !p5 & X((!p1 & !p5) U p4)))) U (p1 | [] (p0 -> (p3 & !p5 & X(!p5 U p4)))))
!p0 U ((p0 U ((!p0 U ((p0 U ([]!p0 | []p0)) | []!p0)) | []!p0)) | []!p0)
<>p2 -> (!p2 U (p2 & (!p0 U ((p0 U ((!p0 U ((p0 U ([]!p0 | []p0)) | []!p0)) | []!p0)) | []!p0))))
\ No newline at end of file
// -*- coding: utf-8 -*-
// Copyright (C) 2009, 2010, 2012, 2013 Laboratoire de Recherche et
// Développement de l'Epita (LRDE).
//
// This file is part of Spot, a model checking library.
//
// Spot is free software; you can redistribute it and/or modify it
// under the terms of the GNU General Public License as published by
// the Free Software Foundation; either version 3 of the License, or
// (at your option) any later version.
//
// Spot is distributed in the hope that it will be useful, but WITHOUT
// ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
// or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public
// License for more details.
//
// You should have received a copy of the GNU General Public License
// along with this program. If not, see <http://www.gnu.org/licenses/>.
#include <queue>
#include <math.h>
#include "tgbaalgos/scc.hh"
#include "tgbaalgos/cutscc.hh"
#include "ltlparse/ltlfile.hh"
#include "tgbaalgos/ltl2tgba_fm.hh"
namespace spot
{
unsigned tgba_size(const tgba* a)
{
typedef std::unordered_set<const state*,
state_ptr_hash, state_ptr_equal> hash_type;
hash_type seen;
std::queue<state*> tovisit;
// Perform breadth-first search.
state* init = a->get_init_state();
tovisit.push(init);
seen.insert(init);
unsigned count = 0;
// While there are still states to visit.
while (!tovisit.empty())
{
++count;
state* cur = tovisit.front();
tovisit.pop();
tgba_succ_iterator* sit = a->succ_iter(cur);
for (sit->first(); !sit->done(); sit->next())
{
state* dst = sit->current_state();
// Is it a new state ?
if (seen.find(dst) == seen.end())
{
// Yes, register the successor for later processing.
tovisit.push(dst);
seen.insert(dst);
}
else
// No, free dst.
dst->destroy();
}
delete sit;
}
hash_type::iterator it2;
// Free visited states.
for (it2 = seen.begin(); it2 != seen.end(); it2++)
(*it2)->destroy();
return count;
}
}
void compute_and_print(std::vector<double>& v,
int count, std::ofstream& output)
{
int i;
double sum = 0.;
double mean;
double median;
double variance = 0.;
sum = 0;
// Compute mean: sigma(Xi)/n for i=0..n-1.
for (i = 0; i < count; i++)
sum += v[i];
mean = sum / count;
// Compute variance: sigma((Xi - mean)*(Xi - mean))/n for i=0..n-1.
for (i = 0; i < count; i++)
variance += (v[i] - mean)*(v[i] - mean);
variance = variance / count;
// Compute median: mean of (n-th/2) value and ((n-th/2)+1) value if n even
// else (n-th+1) value if n odd.
if (count % 2 == 0)
median = float(v[count/2] + v[(count/2)+1])/2;
else
median = v[(count+1)/2];
output << "\tMin = " << v[0] << std::endl;
output << "\tMax = " << v[count-1] << std::endl;
output << "\tMean = " << mean << std::endl;
output << "\tMedian = " << median << std::endl;
output << "\tStandard Deviation = " << sqrt(variance) << std::endl;
output << std::endl;
}
int main (int argc, char* argv[])
{
if (argc != 2)
{
std::cout << "Usage : ./stats file_name" << std::endl;
std::cout << "There must be one LTL formula per line." << std::endl;
return 1;
}
std::ofstream output;
output.open("results");
spot::bdd_dict* dict = new spot::bdd_dict();
unsigned count = 0;
std::vector<double> acc_scc;
std::vector<double> dead_scc;
std::vector<double> acc_paths;
std::vector<double> dead_paths;
std::vector<double> spanning_paths;
std::vector<double> self_loops;
unsigned k = 0;
// Get each LTL formula.
spot::ltl::ltl_file formulae(argv[1]);
while (const spot::ltl::formula* f = formulae.next())
{
++k;
spot::tgba* a = ltl_to_tgba_fm(f, dict, /* exprop */ true);
f->destroy();
// Get number of spanning paths.
spot::scc_map m (a);
m.build_map();
spot::state* initial_state = a->get_init_state();
unsigned init = m.scc_of_state(initial_state);
initial_state->destroy();
std::vector<std::vector<spot::sccs_set* > >* paths = find_paths(a, m);
unsigned spanning_count =spot::max_spanning_paths(&(*paths)[init], m);
spanning_paths.push_back(double(spanning_count));
// Get characteristics from automaton.
spot::scc_stats stat;
stat = build_scc_stats(a);
// Add those characteristics to our arrays.
acc_scc.push_back(double(stat.acc_scc));
dead_scc.push_back(double(stat.dead_scc));
acc_paths.push_back(double(stat.acc_paths));
dead_paths.push_back(double(stat.dead_paths));
self_loops.push_back(double(stat.self_loops)/tgba_size(a));
++count;
delete a;
unsigned i;
unsigned j;
for (i = 0; i < paths->size(); ++i)
for (j = 0; j < (*paths)[i].size(); ++j)
delete (*paths)[i][j];
delete paths;
}
if (count == 0)
{
std::cerr << "Nothing read." << std::endl;
exit(1);
}
// We could have inserted at the right place instead of
// sorting at the end.
// Sorting allows us to find the extrema and
// the median of the distribution.
sort(acc_scc.begin(), acc_scc.end());
sort(dead_scc.begin(), dead_scc.end());
sort(acc_paths.begin(), acc_paths.end());
sort(spanning_paths.begin(), spanning_paths.end());
sort(dead_paths.begin(), dead_paths.end());
sort(self_loops.begin(), self_loops.end());
output << "Parsed Formulae : " << count << std::endl << std::endl;
// Accepting SCCs
output << "Accepting SCCs:" << std::endl;
compute_and_print(acc_scc, count, output);
// Dead SCCs
output << "Dead SCCs:" << std::endl;
compute_and_print(dead_scc, count, output);
// Accepting Paths
output << "Accepting Paths:" << std::endl;
compute_and_print(acc_paths, count, output);
// Dead Paths
output << "Dead Paths:" << std::endl;
compute_and_print(dead_paths, count, output);
// Max Effective Splitting
output << "Max effective splitting:" << std::endl;
compute_and_print(spanning_paths, count, output);
// Self loops
output << "Self loops per State:" << std::endl;
compute_and_print(self_loops, count, output);
std::cout << "Statistics generated in file results." << std::endl;
output.close();
delete dict;
return 0;
}
## Copyright (C) 2009, 2010 Laboratoire de Recherche et Dveloppement
## de l'Epita (LRDE).
##
## This file is part of Spot, a model checking library.
##
## Spot is free software; you can redistribute it and/or modify it
## under the terms of the GNU General Public License as published by
## the Free Software Foundation; either version 3 of the License, or
## (at your option) any later version.
##
## Spot is distributed in the hope that it will be useful, but WITHOUT
## ANY WARRANTY; without even the implied warranty of MERCHANTABILITY
## or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public
## License for more details.
##
## You should have received a copy of the GNU General Public License
## along with this program. If not, see <http://www.gnu.org/licenses/>.
PML2TGBA = $(PERL) $(srcdir)/pml2tgba.pl
RES = cut-results
AM_CPPFLAGS = -I$(srcdir)/../../src $(BUDDY_CPPFLAGS)
AM_CXXFLAGS = $(WARNING_CXXFLAGS)
LDADD = ../../src/libspot.la
dist_noinst_SCRIPTS = \
pml2tgba.pl
noinst_PROGRAMS = \
cutscc
cutscc_SOURCES = cutscc.cc
nodist_noinst_DATA = \
models/cl3serv1.tgba \
models/cl3serv1R.tgba \
models/cl3serv3.tgba \
models/cl3serv3R.tgba \
models/eeaean1.tgba \
models/eeaean1R.tgba \
models/eeaean2.tgba \
models/eeaean2R.tgba \
models/leader.tgba \
models/leaderR.tgba \
models/mobile1.tgba \
models/mobile1R.tgba \
models/mobile2.tgba \
models/mobile2R.tgba \
models/zune.tgba \
models/zuneR.tgba
models/cl3serv1.tgba: $(srcdir)/models/cl3serv1.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/cl3serv1.pml w1 s1 >$@
models/cl3serv1R.tgba: $(srcdir)/models/cl3serv1.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/cl3serv1.pml w1 s1 >$@
models/cl3serv3.tgba: $(srcdir)/models/cl3serv3.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/cl3serv3.pml w1 s1 >$@
models/cl3serv3R.tgba: $(srcdir)/models/cl3serv3.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/cl3serv3.pml w1 s1 >$@
models/eeaean1.tgba: $(srcdir)/models/eeaean1.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/eeaean1.pml \
noLeader zeroLeads oneLeads twoLeads threeLeads >$@
models/eeaean1R.tgba: $(srcdir)/models/eeaean1.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/eeaean1.pml \
noLeader zeroLeads oneLeads twoLeads threeLeads >$@
models/eeaean2.tgba: $(srcdir)/models/eeaean2.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/eeaean2.pml \
noLeader zeroLeads oneLeads twoLeads threeLeads >$@
models/eeaean2R.tgba: $(srcdir)/models/eeaean2.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/eeaean2.pml \
noLeader zeroLeads oneLeads twoLeads threeLeads >$@
models/leader.tgba: $(srcdir)/models/leader.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/leader.pml \
elected noLeader oneLeader >$@
models/leaderR.tgba: $(srcdir)/models/leader.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/leader.pml \
elected noLeader oneLeader >$@
models/mobile1.tgba: $(srcdir)/models/mobile1.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/mobile1.pml \
p q r >$@
models/mobile1R.tgba: $(srcdir)/models/mobile1.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/mobile1.pml \
p q r >$@
models/mobile2.tgba: $(srcdir)/models/mobile2.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/mobile2.pml \
p q r >$@
models/mobile2R.tgba: $(srcdir)/models/mobile2.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/mobile2.pml \
p q r >$@
models/zune.tgba: $(srcdir)/models/zune.pml
$(mkdir_p) models
$(PML2TGBA) $(srcdir)/models/zune.pml \
zune_at_S zune_at_E >$@
models/zuneR.tgba: $(srcdir)/models/zune.pml
$(mkdir_p) models
$(PML2TGBA) -r $(srcdir)/models/zune.pml \
zune_at_S zune_at_E >$@
CLEANFILES = $(nodist_noinst_DATA)
bench: $(noinst_PROGRAMS)
mkdir cut-results 2> /dev/null || true
./cutscc models/clserv.ltl 4 models/cl3serv1.tgba > $(RES)/cl3serv1
./cutscc models/clserv.ltl 4 models/cl3serv1R.tgba > $(RES)/cl3serv1R
./cutscc models/clserv.ltl 4 models/cl3serv3.tgba > $(RES)/cl3serv3
./cutscc models/clserv.ltl 4 models/cl3serv3R.tgba > $(RES)/cl3serv3R
./cutscc models/eeaean.ltl 4 models/eeaean1.tgba > $(RES)/eeaean1
./cutscc models/eeaean.ltl 4 models/eeaean1R.tgba > $(RES)/eeaean1R
./cutscc models/eeaean.ltl 4 models/eeaean2.tgba > $(RES)/eeaean2
./cutscc models/eeaean.ltl 4 models/eeaean2R.tgba > $(RES)/eeaean2R
./cutscc models/leader.ltl 4 models/leader.tgba > $(RES)/leader
./cutscc models/leader.ltl 4 models/leaderR.tgba > $(RES)/leaderR
./cutscc models/mobile1.ltl 4 models/mobile1.tgba > $(RES)/mobile1
./cutscc models/mobile1.ltl 4 models/mobile1R.tgba > $(RES)/mobile1R
./cutscc models/mobile2.ltl 4 models/mobile2.tgba > $(RES)/mobile2
./cutscc models/mobile2.ltl 4 models/mobile2R.tgba > $(RES)/mobile2R
./cutscc models/zune.ltl 4 models/zune.tgba > $(RES)/zune
./cutscc models/zune.ltl 4 models/zuneR.tgba > $(RES)/zuneR
This directory contains the input files and test program used to produce
the measures for our new method consisting in splitting the automaton
corresponding to the formula in order to obtain smaller synchronised product.
This new method could lead to parallel computation of emptiness checks for
possibly faster results.
==========
CONTENTS
==========
This directory contains:
* models/cl3serv1.pml
* models/cl3serv3.pml
Two simple client/server promela examples.
* models/clserv.ltl
An LTL formula to verify on these examples.
* models/eeaean1.pml
* models/eeaean2.pml
Variations of the leader election protocol with extinction from
Tel, Introduction to Distributed Algorithms, 1994, Chapter 7. The
network in the model consists of three nodes. In Variant 1, the
same node wins every time, in Variant 2, each node gets a turn at
winning the election. These specifications were originally
distributed alongside
@InProceedings{ schwoon.05.tacas,
author = {Stefan Schwoon and Javier Esparza},
title = {A note on on-the-fly verification algorithms.},
booktitle = {Proceedings of the 11th International Conference
on Tools and Algorithms for the Construction and
Analysis of Systems
(TACAS'05)},
year = {2005},
series = {Lecture Notes in Computer Science},
publisher = {Springer-Verlag},
month = apr
}
* models/eeaean.ltl
Sample properties for the leader election protocols. These come from
@InProceedings{ geldenhuys.04.tacas,
author = {Jaco Geldenhuys and Antti Valmari},
title = {Tarjan's Algorithm Makes On-the-Fly {LTL} Verification
More Efficient},
booktitle = {Proceedings of the 10th International Conference on
Tools and Algorithms for the Construction and Analysis
of Systems
(TACAS'04)},
editor = {Kurt Jensen and Andreas Podelski},
pages = {205--219},
year = {2004},
publisher = {Springer-Verlag},
series = {Lecture Notes in Computer Science},
volume = {2988},