User Guide

What is toulbar2

toulbar2 is an exact black box discrete optimization solver targeted at solving cost function networks (CFN), thus solving the so-called “weighted Constraint Satisfaction Problem” or WCSP. Cost function networks can be simply described by a set of discrete variables each having a specific finite domain and a set of integer cost functions, each involving some of the variables. The WCSP is to find an assignment of all variables such that the sum of all cost functions is minimum and lest than a given upper bound often denoted as \(k\) or \(\top\). Functions can be typically specified by sparse or full tables but also more concisely as specific functions called “global cost functions” [Schiex2016a].

Using on the fly translation, toulbar2 can also directly solve optimization problems on other graphical models such as Maximum probability Explanation (MPE) on Bayesian networks [koller2009], and Maximum A Posteriori (MAP) on Markov random field [koller2009]. It can also read partial weighted MaxSAT problems, Quadratic Pseudo Boolean problems (MAXCUT) as well as Linkage .pre pedigree files for genotyping error detection and correction.

toulbar2 is exact. It will only report an optimal solution when it has both identified the solution and proved its optimality. Because it relies only on integer operations, addition and subtraction, it does not suffer from rounding errors. In the general case, the WCSP, MPE/BN, MAP/MRF, PWMaxSAT, QPBO or MAXCUT being all NP-hard problems and thus toulbar2 may take exponential time to prove optimality. This is however a worst-case behavior and toulbar2 has been shown to be able to solve to optimality problems with half a million non Boolean variables defining a search space as large as \(2^{829,440}\). It may also fail to solve in reasonable time problems with a search space smaller than \(2^{264}\).

toulbar2 provides and uses by default an “anytime” algorithm [Katsirelos2015a] that tries to quickly provide good solutions together with an upper bound on the gap between the cost of each solution and the (unknown) optimal cost. Thus, even if it is unable to prove optimality, it will bound the quality of the solution provided. It can also apply a variable neighborhood search algorithm exploiting a problem decomposition [Ouali2017]. This algorithm is complete (if enough CPU-time is given) and it can be run in parallel using OpenMPI. A parallel version of previous algorithm also exists [Beldjilali2022].

Beyond the service of providing optimal solutions, toulbar2 can also find a greedy sequence of diverse solutions [Ruffini2019a] or exhaustively enumerate solutions below a cost threshold and perform guaranteed approximate weighted counting of solutions. For stochastic graphical models, this means that toulbar2 will compute the partition function (or the normalizing constant \(Z\)). These problems being #P-complete, toulbar2 runtimes can quickly increase on such problems.

By exploiting the new toulbar2 python interface, with incremental solving capabilities, it is possible to learn a CFN from data and to combine it with mandatory constraints [Schiex2020b]. See examples at https://forgemia.inra.fr/thomas.schiex/cfn-learn.

How do I install it ?

toulbar2 is an open source solver distributed under the MIT license as a set of C++ sources managed with git at http://github.com/toulbar2/toulbar2. If you want to use a released version, then you can download there source archives of a specific release that should be easy to compile on most Linux systems.

If you want to compile the latest sources yourself, you will need a modern C++ compiler, CMake, Gnu MP Bignum library, a recent version of boost libraries and optionally the jemalloc memory management and OpenMPI libraries (for more information, see Installation from sources). You can then clone toulbar2 on your machine and compile it by executing:

git clone https://github.com/toulbar2/toulbar2.git
cd toulbar2
mkdir build
cd build
# ccmake ..
cmake ..
make

Finally, toulbar2 is available in the debian-science section of the unstable/sid Debian version. It should therefore be directly installable using:

sudo apt-get install toulbar2

If you want to try toulbar2 on crafted, random, or real problems, please look for benchmarks in the Cost Function benchmark Section. Other benchmarks coming from various discrete optimization languages are available at Genotoul EvalGM [Hurley2016b].

How do I test it ?

Some problem examples are available in the directory toulbar2/validation. After compilation with cmake, it is possible to run a series of tests using:

make test

For debugging toulbar2 (compile with flag CMAKE_BUILD_TYPE="Debug"), more test examples are available at Cost Function Library. The following commands run toulbar2 (executable must be found on your system path) on every problems with a 1-hour time limit and compare their optimum with known optima (in .ub files).

cd toulbar2
git clone https://forgemia.inra.fr/thomas.schiex/cost-function-library.git
./misc/script/runall.sh ./cost-function-library/trunk/validation

Other tests on randomly generated problems can be done where optimal solutions are verified by using an older solver toolbar (executable must be found on your system path).

cd toulbar2
git clone https://forgemia.inra.fr/thomas.schiex/toolbar.git
cd toolbar/toolbar
make toolbar
cd ../..
./misc/script/rungenerate.sh

Using it as a black box

Using toulbar2 is just a matter of having a properly formatted input file describing the cost function network, graphical model, PWMaxSAT, PBO or Linkage .pre file and executing:

toulbar2 [option parameters] <file>

and toulbar2 will start solving the optimization problem described in its file argument. By default, the extension of the file (either .cfn, .cfn.gz, .cfn.bz2, .cfn.xz, .wcsp, .wcsp.gz, .wcsp.bz2, .wcsp.xz, .wcnf, .wcnf.gz, .wcnf.bz2, .wcnf.xz, .cnf, .cnf.gz, .cnf.bz2, .cnf.xz, .qpbo, .qpbo.gz, .qpbo.bz2, .qpbo.xz, .opb, .opb.gz, .opb.bz2, .opb.xz, .lp, .lp.gz, .lp.bz2, .lp.xz, .uai, .uai.gz, .uai.bz2, .uai.xz, .LG, .LG.gz, .LG.bz2, .LG.xz, .xml, .xml.gz, .xml.bz2, .xml.xz, .pre or .bep) is used to determine the nature of the file (see Input formats). There is no specific order for the options or problem file. toulbar2 comes with decently optimized default option parameters. It is however often possible to set it up for different target than pure optimization or tune it for faster action using specific command line options.

Quick start

  • Download a binary weighted constraint satisfaction problem (WCSP) file example.wcsp.xz. Solve it with default options:

    toulbar2 EXAMPLES/example.wcsp.xz
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.6e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Preprocessing time: 0.001 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [20, 64] 68.750%
    New solution: 28 (0 backtracks, 6 nodes, depth 8)
    New solution: 27 (5 backtracks, 15 nodes, depth 5)
    Optimality gap: [21, 27] 22.222 % (8 backtracks, 18 nodes)
    Optimality gap: [22, 27] 18.519 % (21 backtracks, 55 nodes)
    Optimality gap: [23, 27] 14.815 % (49 backtracks, 122 nodes)
    Optimality gap: [24, 27] 11.111 % (63 backtracks, 153 nodes)
    Optimality gap: [25, 27] 7.407 % (81 backtracks, 217 nodes)
    Optimality gap: [27, 27] 0.000 % (89 backtracks, 240 nodes)
    Node redundancy during HBFS: 25.417 %
    Optimum: 27 in 89 backtracks and 240 nodes ( 460 removals by DEE) and 0.006 seconds.
    end.
    
  • Solve a WCSP using INCOP, a local search method [idwalk:cp04] applied just after preprocessing, in order to find a good upper bound before a complete search:

    toulbar2 EXAMPLES/example.wcsp.xz -i
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.6e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Preprocessing time: 0.001 seconds.
    New solution: 27 (0 backtracks, 0 nodes, depth 1)
    INCOP solving time: 0.254 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [20, 27] 25.926%
    Optimality gap: [21, 27] 22.222 % (4 backtracks, 8 nodes)
    Optimality gap: [22, 27] 18.519 % (42 backtracks, 95 nodes)
    Optimality gap: [23, 27] 14.815 % (93 backtracks, 209 nodes)
    Optimality gap: [24, 27] 11.111 % (111 backtracks, 253 nodes)
    Optimality gap: [25, 27] 7.407 % (121 backtracks, 280 nodes)
    Optimality gap: [27, 27] 0.000 % (128 backtracks, 307 nodes)
    Node redundancy during HBFS: 16.612 %
    Optimum: 27 in 128 backtracks and 307 nodes ( 647 removals by DEE) and 0.263 seconds.
    end.
    
  • Solve a WCSP with an initial upper bound and save its (first) optimal solution in filename “example.sol”:

    toulbar2 EXAMPLES/example.wcsp.xz -ub=28 -w=example.sol
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.6e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Preprocessing time: 0.001 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [20, 28] 28.571%
    New solution: 27 (0 backtracks, 4 nodes, depth 6)
    Optimality gap: [21, 27] 22.222 % (6 backtracks, 14 nodes)
    Optimality gap: [22, 27] 18.519 % (25 backtracks, 61 nodes)
    Optimality gap: [23, 27] 14.815 % (56 backtracks, 133 nodes)
    Optimality gap: [24, 27] 11.111 % (60 backtracks, 148 nodes)
    Optimality gap: [25, 27] 7.407 % (83 backtracks, 228 nodes)
    Optimality gap: [27, 27] 0.000 % (89 backtracks, 265 nodes)
    Node redundancy during HBFS: 32.453 %
    Optimum: 27 in 89 backtracks and 265 nodes ( 441 removals by DEE) and 0.007 seconds.
    end.
    
  • … and see this saved “example.sol” file:

    cat example.sol
    # each value corresponds to one variable assignment in problem file order
    
    1 0 2 2 2 2 0 4 2 0 4 1 0 0 3 0 3 1 2 4 2 1 2 4 1
    
  • Download a larger WCSP file scen06.wcsp.xz. Solve it using a limited discrepancy search strategy [Ginsberg1995] with a VAC integrality-based variable ordering [Trosser2020a] in order to speed-up the search for finding good upper bounds first (by default, toulbar2 uses another diversification strategy based on hybrid best-first search [Katsirelos2015a]):

    toulbar2 EXAMPLES/scen06.wcsp.xz -l -vacint
    
    Read 100 variables, with 44 values at most, and 1222 cost functions, with maximum arity 2.
    Cost function decomposition time : 0.000133 seconds.
    Preprocessing time: 0.154752 seconds.
    82 unassigned variables, 3273 values in all current domains (med. size:44, max size:44) and 327 non-unary cost functions (med. arity:2, med. degree:6)
    Initial lower and upper bounds: [0, 248338] 100.000%
    --- [1] LDS 0 --- (0 nodes)
    c 2097152 Bytes allocated for long long stack.
    c 4194304 Bytes allocated for long long stack.
    New solution: 7771 (0 backtracks, 101 nodes, depth 3)
    --- [1] LDS 1 --- (101 nodes)
    c 8388608 Bytes allocated for long long stack.
    New solution: 5848 (1 backtracks, 282 nodes, depth 4)
    New solution: 5384 (3 backtracks, 397 nodes, depth 4)
    New solution: 5039 (4 backtracks, 466 nodes, depth 4)
    New solution: 4740 (8 backtracks, 640 nodes, depth 4)
    --- [1] LDS 2 --- (738 nodes)
    New solution: 4675 (37 backtracks, 966 nodes, depth 5)
    New solution: 4633 (44 backtracks, 1113 nodes, depth 5)
    New solution: 4509 (45 backtracks, 1165 nodes, depth 5)
    New solution: 4502 (51 backtracks, 1226 nodes, depth 5)
    New solution: 4344 (54 backtracks, 1291 nodes, depth 5)
    New solution: 4258 (135 backtracks, 1864 nodes, depth 5)
    New solution: 4118 (136 backtracks, 1907 nodes, depth 5)
    New solution: 4107 (138 backtracks, 1965 nodes, depth 4)
    New solution: 4101 (147 backtracks, 2040 nodes, depth 5)
    New solution: 4099 (150 backtracks, 2057 nodes, depth 4)
    New solution: 4037 (152 backtracks, 2080 nodes, depth 5)
    New solution: 3853 (157 backtracks, 2171 nodes, depth 5)
    New solution: 3800 (209 backtracks, 2475 nodes, depth 5)
    New solution: 3781 (222 backtracks, 2539 nodes, depth 5)
    New solution: 3769 (226 backtracks, 2559 nodes, depth 5)
    New solution: 3750 (227 backtracks, 2568 nodes, depth 5)
    New solution: 3748 (229 backtracks, 2575 nodes, depth 5)
    --- [1] LDS 4 --- (2586 nodes)
    New solution: 3615 (663 backtracks, 5086 nodes, depth 7)
    New solution: 3614 (698 backtracks, 5269 nodes, depth 6)
    New solution: 3599 (704 backtracks, 5310 nodes, depth 6)
    New solution: 3594 (708 backtracks, 5335 nodes, depth 7)
    New solution: 3591 (709 backtracks, 5343 nodes, depth 6)
    New solution: 3580 (710 backtracks, 5354 nodes, depth 7)
    New solution: 3578 (716 backtracks, 5374 nodes, depth 6)
    New solution: 3551 (988 backtracks, 6456 nodes, depth 7)
    New solution: 3539 (996 backtracks, 6522 nodes, depth 7)
    New solution: 3516 (1000 backtracks, 6554 nodes, depth 7)
    New solution: 3507 (1002 backtracks, 6573 nodes, depth 7)
    New solution: 3483 (1037 backtracks, 6718 nodes, depth 7)
    New solution: 3464 (1038 backtracks, 6739 nodes, depth 7)
    New solution: 3438 (1047 backtracks, 6806 nodes, depth 7)
    New solution: 3412 (1049 backtracks, 6824 nodes, depth 7)
    --- [1] Search with no discrepancy limit --- (9443 nodes)
    New solution: 3404 (4415 backtracks, 14613 nodes, depth 27)
    New solution: 3402 (4416 backtracks, 14615 nodes, depth 25)
    New solution: 3400 (4417 backtracks, 14619 nodes, depth 24)
    New solution: 3391 (4419 backtracks, 14630 nodes, depth 28)
    New solution: 3389 (4420 backtracks, 14632 nodes, depth 26)
    Optimality gap: [100, 3389] 97.049 % (21663 backtracks, 49099 nodes)
    Optimality gap: [300, 3389] 91.148 % (24321 backtracks, 54415 nodes)
    Optimality gap: [957, 3389] 71.762 % (37965 backtracks, 81703 nodes)
    Optimality gap: [1780, 3389] 47.477 % (39060 backtracks, 83893 nodes)
    Optimality gap: [1999, 3389] 41.015 % (39252 backtracks, 84277 nodes)
    Optimum: 3389 in 39276 backtracks and 84325 nodes ( 444857 removals by DEE) and 36.293 seconds.
    end.
    
  • Download a cluster decomposition file scen06.dec (each line corresponds to a cluster of variables, clusters may overlap). Solve the previous WCSP using a variable neighborhood search algorithm (UDGVNS) [Ouali2017] during 10 seconds:

    toulbar2 EXAMPLES/scen06.wcsp.xz EXAMPLES/scen06.dec -vns -time=10
    
    Read 100 variables, with 44 values at most, and 1222 cost functions, with maximum arity 2.
    Cost function decomposition time : 9.2e-05 seconds.
    Preprocessing time: 0.152035 seconds.
    82 unassigned variables, 3273 values in all current domains (med. size:44, max size:44) and 327 non-unary cost functions (med. arity:2, med. degree:6)
    Initial lower and upper bounds: [0, 248338] 100.000%
    c 2097152 Bytes allocated for long long stack.
    c 4194304 Bytes allocated for long long stack.
    c 8388608 Bytes allocated for long long stack.
    New solution: 7566 (0 backtracks, 109 nodes, depth 110)
    Problem decomposition in 55 clusters with size distribution: min: 1 median: 5 mean: 4.782 max: 12
    ****** Restart 1 with 1 discrepancies and UB=7566 ****** (109 nodes)
    New solution: 7555 (0 backtracks, 109 nodes, depth 1)
    New solution: 7545 (0 backtracks, 111 nodes, depth 2)
    New solution: 7397 (0 backtracks, 114 nodes, depth 2)
    New solution: 7289 (0 backtracks, 118 nodes, depth 2)
    New solution: 7287 (0 backtracks, 118 nodes, depth 1)
    New solution: 7277 (0 backtracks, 118 nodes, depth 1)
    New solution: 5274 (0 backtracks, 118 nodes, depth 1)
    New solution: 5169 (0 backtracks, 118 nodes, depth 1)
    New solution: 5159 (0 backtracks, 118 nodes, depth 1)
    New solution: 5158 (0 backtracks, 118 nodes, depth 1)
    New solution: 5105 (1 backtracks, 120 nodes, depth 1)
    New solution: 4767 (2 backtracks, 140 nodes, depth 2)
    New solution: 4667 (2 backtracks, 140 nodes, depth 1)
    New solution: 4655 (8 backtracks, 164 nodes, depth 2)
    New solution: 4588 (8 backtracks, 171 nodes, depth 2)
    New solution: 4543 (8 backtracks, 172 nodes, depth 2)
    New solution: 4541 (8 backtracks, 172 nodes, depth 1)
    New solution: 4424 (8 backtracks, 174 nodes, depth 2)
    New solution: 4423 (8 backtracks, 174 nodes, depth 1)
    New solution: 4411 (8 backtracks, 174 nodes, depth 1)
    New solution: 4401 (8 backtracks, 174 nodes, depth 1)
    New solution: 4367 (8 backtracks, 175 nodes, depth 2)
    New solution: 4175 (9 backtracks, 177 nodes, depth 1)
    New solution: 4174 (9 backtracks, 177 nodes, depth 1)
    New solution: 4173 (9 backtracks, 177 nodes, depth 1)
    New solution: 4171 (9 backtracks, 177 nodes, depth 1)
    New solution: 4152 (9 backtracks, 177 nodes, depth 1)
    New solution: 4142 (12 backtracks, 187 nodes, depth 2)
    New solution: 4001 (43 backtracks, 562 nodes, depth 2)
    New solution: 3900 (43 backtracks, 562 nodes, depth 1)
    New solution: 3891 (78 backtracks, 779 nodes, depth 1)
    New solution: 3890 (80 backtracks, 788 nodes, depth 1)
    New solution: 3816 (130 backtracks, 1192 nodes, depth 2)
    New solution: 3768 (137 backtracks, 1217 nodes, depth 1)
    New solution: 3740 (205 backtracks, 1660 nodes, depth 2)
    New solution: 3738 (205 backtracks, 1660 nodes, depth 1)
    New solution: 3730 (229 backtracks, 1780 nodes, depth 1)
    New solution: 3723 (230 backtracks, 1786 nodes, depth 2)
    New solution: 3721 (230 backtracks, 1786 nodes, depth 1)
    New solution: 3711 (236 backtracks, 1819 nodes, depth 1)
    New solution: 3633 (239 backtracks, 1850 nodes, depth 2)
    New solution: 3628 (245 backtracks, 1941 nodes, depth 2)
    New solution: 3621 (245 backtracks, 1943 nodes, depth 2)
    New solution: 3609 (245 backtracks, 1943 nodes, depth 1)
    New solution: 3608 (411 backtracks, 3079 nodes, depth 2)
    New solution: 3600 (518 backtracks, 3775 nodes, depth 2)
    New solution: 3598 (525 backtracks, 3806 nodes, depth 2)
    New solution: 3597 (525 backtracks, 3806 nodes, depth 1)
    New solution: 3587 (525 backtracks, 3806 nodes, depth 1)
    New solution: 3565 (534 backtracks, 3846 nodes, depth 2)
    New solution: 3554 (536 backtracks, 3856 nodes, depth 1)
    New solution: 3534 (538 backtracks, 3860 nodes, depth 1)
    New solution: 3522 (538 backtracks, 3861 nodes, depth 2)
    New solution: 3507 (560 backtracks, 3987 nodes, depth 2)
    New solution: 3505 (584 backtracks, 4130 nodes, depth 2)
    New solution: 3500 (598 backtracks, 4255 nodes, depth 2)
    New solution: 3498 (600 backtracks, 4281 nodes, depth 2)
    New solution: 3493 (657 backtracks, 4648 nodes, depth 2)
    ****** Restart 2 with 2 discrepancies and UB=3493 ****** (6206 nodes)
    New solution: 3492 (1406 backtracks, 9011 nodes, depth 3)
    ****** Restart 3 with 2 discrepancies and UB=3492 ****** (10128 nodes)
    New solution: 3389 (1652 backtracks, 10572 nodes, depth 3)
    ****** Restart 4 with 2 discrepancies and UB=3389 ****** (11566 nodes)
    
    Time limit expired... Aborting...
    
  • Download another difficult instance scen07.wcsp.xz. Solve it using a variable neighborhood search algorithm (UDGVNS) with maximum cardinality search cluster decomposition and absorption [Ouali2017] during 5 seconds:

    toulbar2 EXAMPLES/scen07.wcsp.xz -vns -O=-1 -E -time=5
    
    Read 200 variables, with 44 values at most, and 2665 cost functions, with maximum arity 2.
    Cost function decomposition time : 0.000303 seconds.
    Reverse DAC dual bound: 10001 (+0.010%)
    Preprocessing time: 0.351 seconds.
    162 unassigned variables, 6481 values in all current domains (med. size:44, max size:44) and 764 non-unary cost functions (med. arity:2, med. degree:8)
    Initial lower and upper bounds: [10001, 436543501] 99.998%
    c 2097152 Bytes allocated for long long stack.
    c 4194304 Bytes allocated for long long stack.
    c 8388608 Bytes allocated for long long stack.
    New solution: 1455221 (0 backtracks, 232 nodes, depth 233)
    Tree decomposition time: 0.003 seconds.
    Problem decomposition in 25 clusters with size distribution: min: 3 median: 10 mean: 10.360 max: 38
    ****** Restart 1 with 1 discrepancies and UB=1455221 ****** (232 nodes)
    New solution: 1445522 (0 backtracks, 232 nodes, depth 1)
    New solution: 1445520 (0 backtracks, 232 nodes, depth 1)
    New solution: 1445320 (0 backtracks, 232 nodes, depth 1)
    New solution: 1445319 (0 backtracks, 232 nodes, depth 1)
    New solution: 1435218 (0 backtracks, 232 nodes, depth 1)
    New solution: 1425218 (0 backtracks, 232 nodes, depth 1)
    New solution: 1425217 (0 backtracks, 232 nodes, depth 1)
    New solution: 1415216 (0 backtracks, 232 nodes, depth 1)
    New solution: 1405218 (0 backtracks, 232 nodes, depth 1)
    New solution: 1405216 (9 backtracks, 286 nodes, depth 2)
    New solution: 1395016 (9 backtracks, 286 nodes, depth 1)
    New solution: 1394815 (9 backtracks, 289 nodes, depth 2)
    New solution: 1394716 (9 backtracks, 289 nodes, depth 1)
    New solution: 394818 (13 backtracks, 300 nodes, depth 1)
    New solution: 394816 (13 backtracks, 300 nodes, depth 1)
    New solution: 394716 (15 backtracks, 307 nodes, depth 1)
    New solution: 394715 (26 backtracks, 361 nodes, depth 1)
    New solution: 394713 (26 backtracks, 361 nodes, depth 1)
    New solution: 384515 (30 backtracks, 379 nodes, depth 2)
    New solution: 384513 (30 backtracks, 379 nodes, depth 1)
    New solution: 384313 (30 backtracks, 379 nodes, depth 1)
    New solution: 384213 (33 backtracks, 390 nodes, depth 1)
    New solution: 384211 (33 backtracks, 390 nodes, depth 1)
    New solution: 384208 (42 backtracks, 426 nodes, depth 1)
    New solution: 384207 (42 backtracks, 427 nodes, depth 2)
    New solution: 364206 (42 backtracks, 427 nodes, depth 1)
    New solution: 353705 (42 backtracks, 438 nodes, depth 2)
    New solution: 353703 (42 backtracks, 443 nodes, depth 2)
    New solution: 353702 (44 backtracks, 450 nodes, depth 1)
    New solution: 353701 (52 backtracks, 482 nodes, depth 1)
    New solution: 343898 (88 backtracks, 705 nodes, depth 1)
    New solution: 343698 (91 backtracks, 717 nodes, depth 1)
    New solution: 343593 (94 backtracks, 726 nodes, depth 1)
    ****** Restart 2 with 2 discrepancies and UB=343593 ****** (1906 nodes)
    New solution: 343592 (319 backtracks, 2203 nodes, depth 3)
    ****** Restart 3 with 2 discrepancies and UB=343592 ****** (3467 nodes)
    
    Time limit expired... Aborting...
    
  • Download file 404.wcsp.xz. Solve it using Depth-First Brand and Bound with Tree Decomposition and HBFS (BTD-HBFS) [Schiex2006a] based on a min-fill variable ordering:

    toulbar2 EXAMPLES/404.wcsp.xz -O=-3 -B=1
    
    Read 100 variables, with 4 values at most, and 710 cost functions, with maximum arity 3.
    Cost function decomposition time : 6.6e-05 seconds.
    Reverse DAC dual bound: 64 (+35.938%)
    Reverse DAC dual bound: 66 (+3.030%)
    Reverse DAC dual bound: 67 (+1.493%)
    Preprocessing time: 0.008 seconds.
    88 unassigned variables, 228 values in all current domains (med. size:2, max size:4) and 591 non-unary cost functions (med. arity:2, med. degree:13)
    Initial lower and upper bounds: [67, 155] 56.774%
    Tree decomposition width  : 19
    Tree decomposition height : 43
    Number of clusters        : 47
    Tree decomposition time: 0.002 seconds.
    New solution: 124 (20 backtracks, 35 nodes, depth 3)
    Optimality gap: [70, 124] 43.548 % (20 backtracks, 35 nodes)
    New solution: 123 (34 backtracks, 64 nodes, depth 3)
    Optimality gap: [77, 123] 37.398 % (34 backtracks, 64 nodes)
    New solution: 119 (173 backtracks, 348 nodes, depth 3)
    Optimality gap: [88, 119] 26.050 % (173 backtracks, 348 nodes)
    Optimality gap: [91, 119] 23.529 % (202 backtracks, 442 nodes)
    New solution: 117 (261 backtracks, 609 nodes, depth 3)
    Optimality gap: [97, 117] 17.094 % (261 backtracks, 609 nodes)
    New solution: 114 (342 backtracks, 858 nodes, depth 3)
    Optimality gap: [98, 114] 14.035 % (342 backtracks, 858 nodes)
    Optimality gap: [100, 114] 12.281 % (373 backtracks, 984 nodes)
    Optimality gap: [101, 114] 11.404 % (437 backtracks, 1123 nodes)
    Optimality gap: [102, 114] 10.526 % (446 backtracks, 1152 nodes)
    Optimality gap: [103, 114] 9.649 % (484 backtracks, 1232 nodes)
    Optimality gap: [104, 114] 8.772 % (521 backtracks, 1334 nodes)
    Optimality gap: [105, 114] 7.895 % (521 backtracks, 1353 nodes)
    Optimality gap: [106, 114] 7.018 % (525 backtracks, 1364 nodes)
    Optimality gap: [107, 114] 6.140 % (525 backtracks, 1379 nodes)
    Optimality gap: [109, 114] 4.386 % (534 backtracks, 1539 nodes)
    Optimality gap: [111, 114] 2.632 % (536 backtracks, 1559 nodes)
    Optimality gap: [113, 114] 0.877 % (536 backtracks, 1564 nodes)
    Optimality gap: [114, 114] 0.000 % (536 backtracks, 1598 nodes)
    HBFS open list restarts: 0.000 % and reuse: 11.080 % of 352
    Node redundancy during HBFS: 34.355 %
    Optimum: 114 in 536 backtracks and 1598 nodes ( 21 removals by DEE) and 0.031 seconds.
    end.
    
  • Solve the same problem using Russian Doll Search exploiting BTD [Sanchez2009a]:

    toulbar2 EXAMPLES/404.wcsp.xz -O=-3 -B=2
    
    Read 100 variables, with 4 values at most, and 710 cost functions, with maximum arity 3.
    Cost function decomposition time : 6.6e-05 seconds.
    Reverse DAC dual bound: 64 (+35.938%)
    Reverse DAC dual bound: 66 (+3.030%)
    Reverse DAC dual bound: 67 (+1.493%)
    Preprocessing time: 0.008 seconds.
    88 unassigned variables, 228 values in all current domains (med. size:2, max size:4) and 591 non-unary cost functions (med. arity:2, med. degree:13)
    Initial lower and upper bounds: [67, 155] 56.774%
    Tree decomposition width  : 19
    Tree decomposition height : 43
    Number of clusters        : 47
    Tree decomposition time: 0.002 seconds.
    --- Solving cluster subtree 5 ...
    New solution: 0 (0 backtracks, 0 nodes, depth 2)
    ---  done  cost = [0,0] (0 backtracks, 0 nodes, depth 2)
    
    --- Solving cluster subtree 6 ...
    New solution: 0 (0 backtracks, 0 nodes, depth 2)
    ---  done  cost = [0,0] (0 backtracks, 0 nodes, depth 2)
    
    --- Solving cluster subtree 7 ...
    
    ...
    
    --- Solving cluster subtree 44 ...
    New solution: 42 (420 backtracks, 723 nodes, depth 7)
    New solution: 39 (431 backtracks, 743 nodes, depth 9)
    New solution: 35 (447 backtracks, 785 nodes, depth 22)
    ---  done  cost = [35,35] (557 backtracks, 960 nodes, depth 2)
    
    --- Solving cluster subtree 46 ...
    New solution: 114 (557 backtracks, 960 nodes, depth 2)
    ---  done  cost = [114,114] (557 backtracks, 960 nodes, depth 2)
    
    Optimum: 114 in 557 backtracks and 960 nodes ( 50 removals by DEE) and 0.026 seconds.
    end.
    
  • Solve another WCSP using the original Russian Doll Search method [Verfaillie1996] with static variable ordering (following problem file) and soft arc consistency:

    toulbar2 EXAMPLES/505.wcsp.xz -B=3 -j=1 -svo -k=1
    
    Read 240 variables, with 4 values at most, and 2242 cost functions, with maximum arity 3.
    Cost function decomposition time : 0.000911 seconds.
    Preprocessing time: 0.013967 seconds.
    233 unassigned variables, 666 values in all current domains (med. size:2, max size:4) and 1966 non-unary cost functions (med. arity:2, med. degree:16)
    Initial lower and upper bounds: [2, 34347] 99.994%
    Tree decomposition width  : 59
    Tree decomposition height : 233
    Number of clusters        : 239
    Tree decomposition time: 0.017 seconds.
    --- Solving cluster subtree 0 ...
    New solution: 0 (0 backtracks, 0 nodes, depth 2)
    ---  done  cost = [0,0] (0 backtracks, 0 nodes, depth 2)
    
    --- Solving cluster subtree 1 ...
    New solution: 0 (0 backtracks, 0 nodes, depth 2)
    ---  done  cost = [0,0] (0 backtracks, 0 nodes, depth 2)
    
    --- Solving cluster subtree 2 ...
    
    ...
    
    --- Solving cluster subtree 3 ...
    New solution: 21253 (26963 backtracks, 48851 nodes, depth 3)
    New solution: 21251 (26991 backtracks, 48883 nodes, depth 4)
    ---  done  cost = [21251,21251] (26992 backtracks, 48883 nodes, depth 2)
    
    --- Solving cluster subtree 238 ...
    New solution: 21253 (26992 backtracks, 48883 nodes, depth 2)
    ---  done  cost = [21253,21253] (26992 backtracks, 48883 nodes, depth 2)
    
    Optimum: 21253 in 26992 backtracks and 48883 nodes ( 0 removals by DEE) and 6.180 seconds.
    end.
    
  • Solve the same WCSP using a parallel variable neighborhood search algorithm (UPDGVNS) with min-fill cluster decomposition [Ouali2017] using 4 cores during 5 seconds:

    mpirun -n 4 toulbar2 EXAMPLES/505.wcsp.xz -vns -O=-3 -time=5
    
    Read 240 variables, with 4 values at most, and 2242 cost functions, with maximum arity 3.
    Cost function decomposition time : 0.002201 seconds.
    Reverse DAC dual bound: 11120 (+81.403%)
    Reverse DAC dual bound: 11128 (+0.072%)
    Preprocessing time: 0.079 seconds.
    233 unassigned variables, 666 values in all current domains (med. size:2, max size:4) and 1966 non-unary cost functions (med. arity:2, med. degree:16)
    Initial lower and upper bounds: [11128, 34354] 67.608%
    Tree decomposition time: 0.017 seconds.
    Problem decomposition in 89 clusters with size distribution: min: 4 median: 11 mean: 11.831 max: 23
    New solution: 26266 (0 backtracks, 59 nodes, depth 60)
    New solution: 26265 in 0.038 seconds.
    New solution: 26264 in 0.046 seconds.
    New solution: 25266 in 0.047 seconds.
    New solution: 25265 in 0.060 seconds.
    New solution: 25260 in 0.071 seconds.
    New solution: 24262 in 0.080 seconds.
    New solution: 23262 in 0.090 seconds.
    New solution: 23260 in 0.098 seconds.
    New solution: 23259 in 0.108 seconds.
    New solution: 22262 in 0.108 seconds.
    New solution: 22261 in 0.110 seconds.
    New solution: 22260 in 0.113 seconds.
    New solution: 22259 in 0.118 seconds.
    New solution: 22258 in 0.128 seconds.
    New solution: 22257 in 0.138 seconds.
    New solution: 22255 in 0.154 seconds.
    New solution: 22254 in 0.170 seconds.
    New solution: 22252 in 0.206 seconds.
    New solution: 21257 in 0.227 seconds.
    New solution: 21256 in 0.256 seconds.
    New solution: 21254 in 0.380 seconds.
    New solution: 21253 in 0.478 seconds.
    --------------------------------------------------------------------------
    MPI_ABORT was invoked on rank 1 in communicator MPI_COMM_WORLD
    with errorcode 0.
    
    NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
    You may or may not see output from other processes, depending on
    exactly when Open MPI kills them.
    --------------------------------------------------------------------------
    
    Time limit expired... Aborting...
    
  • Download a cluster decomposition file example.dec (each line corresponds to a cluster of variables, clusters may overlap). Solve a WCSP using a variable neighborhood search algorithm (UDGVNS) with a given cluster decomposition:

    toulbar2 EXAMPLES/example.wcsp.xz EXAMPLES/example.dec -vns
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.6e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Preprocessing time: 0.001 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [20, 64] 68.750%
    New solution: 28 (0 backtracks, 6 nodes, depth 7)
    Problem decomposition in 7 clusters with size distribution: min: 11 median: 15 mean: 15.143 max: 17
    ****** Restart 1 with 1 discrepancies and UB=28 ****** (6 nodes)
    New solution: 27 (0 backtracks, 6 nodes, depth 1)
    ****** Restart 2 with 2 discrepancies and UB=27 ****** (57 nodes)
    ****** Restart 3 with 4 discrepancies and UB=27 ****** (143 nodes)
    ****** Restart 4 with 8 discrepancies and UB=27 ****** (418 nodes)
    ****** Restart 5 with 16 discrepancies and UB=27 ****** (846 nodes)
    Optimum: 27 in 521 backtracks and 1156 nodes ( 3066 removals by DEE) and 0.039 seconds.
    end.
    
  • Solve a WCSP using a parallel variable neighborhood search algorithm (UPDGVNS) with the same cluster decomposition:

    mpirun -n 4 toulbar2 EXAMPLES/example.wcsp.xz EXAMPLES/example.dec -vns
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 2.7e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Preprocessing time: 0.002 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [20, 64] 68.750%
    Problem decomposition in 7 clusters with size distribution: min: 11 median: 15 mean: 15.143 max: 17
    New solution: 28 (0 backtracks, 7 nodes, depth 8)
    New solution: 27 in 0.001 seconds.
    Optimum: 27 in 0 backtracks and 7 nodes ( 36 removals by DEE) and 0.064 seconds.
    Total CPU time = 0.288 seconds
    Solving real-time = 0.071 seconds (not including preprocessing time)
    end.
    
  • Download file example.order. Solve a WCSP using BTD-HBFS based on a given (min-fill) reverse variable elimination ordering:

    toulbar2 EXAMPLES/example.wcsp.xz EXAMPLES/example.order -B=1
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.5e-05 seconds.
    Reverse DAC dual bound: 20 (+10.000%)
    Reverse DAC dual bound: 21 (+4.762%)
    Preprocessing time: 0.001 seconds.
    24 unassigned variables, 116 values in all current domains (med. size:5, max size:5) and 62 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [21, 64] 67.188%
    Tree decomposition width  : 8
    Tree decomposition height : 16
    Number of clusters        : 18
    Tree decomposition time: 0.000 seconds.
    New solution: 29 (19 backtracks, 30 nodes, depth 3)
    New solution: 28 (37 backtracks, 62 nodes, depth 3)
    Optimality gap: [22, 28] 21.429 % (37 backtracks, 62 nodes)
    Optimality gap: [23, 28] 17.857 % (309 backtracks, 629 nodes)
    New solution: 27 (328 backtracks, 672 nodes, depth 3)
    Optimality gap: [23, 27] 14.815 % (328 backtracks, 672 nodes)
    Optimality gap: [24, 27] 11.111 % (347 backtracks, 724 nodes)
    Optimality gap: [25, 27] 7.407 % (372 backtracks, 819 nodes)
    Optimality gap: [26, 27] 3.704 % (372 backtracks, 829 nodes)
    Optimality gap: [27, 27] 0.000 % (372 backtracks, 873 nodes)
    HBFS open list restarts: 0.000 % and reuse: 10.769 % of 65
    Node redundancy during HBFS: 16.724 %
    Optimum: 27 in 372 backtracks and 873 nodes ( 463 removals by DEE) and 0.020 seconds.
    end.
    
  • Download file example.cov. Solve a WCSP using BTD-HBFS based on a given explicit (min-fill path-) tree-decomposition:

    toulbar2 EXAMPLES/example.wcsp.xz EXAMPLES/example.cov -B=1
    
    Read 25 variables, with 5 values at most, and 63 cost functions, with maximum arity 2.
    Warning! Cannot apply variable elimination during search with a given tree decomposition file.
    Warning! Cannot apply functional variable elimination with a given tree decomposition file.
    Cost function decomposition time : 1.6e-05 seconds.
    Reverse DAC dual bound: 20 (+15.000%)
    Reverse DAC dual bound: 22 (+9.091%)
    Preprocessing time: 0.001 seconds.
    25 unassigned variables, 120 values in all current domains (med. size:5, max size:5) and 63 non-unary cost functions (med. arity:2, med. degree:5)
    Initial lower and upper bounds: [22, 64] 65.625%
    Tree decomposition width  : 16
    Tree decomposition height : 24
    Number of clusters         : 9
    Tree decomposition time: 0.000 seconds.
    New solution: 29 (23 backtracks, 29 nodes, depth 3)
    New solution: 28 (32 backtracks, 46 nodes, depth 3)
    Optimality gap: [23, 28] 17.857 % (37 backtracks, 58 nodes)
    New solution: 27 (61 backtracks, 122 nodes, depth 3)
    Optimality gap: [23, 27] 14.815 % (61 backtracks, 122 nodes)
    Optimality gap: [24, 27] 11.111 % (132 backtracks, 269 nodes)
    Optimality gap: [25, 27] 7.407 % (177 backtracks, 395 nodes)
    Optimality gap: [26, 27] 3.704 % (189 backtracks, 467 nodes)
    Optimality gap: [27, 27] 0.000 % (189 backtracks, 482 nodes)
    HBFS open list restarts: 0.000 % and reuse: 25.926 % of 27
    Node redundancy during HBFS: 25.519 %
    Optimum: 27 in 189 backtracks and 482 nodes ( 95 removals by DEE) and 0.010 seconds.
    end.
    
  • Download a Markov Random Field (MRF) file pedigree9.uai.xz in UAI format. Solve it using bounded (of degree at most 8) variable elimination enhanced by cost function decomposition in preprocessing [Favier2011a] followed by BTD-HBFS exploiting only small-size (less than four variables) separators:

    toulbar2 EXAMPLES/pedigree9.uai.xz -O=-3 -p=-8 -B=1 -r=4
    
    Read 1118 variables, with 7 values at most, and 1118 cost functions, with maximum arity 4.
    No evidence file specified. Trying EXAMPLES/pedigree9.uai.xz.evid
    No evidence file.
    Generic variable elimination of degree 4
    Maximum degree of generic variable elimination: 4
    Cost function decomposition time : 0.003733 seconds.
    Preprocessing time: 0.073664 seconds.
    232 unassigned variables, 517 values in all current domains (med. size:2, max size:4) and 415 non-unary cost functions (med. arity:2, med. degree:6)
    Initial lower and upper bounds: [553902779, 13246577453] 95.819%
    Tree decomposition width  : 227
    Tree decomposition height : 230
    Number of clusters        : 890
    Tree decomposition time: 0.047 seconds.
    New solution: 865165767 energy: 298.395 prob: 2.564e-130 (72 backtracks, 140 nodes, depth 3)
    New solution: 844685630 energy: 296.347 prob: 1.987e-129 (128 backtracks, 254 nodes, depth 3)
    New solution: 822713386 energy: 294.149 prob: 1.789e-128 (188 backtracks, 373 nodes, depth 3)
    New solution: 809800912 energy: 292.858 prob: 6.506e-128 (327 backtracks, 665 nodes, depth 3)
    New solution: 769281277 energy: 288.806 prob: 3.742e-126 (383 backtracks, 771 nodes, depth 3)
    New solution: 755317979 energy: 287.410 prob: 1.512e-125 (714 backtracks, 1549 nodes, depth 3)
    New solution: 755129381 energy: 287.391 prob: 1.540e-125 (927 backtracks, 2038 nodes, depth 3)
    New solution: 711184893 energy: 282.997 prob: 1.248e-123 (1249 backtracks, 2685 nodes, depth 3)
    HBFS open list restarts: 0.000 % and reuse: 39.620 % of 1474
    Node redundancy during HBFS: 22.653 %
    Optimum: 711184893 energy: 282.997 prob: 1.248e-123 in 21719 backtracks and 56124 nodes ( 72435 removals by DEE) and 4.310 seconds.
    end.
    
  • Download another MRF file GeomSurf-7-gm256.uai.xz. Solve it using Virtual Arc Consistency (VAC) in preprocessing [Cooper2008] and exploit a VAC-based value [Cooper2010a] and variable [Trosser2020a] ordering heuristics:

    toulbar2 EXAMPLES/GeomSurf-7-gm256.uai.xz -A -V -vacint
    
    Read 787 variables, with 7 values at most, and 3527 cost functions, with maximum arity 3.
    No evidence file specified. Trying EXAMPLES/GeomSurf-7-gm256.uai.xz.evid
    No evidence file.
    Cost function decomposition time : 0.001227 seconds.
    Reverse DAC dual bound: 5879065363 energy: 1074.088 (+0.082%)
    VAC dual bound: 5906374927 energy: 1076.819 (iter:486)
    Number of VAC iterations: 726
    Number of times is VAC: 240 Number of times isvac and itThreshold > 1: 234
    Preprocessing time: 1.872 seconds.
    729 unassigned variables, 4819 values in all current domains (med. size:7, max size:7) and 3128 non-unary cost functions (med. arity:2, med. degree:6)
    Initial lower and upper bounds: [5906374927, 111615200815] 94.708%
    c 2097152 Bytes allocated for long long stack.
    New solution: 5968997522 energy: 1083.081 prob: 4.204e-471 (0 backtracks, 19 nodes, depth 21)
    Optimality gap: [5920086558, 5968997522] 0.819 % (17 backtracks, 36 nodes)
    New solution: 5922481881 energy: 1078.430 prob: 4.404e-469 (17 backtracks, 48 nodes, depth 8)
    Optimality gap: [5922481881, 5922481881] 0.000 % (21 backtracks, 52 nodes)
    Number of VAC iterations: 846
    Number of times is VAC: 360 Number of times isvac and itThreshold > 1: 351
    Node redundancy during HBFS: 11.538 %
    Optimum: 5922481881 energy: 1078.430 prob: 4.404e-469 in 21 backtracks and 52 nodes ( 2749 removals by DEE) and 1.980 seconds.
    end.
    
  • Download another MRF file 1CM1.uai.xz. Solve it by applying first an initial upper bound probing, and secondly, use a modified variable ordering heuristic based on VAC-integrality during search [Trosser2020a]:

    toulbar2 EXAMPLES/1CM1.uai.xz -A=1000 -vacint -rasps -vacthr
    
    Read 37 variables, with 350 values at most, and 703 cost functions, with maximum arity 2.
    No evidence file specified. Trying EXAMPLES/1CM1.uai.xz.evid
    No evidence file.
    Cost function decomposition time : 0.000679 seconds.
    Reverse DAC dual bound: 103988236701 energy: -12486.138 (+0.000%)
    VAC dual bound: 103988236701 energy: -12486.138 (iter:4068)
    Number of VAC iterations: 4389
    Number of times is VAC: 189 Number of times isvac and itThreshold > 1: 186
    Threshold: 2326139858 NbAssignedVar: 0 Ratio: 0.0000000
    Threshold: 2320178814 NbAssignedVar: 0 Ratio: 0.0000000
    Threshold: 21288438 NbAssignedVar: 19 Ratio: 0.0000000
    Threshold: 11823689 NbAssignedVar: 20 Ratio: 0.0000000
    Threshold: 8187968 NbAssignedVar: 21 Ratio: 0.0000001
    Threshold: 6858739 NbAssignedVar: 22 Ratio: 0.0000001
    Threshold: 6058812 NbAssignedVar: 22 Ratio: 0.0000001
    Threshold: 5504560 NbAssignedVar: 22 Ratio: 0.0000001
    Threshold: 3972336 NbAssignedVar: 23 Ratio: 0.0000002
    Threshold: 3655432 NbAssignedVar: 23 Ratio: 0.0000002
    Threshold: 3067825 NbAssignedVar: 23 Ratio: 0.0000002
    Threshold: 2174446 NbAssignedVar: 24 Ratio: 0.0000003
    Threshold: 1641827 NbAssignedVar: 24 Ratio: 0.0000004
    Threshold: 1376213 NbAssignedVar: 24 Ratio: 0.0000005
    Threshold: 208082 NbAssignedVar: 24 Ratio: 0.0000031
    Threshold: 104041 NbAssignedVar: 26 Ratio: 0.0000068
    Threshold: 52020 NbAssignedVar: 27 Ratio: 0.0000140
    Threshold: 26010 NbAssignedVar: 27 Ratio: 0.0000281
    Threshold: 13005 NbAssignedVar: 27 Ratio: 0.0000561
    Threshold: 6502 NbAssignedVar: 27 Ratio: 0.0001122
    Threshold: 3251 NbAssignedVar: 27 Ratio: 0.0002245
    Threshold: 1625 NbAssignedVar: 27 Ratio: 0.0004491
    Threshold: 812 NbAssignedVar: 27 Ratio: 0.0008987
    Threshold: 406 NbAssignedVar: 27 Ratio: 0.0017974
    Threshold: 203 NbAssignedVar: 27 Ratio: 0.0035947
    Threshold: 101 NbAssignedVar: 27 Ratio: 0.0072250
    Threshold: 50 NbAssignedVar: 27 Ratio: 0.0145946
    Threshold: 25 NbAssignedVar: 27 Ratio: 0.0291892
    Threshold: 12 NbAssignedVar: 27 Ratio: 0.0608108
    Threshold: 6 NbAssignedVar: 27 Ratio: 0.1216216
    Threshold: 3 NbAssignedVar: 27 Ratio: 0.2432432
    Threshold: 1 NbAssignedVar: 27 Ratio: 0.7297297
    RASPS/VAC threshold: 203
    Preprocessing time: 41.340 seconds.
    37 unassigned variables, 3366 values in all current domains (med. size:38, max size:331) and 626 non-unary cost functions (med. arity:2, med. degree:35)
    Initial lower and upper bounds: [103988236701, 239074057808] 56.504%
    New solution: 104206588216 energy: -12464.303 prob: inf (0 backtracks, 3 nodes, depth 6)
    RASPS done in preprocessing (backtrack: 4 nodes: 8)
    New solution: 104174014744 energy: -12467.560 prob: inf (4 backtracks, 12 nodes, depth 6)
    Optimality gap: [104174014744, 104174014744] 0.000 % (7 backtracks, 15 nodes)
    Number of VAC iterations: 4695
    Number of times is VAC: 458 Number of times isvac and itThreshold > 1: 451
    Node redundancy during HBFS: 0.000 %
    Optimum: 104174014744 energy: -12467.560 prob: inf in 7 backtracks and 15 nodes ( 937 removals by DEE) and 41.354 seconds.
    end.
    
  • Download a weighted Max-SAT file brock200_4.clq.wcnf.xz in wcnf format. Solve it using a modified variable ordering heuristic [Schiex2014a]:

    toulbar2 EXAMPLES/brock200_4.clq.wcnf.xz -m=1
    
    c Read 200 variables, with 2 values at most, and 7011 clauses, with maximum arity 2.
    Cost function decomposition time : 0.000485 seconds.
    Reverse DAC dual bound: 91 (+86.813%)
    Reverse DAC dual bound: 92 (+1.087%)
    Preprocessing time: 0.040 seconds.
    200 unassigned variables, 400 values in all current domains (med. size:2, max size:2) and 6811 non-unary cost functions (med. arity:2, med. degree:68)
    Initial lower and upper bounds: [92, 200] 54.000%
    New solution: 189 (0 backtracks, 9 nodes, depth 11)
    New solution: 188 (45 backtracks, 143 nodes, depth 37)
    New solution: 187 (155 backtracks, 473 nodes, depth 47)
    New solution: 186 (892 backtracks, 2247 nodes, depth 19)
    New solution: 185 (3874 backtracks, 8393 nodes, depth 70)
    New solution: 184 (29475 backtracks, 62393 nodes, depth 40)
    New solution: 183 (221446 backtracks, 522724 nodes, depth 11)
    Node redundancy during HBFS: 37.221 %
    Optimum: 183 in 281307 backtracks and 896184 nodes ( 9478 removals by DEE) and 25.977 seconds.
    end.
    
  • Download another WCSP file latin4.wcsp.xz. Count the number of feasible solutions:

    toulbar2 EXAMPLES/latin4.wcsp.xz -a
    
    Read 16 variables, with 4 values at most, and 24 cost functions, with maximum arity 4.
    Cost function decomposition time : 2e-06 seconds.
    Reverse DAC dual bound: 48 (+2.083%)
    Preprocessing time: 0.006 seconds.
    16 unassigned variables, 64 values in all current domains (med. size:4, max size:4) and 8 non-unary cost functions (med. arity:4, med. degree:6)
    Initial lower and upper bounds: [48, 1000] 95.200%
    Optimality gap: [49, 1000] 95.100 % (17 backtracks, 41 nodes)
    Optimality gap: [58, 1000] 94.200 % (355 backtracks, 812 nodes)
    Optimality gap: [72, 1000] 92.800 % (575 backtracks, 1309 nodes)
    Optimality gap: [1000, 1000] 0.000 % (575 backtracks, 1318 nodes)
    Number of solutions    : =  576
    Time                   :    0.306 seconds
    ... in 575 backtracks and 1318 nodes
    end.
    
  • Find a greedy sequence of at most 20 diverse solutions with Hamming distance greater than 12 between any pair of solutions:

    toulbar2 EXAMPLES/latin4.wcsp.xz -a=20 -div=12
    
    Read 16 variables, with 4 values at most, and 24 cost functions, with maximum arity 4.
    Cost function decomposition time : 3e-06 seconds.
    Reverse DAC dual bound: 48 (+2.083%)
    Preprocessing time: 0.009 seconds.
    320 unassigned variables, 7968 values in all current domains (med. size:26, max size:26) and 8 non-unary cost functions (med. arity:4, med. degree:0)
    Initial lower and upper bounds: [48, 1000] 95.200%
    +++++++++ Search for solution 1 +++++++++
    New solution: 49 (0 backtracks, 7 nodes, depth 10)
    New solution: 48 (2 backtracks, 11 nodes, depth 3)
    Node redundancy during HBFS: 18.182 %
    Optimum: 48 in 2 backtracks and 11 nodes ( 0 removals by DEE) and 0.017 seconds.
    +++++++++ Search for solution 2 +++++++++
    New solution: 52 (2 backtracks, 879 nodes, depth 871)
    Optimality gap: [50, 49] -2.000 % (5 backtracks, 882 nodes)
    New solution: 51 (5 backtracks, 1748 nodes, depth 868)
    Optimality gap: [51, 49] -3.922 % (6 backtracks, 1749 nodes)
    Node redundancy during HBFS: 0.172 %
    Optimum: 51 in 6 backtracks and 1749 nodes ( 0 removals by DEE) and 0.046 seconds.
    +++++++++ Search for solution 3 +++++++++
    New solution: 74 (6 backtracks, 2569 nodes, depth 823)
    New solution: 62 (14 backtracks, 3407 nodes, depth 824)
    New solution: 58 (21 backtracks, 4245 nodes, depth 821)
    Optimality gap: [53, 49] -7.547 % (29 backtracks, 4270 nodes)
    Optimality gap: [56, 49] -12.500 % (30 backtracks, 4276 nodes)
    Optimality gap: [57, 49] -14.035 % (31 backtracks, 4292 nodes)
    New solution: 57 (31 backtracks, 5114 nodes, depth 819)
    Node redundancy during HBFS: 1.017 %
    Optimum: 57 in 31 backtracks and 5114 nodes ( 0 removals by DEE) and 0.146 seconds.
    +++++++++ Search for solution 4 +++++++++
    New solution: 73 (44 backtracks, 5923 nodes, depth 773)
    New solution: 72 (46 backtracks, 6702 nodes, depth 778)
    New solution: 58 (53 backtracks, 7485 nodes, depth 773)
    Optimality gap: [58, 49] -15.517 % (70 backtracks, 7584 nodes)
    Node redundancy during HBFS: 1.846 %
    Optimum: 58 in 70 backtracks and 7584 nodes ( 0 removals by DEE) and 0.256 seconds.
    +++++++++ Search for solution 5 +++++++++
    New solution: 80 (70 backtracks, 8307 nodes, depth 726)
    New solution: 74 (100 backtracks, 9139 nodes, depth 728)
    New solution: 66 (112 backtracks, 9896 nodes, depth 724)
    New solution: 64 (116 backtracks, 10636 nodes, depth 725)
    New solution: 58 (171 backtracks, 11654 nodes, depth 725)
    Node redundancy during HBFS: 3.484 %
    Optimum: 58 in 171 backtracks and 11654 nodes ( 0 removals by DEE) and 0.474 seconds.
    +++++++++ Search for solution 6 +++++++++
    New solution: 79 (178 backtracks, 12347 nodes, depth 677)
    New solution: 76 (207 backtracks, 13102 nodes, depth 677)
    New solution: 65 (212 backtracks, 13804 nodes, depth 680)
    Optimality gap: [59, 49] -16.949 % (251 backtracks, 14053 nodes)
    Optimality gap: [60, 49] -18.333 % (256 backtracks, 14093 nodes)
    Optimality gap: [61, 49] -19.672 % (259 backtracks, 14126 nodes)
    Optimality gap: [62, 49] -20.968 % (260 backtracks, 14165 nodes)
    New solution: 62 (260 backtracks, 14849 nodes, depth 675)
    Node redundancy during HBFS: 4.936 %
    Optimum: 62 in 260 backtracks and 14849 nodes ( 0 removals by DEE) and 0.688 seconds.
    +++++++++ Search for solution 7 +++++++++
    c 2097152 Bytes allocated for long long stack.
    New solution: 77 (267 backtracks, 15495 nodes, depth 630)
    New solution: 76 (283 backtracks, 16160 nodes, depth 629)
    New solution: 75 (334 backtracks, 16982 nodes, depth 628)
    New solution: 68 (335 backtracks, 17615 nodes, depth 628)
    Optimality gap: [64, 49] -23.438 % (383 backtracks, 17946 nodes)
    New solution: 65 (383 backtracks, 18577 nodes, depth 627)
    Optimality gap: [65, 49] -24.615 % (383 backtracks, 18581 nodes)
    Node redundancy during HBFS: 5.915 %
    Optimum: 65 in 383 backtracks and 18581 nodes ( 0 removals by DEE) and 0.963 seconds.
    +++++++++ Search for solution 8 +++++++++
    New solution: 81 (383 backtracks, 19161 nodes, depth 583)
    New solution: 80 (425 backtracks, 19865 nodes, depth 583)
    New solution: 69 (471 backtracks, 20646 nodes, depth 585)
    New solution: 68 (479 backtracks, 21273 nodes, depth 581)
    New solution: 65 (483 backtracks, 21881 nodes, depth 580)
    Node redundancy during HBFS: 6.014 %
    Optimum: 65 in 483 backtracks and 21881 nodes ( 0 removals by DEE) and 1.175 seconds.
    +++++++++ Search for solution 9 +++++++++
    New solution: 68 (483 backtracks, 22413 nodes, depth 535)
    Optimality gap: [66, 49] -25.758 % (581 backtracks, 22902 nodes)
    New solution: 66 (581 backtracks, 23434 nodes, depth 531)
    Node redundancy during HBFS: 6.900 %
    Optimum: 66 in 581 backtracks and 23434 nodes ( 0 removals by DEE) and 1.379 seconds.
    +++++++++ Search for solution 10 +++++++++
    New solution: 68 (619 backtracks, 24035 nodes, depth 484)
    Optimality gap: [67, 49] -26.866 % (686 backtracks, 24436 nodes)
    Optimality gap: [68, 49] -27.941 % (686 backtracks, 24444 nodes)
    Node redundancy during HBFS: 7.924 %
    Optimum: 68 in 686 backtracks and 24444 nodes ( 0 removals by DEE) and 1.597 seconds.
    +++++++++ Search for solution 11 +++++++++
    New solution: 72 (714 backtracks, 24958 nodes, depth 436)
    New solution: 68 (739 backtracks, 25534 nodes, depth 436)
    Node redundancy during HBFS: 8.052 %
    Optimum: 68 in 739 backtracks and 25534 nodes ( 0 removals by DEE) and 1.712 seconds.
    +++++++++ Search for solution 12 +++++++++
    c 4194304 Bytes allocated for long long stack.
    New solution: 81 (770 backtracks, 26006 nodes, depth 389)
    New solution: 78 (772 backtracks, 26399 nodes, depth 389)
    New solution: 77 (779 backtracks, 26818 nodes, depth 389)
    New solution: 76 (809 backtracks, 27354 nodes, depth 390)
    New solution: 72 (858 backtracks, 28065 nodes, depth 389)
    Optimality gap: [69, 49] -28.986 % (863 backtracks, 28122 nodes)
    Optimality gap: [70, 49] -30.000 % (864 backtracks, 28130 nodes)
    Optimality gap: [71, 49] -30.986 % (864 backtracks, 28140 nodes)
    New solution: 71 (864 backtracks, 28532 nodes, depth 387)
    Node redundancy during HBFS: 8.762 %
    Optimum: 71 in 864 backtracks and 28532 nodes ( 0 removals by DEE) and 1.981 seconds.
    +++++++++ Search for solution 13 +++++++++
    New solution: 76 (898 backtracks, 28974 nodes, depth 343)
    New solution: 72 (906 backtracks, 29334 nodes, depth 340)
    Optimality gap: [72, 49] -31.944 % (979 backtracks, 29782 nodes)
    Node redundancy during HBFS: 9.563 %
    Optimum: 72 in 979 backtracks and 29782 nodes ( 0 removals by DEE) and 2.212 seconds.
    +++++++++ Search for solution 14 +++++++++
    New solution: 86 (1062 backtracks, 30429 nodes, depth 292)
    New solution: 80 (1078 backtracks, 30768 nodes, depth 292)
    New solution: 74 (1085 backtracks, 31080 nodes, depth 292)
    Optimality gap: [74, 49] -33.784 % (1102 backtracks, 31203 nodes)
    Node redundancy during HBFS: 10.124 %
    Optimum: 74 in 1102 backtracks and 31203 nodes ( 0 removals by DEE) and 2.441 seconds.
    +++++++++ Search for solution 15 +++++++++
    New solution: 79 (1103 backtracks, 31448 nodes, depth 246)
    New solution: 78 (1122 backtracks, 31726 nodes, depth 246)
    New solution: 76 (1183 backtracks, 32087 nodes, depth 245)
    Optimality gap: [76, 49] -35.526 % (1231 backtracks, 32181 nodes)
    Node redundancy during HBFS: 9.816 %
    Optimum: 76 in 1231 backtracks and 32181 nodes ( 0 removals by DEE) and 2.603 seconds.
    +++++++++ Search for solution 16 +++++++++
    New solution: 80 (1253 backtracks, 32419 nodes, depth 197)
    New solution: 79 (1315 backtracks, 32735 nodes, depth 197)
    New solution: 78 (1336 backtracks, 32968 nodes, depth 196)
    Optimality gap: [78, 49] -37.179 % (1349 backtracks, 32993 nodes)
    Node redundancy during HBFS: 9.575 %
    Optimum: 78 in 1349 backtracks and 32993 nodes ( 0 removals by DEE) and 2.760 seconds.
    +++++++++ Search for solution 17 +++++++++
    New solution: 80 (1349 backtracks, 33141 nodes, depth 151)
    New solution: 79 (1374 backtracks, 33334 nodes, depth 149)
    Optimality gap: [79, 49] -37.975 % (1474 backtracks, 33532 nodes)
    Node redundancy during HBFS: 9.421 %
    Optimum: 79 in 1474 backtracks and 33532 nodes ( 0 removals by DEE) and 2.924 seconds.
    +++++++++ Search for solution 18 +++++++++
    New solution: 80 (1546 backtracks, 33775 nodes, depth 102)
    Optimality gap: [80, 49] -38.750 % (1592 backtracks, 33864 nodes)
    Node redundancy during HBFS: 9.328 %
    Optimum: 80 in 1592 backtracks and 33864 nodes ( 0 removals by DEE) and 3.085 seconds.
    +++++++++ Search for solution 19 +++++++++
    New solution: 80 (1687 backtracks, 34105 nodes, depth 54)
    Node redundancy during HBFS: 9.263 %
    Optimum: 80 in 1687 backtracks and 34105 nodes ( 0 removals by DEE) and 3.219 seconds.
    +++++++++ Search for solution 20 +++++++++
    Optimality gap: [1000, 49] -95.100 % (1809 backtracks, 34349 nodes)
    Node redundancy during HBFS: 9.197 %
    No solution in 1809 backtracks and 34349 nodes ( 0 removals by DEE) and 3.377 seconds.
    end.
    
  • Download a crisp CSP file GEOM40_6.wcsp.xz (initial upper bound equal to 1). Count the number of solutions using #BTD [Favier2009a] using a min-fill variable ordering (warning, cannot use BTD to find all solutions in optimization):

    toulbar2 EXAMPLES/GEOM40_6.wcsp.xz -O=-3 -a -B=1 -ub=1 -hbfs:
    
    Read 40 variables, with 6 values at most, and 78 cost functions, with maximum arity 2.
    Cost function decomposition time : 1.1e-05 seconds.
    Preprocessing time: 0.001019 seconds.
    40 unassigned variables, 240 values in all current domains (med. size:6, max size:6) and 78 non-unary cost functions (med. arity:2, med. degree:4)
    Initial lower and upper bounds: [0, 1] 100.000%
    Tree decomposition width  : 5
    Tree decomposition height : 20
    Number of clusters        : 29
    Tree decomposition time: 0.000 seconds.
    Number of solutions    : =  411110802705928379432960
    Number of #goods       :    3993
    Number of used #goods  :    17190
    Size of sep            :    4
    Time                   :    0.055 seconds
    ... in 13689 backtracks and 27378 nodes
    end.
    
  • Get a quick approximation of the number of solutions of a CSP with Approx#BTD [Favier2009a]:

    toulbar2 EXAMPLES/GEOM40_6.wcsp.xz -O=-3 -a -B=1 -D -ub=1 -hbfs:
    
    Read 40 variables, with 6 values at most, and 78 cost functions, with maximum arity 2.
    Cost function decomposition time : 9e-06 seconds.
    Preprocessing time: 0.000997 seconds.
    40 unassigned variables, 240 values in all current domains (med. size:6, max size:6) and 78 non-unary cost functions (med. arity:2, med. degree:4)
    Initial lower and upper bounds: [0, 1] 100.000%
    
    part 1 : 40 variables and 71 constraints (really added)
    part 2 : 10 variables and 7 constraints (really added)
    --> number of parts : 2
    --> time : 0.000 seconds.
    
    Tree decomposition width  : 5
    Tree decomposition height : 17
    Number of clusters        : 33
    Tree decomposition time: 0.001 seconds.
    
    Cartesian product                  :    13367494538843734031554962259968
    Upper bound of number of solutions : <= 1719926784000000000000000
    Number of solutions    : ~= 480000000000000000000000
    Number of #goods       :    468
    Number of used #goods  :    4788
    Size of sep            :    3
    Time                   :    0.011 seconds
    ... in 3738 backtracks and 7476 nodes
    end.
    

Command line options

If you just execute:

toulbar2

toulbar2 will give you its (long) list of optional parameters, that you can see in part ‘Available options’ of : ToulBar2 Help Message.

To deactivate a default command line option, just use the command-line option followed by :. For example:

toulbar2 -dee: <file>

will disable the default Dead End Elimination [Givry2013a] (aka Soft Neighborhood Substitutability) preprocessing.

We now describe in more detail toulbar2 optional parameters.

General control

-agap=[decimal]

stops search if the absolute optimality gap reduces below the given value (provides guaranteed approximation) (default value is 0)

-rgap=[double]

stops search if the relative optimality gap reduces below the given value (provides guaranteed approximation) (default value is 0)

-a=[integer]

finds at most a given number of solutions with a cost strictly lower than the initial upper bound and stops, or if no integer is given, finds all solutions (or counts the number of zero-cost satisfiable solutions in conjunction with BTD)

-D

approximate satisfiable solution count with BTD

-logz

computes log of probability of evidence (i.e. log partition function or log(Z) or PR task) for graphical models only (problem file extension .uai)

-sigma=[float]

add a (truncated) random zero-centered gaussian noise for graphical models only (problem file extension .uai)

-timer=[integer]

gives a CPU time limit in seconds. toulbar2 will stop after the specified CPU time has been consumed. The time limit is a CPU user time limit, not wall clock time limit.

-bt=[integer]

gives a limit on the number of backtracks (\(9223372036854775807\) by default)

-seed=[integer]

random seed non-negative value or use current time if a negative value is given (default value is 1)

Preprocessing

-x=[(,i[\(=\#<>\)]a)*]

performs an elementary operation (’\(=\)’:assign, ‘\(\#\)’:remove, ‘\(<\)’:decrease, ‘\(>\)’:increase) with value a on variable of index i (multiple operations are separated by a comma and no space) (without any argument, it assumes a complete assignment – used as initial upper bound and as value heuristic – is read from default file “sol”, or, without the option -x, given as input filename with “.sol” extension)

-nopre

deactivates all preprocessing options (equivalent to -e: -p: -t: -f: -dec: -n: -mst: -dee: -trws:)

-p=[integer]

preprocessing only: general variable elimination of degree less than or equal to the given value (default value is -1)

-t=[integer]

preprocessing only: simulates restricted path consistency by adding ternary cost functions on triangles of binary cost functions within a given maximum space limit (in MB)

-f=[integer]

preprocessing only: variable elimination of functional (f=1) (resp. bijective (f=2)) variables (default value is 1)

-dec

preprocessing only: pairwise decomposition [Favier2011a] of cost functions with arity \(>=3\) into smaller arity cost functions (default option)

-n=[integer]

preprocessing only: projects n-ary cost functions on all binary cost functions if n is lower than the given value (default value is 10). See [Favier2011a].

-amo

automatically detects at-most-one constraints and adds them to existing knapsack constraints (positive value) and/or directly in the cost function network up to a given absolute number (non-zero value except -1)

-mst

find a maximum spanning tree ordering for DAC

-S

preprocessing only: performs singleton consistency (only in conjunction with option -A)

-M=[integer]

preprocessing only: apply the Min Sum Diffusion algorithm (default is inactivated, with a number of iterations of 0). See [Cooper2010a].

-trws=[float]

preprocessing only: enforces TRW-S until a given precision is reached (default value is 0.001). See Kolmogorov 2006.

--trws-order

replaces DAC order by Kolmogorov’s TRW-S order.

–trws-n-iters=[integer]

enforce at most N iterations of TRW-S (default value is 1000).

–trws-n-iters-no-change=[integer]

stop TRW-S when N iterations did not change the lower bound up the given precision (default value is 5, -1=never).

–trws-n-iters-compute-ub=[integer]

compute a basic upper bound every N steps during TRW-S (default value is 100)

-hve=[integer]

hidden variable encoding with a given limit to the maximum domain size of hidden variables (default value is 0) A negative size limit means restoring the original encoding after preprocessing while keeping the improved dual bound. See also option -n to limit the maximum arity of dualized n-ary cost functions.

-pwc=[integer]

pairwise consistency by hidden variable encoding plus intersection constraints, each one bounded by a given maximum space limit (in MB) (default value is 0) A negative size limit means restoring the original encoding after preprocessing while keeping the improved dual bound. See also options -minqual, -hve to limit the domain size of hidden variables, and -n to limit the maximum arity of dualized n-ary cost functions.

-minqual

finds a minimal intersection constraint graph to achieve pairwise consistency (combine with option -pwc) (default option)

Initial upper bounding

-l=[integer]

limited discrepancy search [Ginsberg1995], use a negative value to stop the search after the given absolute number of discrepancies has been explored (discrepancy bound = 4 by default)

-L=[integer]

randomized (quasi-random variable ordering) search with restart (maximum number of nodes/VNS restarts = 10000 by default)

-i=[“string”]

initial upper bound found by INCOP local search solver [idwalk:cp04]. The string parameter is optional, using “0 1 3 idwa 100000 cv v 0 200 1 0 0” by default with the following meaning: stoppinglowerbound randomseed nbiterations method nbmoves neighborhoodchoice neighborhoodchoice2 minnbneighbors maxnbneighbors neighborhoodchoice3 autotuning tracemode.

-pils=[“string”]

initial upper bound found by PILS local search solver. The string parameter is optional, using “3 0 0.333 100 500 10000 0.1 0.5 0.1 0.1” by default with the following meaning: nbruns perturb_mode perturb_strength flatMaxIter nbEvalHC nbEvalMax strengthMin strengthMax incrFactor decrFactor.

-lrbcd=[“string”]

initial upperbound found by LR-BCD local search solver. The string parameter is optional, using “5 -2 3” by default with the following meaning: maxiter rank nbroundings. (a negative rank means dividing the theoretical rank by the given absolute value)

-x=[(,i[\(=\#<>\)]a)*]

performs an elementary operation (’\(=\)’:assign, ‘\(\#\)’:remove, ‘\(<\)’:decrease, ‘\(>\)’:increase) with value a on variable of index i (multiple operations are separated by a comma and no space) (without any argument, a complete assignment – used as initial upper bound and as a value heuristic – read from default file “sol” taken as a certificate or given directly as an additional input filename with “.sol” extension and without -x)

-ub=[decimal]

gives an initial upper bound

-rasps=[integer]

VAC-based upper bound probing heuristic (0: disable, >0: max. nb. of backtracks, 1000 if no integer given) (default value is 0)

-raspslds=[integer]

VAC-based upper bound probing heuristic using LDS instead of DFS (0: DFS, >0: max. discrepancy) (default value is 0)

-raspsdeg=[integer]

automatic threshold cost value selection for probing heuristic (default value is 10 degrees)

-raspsini

reset weighted degree variable ordering heuristic after doing upper bound probing

Tree search algorithms and tree decomposition selection

-hbfs=[integer]

hybrid best-first search [Katsirelos2015a], restarting from the root after a given number of backtracks (default value is 16384)

-hbfsmin=[integer]

hybrid best-first search compromise between BFS and DFS minimum node redundancy threshold (alpha percentage, default value is 5%)

-hbfsmax=[integer]

hybrid best-first search compromise between BFS and DFS maximum node redundancy threshold (beta percentage default value is 10%)

-open=[integer]

hybrid best-first search limit on the number of stored open nodes (default value is -1, i.e., no limit)

-sopen=[integer]

number of visited open nodes before sorting the remaining open nodes based on weighted degree heuristics (double this limit for the next sorting) (see also option -q) (default value is 0, i.e., no sorting)

-burst

in parallel HBFS, workers send their solutions and open nodes as soon as possible (by default) For using a parallel version of HBFS, after compiling with MPI option (cmake -DMPI=ON .) use “mpirun -n [NbOfProcess] toulbar2 problem.wcsp”

-eps=[integer|filename]

Embarrassingly parallel search mode. It outputs a given number of open nodes in -x format and exit (default value is 0). See ./misc/script/eps.sh to run them. Use this option twice to specify the output filename.

-B=[integer]

(0) HBFS, (1) BTD-HBFS [Schiex2006a] [Katsirelos2015a], (2) RDS-BTD [Sanchez2009a], (3) RDS-BTD with path decomposition instead of tree decomposition [Sanchez2009a] (default value is 0)

-O=[filename]

reads either a reverse variable elimination order (given by a list of variable indexes) from a file in order to build a tree decomposition (if BTD-like and/or variable elimination methods are used) or reads a valid tree decomposition directly (given by a list of clusters in topological order of a rooted forest, each line contains a cluster number, followed by a cluster parent number with -1 for the first/root(s) cluster(s), followed by a list of variable indexes). It is also used as a DAC ordering.

-O=[negative integer]

build a tree decomposition (if BTD-like and/or variable elimination methods are used) and also a compatible DAC ordering using

  • (-1) maximum cardinality search ordering,

  • (-2) minimum degree ordering,

  • (-3) minimum fill-in ordering,

  • (-4) maximum spanning tree ordering (see -mst),

  • (-5) reverse Cuthill-Mckee ordering,

  • (-6) approximate minimum degree ordering,

  • (-7) default file ordering

  • (-8) lexicographic ordering of variable names

If not specified, then use the variable order in which variables appear in the problem file.

-root=[integer]

root cluster heuristic (0:largest, 1:max. size/(height-size), 2:min. size/(height-size), 3:min. height) (default value is 0)

-minheight

minimizes cluster tree height when searching for the root cluster (can be slow to perform)

-j=[integer]

splits large clusters into a chain of smaller embedded clusters with a number of proper variables less than this number (use options “-B=3 -j=1 -svo -k=1” for pure RDS, use value 0 for no splitting) (default value is 0).

-r=[integer]

limit on the maximum cluster separator size (merge cluster with its father otherwise, use a negative value for no limit) (default value is -1)

-X=[integer]

limit on the minimum number of proper variables in a cluster (merge cluster with its father otherwise, use a zero for no limit) (default value is 0)

-E=[float]

merges leaf clusters with their fathers if small local treewidth (in conjunction with option “-e” and positive threshold value) or ratio of number of separator variables by number of cluster variables above a given threshold (in conjunction with option -vns) (default value is 0)

-F=[integer]

merges clusters automatically to give more freedom to variable ordering heuristic in BTD-HBFS (-1: no merging, positive value: maximum iteration value for trying to solve the same subtree given its separator assignment before considering it as unmerged) (default value is -1)

-R=[integer]

choice for a specific root cluster number

-I=[integer]

choice for solving only a particular rooted cluster subtree (with RDS-BTD only)

Variable neighborhood search algorithms

-vns

unified decomposition guided variable neighborhood search [Ouali2017] (UDGVNS). A problem decomposition into clusters can be given as *.dec, *.cov, or *.order input files or using tree decomposition options such as -O. For a parallel version (UPDGVNS), use “mpirun -n [NbOfProcess] toulbar2 -vns problem.wcsp”.

-vnsini=[integer]

initial solution for VNS-like methods found: (-1) at random, (-2) min domain values, (-3) max domain values, (-4) first solution found by a complete method, (k=0 or more) tree search with k discrepancy max (-4 by default)

-ldsmin=[integer]

minimum discrepancy for VNS-like methods (1 by default)

-ldsmax=[integer]

maximum discrepancy for VNS-like methods (number of problem variables multiplied by maximum domain size -1 by default)

-ldsinc=[integer]

discrepancy increment strategy for VNS-like methods using (1) Add1, (2) Mult2, (3) Luby operator (2 by default)

-kmin=[integer]

minimum neighborhood size for VNS-like methods (4 by default)

-kmax=[integer]

maximum neighborhood size for VNS-like methods (number of problem variables by default)

-kinc=[integer]

neighborhood size increment strategy for VNS-like methods using: (1) Add1, (2) Mult2, (3) Luby operator (4) Add1/Jump (4 by default)

-best=[integer]

stop DFBB and VNS-like methods if a better solution is found (default value is 0)

Node processing & bounding options

-e=[integer]

performs “on the fly” variable elimination of variable with small degree (less than or equal to a specified value, default is 3 creating a maximum of ternary cost functions). See [Larrosa2000].

-k=[integer]

soft local consistency level (NC [Larrosa2002] with Strong NIC for global cost functions=0 [LL2009], (G)AC=1 [Schiex2000b] [Larrosa2002], D(G)AC=2 [CooperFCSP], FD(G)AC=3 [Larrosa2003], (weak) ED(G)AC=4 [Heras2005] [LL2010]) (default value is 4). See also [Cooper2010a] [LL2012asa].

-A=[integer]

enforces VAC [Cooper2008] at each search node with a search depth less than a given value (default value is 0)

-V

VAC-based value ordering heuristic (default option)

-T=[decimal]

threshold cost value for VAC (any decimal cost below this threshold is considered as null by VAC thus speeding-up its convergence, default value is 1, except for the cfn format where it is equal to the decimal cost precision, e.g. 0.001 if 3 digits of precision)

-P=[decimal]

threshold cost value for VAC during the preprocessing phase only (default value is 1, except for the cfn format where it is equal to the decimal cost precision, e.g. 0.001 if 3 digits of precision)

-C=[float]

multiplies all costs internally by this number when loading the problem (cannot be done with cfn format and probabilistic graphical models in uai/LG formats) (default value is 1)

-vaclin

automatic threshold cost value selection for VAC during search (must be combined with option -A)

-vacthr

automatic threshold cost value selection for VAC during search (must be combined with option -A)

-dee=[integer]

restricted dead-end elimination [Givry2013a] (value pruning by dominance rule from EAC value (dee \(>=1\) and dee \(<=3\) )) and soft neighborhood substitutability (in preprocessing (dee=2 or dee=4) or during search (dee=3)) (default value is 1)

-o

ensures an optimal worst-case time complexity of DAC and EAC (can be slower in practice)

-kpdp=[integer]

solves knapsack constraints using dynamic programming (-2: never, -1: only in preprocessing, 0: at every search node, >0: after a given number of nodes) (default value is -2)

Branching, variable and value ordering

-svo

searches using a static variable ordering heuristic. The variable order value used will be the same order as the DAC order.

-b

searches using binary branching (by default) instead of n-ary branching. Uses binary branching for interval domains and small domains and dichotomic branching for large enumerated domains (see option -d).

-c

searches using binary branching with last conflict backjumping variable ordering heuristic [Lecoutre2009].

-q=[integer]

use weighted degree variable ordering heuristic [boussemart2004] if the number of cost functions is less than the given value (default value is 1000000). A negative number will disconnect weighted degrees in embedded WeightedCSP constraints.

-var=[integer]

searches by branching only on the first [given value] decision variables, assuming the remaining variables are intermediate variables that will be completely assigned by the decision variables (use a zero if all variables are decision variables, default value is 0)

-m=[integer]

use a variable ordering heuristic that selects first variables such that the sum of the mean (m=1) or median (m=2) cost of all incident cost functions is maximum [Schiex2014a] (in conjunction with weighted degree heuristic -q) (default value is 0: unused).

-d=[integer]

searches using dichotomic branching. The default d=1 splits domains in the middle of domain range while d=2 splits domains in the middle of the sorted domain based on unary costs.

-sortd

sorts domains in preprocessing based on increasing unary costs (works only for binary WCSPs).

-sortc

sorts constraints in preprocessing based on lexicographic ordering (1), decreasing DAC ordering (2 - default option), decreasing constraint tightness (3), DAC then tightness (4), tightness then DAC (5), randomly (6), DAC with special knapsack order (7), increasing arity (8), increasing arity then DAC (9), or the opposite order if using a negative value.

-solr

solution-based phase saving (reuse last found solution as preferred value assignment in the value ordering heuristic) (default option).

-vacint

VAC-integrality/Full-EAC variable ordering heuristic (can be combined with option -A)

-bisupport=[float]

in bi-objective optimization with the second objective encapsulated by a bounding constraint (see WeightedCSPConstraint), the value heuristic chooses between both EAC supports of first (main) and second objectives by minimum weighted regret (if parameter is non-negative, it is used as the weight for the second objective) or always chooses the EAC support of the first objective (if parameter is zero) or always chooses the second objective (if parameter is negative, -1: for choosing EAC from the lower bound constraint, -2: from the upper bound constraint, -3: to favor the smallest gap, -4: to favor the largest gap) (default value is 0)

Diverse solutions

toulbar2 can search for a greedy sequence of diverse solutions with guaranteed local optimality and minimum pairwise Hamming distance [Ruffini2019a].

-div=[integer]

minimum Hamming distance between diverse solutions (use in conjunction with -a=integer with a limit of 1000 solutions) (default value is 0)

-divm=[integer]

diversity encoding method (0:Dual, 1:Hidden, 2:Ternary, 3:Knapsack) (default value is 3)

-mdd=[integer]

maximum relaxed MDD width for diverse solution global constraint (default value is 0)

-mddh=[integer]

MDD relaxation heuristic: 0: random, 1: high div, 2: small div, 3: high unary costs (default value is 0)

Console output

-help

shows the default help message that toulbar2 prints when it gets no argument.

-v=[integer]

sets the verbosity level (default 0).

-Z=[integer]

debug mode (save problem at each node if verbosity option -v=num \(>= 1\) and -Z=num \(>=3\))

-s=[integer]

shows each solution found during search. The solution is printed on one line, giving by default (-s=1) the value (integer) of each variable successively in increasing file order. For -s=2, the value name is used instead, and for -s=3, variable name=value name is printed instead.

File output

-w=[filename]

writes last/all solutions found in the specified filename (or “sol” if no parameter is given). The current directory is used as a relative path.

-w=[integer]

1: writes value numbers, 2: writes value names, 3: writes also variable names (default value is 1, this option can be used in combination with -w=filename).

-z=[filename]

saves problem in wcsp or cfn format in filename (or “problem.wcsp”/”problem.cfn” if no parameter is given) writes also the graphviz dot file and the degree distribution of the input problem

-z=[integer]

1 or 3: saves original instance in 1-wcsp or 3-cfn format (1 by default), 2 or 4: saves after preprocessing in 2-wcsp or 4-cfn format, -2 or -4: saves after preprocessing but keeps initial domains (this option can be used in combination with -z=filename). If the problem is saved after preprocessing (except for -2 or -4), some variables may be lost (due to variable elimination, see -e or -p or -f).

Probability representation and numerical control

-precision=[integer]

probability/real precision is a conversion factor (a power of ten) for representing fixed point numbers (default value is 7). It is used by CFN/UAI/QPBO/OPB/Pedigree formats. Note that in CFN format the number of significant digits is given in the problem description by default. This option allows to overwrite this default value.

-epsilon=[float]

approximation factor for computing the partition function (if greater than 1, default value is infinity) or floating-point precision (if smaller than 1, default value is 1e-9)

Note that in CFN format, costs are given as decimal numbers (the same for giving an initial upper bound, an absolute optimality gap or VAC threshold values) whereas in WCSP format costs are non-negative integers only.

Random problem generation

-random=[bench profile]

bench profile must be specified as follows.

  • n and d are respectively the number of variable and the maximum domain size of the random problem.

    bin-{n}-{d}-{t1}-{p2}-{seed}

    • t1 is the tightness in percentage % of random binary cost functions

    • p2 is the number of binary cost functions to include

    • the seed parameter is optional

    binsub-{n}-{d}-{t1}-{p2}-{p3}-{seed} binary random & submodular cost functions

    • t1 is the tightness in percentage % of random cost functions

    • p2 is the number of binary cost functions to include

    • p3 is the percentage % of submodular cost functions among p2 cost functions (plus 10 permutations of two randomly-chosen values for each domain)

    tern-{n}-{d}-{t1}-{p2}-{p3}-{seed}

    • p3 is the number of ternary cost functions

    nary-{n}-{d}-{t1}-{p2}-{p3}…-{pn}-{seed}

    • pn is the number of n-ary cost functions

    wcolor-{n}-{d}-0-{p2}-{seed} random weighted graph coloring problem

    • p2 is the number of edges in the graph

    vertexcover-{n}-{d}-{t1}-{p2}-{maxcost}-{seed} random vertex cover problem

    • t1 is the tightness (should be equal to 25)

    • p2 is the number of edges in the graph

    • maxcost each vertex has a weight randomly chosen between 0 and maxcost

    bivertexcover-{n}-{d}-{t1}-{p2}-{maxcost}-{ub2}-{seed} random bi-objective vertex cover problem

    • t1 is the tightness (should be equal to 25)

    • p2 is the number of edges in the graph

    • maxcost each vertex has two weights, both randomly chosen between 0 and maxcost

    • ub2 upper bound for the bounding constraint on the second objective (see epsilon-constraint method)

    salldiff-{n}-{d}-{t1}-{p2}-{p3}…-{pn}-{seed}

    • pn is the number of salldiff global cost functions (p2 and p3 still being used for the number of random binary and ternary cost functions). salldiff can be replaced by gcc or regular keywords with three possible forms (e.g., sgcc, sgccdp, wgcc) and by knapsack.

Input formats

Introduction

The available file formats (possibly compressed by gzip or bzip2 or xz, e.g., .cfn.gz, .wcsp.xz, .opb.bz2) are :

  • Cost Function Network format (.cfn file extension)

  • Weighted Constraint Satisfaction Problem (.wcsp file extension)

  • Probabilistic Graphical Model (.uai / .LG file extension ; the file format .LG is identical to .UAI except that we expect log-potentials)

  • Weighted Partial Max-SAT (.cnf/.wcnf file extension)

  • Quadratic Unconstrained Pseudo-Boolean Optimization (.qpbo file extension)

  • Pseudo-Boolean Optimization (.opb file extension)

  • Integer Linear Programming (.lp file extension)

  • Constraint Satisfaction and Optimization Problem (.xml file extension)

Some examples :

Notice that by default toulbar2 distinguishes file formats based on their extension. It is possible to read a file from a unix pipe using option -stdin=[format]; e.g., cat example.wcsp | toulbar2 --stdin=wcsp

It is also possible to read and combine multiple problem files (warning, they must be all in the same format, either wcsp, cfn, or xml). Variables with the same name are merged (domains must be identical), otherwise the merge is based on variable indexes (wcsp format). Warning, it uses the minimum of all initial upper bounds read from the problem files as the initial upper bound of the merged problem.

Formats details

How do I use it ?

Using it as a C++ library

See toulbar2 Reference Manual which describes the libtb2.so C++ library API.

Using it from Python

A Python interface is now available. Compile toulbar2 with cmake option PYTB2 (and without MPI options) to generate a Python module pytoulbar2 (in lib directory). See examples in src/pytoulbar2.cpp and web/TUTORIALS directory.

An older version of toulbar2 was integrated inside Numberjack. See https://github.com/eomahony/Numberjack.

References

[Beldjilali22]

A Beldjilali, P Montalbano, D Allouche, G Katsirelos and S de Givry. Parallel Hybrid Best-First Search. In Proc. of CP-22, Haifa, Israel, 2022.

[Schiex2020b]

Céline Brouard and Simon de Givry and Thomas Schiex. Pushing Data in CP Models Using Graphical Model Learning and Solving. In Proc. of CP-20, Louvain-la-neuve, Belgium, 2020.

[Trosser2020a] (1,2,3)

Fulya Trösser, Simon de Givry and George Katsirelos. Relaxation-Aware Heuristics for Exact Optimization in Graphical Models. In Proc.of CP-AI-OR’2020, Vienna, Austria, 2020.

[Ruffini2019a] (1,2)

M. Ruffini, J. Vucinic, S. de Givry, G. Katsirelos, S. Barbe and T. Schiex. Guaranteed Diversity & Quality for the Weighted CSP. In Proc. of ICTAI-19, pages 18-25, Portland, OR, USA, 2019.

[Ouali2017] (1,2,3,4,5)

Abdelkader Ouali, David Allouche, Simon de Givry, Samir Loudni, Yahia Lebbah, Francisco Eckhardt, Lakhdar Loukil. Iterative Decomposition Guided Variable Neighborhood Search for Graphical Model Energy Minimization. In Proc. of UAI-17, pages 550-559, Sydney, Australia, 2017.

[Schiex2016a]

David Allouche, Christian Bessière, Patrice Boizumault, Simon de Givry, Patricia Gutierrez, Jimmy H.M. Lee, Ka Lun Leung, Samir Loudni, Jean-Philippe Métivier, Thomas Schiex and Yi Wu. Tractability-preserving transformations of global cost functions. Artificial Intelligence, 238:166-189, 2016.

[Hurley2016b]

B Hurley, B O’Sullivan, D Allouche, G Katsirelos, T Schiex, M Zytnicki and S de Givry. Multi-Language Evaluation of Exact Solvers in Graphical Model Discrete Optimization. Constraints, 21(3):413-434, 2016. Presentation at CPAIOR’16, Banff, Canada, http://www.inra.fr/mia/T/degivry/cpaior16sdg.pdf.

[Katsirelos2015a] (1,2,3,4)

D Allouche, S de Givry, G Katsirelos, T Schiex and M Zytnicki. Anytime Hybrid Best-First Search with Tree Decomposition for Weighted CSP. In Proc. of CP-15, pages 12-28, Cork, Ireland, 2015.

[Schiex2014a] (1,2)

David Allouche, Jessica Davies, Simon de Givry, George Katsirelos, Thomas Schiex, Seydou Traoré, Isabelle André, Sophie Barbe, Steve Prestwich and Barry O’Sullivan. Computational Protein Design as an Optimization Problem. Artificial Intelligence, 212:59-79, 2014.

[Givry2013a] (1,2)

S de Givry, S Prestwich and B O’Sullivan. Dead-End Elimination for Weighted CSP. In Proc. of CP-13, pages 263-272, Uppsala, Sweden, 2013.

[Ficolofo2012]

D Allouche, C Bessiere, P Boizumault, S de Givry, P Gutierrez, S Loudni, JP Métivier and T Schiex. Decomposing Global Cost Functions. In Proc. of AAAI-12, Toronto, Canada, 2012. http://www.inra.fr/mia/T/degivry/Ficolofo2012poster.pdf (poster).

[Favier2011a] (1,2,3)

A Favier, S de Givry, A Legarra and T Schiex. Pairwise decomposition for combinatorial optimization in graphical models. In Proc. of IJCAI-11, Barcelona, Spain, 2011. Video demonstration at http://www.inra.fr/mia/T/degivry/Favier11.mov.

[Cooper2010a] (1,2,3)

M. Cooper, S. de Givry, M. Sanchez, T. Schiex, M. Zytnicki and T. Werner. Soft arc consistency revisited. Artificial Intelligence, 174(7-8):449-478, 2010.

[Favier2009a] (1,2)

A. Favier, S. de Givry and P. Jégou. Exploiting Problem Structure for Solution Counting. In Proc. of CP-09, pages 335-343, Lisbon, Portugal, 2009.

[Sanchez2009a] (1,2,3)

M Sanchez, D Allouche, S de Givry and T Schiex. Russian Doll Search with Tree Decomposition. In Proc. of IJCAI’09, Pasadena (CA), USA, 2009. http://www.inra.fr/mia/T/degivry/rdsbtd_ijcai09_sdg.ppt.

[Cooper2008] (1,2)

M. Cooper, S. de Givry, M. Sanchez, T. Schiex and M. Zytnicki. Virtual Arc Consistency for Weighted CSP. In Proc. of AAAI-08, Chicago, IL, 2008.

[Schiex2006a] (1,2)

S. de Givry, T. Schiex and G. Verfaillie. Exploiting Tree Decomposition and Soft Local Consistency in Weighted CSP. In Proc. of AAAI-06, Boston, MA, 2006. http://www.inra.fr/mia/T/degivry/VerfaillieAAAI06pres.pdf (slides).

[Heras2005]

S. de Givry, M. Zytnicki, F. Heras and J. Larrosa. Existential arc consistency: Getting closer to full arc consistency in weighted CSPs. In Proc. of IJCAI-05, pages 84-89, Edinburgh, Scotland, 2005.

[Larrosa2000]

J. Larrosa. Boosting search with variable elimination. In Principles and Practice of Constraint Programming - CP 2000, volume 1894 of LNCS, pages 291-305, Singapore, September 2000.

[koller2009] (1,2)

D Koller and N Friedman. Probabilistic graphical models: principles and techniques. The MIT Press, 2009.

[Ginsberg1995] (1,2)

W. D. Harvey and M. L. Ginsberg. Limited Discrepency Search. In Proc. of IJCAI-95, Montréal, Canada, 1995.

[Lecoutre2009]

C. Lecoutre, L. Saïs, S. Tabary and V. Vidal. Reasoning from last conflict(s) in constraint programming. Artificial Intelligence, 173:1592,1614, 2009.

[boussemart2004]

Frédéric Boussemart, Fred Hemery, Christophe Lecoutre and Lakhdar Sais. Boosting systematic search by weighting constraints. In ECAI, volume 16, page 146, 2004.

[idwalk:cp04] (1,2)

Bertrand Neveu, Gilles Trombettoni and Fred Glover. ID Walk: A Candidate List Strategy with a Simple Diversification Device. In Proc. of CP, pages 423-437, Toronto, Canada, 2004.

[Verfaillie1996]

G. Verfaillie, M. Lemaître and T. Schiex. Russian Doll Search. In Proc. of AAAI-96, pages 181-187, Portland, OR, 1996.

[LL2009]

J. H. M. Lee and K. L. Leung. Towards Efficient Consistency Enforcement for Global Constraints in Weighted Constraint Satisfaction. In Proceedings of IJCAI’09, pages 559-565, 2009.

[LL2010]

J. H. M. Lee and K. L. Leung. A Stronger Consistency for Soft Global Constraints in Weighted Constraint Satisfaction. In Proceedings of AAAI’10, pages 121-127, 2010.

[LL2012asa]

J. H. M. Lee and K. L. Leung. Consistency Techniques for Global Cost Functions in Weighted Constraint Satisfaction. Journal of Artificial Intelligence Research, 43:257-292, 2012.

[Larrosa2002] (1,2)

J. Larrosa. On Arc and Node Consistency in weighted {CSP}. In Proc. AAAI’02, pages 48-53, Edmondton, (CA), 2002.

[Larrosa2003]

J. Larrosa and T. Schiex. In the quest of the best form of local consistency for Weighted CSP. In Proc. of the 18th IJCAI, pages 239-244, Acapulco, Mexico, August 2003.

[Schiex2000b]

T. Schiex. Arc consistency for soft constraints. In Principles and Practice of Constraint Programming - CP 2000, volume 1894 of LNCS, pages 411-424, Singapore, September 2000.

[CooperFCSP]

M.C. Cooper. Reduction operations in fuzzy or valued constraint satisfaction. Fuzzy Sets and Systems, 134(3):311-342, 2003.