-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Wrong nnf file produced with right model count on formulas #12
Comments
Ya, the Typically, when you generate, you just want to work with the aspects of the theory that make sense. If you want direct counts on the original theory vocabulary, then you need to be careful about how you count. This means accounting for variables that don't appear in the d-DNNF or making sure that all variables appear. Can you try with the |
Even with -smoothNNF dsharp returns a nnf with 1007463893786228957007320280685556805128418235790001520882379093805335556916236051595072291229027640395917516949422080 models while it prints 1003423649010231476228717449425288172853483301587463560042403927001189267086150888397612820568126355100717618571509760 models. I know that if I have n extra variables I must take into account a factor or 2^n, the bug I report here is different. The reproducer, with 372 extra variables, makes dsharp output a nnf with The extra variables come from bitblasting, and removing them would incur yet another layer of variable translation I'd like not to have to write... |
Hrmz...I very much suspect the variable ordering (gaps in the CNF) being the culprit. Two quick followups:
for i in range(1,401):
print(f'{i} -{i} 0') I've tried putting things through the d4 reasoner, but am getting unsat for the generated nnf (not sure what's going on there with that software). |
|
Ok I can reproduce this bug without useless variables, it is only a question of the order of clauses:
If I reverse the order of clauses, then the symptoms disappear. |
I fear this issue is pretty deep. Banged my head against it for a while, and the best I could narrow down is that it involves conflict analysis, and most likely the pollutants infrastructure (something set up by sharpSAT in order to cache things quicker). As a temporary solution, using the |
Thanks for invistigating this bug! I just tried
|
I realise this is an old issue but this might relate to an issue that I am observing. Since my CNF is a lot larger, I took a look at #12 (comment), hoping it will solve my issue too. So far I found the following when running dsharp without any extra flags, just the nnf and cnf paths.
Observe that variable 13 is used in both, violating decomposability. Manually checking the CNF, 13 should not be present in 167. By conditioning on 2, -16, -6, we infer 15 which removes the clause connecting 13 to 14 and 15. During compilation, the AND node on line 168 corresponds to That is as far as I got right now. I might revisit this later, but I could use some pointers as to where in the codebase we best proceed debugging. |
I think there are situations where dsharp outputs the correct model count but an incorrect nnf. A possible trigger is a large number of useless variables, in the sense that they don't appear in the formula at all.
Reproducer:
formula.cnf
:Steps to reproduce:
outputs
and c2d agrees:
However, the model count of the corresponding nnf is not the same. For the bug report I use query-dnnf from https://www.cril.univ-artois.fr/kc/d-DNNF-reasoner.html, but I confirmed this with my own tool computing the model count from the nnf.
but the file returned by c2d has the right model count:
The bug is actually due to the fact that the file has 400 variables but 372 of them are just useless. If I replace the variables in the cnf by consecutive ones (1, 2, 3...) I get the right count:
script to do this:
reduce_cnf.py
and this is the right count because
log2(1003423649010231476228717449425288172853483301587463560042403927001189267086150888397612820568126355100717618571509760 / 104310) = 372
However if I do the same transformation to the nnf, I get a different result:
script
reduce_dnnf.py
which is wrong. This also support the hypothesis that the model count originally printed by dsharp is correct, but not the nnf produced.
I tried to reduce this example as much as possible (with creduce) but now I think I investigated all I could think of. I can't use the validations scripts in src/extra because they use
kr
which is a python module I could not find.The text was updated successfully, but these errors were encountered: