I am using z3py. I am trying to check the satisfiability for different problems with different sizes and verify the scalability of the proposed method. However, to do that I need to know the memory consumed by the solver for each problem. Is there a way to access the memory or make the z3py print it in the STATISTICS section. Thank you so much in advance.
Update-27/5/2015:
I tried with the paython memory profiler but it seems that the generated memory is very large. I am not sure but the reported memory is similar to the memory consumed by the python application and not only Z3(constructing the z3 model and checking the sat and then generating model). Moreover, I used formal modeling checking tools for many years now. I am expecting Z3 to be more efficient and have better scalability, however, I am getting much less memory than what the paython is generating. What I am thinking to do is to try to measure the design size or the scalability using factors other than memory. In z3py statistics many details are generated to describe the design size and complexity. However, I am not able to find any explanation of these parameters in the tutorial, webpage, or z3 papers. For example, can you help me understand the following parameters generated in the statistics for one of the basic models I have. Also is there any parameter/parameters which can replace the memory or be good indication of the Z3 mdoel size/complexity.
- :added-eqs 152
- :assert-lower 2
- :assert-upper 2
- :binary-propagations 59
- :conflicts 6
- :datatype-accessor-ax 12
- :datatype-constructor-ax 14
- :datatype-occurs-check 19
- :datatype-splits 12
- :decisions 35
- :del-clause 2
- :eq-adapter 2
- :final-checks 1
- :mk-clause 9
- :offset-eqs 2
- :propagations 61
Thank again for your time.