Unexpected behavior with unique rand_list_t
Closed this issue · 4 comments
I am trying to randomize a few rand_list_t, and I want the lists to be unique:
import vsc
@vsc.randobj
class my_c(object):
def __init__(self):
self.a = vsc.rand_list_t(vsc.rand_uint16_t(1), 2)
self.b = vsc.rand_list_t(vsc.rand_uint16_t(1), 2)
self.c = vsc.rand_list_t(vsc.rand_uint16_t(1), 2)
@vsc.constraint
def index_1_constraint(self):
self.a[1] < 2
self.b[1] < 2
self.c[1] < 2
@vsc.constraint
def unique_list_constraint(self):
vsc.unique(self.a, self.b, self.c)
if __name__ == "__main__":
it = my_c()
it.randomize(solve_fail_debug=1)
It fails with the following error message:
vsc.model.solve_failure.SolveFailure: solve failure
It fails because it seems like it is trying to do vsc.unique(self.a[0], self.b[0], self.c[0]) and vsc.unique(self.a[1], self.b[1], self.c[1]), but that's not really what I want. I would like the lists to be unique, not the individual indexes. Index 1 could be the same for all lists as long as the index 0 are different.
Is there a way to achieve what I want?
Hi @felixdube,
The current behavior is aligned with SV behavior. Specifically, all arguments to unique are 'flattened' down to a combined list of scalar variables which must be unique.
Unfortunately, I don't think there's an easy way to synthesize your desired behavior using existing constraints. This is because you effectively need to construct an 'OR' across all the uniquification terms between the arrays.
I think it's reasonable to support 'vector' uniquification within PyVSC. Is it safe to assume that, at least for your use case, the size of all vectors is the same?
Hi @mballance,
Thank you for you quick reply!
Yes, in my use case, it would be okay to assume the size of the vectors is the same.
Felix
Hi @felixdube, Have a look at the new unique_vec
constraint in the 0.9.0 release. You can find a test showing its use here:
Lines 204 to 234 in 36aca2d
Awesome! Thank you for the help!