jlubcke/tri.table

value_to_query_string_value_string expects a special field "name" in models

Closed this issue · 17 comments

Hi,

I've encountered a bug when trying to filter a table using a foreign key with a lookup (i.e. a choice_queryset column). The problem can be reproduced in the provided example project when calling http://127.0.0.1:8000/kitchen_sink/?b=1&-=-&query= but only if Foo.name is replaced by something else (e.g. Foo.foo_name) where-ever it appears (twice in models.py and once in views.py). The error obtained when filtering:

Exception Value:
'Foo' object has no attribute 'name'
Exception Location:
c:\portablesoftware\winpython3-5\python-3.5.3.amd64\lib\site-packages\tri\query\__init__.py in value_to_query_string_value_string, line 66 

By the way, I had to massage the example to make it work, namely "pattern" import must be removed from urls.py and strings must be replaced by callables for views.py to make it work. Also replace xrange by range and add templates in settings. I guess it's due to the Django version I use (1.10).

boxed commented

Well, I don't think it's a bug so much as bad documentation and error message (and it's strictly in tri.query, not tri.table but you don't care of course :P). The problem is that to search for things we need to know how to map "I want to search for a Foo called superman" to the django query language. The default is Foo.objects.filter(name="superman") in this case, but as you say if the object doesn't have a name property that blows up. The solution is to specify the field value_to_q_lookup to the query definition. In the kitchen_sink example:

        b = Column.choice_queryset(
            show=False,
            choices=Foo.objects.all(),
            model=Foo,
            bulk__show=True,
            query__show=True,
            query__gui__show=True,
            query__value_to_q_lookup='foo_name',  # <---- change here!
        )

The name value_to_q_lookup is pretty bad too I realize :P I welcome suggestions, because that's the best I could come up with at the time :(

Thanks for the answer ! I guess that makes sense for a generic lookup, but in the case of a choice_queryset isn't pk always the field we want ?

On a closely-related subject, is it possible to be more refined with the "simple" filter form ? E.g. search by substrings (instead of exact match) and search by partial dates (like 2012 for the 2012 records) ?

boxed commented

but in the case of a choice_queryset isn't pk always the field we want ?

Well... kindof. But tri.query supports both the nice little GUI dropdown AND a full query language. The latter won't work in any useful way without this name lookup. And internally we always convert the GUI view to the full advanced query language before evaluating. There are advantages and disadvantages to this for sure, but the advanced query language feature is pretty cool so that's why it's like that :P

E.g. search by substrings (instead of exact match)

Yes. There are multiple ways:

  • The advanced filter does that by using "foo:bar" instead of "foo=bar" (probably not what you want)
  • You can specify op_to_q_op to override. It's a bit icky but it gets the job done. So it'd be something like query__op_to_q_op=lambda op: 'icontains' if op == '=' else Q_OP_BY_OP[op] (import Q_OP_BY_OP from tri.query) if you want to make = behave like a case insensitive contains.
  • You can also use the freetext feature. So you'd do query__freetext=True (and maybe remove query__gui=True) on all the fields you want freetext searching on. Then you'll get one field in the GUI that searches multiple columns with a case insensitive contains. Beware that this can be slow of course.

a) Ok, I get it now. Thanks for the explanation !
b) the op_to_q_op override works nicely, thanks ! (again)

Up to now, this library looks pretty awesome. Are there any plans to write a comprehensive documentation?

boxed commented

Up to now, this library looks pretty awesome.

I'm glad to hear it! We certainly think so :P

Are there any plans to write a comprehensive documentation?

Well, we are trying for sure. It's hard! https://tritable.readthedocs.io/en/latest/api.html has fairly comprehensive documentation on specific fields, but we're still not even sure how to structure a reference documentation in a useful way :(

We've thought about writing more cookbook style examples but the problem seems to me to be that this library is so flexible and powerful (and a bit strange) it's hard to make documentation that will cover it.

One thing we've found is that every time we interact with a new user in a significant way and then look over the documentation we found things we can improve, but it's hard to imagine being a beginner with the library.

boxed commented

Oh, and as for documentation it's pretty significant to understand that tri.table is built on top of tri.declarative, tri.form and tri.query for certain parts and that you can shunt commands through tri.table to these underlying libs in certain places. This a style of API design and thinking that is hard to put across effectively.

Well, as a new user (and an intermediate-level pythoner), I would not mind having more "how-tos" and tutorials, even if they don't cover the whole thing. In the "installation" section, one should add that for table filtering, jquery is required (that took me a few minutes to discover) and that tri.form and tri.query must also be included in the apps list (to reach the templates). In the API reference, it's not always clear to me what can be used as "meta" options or as declarative-style parameters or as instanciation parameters. I guess that the examples provided in similar libraries (e.g. django-tables2 and django-filter) could be covered to illustrate how tri.table shines at solving these issues, but well, as you said, it's a lot of work.

boxed commented
boxed commented

Actually, now that I think about it the jquery requirement is just silly really. We can probably very easily remove it. It's more of a holdover from when we extracted this lib from the proprietary code base it comes from...

boxed commented

The jquery requirement will be gone in the next version of tri.query!

Nice, always a good thing to reduce dependencies

boxed commented

I wrote this text as a response to your feedback above and would love your feedback!

Architecture overview

tri.table is built on top of tri.form (a form library) and tri.query (a query/search/filter library). Both tri.form and tri.query are written in the same API style as tri.table, which enables great integration and cohesion. All three libraries are built on top of tri.declarative. This has some nice consequences that I'll try to explain here.

Declarative/programmatic hybrid API

The @declarative, @with_meta and @creation_ordered decorators from tri.declarative enables us to very easily write an API that can look both like a normal simple python API:

my_table = Table(columns=[Column(name='foo'), Column('bar')], sortable=False)

This code is hopefully pretty self explanatory. But the cool thing is that we can do the exact same thing with a declarative style:

class MyTable(Table):
    foo = Column()
    bar = Column()
    
    class Meta:
        sortable = False
        
my_table = MyTable()

This style can be much more readable. There's a subtle different though between the first and second styles: the second is really a way to declare defaults, not values. This means we can create instances of the class and setting the values with the call to the constructor:

my_table = MyTable(
    column__foo__show=False,  # <- hides the column foo
    sortable=True,  # <- turns on sorting again
)

...without having to create a new class inheriting from MyTable. So the API keeps all the power of the simple style and also getting the nice syntax of a declarative API.

Namespace dispatching

I've already hinted at this above in the example where we do column__foo__show=False. This is an example of the powerful namespace dispatch mechanism from tri.declarative. It's inspired by the query syntax of Django where you use __ to jump namespace. (If you're not familiar with Django, here's the gist of it: you can do SomeTable.objects.filter(some_foreign_key_relation__value_of_another_table='foo') to filter.) We really like this style and have expanded on it. It enables functions to expose the full API of functions it calls while still keeping the code simple. Here's a contrived example:

from tri.declarative import dispatch, EMPTY


@dispatch(
    b__x=1,  # these are default values. "b" here is implicitly defining a namespace with a member "x" set to 1
    c__y=2,
)
def a(foo, b, c):
    print('foo:', foo)
    some_function(**b)
    another_function(**c)


@dispatch (
    d=EMPTY,  # explicit namespace
)
def some_function(x, d):
    print('x:', x)
    another_function(**d)


def another_function(y=None, z=None):
    if y:
        print('y:', y)
    if z:
        print('z:', z)

# now to call a()!
a('q')
# output:
# foo: q
# x: 1
# y: 2


a('q', b__x=5)
# foo: q
# x: 5
# y: 2

a('q', b__d__z=5)
# foo: q
# x: 1
# z: 5
# y: 2

This is really useful in tri.table as it means we can expose the full feature set of the underling tri.query and tri.form libraries by just dispatching keyword arguments downstream. It also enables us to bundle commonly used features in what we call "shortcuts", which are pre packaged sets of defaults.

Hi,

I think it definitively gives precious information to the user. A thing that still remains confusing to me is how to know in advance what sequence of keys should be used in the high-level functions. In the namespace dispatching example, you use a sequence of function argument names (which is rather obvious, but only if one knows in advance what argument is given to which internal function...). But in the MyTable example, column__foo__show contains a mix of user-defined and library-defined arguments. Also, since the argument to Table is columns (with "s"), I would have expected something with columns and not column.

The "explicit" namespace thing is also a bit unclear to me, but that's maybe only me being tired or stupid (or both).

boxed commented

Yea, right now the only good way to figure out the full path for something is to read the code. It's not ideal, but we do have some ideas about how to improve this.

column/columns

The parameter columns is for specifying a list of columns (i.e. specifying ALL columns), but in this example you only want to tweak one specific column, so that's why it's without the s because it's "the column with the name foo".

The "explicit" namespace thing is also a bit unclear to me, but that's maybe only me being tired or stupid (or both).

I would have hoped the implicit case was more confusing heh.

Just another small comment. In the "Namespace dispatching" section above, you may want to add a bit of contextual hint, like "The problem we want to solve is how to provide an elegant way to expose the internal function arguments to the API front-end, in particular when an internal function, like an object constructor, is called several times".

boxed commented

I've now added a nice little error message if you hit this issue. So instead of a AttributeError from deep inside tri.query, now you'll get an AttributeError with an error message saying you need to specify "value_to_q_lookup" if you don't have a 'name' field.

This ticket needs to stay open though, because it has a lot of good points about documentation that we really need to address.

boxed commented

I believe all of this is fixed in iommi. If you have time I would love your feedback on this!