问题
I encounter this error when querying a pytables table with where method passing a string with 50 conditions. These conditions are basically a translation of a SQL IN clause i.e. I want to get records where a particular field is equal to a certain list of values.
C:\Python2764\lib\site-packages\numexpr\necompiler.pyc in evaluate(ex, local_dict, global_dict, out, order, casting, **kwargs)
744 kwargs = {'out': out, 'order': order, 'casting': casting,
745 'ex_uses_vml': ex_uses_vml}
--> 746 return compiled_ex(*arguments, **kwargs)
ValueError: too many inputs
What would be another way to perform the same query? My first thought is to perform the query 50 times, once for each value and merge the results. I was hoping maybe there is a more elegant solution to this.
Is there a known restriction on the number of conditions that can be passed to numexpr?
回答1:
If you use pandas.DataFrame.query
then you can try to change engine='python'
in **kwargs.
回答2:
I asked the same question to the Pytable guys on GitHub. (Coincidentally, I had 50 conditions translated from a SQL IN clause as well.) They were really nice and fast to respond. It is a numpy issue:
This is a limitation of numexpr based on the NPY_MAXARGS value which is defined in numpy. Upgrading to numpy v1.8 may fix the issue. See numpy/numpy#226 and
[1]
. ...
...
Oh and by the way, the limit isNPY_MAXARGS=32
What numpy version do you use? numpy 1.8 is OK.
EDIT: Actually it's not working on numpy 1.8 as well. They have NPY_MAXARGS=32 still
来源:https://stackoverflow.com/questions/18432277/pytables-numexpr-valueerror-too-many-inputs-when-querying-with-a-lot-of-conditi