Skip to content
Advertisement

“Maximum number of parameters” error with filter .in_(list) using pyodbc

One of our queries that was working in Python 2 + mxODBC is not working in Python 3 + pyodbc; it raises an error like this: Maximum number of parameters in the sql query is 2100. while connecting to SQL Server. Since both the printed queries have 3000 params, I thought it should fail in both environments, but clearly that doesn’t seem to be the case here. In the Python 2 environment, both MSODBC 11 or MSODBC 17 works, so I immediately ruled out a driver related issue.

So my question is:

  1. Is it correct to send a list as multiple params in SQLAlchemy because the param list will be proportional to the length of list? I think it looks a bit strange; I would have preferred concatenating the list into a single string because the DB doesn’t understand the list datatype.
  2. Are there any hints on why it would be working in mxODBC but not pyodbc? Does mxODBC optimize something that pyodbc does not? Please let me know if there are any pointers – I can try and paste more info here. (I am still new to debugging SQLAlchemy.)

Footnote: I have seen lot of answers that suggest to chunk the data, but because of 1 and 2, I wonder if I am doing the correct thing in the first place.

(Since it seems to be related to pyodbc, I have raised an internal issue in the official repository.)

JavaScript

Advertisement

Answer

When you do a straightforward .in_(list_of_values) SQLAlchemy renders the following SQL …

JavaScript

… where each value in the IN clause is specified as a separate parameter value. pyodbc sends this to SQL Server as …

JavaScript

… so you hit the limit of 2100 parameters if your list is very long. Presumably, mxODBC inserted the parameter values inline before sending it to SQL Server, e.g.,

JavaScript

You can get SQLAlchemy to do that for you with

JavaScript
User contributions licensed under: CC BY-SA
8 People found this is helpful
Advertisement