One of our queries that was working in Python 2 + mxODBC is not working in Python 3 + pyodbc; it raises an error like this:
Maximum number of parameters in the sql query is 2100. while connecting to SQL Server. Since both the printed queries have 3000 params, I thought it should fail in both environments, but clearly that doesn’t seem to be the case here. In the Python 2 environment, both MSODBC 11 or MSODBC 17 works, so I immediately ruled out a driver related issue.
So my question is:
listas multiple params in SQLAlchemy because the param list will be proportional to the length of
list? I think it looks a bit strange; I would have preferred concatenating the
listinto a single
stringbecause the DB doesn’t understand the
Footnote: I have seen lot of answers that suggest to chunk the data, but because of 1 and 2, I wonder if I am doing the correct thing in the first place.
(Since it seems to be related to pyodbc, I have raised an internal issue in the official repository.)
import sqlalchemy import sqlalchemy.orm from sqlalchemy import MetaData, Table from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.orm.session import Session Base = declarative_base() create_tables = """ CREATE TABLE products( idn NUMERIC(8) PRIMARY KEY ); """ check_tables = """ SELECT * FROM products; """ insert_values = """ INSERT INTO products (idn) values (1), (2); """ delete_tables = """ DROP TABLE products; """ engine = sqlalchemy.create_engine('mssql+pyodbc://user:password@dsn') connection = engine.connect() cursor = engine.raw_connection().cursor() Session = sqlalchemy.orm.sessionmaker(bind=connection) session = Session() session.execute(create_tables) metadata = MetaData(connection) class Products(Base): __table__ = Table('products', metadata, autoload=True) try: session.execute(check_tables) session.execute(insert_values) session.commit() query = session.query(Products).filter( Products.idn.in_(list(range(0, 3000))) ) query.all() f = open("query.sql", "w") f.write(str(query)) f.close() finally: session.execute(delete_tables) session.commit()
When you do a straightforward
.in_(list_of_values) SQLAlchemy renders the following SQL …
SELECT team.prov AS team_prov, team.city AS team_city FROM team WHERE team.prov IN (?, ?)
… where each value in the IN clause is specified as a separate parameter value. pyodbc sends this to SQL Server as …
exec sp_prepexec @p1 output,N'@P1 nvarchar(4),@P2 nvarchar(4)',N'SELECT team.prov AS team_prov, team.city AS team_city, team.team_name AS team_team_name FROM team WHERE team.prov IN (@P1, @P2)',N'AB',N'ON'
… so you hit the limit of 2100 parameters if your list is very long. Presumably, mxODBC inserted the parameter values inline before sending it to SQL Server, e.g.,
SELECT team.prov AS team_prov, team.city AS team_city FROM team WHERE team.prov IN ('AB', 'ON')
You can get SQLAlchemy to do that for you with
provinces = ["AB", "ON"] stmt = ( session.query(Team) .filter( Team.prov.in_(sa.bindparam("p1", expanding=True, literal_execute=True)) ) .statement ) result = list(session.query(Team).params(p1=provinces).from_statement(stmt))