Table get locked when called an SQL Server SP from pyodbc Python
I have a table I made for testing called test. I just want to see if my code works. My python code is very simple and only calls one SP
Here is my SQL Server SP script:
JavaScript
x
16
16
1
ALTER PROCEDURE [dbo].[TestService]
2
AS
3
BEGIN
4
SET NOCOUNT ON;
5
6
declare @C1 int
7
8
BEGIN TRANSACTION;
9
insert into Test (a1) values ('Service')
10
COMMIT TRANSACTION;
11
12
SELECT @C1 = COUNT(*) FROM test (nolock)
13
14
SELECT GETDATE(), @C1 as t
15
END
16
and my python code is this:
JavaScript
1
11
11
1
import pyodbc
2
import pandas as pd
3
4
#df_results.drop(axis=0, inplace=True)
5
6
ConnectionString = "DRIVER={SQL Server};Server=Serv;Database=DB;User Id=user;Password=*****;"
7
8
conn = pyodbc.connect(ConnectionString)
9
df_results = pd.read_sql("EXEC TestService" , conn)
10
print(df_results)
11
Before running the python code I ran this select
JavaScript
1
2
1
SELECT * FROM Test (NoLock)
2
and the output was 4 records
I ran the python script and I got this output
JavaScript
1
2
1
0 2021-12-19 00:09:36.887 5
2
which means record been inserted and total number of records is 5
but when I run the
JavaScript
1
2
1
SELECT * FROM Test (NoLock)
2
I still get 4 records only
and when I try
JavaScript
1
2
1
SELECT * FROM Test
2
I get timed out.
How to fix that?
Advertisement
Answer
read_sql
won’t commit the transaction. You need to explicitly commit it.
JavaScript
1
6
1
conn = pyodbc.connect(ConnectionString)
2
df_results = pd.read_sql("EXEC TestService" , conn)
3
print(df_results)
4
conn.commit()
5
conn.close()
6