I need to repeatedly query a MySQL DB from Python, as the data is rapidly changing. Each time the data is read, it is transferred into a list.
I had assumed that simply
I'd make a few changes. First, declare the cursor before the while loop. I would also do a buffered cursor. And finally, close the cursor and DB after the file is done. Hope this helps.
import mysql.connector
from mysql.connector import Error
from time import sleep
# Create empty list to store values from database.
listSize = 100
myList = []
for i in range(listSize):
myList.append([[0,0,0]])
# Connect to MySQL Server
mydb = mysql.connector.connect(host='localhost',
database='db',
user='user',
password='pass')
mycursor = mydb.cursor(buffered=True, dictionary=True)
# Main loop
while True:
# SQL query
sql = "SELECT * FROM table"
# Read the database, store as a dictionary
mycursor.execute(sql)
# Store data in rows
myresult = mycursor.fetchall()
# Transfer data into list
for row in myresult:
myList[int(row["rowID"])] = (row["a"], row["b"], row["c"])
print(myList[int(row["rowID"])])
print("---")
sleep (0.1)
mycursor.close()
mydb.close()
You need to commit the connection after each query. This commits the current transaction and ensures that the next (implicit) transaction will pick up changes made while the previous transaction was active.
# Main loop
while True:
# SQL query
sql = "SELECT * FROM table"
# Read the database, store as a dictionary
mycursor = mydb.cursor(dictionary=True)
mycursor.execute(sql)
# Store data in rows
myresult = mycursor.fetchall()
# Transfer data into list
for row in myresult:
myList[int(row["rowID"])] = (row["a"], row["b"], row["c"])
print(myList[int(row["rowID"])])
# Commit !
mydb.commit()
print("---")
sleep (0.1)
The concept here is isolation levels. From the docs (emphasis mine):
REPEATABLE READ
This is the default isolation level for InnoDB. Consistent reads within the same transaction read the snapshot established by the first read.