MySQLdb is extremely slow with large result sets

前端 未结 2 788
失恋的感觉
失恋的感觉 2021-01-12 12:11

I executed the following query both in phpMyAdmin & MySQLdb (python).

SELECT *, (SELECT CONCAT(`id`, \'|\', `name`, \'|\', `image_code`)
FROM `model_arti         


        
相关标签:
2条回答
  • 2021-01-12 13:03

    If you expect an SQL query to have a large result set which you then plan to iterate over record-by-record, then you may want to consider using the MySQLdb SSCursor instead of the default cursor. The default cursor stores the result set in the client, whereas the SSCursor stores the result set in the server. Unlike the default cursor, the SSCursor will not incur a large initial delay if all you need to do is iterate over the records one-by-one.

    You can find a bit of example code on how to use the SSCursor here.

    For example, try:

    import MySQLdb.cursors
    
    self.db = MySQLdb.connect(host="localhost", user="root", passwd="", db="ibeat",
                              cursorclass = MySQLdb.cursors.SSCursor)
    

    (The rest of the code can remain the same.)

    0 讨论(0)
  • 2021-01-12 13:09

    PHPMyAdmin places a limit on all queries so you don't return large result sets in the interface. So if your query normally returns 1,000,000 rows, and PHPMyAdmin reduces that to 1,000 (or whatever the default is), then you would have to expect a lot longer processing time when Python grabs or even queries the entire result set.

    Try placing a limit in Python that matches the limit on PHPMyAdmin to compare times.

    0 讨论(0)
提交回复
热议问题