From a performance perspective, how efficient is it to use a MySQL temporary table for a highly used website feature?

后端 未结 3 688
眼角桃花
眼角桃花 2021-02-14 03:48

I\'m attempting to write a search functionality for a website, and I\'ve decided upon an approach of using MySQL temporary tables to handle the data input, via the query below:<

相关标签:
3条回答
  • 2021-02-14 04:28

    What you stated is totally correct, the temporary table will only be visible to the current user/connection. Still, there is some overhead and some other problems such as:

    • For each of the thousands of searches you are going to create and fill that table (and drop it later) - not per user, per search. Because each search most likely will re-execute the script, and "per session" does not mean PHP session - it means database session (open connection).
    • You will need the CREATE TEMPORARY TABLES privilege, which you might not have.
    • Still, that table really should have MEMORY type, which steals your RAM more than it looks like. Because even having VARCHAR, MEMORY tables use fixed length row-storage.
    • If your heuristics later need to refer to that table twice (like SELECT xyz FROM patternmatch AS pm1, patternmatch AS pm2 ...) - this is not possible with MEMORY tables.

    Next, it would be easier for you - and also for the database - to add the LIKE '%xyz%' directly to your images tables WHERE clause. It will do the same without the overhead of creating a TEMP TABLE and joining it.

    In any case - no matter which way you go - that WHERE will be horribly slow. Even if you add an index on images.name you most likely will need LIKE '%xyz%' instead of LIKE 'xyz%', so that index will not get used.

    I'm asking whether a session-specific temporary table to handle the search input by the user (created on a search, dropped on the end of a session) is an appropriate way of handling a search functionality.

    No. :)

    Alternative options

    MySQL has a build-in Fulltext-Search (since 5.6 also for InnoDB) that even can give you that scoring: I highly recommend giving it a read and a try. You can be sure that the database knows better than you how to do that search efficiently.

    If you are going to use MyISAM instead of InnoDB, be aware of the often overlooked limitation that FULLTEXT searches only return anything if the number of results is less than 50% of the total table rows.

    Other things that you might want to look at, are for example Solr (Nice introduction read to that topic itself would be the beginning of http://en.wikipedia.org/wiki/Apache_Solr ). We are using it in our company and it does a great job, but it requires quite some learning.

    Summary

    The solution to your current problem itself (the search) is to use the FULLTEXT capabilities.

    If I have hundreds of thousands of searches per second, what sort of performance issues might I encounter? Is there any better way of implementing a search functionality?

    To give you a number, 10.000 calls per second is not "trivial" already - with hundreds of thousands of searches per second the sort of performance issues you will encounter are everywhere in your set-up. You are going to need a couple of servers, load balancing and tons of other amazing tech crap. And one of this will be for example Solr ;)

    0 讨论(0)
  • 2021-02-14 04:39
    1. Creating temporary tables on disk is relatively expensive. In your scenario it sounds like it'll be slower than it's worth.
    2. It's usually only worthwhile to create temporary tables in memory. But you need to know you have enough memory available at all times. If you plan to support so many searches per second this is not a good solution.
    3. MySQL has full-text searching built-in. It's good for small systems. This would likely perform far better than your temp table and JOIN. But if you want to support thousands of searches per second I would not recommend it. It could consume too much of your overall database performance. Plus you're then forced to use MyISAM for storage which might have its own issues in your scenario.
    4. For so many searches you'll want to offload the work to another system. Plenty of searching systems with scoring already exist. Take a look at ElasticSearch, Solr/Lucene, Redis, etc.
    0 讨论(0)
  • 2021-02-14 04:46

    From the code you give, I really don't think tmp tables are needed, nor is FULLTEXT searching. But ... about tmp table performance:

    The creation/cleanup of the tmp table is not written to transaction logs, so it will be relatively quick for the OS to do the I/O involved. If the temporary tables will be small and short-lived, and you have lots of buffers available for the OS, the disk realistically wont even be touched. If you think it will be anyways, get an SSD drive, and get more RAM.

    But if you are realistic that you are looking at hundreds of thousands of searches per second then you have a big engineering project on hand. Why not just do:

    select images.* from images where name in ('some', 'search', 'query')

    ?

    0 讨论(0)
提交回复
热议问题