问题
I have a discussion-db, and I need a great amount of test data, for different sized samples. Please, see the ready SELECT, JOIN and CREATE-queries, please scroll down in the link.
How can I automatically generate test data to the db?
How to generate test data in different sized samples?
Is there some ready tool?
回答1:
Here are a couple of suggestions for free tools that generate test data:
Databene Benerator: supports many JDBC-capable database brands, uses XML format compatible with DbUnit, GPL license.
Super Smack: originally a load-test tool for MySQL, it also supports PostgreSQL and it includes a generator of mock data.
- A current version of Super Smack appears to be available here
I asked a similar question here on StackOverflow in February, and the two choices above seemed like the best options.
回答2:
I know this question is super dated, but I was looking for the answer to this exact question today and I came across this:
http://wiki.postgresql.org/wiki/Sample_Databases
Out of the options listed (including built in tools like pgbench), pgFoundry has several compelling options that works perfectly for the test cases I am working on.
I thought it might help someone like me, so there it is.
回答3:
I'm not sure how to get automatically generated data and insert it into the database (I'm sure you could pull it off with a python script or something), but if you're just looking for endless blabbering to stick into a db, this should be helpful.
回答4:
I'm not a postres person, but in many other DBs I've used, a simple mechanism for generating large quantities of test data is a cross join. The technique is particularly useful for generating large quantities of test data.
Here's a nice blog post on it (SQL Server specific though).
来源:https://stackoverflow.com/questions/1295641/automatically-generated-test-data-to-a-db-from-a-schema