We use an Oracle(or postgres) database and an application server to execute integration tests. To isolate each test from another one , the database schema is dropped and re-created before each test.
As you see this is a time taking process. The application uses 100+ tables. We are thinking of writing custom sql to delete unwanted data from each tables. Is there a better way to save and restore database state?
( It appears DBUnit could do this, I have not tried it yet. )
A single test involves:
- create database schema.
- Start app server.
- Start multiple client applications.
- Execute and verify.
We have 5000 odd tests, taking 700 hours or so. (we do it on a grid environment, finishes overnight)
Most of the tests uses small data sizes, say up to 10 MB.
Oracle Flashback allows you to restore a table at a specified time point using a simple SQL query. The documentation is available here.
I don't know if Postgre has a similar feature.
What version of Oracle (enterprise 10g+ or standard)? Assuming you're using Enterprise, you can use Flashback database. You build out your baseline DB. Then
create a guaranteed restore point
run your test
capture results somewhere outside the database
flashback database to restore point
start over
This should be enough to get you started. If you want additional detail let me know.
For PostgreSQL, I think using a template database is quicker than re-creating all tables individually.
Just create a new database with a different name than the one that you use normally (e.g. my_template_db) but with all tables that you need. You can put testdata in there as well.
When you run the test, drop the database that you want to test. Then re-create the test using the template.
DROP DATABASE my_test_db; CREATE DATABASE my_test_db WITH TEMPLATE my_template_db;
There were some optimizations in 9.0 that would speed this up. So maybe that approach is faster than re-creating all tables through SQL.
For Oracle, you can use this pl/sql package: snapshot.sql
We have 500 tables, 30 of which are restored after each test, and it takes ~500ms on average.
Usage is extremely simple:
EXECUTE SNAPSHOT.TAKE_SNAPSHOT('snapshot name');
EXECUTE SNAPSHOT.RESTORE_SCHEMA('snapshot name');
If each test fits in a single transaction, you can simply rollback. Is that an option?
Questions
- What kind of database are we talking about?
- Is it a Multi-T size or just a few G?
- How much data is in it?
- How are constraints defined?
- How quick should this be done?
- How long do your tests take? (a few days or a few weeks)
- How much storage is available?
- How much updates are made during a test?
- What version database do you have?
Suggestions
- If you have plenty storage and not very many updates, try a flashback database.
- If you are testing on copies of the prod database, use cloning (and of course, data masking) (also a good test for the prod backup).
- If you have a nice test database with valuable test data in it, use backup/restore.
- If you have a version 11g database, configured with a physical standby database, you could try to test on a snapshot database.
来源:https://stackoverflow.com/questions/5040362/database-restore-to-particular-state-for-testing