We (IT Department at work) are looking to build an iPad app, that will take numeric IDs and provide a simple lookup in a table. It is essentially a primary key search on a single table, and displaying a field after minor processing.
The Caveat:
There are 4.5 million rows in this table, and it needs a lookup time of maximum of 1 second. It will not have an internet connection, so it has to happen on the device. We have a few ideas but which makes most sense:
Sqlite: Will it stand up to such abuse? Can it handle that many rows, and will it do it well?
Flat file search: we can loop over the file ourselves, or split them up by the first few digits to do some slightly more intelligent indexing.
Off load to some 3rd party database app on the device which can handle it through an API.
Something else entirely which in our infinite wisdom we have missed.
I must take the chance to thank apple for making this so easy to test ourselves. Without a Mac or Dev license, we don't want to commit over £2000 until we know we can get it done well.
SQLite is amazingly fast. A test table with 4.5 million records has that structure:
CREATE TABLE testtable (numericid INTEGER PRIMARY KEY, testtext TEXT);
It is filled with increasing values for numericid (0, 1, .... ) and a string for testtext.
Doing all the insert atomically took 1 hour 42 minutes on a MacBook pro (2009). The resulting SQLite file is 94 MB of size.
Inside the iOS app, the database is opened in the viewDidLoad method. A simple button triggered database query like this:
- (void)btnPressed:(UIButton *)sender{
NSLog(@"btn pressed, start");
sqlite3_stmt *statement = nil;
NSString *querystring;
querystring= [NSString stringWithFormat:@"SELECT * FROM testtable WHERE numericid = 2571312;"];
const char *sql = [querystring UTF8String];
NSLog(@"sql is: %s", sql);
if (sqlite3_prepare_v2(dbConnection, sql, -1, &statement, NULL)!=SQLITE_OK){
NSLog(@"sql problem occured with: %s", sql);
NSLog(@"%s", sqlite3_errmsg(dbConnection));
}
else
{
while (sqlite3_step(statement) == SQLITE_ROW) {
NSString *numericid = [NSString stringWithUTF8String:(char*)sqlite3_column_text(statement, 1)];
NSString *testtext = [NSString stringWithUTF8String:(char*)sqlite3_column_text(statement, 0)];
NSLog(@"%@",[NSString stringWithFormat:@"%@ (%@)", numericid, testtext]);
} // while
}
sqlite3_finalize(statement);
NSLog(@"btn pressed, finished");
}
results in an output of:
2012-08-10 17:51:36.734 DBQueryTest[28462:707] Database Successfully Opened
2012-08-10 17:51:39.083 DBQueryTest[28462:707] btn pressed, start
2012-08-10 17:51:39.087 DBQueryTest[28462:707] sql is: SELECT * FROM testtable WHERE numericid = 2571312;
2012-08-10 17:51:39.099 DBQueryTest[28462:707] text2571312 (2571312)
2012-08-10 17:51:39.102 DBQueryTest[28462:707] btn pressed, finished
So a query takes under 19ms! This could be reproduced for several values of numericid, though I didn't run a fully randomized test for statistical evaluation.
Conclusion: This test setup fulfills your requirements. SQLite is definitely a way to go.
UPDATE:
A quick random access test with 100000 key values verifies the first result. Leaving sql statement string creation and time-consuming NSLog outputs out of time measurement, the average database query time drops by an order of magnitude to:
average query time: 1.8 ms
mean deviation: 0.4 ms
maximum query time: 25.9 ms
minimum query time: 0.6 ms
Although Sqlite should work fine, it is probably overkill. A simple binary search should be all you need. If it is too slow on one big file, split it into 10 or 100 subfiles by first or first two digits.
Optionally, you could load the data into a CFArray, which supports binary search (see CFArrayBSearchValues
). While this would have a performance penalty for the initial load, it may perform better on subsequent searches than a custom binary search on one or more files.
As I've already sort of said in a comment - This isn't hard to test, you can load up a large database into a Core Data store and create a test app to see how fast the results are returned.
I say Core Data over SQLite - as it's optimised for the platform (even though it uses SQLite as a storage medium), and it's easier to write code that returns values and displays them.
Edited to add
I created a sample project that loaded up a dataset of 11 records and 400,000 records. This used Core Data on iOS 5.
Ran the test on my iPad2 (yes 2)
search times varied between 3-8 ms (0.003 - 0.008 s), no appreciable difference between the smaller and larger dataset.
This is unoptimised code, running in debug mode, and not written for any sort of performance enhancements - the search predicate was created at every search rather than cached, for example, no threading.
The size of the datastore for 400,000 records is 17.2 mb, so even the larger 4.5m records would easily fit on an iPad.
Loading this data into memory on iOS is a no-go.
You should use SQLLite. Thats what its made for and you will not do better handrolling file IO code.
来源:https://stackoverflow.com/questions/11901931/ios-implementation-theory