in my view there is always a trade off
as stated above, depends on how you intend to collect and use the data you generate
the databases already have out of the box, several utilities and features that assist, in manipulating the data i.e. input, storage, analysis, sorting, comparison, integrity checks, reporting, retrieval, search, filters, security, multi-user edits etc
you can however achieve all this with a flat file but you must be prepared to provide the interface that do these things that most databases do. by cleverly tweaking the structure and using well defined columns/fields per row data item you can even achieve relational status. the difference here is that whereas some of the databases have these features as standard, you would have to create this features through script or code. all these features can be created including all types of backups.
you must however determine which gives you the optimal benefit i.e do you gain more by writing your own scripts to manipulate the data (as listed above) and having a much lighter and perhaps faster performance or do you gain more by cutting down your own development time and instead deploying either a standard or customized database solution?
my own take remains that when someone tells you user friendly, the scale is inversely proportional, i.e in the ideal case, the more effort by the programmer/developer behind the scenes developing all sorts of scenarios, the easier it is for the end user, the less effort by the programmer/developer, then the end user of the system will have to use more effort to manipulate data