Fast Search in Logs [closed]

倖福魔咒の 提交于 2019-12-05 17:39:50

The problem is using XML will make your log file even bigger I would suggest either splitting up your log files by date or lines otherwise use file based database engines such as sqlite

A gigabyte isn't that big, really. What kind of "correlation" are you trying to do with these log files? I've often found it's simpler to write a custom program (or script) to handle a log file in a particular way than it is to try to come up with a database schema to handle everything you'll ever want to do with it. Of course, if your log files are hard to parse for whatever reason, it may well be worth trying to fix that aspect.

(I agree with kuoson, by the way - XML is almost certainly not the way to go.)

If you can check your logs on Windows, or using Wine, LogParser is a great tool to mine data out of logs, it practically allows you to run SQL queries on any log, with no need to change any code or log formats, and it can even be used generate quick HTML or excel reports.

Also a few years ago, when XML was in the hype I was using XML logs, and XSLT stylesheets to produce views, it was actually kinda nice, but it used way to much memory and it would choke on large files, so you probably DON'T want to use XML.

The trouble with working on log files is that each one has to be queried individually, you'll get a much sharper response if you could create an index of the log files and search/query that instead. Lucene would be my next port of call, then solr.

Maybe you could load your log into Emacs (provided you have sufficient memory) and use the various Emacs features such as incremental search and Alt-X occur.

Disclaimer: I haven't tried this on files > 100MB.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!