Looking up for a piece of information in log file(s) which continuously fed up by a running process isn't a fun task to-do! Fortunately, grep
tool comes to rescue. it's available in all major OS including Windows (by installing Git Bash for example).
Syntax
tail -f /path/to/file1 /path/to/file2 | grep --line-buffered "pattern"
Examples
search for lines containing "/min"
(case-insensitive)
$ tail -f test3.log | grep --ignore-case --line-buffered "/min"
2017-05-23 10:40:20 [scrapy] INFO: Crawled 452 pages (at 60 pages/min), scraped 355 items (at 50 items/min)
2017-05-23 10:41:20 [scrapy] INFO:...
2017-05-23 10:42:20 [scrapy] INFO:...
search lines that match pattern 2017-05-23 (09|10):\d{2}:\d{2}
(regular expression) in all test*.log
files (e.g. test1.log, test2.log, test3.log...)
$ tail -f test*.log | grep --ignore-case --extended-regexp --line-buffered --regexp="2017-05-23 (09|10|11)"
2017-05-23 10:48:08 [scrapy] DEBUG:Scraped from <200 http://www.travelpod.com/travel-blog-entries/isy14/1/1401062400/tpod.html>
2017-05-23 10:48:08 [scrapy] DEBUG:...
2017-05-23 10:48:08 [travelpod] DEBUG:...