Are you bored opening files in different text editor while working on command line to parse and gain only information that is required. Writing longer commands to slice only required data, or regex commands for formatting. Let's solve your problem using ag (Angle-Grinder) tool. Yes, ag lets you Slice, format & parse files on the command line.
Processing Millions of row in second or analysing data from file at lightning speed can only be possible using Angle grinder. While processing any files results are live updated on terminal which lets you control your data and you can stop process whenever required. Parsing a document and retrieving needful information from json, log or any other files made easy on the terminal. For the most part, you don't have to think about how software works, and you get to focus on the interactive software and use the terminal in the way you want. This tool is built for purpose of making terminal more powerful and user can efficiently utilise its time rather wasting it on other non-terminal tools.
Features of Angle Grinder :
- Parse, Aggregate, sum & sort data in files
- Live updates on terminal
- Better data analytics - logs checking
- Perhaps the biggest advantage of ag over traditional text processing methods for looking through logfiles is that it makes it easy to search by date (not covered in tutorial)
So without wasting more time, let's get our hand dirty.
agrind/ag/angle-grinder tool is supported on any version of Linux and also on Mac OS. One single command on any system and ag is ready for you to use.
$ curl -L https://github.com/rcoh/angle-grinder/releases/download/v0.7.2/angle_grinder-v0.7.2-x86_64-unknown-linux-musl.tar.gz \ | tar Ozxf - \ | sudo tee /usr/local/bin/agrind > /dev/null && sudo chmod +x /usr/local/bin/agrind
Installation takes around 2-3 min depending on your internet speed.
Once you install ag tool, all you have to do is check if "ag or agrind" command are working on your machine.
agrind tool works with many series of operator and multiple "|" for passing output of each part to filter or parse or aggregate data. Generally, initial operation will parse data files or JSON followed by aggregation or grouping or fining total no of occurrences this list is not limited you can explore your self and learn different ways for other operators as well.
Using ag tool
we are using sample files available in package for demonstration, and for future commands don't forget this syntax,
agrind ' | operator1 | operator2 | operator3 | ...'
so let's start with different example one by one,
Filter command will list down lines with ERROR tag in them, lets say that you are working with Network file and all of sudden you need to find out error from its log. This simple command will work like charm for you. All you need to do is pass WORD inside ''(single quote).
$ sudo cat filter_test.log | agrind '"ERROR"'
$ cat test_json.log | agrind '* | json'
where if(condition) is satisfied, row is kept as is but if the condition is not satisfied rows where dropped. In the command below left is column name while right is the condition. Examples
$ cat test_parse.log | agrind '* | json | where status_code >= 400'
an, another example
$ python -u gen_logs.py |agrind '* | json | p50(response_ms) by status_code,url'||
other Non-Aggregate operations support by ag are,
- JSON, lets you parse output in JSON format.
- Parse, Parse text that matches the pattern into variables. Lines that don't match the pattern will be dropped.
- Fields, use for conditional dropping.
$ python -u gen_logs.py
other operator that's can be used instead,
- sum, row value is nonnumeric than it's not considered
- average, Average values in column.
- count distinct
Parse with Operator - Rendering
$ cat test_parse.log | agrind '* | parse "in *ms" as response_time'
Hope you love this tutorial and let us know your comments.
Read Also :
- Squid Analyzer - A Parser for Squid proxy access.log File
- Setup Centralized Syslog Server in Red Hat Linux
- 7 Linux Tail Command Examples and How it Helps Monitor Logs