Setup log format
So you’re tired of reading apache with column -t
or need to process them with external tools maybe push them into a logstash? Say no more:
# # Inside your virtual host definition # # Declaring your custom log format as a JSON structure LogFormat '{"time":"%{%FT%T%z}t","response":{"status":"%>s","duration":"%D","length":"%B"},"request":{"method":"%m","host":"%V","port":"%p","url":"%U","query":"%q"},"client":{"ip":"%a","agent":"%{User-agent}i","referer":"%{Referer}i"}}' json_log # Declaring an environment variable based on the type of file requested SetEnvIf Request_URI "(\.gif|\.png|\.jpg|\.ico|\.css|\.js|\.eot|\.ttf|\.woff2?)$" request_static=1 # Declaring separate log files (one for static content, one for dynamic pages) with the new log format CustomLog /path/to/log/access_static.log json_log env=request_static CustomLog /path/to/log/access_dynamic.log json_log env=!request_static
Tool to read/analyse the logs (manually)
A small tool called jq which basically reads each line and treats it as a JSON object, then outputs them pretty printed.
The package itself doesn’t have any dependencies and is readily available in linux repos.
Minimal usage:
echo '{"a": 1, "b": 2, "c": 3, "d": [{"e": 4}]}' | jq .
{ "a": 1, "b": 2, "c": 3, "d": [ { "e": 4 } ] }
Object restructuring:
echo '{"a": 1, "b": 2, "c": 3, "d": [{"e": 4}]}' | jq '{"c": .a, "e": .d[0].e}'
{ "c": 1, "e": 4 }
Parsing string content as JSON:
echo '{"a":1,"b":"[{\"c\":2,\"d\":\"3\"}, {\"c\":3,\"e\":\"5\"}]"}' | jq '.["b"]|fromjson'
[ { "c": 2, "d": "3" }, { "c": 3, "e": "5" } ]
Filtering:
echo '{"a":1,"b":"[{\"c\":2,\"d\":\"3\"}, {\"c\":3,\"e\":\"5\"}]"}' | jq '.["b"]|fromjson|.[]|select(.c == 2)'
{ "c": 2, "d": "3" }