I want to parse out data out of a log file which consist of JSON sting and I wonder if there's a way for me to use a bash function to perform any custom parsing instead of overloading jq
command.
Command:
tail errors.log --follow | jq --raw-output '. | [.server_name, .server_port, .request_file] | @tsv'
Outputs:
8.8.8.8 80 /var/www/domain.com/www/public
I want to parse 3rd column to cut the string to exclude /var/www/domain.com
part where /var/www/domain.com
is the document root, and /var/www/domain.com/subdomain/public
is the public html
section of the site. Therefore I would like to leave my output as /subdomain/public
(or from the example /www/public
).
I wonder if I can somehow inject a bash function to parse .request_file
column? Or how would I do that using jq
?
I'm having issues piping out the output of any part of this command that would allow me to do any sort of string manipulation.
domain.com
exactly, or do you want to trim the first three directories no matter what they are? Should the resulting 3rd column bewww/public
or/www/public
? I'm making some assumptions in my answer, but a better question would provide an explicit example of the desired output, not just the current output. – Charles Duffy/www/public
would be the desired output. – HelpNeeder--follow
) should always be before arguments (likeerrors.log
); GNU tools allow it to be the other way around, but the POSIX utility syntax guidelines are explicit that only the options-first ordering is guaranteed to be supported. – Charles Duffy/subdomain/public
which is/www/domain.com
if the source waswww.example.com
. – HelpNeeder