Skip to content

Parsing JSON from the command line using `fx`

Last updated:

Get fx from https://github.com/antonmedv/fx.

After applying the lodash tweak mentioned in the Readme we can do some neat little things from the command line.

Analyse Elasticsearch responses:

Terminal window
curl -XPOST -H 'Content-Type: application/json' "https://[es-host]:9243/data/_search" -d'
{
"query":{
[…]
},
"size”:10
}' | fx 'x => x.hits.hits' | fx 'map("_source.document.id")'

Lists the IDs of all documents of a certain type.

Multiple fx calls like

Terminal window
| fx 'x => x.hits.hits' | fx 'map("_source.document.id")'

can be simplified to

Terminal window
| fx 'x => x.hits.hits' 'map("_source.document.id")'

Note, === does not seem to work properly, use == instead, e.g.

Terminal window
| fx 'filter(x => x.res.error.code == 99)'

Simple pretty print:

Terminal window
$ echo '{"a":1,"b":{}}' | fx .
{
"a": 1,
"b": {}
}

Return object keys

Javascript core objects can be used too. Example from the docs:

Terminal window
cat package.json | fx .dependencies Object.keys

Read an Object Array from a file and output an item’s field as JSON

Terminal window
node -e 'console.log(JSON.stringify(require("./configs.js")))' | fx 'find({id: "foobar"})' '.metadataOverrides'

Interactive mode

Offers intuitive navigation of the data with arrow keys (up, down, collapse/expand), more key bindings are listed here: https://github.com/antonmedv/fx/blob/master/DOCS.md#interactive-mode

Even allows regex searches :)

Copy JSON logs into a single file containing array

Lets assume the logs are placed into sub-directories of the current path and have the extension .log.

find . -type f -name "*.log" -exec cat {} + > all_logs

all_logs is not valid JSON as it is a list of arrays separated by whitespaces. We can use fx to read the lines into an array using --slurp and flatten the two level array into a single one using this.flat():

cat all_logs| fx --slurp 'this.flat()' > all_logs.json