Powerful Command line tools for DevOps: Nushell and Jc
Episode #39: Revolutionize Your DevOps Toolkit with Nushell and Jc for Maximum Efficiency.
In my 20 years of experience with Linux command tools, I always felt frustrated with having to learn so many different text-processing tools like awk
, grep
, and sed
, every time I wanted to parse the output of command line tools like ls
, ps
or df
.
If only those commands were using a structured format like JSON as their output, you would only have to learn a single processing tool like Jq.
I recently discovered Nushell, a modern alternative to the Bash terminal that rewrites some command line tools to export structured data instead of just plain text.
Nushell makes it extremely easy to work with structured data with its human-friendly programming language and modern approach to the terminal.
As you might expect, though, since Nushell is a relatively new project, it only rewrites some of the most common command line tools.
For those tools that are not covered, you either have to hope they expose a --json
flag or you can integrate with another command line tool called Jc (to not confuse with Jq, the famous JSON parser) that converts 100s of command line tools outputs to JSON format.
So after years of struggling, I finally got rid of awk
, grep
, sed
, and jq
and significantly improved my life.
I can use Jc to convert legacy Linux command output into structured data and then Nushell to process, filter, update or display that structured data with a human-friendly language.
In this tutorial, I want to share my experience and the lessons I learned as a Platform Engineer.
Want to connect?
👉 Follow me on LinkedIn and Twitter.
If you need 1-1 mentoring sessions, please check my Mentorcruise profile.
Problem statement
As a Software Engineer, you interact daily with myriads of Linux command line tools (e.g. ps
, grep
, dig
, ls
), each with its output format and list of flags.
Mastering those tools can take years.
Most people, myself included, only master the basics and then spend hours consulting the documentation or StackOverflow whenever they encounter a new use case.
ChatGPT can really help here, especially when you want to create complex scripts. The only caveat is that you are left at the mercy of trusting a machine to do the work for you.
Furthermore, the more tools are necessary in your script, the harder it will be to make your code portable to other machines and the variety of Operating systems available.
Sometimes, when I get frustrated by the limitations of Bash (and my knowledge of the tool), I wish I had started directly from a Python script or another compiled language like Golang when things get complicated.
However, with those languages come other complexities.
There should be an easier way to use those command line tools in a powerful and human-friendly way.
A little bit of personal history
I got my first Linux distribution almost 20 years ago, in my first year of university.
A terminal was a scary place to be coming from a Windows desktop.
So many tools to learn!!
You had to do anything just by typing commands on a black background.
Somehow, I managed to survive university and my first few jobs without being a ninja in text processing tools like grep
, awk
, sed
and so on.
I knew enough to get by but was never fast by any metrics.
For the young people reading this article, this was long before ChatGPT was even a remote idea.
You had to invest a lot of upfront time to learn those text processing tools or countless hours on Stackoverflow asking for help whenever needed.
So, almost 10 years ago, I decided to invest some time into learning those tools and picked up an excellent book called Data Science at the Command Line, 2e. You can now read this book online for free.
While reading this book, I got so frustrated that each command had a completely different output, and making them work together was so tedious. So, while I finished the book, I never retained big chunks of its content.
If only those commands had a standard interoperability format that would make it easier to filter, search, and pass along data from one command to another.
Why has nobody thought about it?
Fast forward to today.
ChatGPT can do a great job at crafting a complex pipeline for you. The only thing that you need to provide is the expected input and expected output.
Except that even ChatGPT makes mistakes, and you don't want to trust it completely.
If only there was a better way.
JC - Convert legacy command line output into JSON
Since JSON is the de-facto standard for machine-readable data, it would be nice if any Linux command had a --json
flag to output its results in JSON.
Instead of rewriting tools written in the '70s, JC parses the old and well-establish Linux commands output in a JSON format that can be easily processed by tools like Jq.
Examples
In the example below, we can see how to get the results of pinging an IP address in Json format by using jc -p ping
command.
Jc can also be used as a Python library, as seen in the example below.
Documentation
More about JC at:
The author of Jc has written extensively about the philosophy behind Jc at:
Nushell: A modern shell with a friendly programming language
Now that we have most of the Linux commands output in JSON format, what about processing this data?
Suddenly, most data processing tools like awk
, sed
, and grep
are completely irrelevant, and you can just use jq
.
But how easy is it to use jq
for everything?
It is easy to start but it becomes pretty complex very soon.
Discover Nushell, a modern alternative to Bash conceived for the 21st century.
Rather than thinking of files and data as raw streams of text, Nu looks at each input as something with structure.
In Unix, it's common to pipe between commands to split up a sophisticated command over multiple steps. Nu takes this a step further and builds heavily on the idea of pipelines
Examples
In the example below, you can see how to filter the command ls
results by size greater than 10 Megabytes and then sort the results by the modified timestamp.
In this other example, we can see how the output of the command ps
to list all the running processes in your machines are filtered and sorted in reverse order by the amount of CPU.
As you can see from the screenshots above, Nushell is quite a powerful language.
Pros and cons
Between the benefits we can list:
Nushell is a fully-typed scripting language. Similar support to compiled programming languages. Errors (like type inconsistencies) are detected before the script execution.
Built-in commands: an extensive list of built-in functions to work with Sqlite, Excel, HTTP endpoints and so much more
Cross-platform support: scripts written in Nushell can run without changes in all major operating systems: Windows, Linux, and Mac.
Autocomplete: for all commands (even your custom-made ones) and programming language functions.
Verbose error management.
Fast: since it's written in Rush.
Good support from the community: it integrates with lots of other tools.
Negative bits:
Nushell is not Posix Compliant. Bash scripts won't run unmodified in Nushell. Replaced commands (like
ls
orps
from above) don't have the same parameters. Nushell prefers to have commands with fewer parameters and uses Pipes to combine multiple commands instead.
Documentation
You can find more information about Nushell at:
Conclusion
I plan to still use Zsh as my favourite Posix-compliant shell to collaborate with other developers at work and progressively move to Nushell for any automation or personal use.