

It’s even a tape archiving tool. Just pretty much nobody uses it in the original way any more.
Very much one of those “if it ain’t broke, don’t replace it” tools.


It’s even a tape archiving tool. Just pretty much nobody uses it in the original way any more.
Very much one of those “if it ain’t broke, don’t replace it” tools.


Yeah, there should be a clear separation between scripts, which should have a shebang, and interactive use.
If a script starts acting oddly after someone does a chsh, then that script is broken. Hopefully people don’t actually distribute broken script files that have some implicit dependency on an unspecified interpreter in this day and age.
That’s interesting I hadn’t thought about the JSON angle! Do you mean that you can actually use
jqon regular command outputs likels -l?
No, you need to be using a tool which has json output as an option. These are becoming more common, but I think still rare among the GNU coreutils. ls output especially is unparseable, as in, there are tons of resources telling people not to do it because it’s pretty much guaranteed to break.
I’ve been using fish (with starship for prompt) for like a year I think, after having had a self-built zsh setup for … I don’t know how long.
I’m capable of using awk but in a very simple way; I generally prefer being able to use jq. IMO both awk and perl are sort of remnants of the age before JSON became the standard text-based structured data format. We used to have to write a lot of dinky little regex-based parsers in Perl to extract data. These days we likely get JSON and can operate on actual data structures.
I tried nu very briefly but I’m just too used to POSIX-ish shells to bother switching to another model. For scripting I’ll use with set -eou pipefail but very quickly switch to Python if it looks like it’s going to have any sort of serious logic.
My impression is that there’s likely more of us that’d like a less wibbly-wobbly, better shell language for scripting purposes, but that efforts into designing such a language very quickly goes in the direction of nu and oil and whatnot.
One more puzzle piece here is that
duwon’t report on files that have been marked for deletion but are still held on to by some process. There’s anlsofincantation to list those, but I can’t recall it off the top of my head.It used to be part of sysadmin work to detect the processes that held on to large files if
dfreports that you’re running out of space, and restart them to make them let go of the file. But I haven’t done that in ages. And if you restarted the host OS that should have taken care of that.I assume you also know how to prune container resources.