100x everytime I write a shell script, I regret it later. But you'll never take `make` away from me (I know folks have tried, but I haven't found a similarly elegant system; not for lack of trying, lots of terrible renditions on the old tool).
Nnnnnnnnnnnoooooooooooooooooo! Drag ksh with a coroutine that runs sqlplus out of the grasp of my cold dead hands!
Well, maybe. Can I define ONE persistent connection to a database for an entire session, so I can combine the ability to do anything I need at the OS level with anything I need inside the database, with only one connection to a database? I have a set of standard functions that I include in each script that make creation of new administrative functionality easy. Very portable library.
One thing that brings me some concern when using Go is the fact we would almost certainly use libraries from “random” GitHub repositories. What happens if they are offline, not maintained, or they get hacked and someone puts a backdoor in their code? I mean, this seems to have lower chances to happen with C programs as most of them rely on standard or POSIX libraries. I don’t know.. It just “feels” I can trust them more..
The same happens with shell scripts. awk, cron, sed, etc look safe. It is like thinking that “someone looked at it and the code is safe and well maintained”.
In Go we import a library directly from a user repo. Doesn’t it feel weird? What’s your opinion here?
PS.: I know about xz backdoor and similar cases and I know no software is 100% protected from these supply chain attacks, but I’d like to hear more opinions from you and your readers here. :)
I used to be almost the shell script fairy, but now I mostly operate by calling a tool I wrote that just calls the OpenAI API to get the right incantation. It usually gets it right after a few iterations at most. I feel like an idiot, but I can’t be bothered to memorize arguments to ffmpeg for example - life is too short.
Then, when I wanted to transcribe longer videos and fix the transcription up, parallelizing it via xargs and background jobs was much easier than writing it properly. Yes, it was a fragile pile of duct tape, but in the worst case I would just need it to run the script again.
100x everytime I write a shell script, I regret it later. But you'll never take `make` away from me (I know folks have tried, but I haven't found a similarly elegant system; not for lack of trying, lots of terrible renditions on the old tool).
You mean you don't like Ant? :-)
yeah there is, small nit: `docker container ps -a | cut -f1 -d\ | grep -v CONTAINER` is the same as `docker ps -a -q`
Nnnnnnnnnnnoooooooooooooooooo! Drag ksh with a coroutine that runs sqlplus out of the grasp of my cold dead hands!
Well, maybe. Can I define ONE persistent connection to a database for an entire session, so I can combine the ability to do anything I need at the OS level with anything I need inside the database, with only one connection to a database? I have a set of standard functions that I include in each script that make creation of new administrative functionality easy. Very portable library.
:-)
Interesting point of view, thanks for sharing. :)
One thing that brings me some concern when using Go is the fact we would almost certainly use libraries from “random” GitHub repositories. What happens if they are offline, not maintained, or they get hacked and someone puts a backdoor in their code? I mean, this seems to have lower chances to happen with C programs as most of them rely on standard or POSIX libraries. I don’t know.. It just “feels” I can trust them more..
The same happens with shell scripts. awk, cron, sed, etc look safe. It is like thinking that “someone looked at it and the code is safe and well maintained”.
In Go we import a library directly from a user repo. Doesn’t it feel weird? What’s your opinion here?
PS.: I know about xz backdoor and similar cases and I know no software is 100% protected from these supply chain attacks, but I’d like to hear more opinions from you and your readers here. :)
Thanks!
Here kid, try a little bit of this shell scripting. It'll make you feel productive. It'll make you feel powerful. Later, you can have more...
I used to be almost the shell script fairy, but now I mostly operate by calling a tool I wrote that just calls the OpenAI API to get the right incantation. It usually gets it right after a few iterations at most. I feel like an idiot, but I can’t be bothered to memorize arguments to ffmpeg for example - life is too short.
Then, when I wanted to transcribe longer videos and fix the transcription up, parallelizing it via xargs and background jobs was much easier than writing it properly. Yes, it was a fragile pile of duct tape, but in the worst case I would just need it to run the script again.