orchestral timing...
#orchestral #bash #mallet #orchestra #conductor #smash #percussion #animatedgif
6 Likes
3 Comments
1 Shares
orchestral timing...
#orchestral #bash #mallet #orchestra #conductor #smash #percussion #animatedgif
took me some hours (days) to realise it doh.
For those who use #dash and are faced with #bashisms:
bash: cat file | while read -d 'x' line ; do
dash: cat file | tr 'x' '\n' | while read line ; do
On the positive side, I can now drill down into horrible machine generated nested html like g-o-o-gle's translations of webpages - using dash. It all started with trying to parse RSS....
#gnu #linux #bash
There is a way to get rid of the last, and only the last, newline in a file, if it exists. xxd
, in case you don't use it often, reads stdin
and shows you what was read, in hex.
printf "one\ntwo\nthree\n" > test.txt
cat test.txt | xxd
00000000: 6f6e 650a 7477 6f0a 7468 7265 650a one.two.three.
printf "%s" "$(cat test.txt)" | xxd
00000000: 6f6e 650a 7477 6f0a 7468 7265 65 one.two.three
This won't work with DOS-style newlines (\r\n
). The last \r
won't be eliminated. That would need a tr -d "\r"
, but that would eliminate all the \r
s.
If you knew for certain that there was a newline at the end, you could measure the length of the file with wc -c
and use head -c
to eliminate it. But this trick is so simple, I think I'd use it anyway with BASH and DASH.
I don't know if this trick works with any shells other than BASH and DASH. I think printf
is always a built-in command, so it would depend on the shell.
BTW, DASH is the variant form of BASH that Debian and Debian derivatives use. With Mint, commands you type from the command line, by default, use BASH, but scripts, by default, use DASH. This matters because echo
works slightly different with DASH and BASH. That's why some people use printf "%s\n" "whatever"
instead of echo "whatever"
in scripts.
#newline #newlines #bash #dash #shell #shell-script #trick #hack #programming #coding
In the vast realm of Linux, an open-source operating system, the grep command holds a significant place. An acronym for ‘Global Regular Expression Print’, the grep command is a widely-used Linux tool that gives users the power to search through text or output based on specific patterns. This command line utility is indispensable for efficient file and data management, serving as a cornerstone in Linux operations.
The grep command’s usefulness is in its versatility and power. Whether you’re debugging a problem by searching through logs, searching through code for a specific function, or even looking for files with a specific keyword, the grep command is an indispensable tool in a Linux user’s toolkit. Its ability to filter output makes it an ideal command for piping with other commands to refine output further.
This very week I used the grep command to search through over 200 desktop files to firstly check they each had a line inside starting with ‘Exec=’ and then I got that line extracted and printed out into an output file that I could examine (instead of opening each every one of the 200+ files).
And liked I used the grep command, it is often used in conjunction with the pipe symbol ‘|’ to further process what grep has extracted, e.g. extract a sub-string or perform some other function.
In fact, the bash shell itself can really do a lot, from prompting for inputs, or reading variables passed to it at runtime, do/while loops, if/then statements, and a lot more. If you want to automate something, you can frequently just use a bash script file in place of a more complex language. But you won’t know unless you just read up a bit on a few of the different commands. If you get stuck, ask Google Bard or similar AI for some help, and it will even spit out some sample code you can use.
See https://www.linuxcapable.com/grep-command-in-linux-with-examples/
#Blog, #bash, #grep, #linux, #scripting, #technology
Ich habe mit ocrmypdf
gerade ein ziemlich cooles Kommandozeilentool für #Linux gefunden. Wenn man mal wieder ein aus einem Bild erstelltes #PDF bekommt und sich wundert, warum man darin nicht suchen kann, hilft dieses Tool dabei, das PDF mit einem Text-Layer zu versehen. Sehr schön.
So my server ran full, and it has years of snapshots, i actually do not want to delete, however i spotted (with the amazing ndcu) some 500GB of Backups i do not need. So i want to delete them.
apparently to do this you need to set each snapshot to writable and delete the file/folder you want. (see https://www.suse.com/support/kb/doc/?id=000019594)
they have also a script there doing this in a handle able manner. however it is only for Files and not for Folders. But i need to delete Folders. Ideally i want to be ably to provide the path to it so i do not delete some identically named folders elsewhere.
I tried to adapt the script but i do not quiet understand it.
original script
```
file=$1
while read a
do snapshot=$(echo ${a%%/$file})
btrfs property set $snapshot ro false
rm -f $a
btrfs property set $snapshot ro true
done < <(find /.snapshots/ -name $file)
**question 1:**
what does this line exactly do?
do snapshot=$(echo ${a%%/$file})
```
question 2
in what way/order does this "< <" syle of conding feed the paths to the loop?
because this (my for folders adapted test script)
file=$1
while read a
do snapshot=$(echo ${a%%/$file})
echo "$snapshot snpsho"
# btrfs property set $snapshot ro false
#rm -rf $a ## delete command adapted for folders
echo "foundet $a"
#btrfs property set $snapshot ro true
done < <(find /home/ -type d -path $file) ## find command adapted for folders and paths
this does not give me any output, altough the find command on its own totally does
also why, do we still not have tools that do make sense with btrfs filesystems. like:
- gui option in filemanager to delete a file/folder though all spanshots
- gui option in filemanager to easily compare/resore older versions of a file. (like windows)
this should be quiet easy and would be heavily helpfull...
Anyone can tell me why this " don't prevent that filenames with spaces get interpreted as arguments?
I think it is the second
find /home/user/pdfs/ocr -type f \( -iname "*.pdf" -and -not -iname "*_ocr.pdf" \) | while read file ; do ocrmypdf -q -l deu+eng --rotate-pages --rotate-pages-threshold 8 -c -s "$file" /home/user/pdfs/ocr_fertig/$(basename "$file" ".pdf")_ocr.pdf && rm "$file" ; done
Is it possible that find -exec
still runs the command if there was not file found?
I am trying this:
find /home/user/folder/pdfs/ocr -type f \( -iname "*.pdf" -and -not -iname "*_ocr.pdf" \) -exec bash -c 'ocrmypdf -q -l deu+eng --rotate-pages --rotate-pages-threshold 8 -c -s "{}" /home/user/folder/pdfs/ocr_fertig/$(basename "{}" ".pdf")_ocr.pdf' \; -exec ls "{}" \;
But if there is no file in the folder then I get an error from #gostscript that origin.pdf was not defined, so I wonder how I can get exec to run the command only if at least one matching file was found?!
libérez la puissance de la console linux grace a gpt !
#doctorUbuntu #gpt #AI #codage #python #shell #bash #commandesbash #linux
https://youtu.be/1XJbhLBy4Vk
#AI #bash #linux #gpt
le lien github vers la V3 de doctorubuntu est dans la descriptoin
Today I produced the following script, that receives as input an email containing a DMARC XML report as an attachment, extracts the attachment, decompresses it, and queries the report to see if there are failures coming from any of the IPs that I use to send mail (ignoring those I don’t use/control), and sends an error message and the decompressed file to a room on my Matrix server for any failures it finds.
#!/bin/bash
TMP=$(mktemp -d)
SOURCE_IPS=$(host mail.koehn.com | grep address | awk '{print "\""$(NF)"\","}' | tr '\n' ' ' | sed 's/, $//')
cd "$TMP" || exit 1
function cleanup {
rm -rf "$TMP"
}
trap cleanup EXIT
FILES=$(munpack -f 2>/dev/null | awk '{print $1}')
for file in $FILES ; do
if 7z e -so "$file" | xidel --data - --xquery './/row[source_ip=('"$SOURCE_IPS"') and (policy_evaluated/dkim="fail" or policy_evaluated/spf="fail")]' 2> >(grep -v "Processing: stdin") | grep . ; then
mc -m "🔴 Received DMARC report containing failures: $file"
mc -f "$file"
fi
done
I worked quite hard to solve this problem, and I’m happy with its (eventual) simplicity. When you have (for example) two Kubernetes containers in a pod (or two processes that can share a named pipe) and you need to run a process on one of them from the other one, I have just the tool for you. It’s basically ssh
without all the pesky networking, using named pipes instead of TCP streams.
Many remote (and even local) commands can executed so much quicker when using Bash instead of graphical user interfaces, but for newer users, Bash can be an unfriendly and cumbersome environment.
In this video I show how, with the installation of just three applications and some alias commands, we can make working with Bash much faster, easier, and prettier!
See https://youtu.be/OR2G9OSlmVI
#technology #Linux #Bash #tips #productivity
#Blog, ##bash, ##linux, ##productivity, ##technology, ##tips
I rebuilt a system for keeping electronic notes for other programs, and moved that from a flat text file to a MySql database table, fixing the script and the 2 programs that use that information.
The name, in English, is a bit like ‘rubbish’, which holds dear memories of my first job, where we had a similar system to keep pointers, last-used-dates and other unrelated stuff together.
This month: * Command & Conquer * How-To : Bash to Python, Migrating from VAX/VMS and Latex * Graphics : Inkscape * Everyday Ubuntu: Diagramming with Dia * Review : Xubuntu 22.04 * Review : Void Linux * Ubuntu Games : Crystal Caves HD plus: News, My Opinion, The Daily Waddle, Q&A, and more. Get it while it's
#magazine #bash #crystalcaves #dia #diagram #inkscape #latex #python #qa #vax #vaxvms #vms #void #voidlinux #waddle #xubuntu #fullcirclemagazine #ubuntu #linux
Je pense la traduire.
#Linux #Shell #Bash #CommandLine
#LinuxCommandLibrary (4638 manual pages, 21 basic categories and a bunch of general terminal tips.)