#blog

danie10@squeet.me

Solved e-mail not working in luckyBackup app with smtp-cli app

luckyBackup options pane for e-mail settings showing the command and various parameters that have been set. It also lists some options that be set with text options for sender address, smtp server, and text body.
I love the free and open source luckyBackup app for doing rsync backups of my user data, as rsync is fast and reliable, and you can easily verify seeing your files that are backed up. I’ve been using luckyBackup for many years now, despite it no longer being supported from about 12 years ago (little has changed with regards to rsync, which it uses, and rsync updates still).

The only problem I, and many others had, was when Google tightened up their SMTP access for sending of mail. We started getting authentication errors for SMTP logins with app passwords. But I’d seen many complain but no easy solution was ever mentioned for luckyBackup, which executes a command line execution to send the logs via e-mail.

This week it became critical as I noticed my backups stopped running a few days ago, and of course, I got no error e-mail. I looked to using Duplicati backup now which is really excellent and saves masses of space, but it is horribly slow, especially on the first run. For my 1.1 TB of data it has now been running nearly 24 hours and it has 230 GB still to go!

So, a renewed effort on luckyBackup finally found a breakthrough. I looked at alternative CLI mail command apps and actually got ssmtp to work, by adding an option AuthMethod=LOGIN. This got me to realise that of course GMail’s (and some other SMTP servers) are not always standard. But the ssmtp app does not execute properly from within luckyBackup.

Then I came upon a comment in a post where someone said they had created an open-source app called smtp-cli for the command line, and it also did some diagnostics along with some optional parameters including also an option --auth-login. And it worked, including inside of luckyBackup!

The app only had updates done 5 years ago, but I see various issues have been closed, and more than once it was said no update was actually required as users can use the numerous parameters to control various issues. It also has a --verbose option to give lots of feedback on its progress or issues.

This app is also pretty useful if you’re having any issue with e-mailing from cron jobs as it will handle that job pretty well too.

I just wish I’d come across this app sooner!

See https://github.com/mludvig/smtp-cli
#Blog, #luckybackup, #opensource, #smtp, #technology

danie10@squeet.me

Magnets are switching up the keyboard game with an additional keystroke setting

Close up view of a keyboard showing just a few white keys, with a cube shaped key switch resting on top of one of the keys. The switch is made of plastic with a black coloured base, and a transparent top. Protruding from the top of the switch is a few mm's of pink stalk that the key cap would be attached to.
These keyboards rely on magnets and springs and activate by sensing changes in the magnetic field. Popularized by Dutch keyboard startup Wooting, these switches rely on the Hall Effect and have actually been around since the 1960s.

You can change how far you need to press down to register the keystroke, as well as for the release point.

The one thing you can’t change, though, is the switch’s resistance. Despite all the talk of magnets, that’s still handled by the spring inside the switch, after all (for the moment, until the xyz is released).

But interestingly, this also means with temperature differences, you may also have to “calibrate” your keyboard. The price point for the Akko MOD007B PC Santorini keyboard at around US$110 to $150 is certainly not more expensive than many mechanical keyboards.

See https://techcrunch.com/2024/04/07/magnets-are-switching-up-the-keyboard-game/
#Blog, #keyboards, #technology

danie10@squeet.me

How to update the firmware on Raspberry Pi

Fingers holding a Raspberry Pi computer board. I the background is the glass door to a PC with some RGB lights shining.
Essentially, firmware is a form of low-level software that instructs hardware on how to operate and interact with other devices and components. For instance, firmware tells a computer to turn on when you press the power button, and it also tells a Raspberry Pi how to read micro-SD cards and USB drives.

Depending on what you do with your Raspberry Pi, you might never need to update its firmware. Aside from the occasional bug fix, you only have to update your Raspberry if you upgrade a project with new processors, memory chips, or printed circuit boards. You probably won’t need a firmware update if you only use the Raspberry Pi to power a mini RC rover, but if you feel like adding a bit more processing power to a device running Windows 11, you’d better install new firmware.

So, this may mostly not really be required, but it is handy to know if you plan to re-purpose your Pi with newer hardware.

See https://www.xda-developers.com/how-to-update-the-firmware-on-raspberry-pi/
#Blog, #raspberrypi, #technology

danie10@squeet.me

Starlink in Zimbabwe: Techies Find Ways to Disguise Kits, Evading Authorities

Split view showing on the left a Starlink satellite dish standing on a roof, while on the right is what appears to be just a solar panel. Bottom right is the word Techzim.
In a country choked by high internet costs and limited options, Zimbabweans are turning to remarkable ingenuity. Facing arrest and equipment seizure for using Starlink, tech-savvy individuals have devised a way to disguise the kits.

One such individual who communicated anonymously with us here at Techzim has said he’s helping people hide their Starlink terminals from the authorities.

They modify the terminal so that it looks like a solar panel, or just a light. They are also able to make it work without the indoor router, removing any evidence of the presence of a Starlink connection, even if the authorities suspect, or a neighbour snitches.

It now seems to be a sort of business helping do this via kits that can be purchased. It is certainly also needed in countries where the Internet is itself censored (LOL yes also including Australia).

IT’s just really sad also where suppliers can get away with not having enough innovative competition for them to lower their prices through a competitive market. Governments too are not always putting their citizens first as licensing seems to have some other objectives. Ask citizens, and they’re probably going to just say they want to have choices.

See https://www.techzim.co.zw/2024/04/starlink-in-zimbabwe-techies-find-ways-to-disguise-kits-evading-authorities/
#Blog, #africa, #technology, #zimbabwe

danie10@squeet.me

This Asus PC case monitors your dust filter so you don’t have to

Side view of a black computer case, with a dust filter mesh resting against the side of the case.
Traditionally, one would have to periodically check the status of the dust filtering on a PC case, but that’s not the case (pun intended!) with the Asus ProArt PA602. This chassis has a fancy infrared (IR) sensor behind the front-facing dust filter. Should this detect a set layer of dust covering the filter material, a small LED will illuminate on the side of the case. It’s tastefully done. No alert on an LCD screen, no obnoxious sound. With this activated, you will know to clean the filter (and give the inside a quick air blast) next time the system has been shut down.

Quite a thoughtful case, apart from having the dust filter warning, it also has wheels to move it more easily.

But it does show also, is that even cases can innovate as well. I’d like to see more of these and maybe have the sensors also on the other dust filters (my case has one underneath as well), as IR sensors themselves are not very expensive to incorporate.

See https://www.xda-developers.com/this-asus-pc-case-monitors-your-dust-filter/
#Blog, #cases, #dust, #technology

danie10@squeet.me

German state of Schleswig-Holstein ditches Windows, Microsoft Office for Linux and LibreOffice

An old fashioned wooden sailing ship in the background on a canal. On both sides of the canal are tall wooden houses with Tudor style-woodwork. In the foreground is a group of penguins.
Schleswig-Holstein, Germany’s most northern state, is starting its switch from Microsoft Office to LibreOffice, and is planning to move from Windows to Linux on the 30,000 PCs it uses for local government functions.

The announcement (in German) was made yesterday by the state’s Minister-President Daniel Gunther, who has served in that position since 2017. According to a translated version of the announcement, independence was a key motivation for switching to open source software.

This is unlike the reasons that were given by Munich and Lower Saxony which were stayed to be cost savings, and then Microsoft discounted their services. Back when LiMux started, it was mostly seen as a way to save money. Now the focus is far more on data protection, privacy and security. Consider that the European Data Protection Supervisor (EDPS) recently found that the European Commission’s use of Microsoft 365 breaches data protection law for EU institutions and bodies.”

See https://www.theregister.com/2024/04/04/germanys_northernmost_state_ditches_windows/
#Blog, #datasovereignty, #germany, #opensource, #technology

danie10@squeet.me

Android’s Find My Device network settings FINALLY start going live for some users

Smartphone with screen showing a title Device finders, and Find my Device set to on.
Nearly a year ago at Google I/O, the new “Find My Device” network for trackers was announced for Android, but then delayed indefinitely for the benefit of iPhone users. Now, finally, the network is starting to roll out – sort of.

The Find My Device network piggybacks off of all Android phones with Google Play Services to help users find lost items including phones and smartwatches, as well as trackers, headphones, and more. Google first announced the network in May 2023 with the goal of rolling it out in the months to follow, but the company later announced a delay.

Yes, the hold-up was supposedly Apple refusing to implement their part of the deal, to warn Apple users that an Android tracker was following them. And yes, this despite Google having long ago implemented the warning for Android users. A whole lot of new trackers for Android were supposed to be based on this new standard.

So, I’ve been sitting with a set of AirTags that warn me daily that they are following me around, because I could not yet buy the new trackers for Android. Let’s just hope this now moves ahead with some speed. No-one wants to buy outdated trackers that only work with Samsung, or Tile, or whoever. The new standard will allow any Android device to find your lost tag, which is how Apple’s AirTags work for any iOS device passing by.

See https://9to5google.com/2024/04/03/android-find-my-device-network-live-early/
#Blog, #android, #technology, #trackers

danie10@squeet.me

UK govt office admits ability to negotiate billions in cloud spending curbed by vendor lock-in

Man sitting at a boardroom table examining a piece of paper. Around him are positioned various other executives.
It’s one of the points I’ve been making since the beginning of enterprise cloud services. It’s not to say necessarily that a cloud service is bad, but you have no control (and often no easy way out) if prices jump (remember Microsoft changing their SQL database licenses from per CPU to per core – think it was that way around).

Quite often too, cloud providers use their own proprietary formats inside their cloud, so all works wonderfully, but what went out of the window was many governments’ requirements around open data standards. There was one very good reason for open data standards, and that was for easy portability to any other service, e.g. using ODF document standard and then moving from LibreOffice to, or from, FreeOffice.

Combine both of these and you’re in a tightish spot. You may also have very few skilled IT staff left, because all your services now sit in someone else’s cloud.

So, you just want to factor all of this in very carefully when considering whether to go into a cloud or not. We won’t even mention the UK govt’s experiences recently with Oracle…

See https://www.theregister.com/2024/04/04/uk_cddo_admits_cloud_spending_lock_issues_exclusive/
#Blog, #cloud, #technology, #UK, #vendorlockin

danie10@squeet.me

Overview of Memories Advanced Photo Management Suite that installs inside Nextcloud

Memories open-source selfhosted photo management with an image showing a whole lot of polaroid style photos scattered on a table
Memories is a fast, modern and advanced photo management suite, that installs quickly and easily inside Nextcloud. My video contrasts it with the Photos app that comes with Nextcloud, and highlights some reasons why you may want to use it instead of Photos. This app has face, object, landmark, place, and human action recognition capability through the Recognise app. It’s not that obvious, but albums can be shared, and photos commented on, with other Nextcloud registered users using the underlying Nextcloud file commenting system.

Memories is a great way to collaborate and share photos privately with friends and family, and even to share public links to some of your albums. It can even work on a Raspberry Pi hosted in the home.

It also has apps for iOS and Android, which can optionally auto-upload photos into Memories.

By saving/reading titles and descriptions into the photo’s EXIF headers means that importing or exporting out of Memories is a lot less of a chore with migrating between photo services.

Watch https://youtu.be/2A6u0AluCnI
#Blog, #opensource, #photomanagement, #selfhosting, #technology

danie10@squeet.me

Android 15 may be taking privacy to a whole new level by hiding locations even from the network

Silhouette of a few people talking and glancing at their phones
In a surprise move that’s more iPhone than iPhone, it looks like Android may be taking privacy to the next level, with new devices able to hide their locations even from their networks.

Even more markedly, Google has also previewed a feature to protect devices from IMSI grabbers and intercept platforms. These are the technologies used by law enforcement and sometimes criminals to capture phone IDs and trick those phones into connecting to a copycat network, allowing calls and messages to be intercepted.

As ever with this level of phone security and privacy, it’s good to have even if it won’t be a game-changer for the vast majority of users. But for journalists, politicians, celebrities, dissidents and protesters, this is a major advance in the improvement of personal privacy and data security and is as welcome as it is surprising.

I suppose too with the global adoption of RCS it also means that insecure text messaging can also be a thing of the past soon (ish). Wonder if the bulk SMS companies have aligned with this yet.

See https://www.forbes.com/sites/zakdoffman/2024/03/29/google-upgrade-samsung-s24-s23-pixel-vs-apple-iphone-15-pro-max/
#Blog, #privacy, #technology

danie10@squeet.me

LocalSend and SnapDrop can be used for universal AirDrop instead of pushing to Apple

White screen with title 'click to send files or right-click to send a message' and three blue icons with names such as Purple Rodent and Orange Pidgeon, as options. At the bottom is a blue Wi-Fi type icon with the label, You are known as Purple Salamander.
Firstly, Airdop only works to other Apple devices, but there is a lot more to the world than just Apple devices! These open source (private and secure despite what Apple claims) apps will do universal file transfer between Apple, Linux, Windows, Android, etc devices.

They work on the same LAN so keep the traffic local. Snapdrop can even be self-hosted by you. Pairdrop (also .net address) is a fork of Snapdrop that offers a few extra bells and whistles like permanently pairing your devices (also Syncthing is better for this) as well as a temporary public room.

See https://arstechnica.com/gadgets/2024/03/the-two-apps-i-use-when-i-need-airdrop-on-non-apple-devices/
#Blog, #alternativeto, #filetransfer, #opensource, #technology

danie10@squeet.me

Proton Pass now supports passkeys on all devices and plans: Beating Bitwarden to mobile devices

Popup window with title Passkey, and itemises information underneath such as username, domain, key, and created date.
Passkeys are an easy and secure alternative to traditional passwords that can help prevent phishing attacks and make your online experience smoother and safer.

Unfortunately, Big Tech’s rollout of this technology prioritized using passkeys to lock people into their walled gardens over providing universal security for everyone (you have to use their platform, which often does not work across all platforms). And many password managers only support passkeys on specific platforms or provide them with paid plans, meaning you only get to reap passkeys’ security benefits if you can afford them.

They’ve reimagined passkeys, helping them reach their full potential as free, universal, and open-source tech. They have made online privacy and security accessible to everyone, regardless of what device you use or your ability to pay.

I’m still a paying customer of Bitwarden as Proton Pass was up to now still not doing everything, but this may make me re-evaluate using Proton Pass as I’m also a paying customer of Proton Pass. It certainly looks like Proton Pass is advancing at quite a pace, and Proton has already built up a good reputation for private e-mail and an excellent VPN client.

Proton is also the ONLY passkey provider that I’ve seen allowing you to store, share, and export passkeys just like you can with passwords!

See https://proton.me/blog/proton-pass-passkeys
#Blog, #opensource, #passkeys, #ProtonPass, #security, #technology

danie10@squeet.me

ActivityPub plugin for WordPress Adds Your WordPress Site to the Fediverse

Screen headed ActivityPub with a welcome message and a form for the author profile to be completed with a username and profile URL.
This plugin effectively turns your WordPress blog into a one-person Mastodon (or another microblog) instance on the Fediverse. You carry on doing your blogs as normal, but anyone from right across the Fediverse can find and follow you, and reply and like your posts.

What is really nifty is that replies from the Fediverse appear as comments on the WordPress blog post. So, you manage all the interactivity from inside WordPress.

This is ideal for those who are primarily focussed on their blog and would prefer to manage things there. Their Fediverse address will be that of their blog (with that domain name). In this way, you also own your Fediverse address and retain your blog’s branding.

This is perfect for businesses who run WordPress news blogs already, and who do not want to establish, and manage, a separate Fediverse account. Retaining their own branding is also the cherry on the top for them.

If you already have a Fediverse address, then that stays separate from your blog’s address.

See https://lifehacker.com/tech/make-your-wordpress-site-part-of-the-fediverse
#Blog, #blogging, #fediverse, #technology, #wordpress

danie10@squeet.me

Google Just Revealed When Apple Will Officially Adopt RCS: Possible Northern Hemisphere Fall 2024

Woman smiling at, and holding a smartphone. Title text says "better messaging for all. Apple has announced it will be adopting RCS in the fall of 2024. Once that happens, it will mean a better messaging experience for everyone".
The Android developer just published an updated landing page for Google Messages, showing off key features ranging from customization, privacy and security, and, of course, AI.

On this landing page, there are different sections for each feature set, including one for RCS. As spotted by 9to5Google, if you expand this list of RCS features and scroll to the bottom, you see a section on “Coming soon on iOS: Better messaging for all.” That’s no surprise: We’ve known Apple was adopting RCS since November. However, it’s the next line that brings the news: “Apple has announced it will be adopting RCS in the fall of 2024.”

Of course, this does not say a lot as it is “in the fall” which is anywhere over a couple of months, and Google has tried to embarrass Apple into making moves before. I suppose, though, there is the looming court case against Apple which is anyway keeping pressure on Apple. If it were not for the US court case, I would have guessed Apple may have pulled out after the EU had ruled Apple was not a dominant player in the market (although the EU case was looking more at interoperability with WhatsApp and others in Apple Messages).

Of course, with Apple actually including RCS now, they can probably argue that there is interoperability via RCS between their platform and Android too. It must be remembered that in many countries, like mine, SMS’s are paid for so are very expensive to use for any form of chatting, and the costs go up exponentially when you text an international number.

I personally have quite a few issues with interoperability with Apple:

  • I still have AirTags from when I had an iPhone and I daily get the audio beeps warning me the AirTags are not connected (I use an Android phone and alternate between an iPad and an Android tablet)
  • I can’t wait to sell my AirTags and get the new one’s Google was working on that will interoperate with Apple, but supposedly Apple has been delaying building in that support into their devices (which Google already built into Android for AirTags in 2023)
  • Because I was on Apple Messages and my iPad still sometimes connects, I find a message on my iPad that arrived a week ago which I had not seen (I had Beeper which was solving this problem)

Apple is not at all dominant outside the USA, but it makes interacting with Apple users quite a pain, as Apple has gone out of their way to try to keep their users inside the walled garden.

See https://lifehacker.com/tech/google-just-revealed-when-apple-will-officially-adopt-rcs
#Blog, #apple, #interoperability, #RCS, #technology

danie10@squeet.me

The Ultrahuman Ring Air: The best smart ring for fitness junkies beats the Oura in 4 major ways

Fingers of a hand, holding a dark coloured ring. In the background is a blurry view of a city.
The Ultrahuman Ring Air is a subscription-free smart ring with AI health insights I have never seen before and data that rivals Oura’s.

It is really great to see masses of new entrants to the smart ring market in the last year. It shows there is growing demand in this segment of the market, and many people do want fitness devices that are more compact and comfortable to wear.

None of them though, as far as I’m aware, do real-time heart rate for exercising. But they are great for body temp differences, blood oxygen, sleeps stats, step counting, and overall health and fitness.

Oura has become too expensive for many with their monthly subscription, so I think these once-off purchase entrants will be really welcome, and more competition in the market is usually good for consumers.

See https://www.zdnet.com/article/the-best-smart-ring-for-fitness-junkies-beats-the-oura-in-4-major-ways/
#Blog, #fitness, #smartring, #technology

danie10@squeet.me

This YouTuber shows off running Steam games on a Raspberry Pi 5

Raspberry Pi computer box in front of a TV
As you might expect, you’re not going to be getting Baldur’s Gate 3 running on max settings on a Raspberry Pi 5. However, the actual results were still pretty impressive. Simple 2D games like Brotato ran fine, and older 3D games like the original Portal were very playable. Strangely enough, Terraria gave the Pi 5 a hard time, despite not being a graphically intense game.

Yes, it’s not a gaming computer by any means, but it is pretty good bang for the buck, and also pretty versatile.

See https://www.xda-developers.com/youtuber-running-steam-games-raspberry-pi-5/
#Blog, #gaming, #raspberrypi, #technology

danie10@squeet.me

High Level Steps to Migrate my Docker Hosting to a Different Hosting Service

Hosting migration
The planning took a lot longer than the actual move time, and the downtime for my main blog was really only around 10 minutes. But this has inspired me to also document some of the steps I took for my own future reference. Also, many videos and guides I watched, often only dealt with a single aspect of the migration, such as the Docker volumes backup and restore.

I’m not going to make this a detailed step by step guide as everyone’s situation is different, but hopefully this conceptual overview, and some detail with links, will help many others. I’m not sure yet if I’m going to do an explainer video, but maybe this post will help me make up my mind.

My hosting environment is a hosted VPS service running Ubuntu Linux, with Docker and Portainer, which host various web services, each in their own Docker container and related Docker volumes for persistent storage. One of those Docker containers is a Nginx Proxy Manager reverse proxy which routes incoming requests to the correct web services. There is also an OpenVPN container that allows me to authenticate and drop into the LAN environment to do maintenance. The DNS (for resolving the URL for each service) is handled through a free CloudFlare service. That DNS service points to the correct main public IP address for the server. The theory is, as soon as the service has been backed up and restored (and tested) to the new server, I update the IP address in the DNS to point to the new server, and any visitors get routed immediately to the new server without knowing the difference.

The high-level steps one needs to perform are:

Check some details on your existing server environment such as:

  • what Linux users are being used apart from root – I had a user Soft and there was also a www-data user
  • what firewall ports are open
  • what cron jobs are running
  • what Docker networks are running e.g. I have mysql-net created to share a common database
  • which containers are using the shared database as I want to restore them together straight after the database is migrated (they need the database to function)
  • check what volumes are attached to which containers, and see what user file permissions are present (I did not need this because I used Docker-Backup)
  • Note down the container ID for each container. This is needed for the Docker-Backup app, and also serves as a checklist as to which have been restored.

Perform Portainer backup inside Portainer – this saves all the Stacks with their configs to recreate the container images on the new server. It does not backup Docker networks or volumes.

Before I started the actual backups, I first got the new VPS setup with:

  • it’s OS updated
  • firewall setup and ports 22 (SSH), 80 (HTTP), 443 (HTTPS), 1143 (OpenVPN), and 9000 (Temporarily for Portainer), as well as temporarily the port 81 for NginxPM
  • Fail2Ban installed
  • configured ssh passwordless login (public/private key), and then disabled password logins on the new server
  • I created a user Soft (I forgot to create www-data but things did work)
  • Docker installed
  • Portainer installed, and I restored Portainer (when you start it up you have that option at login), but I did nothing further with it
  • created that one custom Docker network that I was using to share data between my containers (can do this in Portainer, or from CLI)
  • install Go (required for Docker-Backup) and installed Docker-Backup on both existing server and new server

  • Stop containers on existing server before backing up – to ensure all data is written to files and databases can be safely copied. I started with stand-alone services first which were not using the shared database, so I could test them being copied and restored first on the new server. So, my OpenVPN and NginxPM, Wordle, and Glances all worked without the shared database.

  • Run Docker-Backup on existing server for first one or two container IDs from inside /root/docker-backup with the command ./docker-backup backup --tar <containerID>. Running as root saved those compressed files to /root/docker-backup as a .tar file. This application includes all the volumes for that container as well as user permissions.

  • Log into terminal on the NEW server, go to /root/docker-backup (because we want to pull across the backed-up file into the same location on the new server). Copy the backed up file from this directory with the command scp root@191.101.59.143:/root/docker-backup/backedupfile.tar backedfile.tar . This will bring across that backed up file into the new server with that same name in /rot/docker-backup. Note for remote scm copy the existing server needs to allow password login, so you may need to reenable that for this to work.

  • Run Docker-Backup with the restore option to recreate the container’s volumes with all the persistent data and user file permissions, from the backed-up file. The share database will also just be a container volume being restored. You can browse /var/lib/docker/volumes to check the persistent data has been restored. I noted two issues though, and one was sometimes a volume name like glances would not be recreated with that name, but instead a long number of digits (I have no idea why) so I’d verify by its contents what it should be, and then rename it back to glances (or whatever). Also often got the error: Error response from daemon: No such image: kylemanna/openvpn:latest. I solved this in my case by just running docker pull <image-name> to pull the image, before running the restore again which then worked.

  • In theory now you could just spin up that container and it should work by finding its persistent data in its volumes. For some containers this did not work, and what fixed it, was going into Portainer and opening the Stack for that service, and just forcing a Stack Update (in /Stacks/Edit). In that way I also ensured the container and volumes were all properly connected. Don’t worry, it won’t overwrite any of the volume data you’ve restored.

  • The above is why I also temporarily opened ports 9000 and 81, so that I could run Portainer and NginxPM from my web browser with just the public IP of the server (easy, and I lock it down as soon as I finished setting up).

  • That in essence was it, I did this one by one, testing each one. When it came to my blog and the shared database, I just did that batch all together with the database restored first, so that the others could all come online and connect to the database.

  • My last steps after everything was working and tested, was to close ports 9000 and 81 on the new server, and to shut down the old server and notify the provider to cancel the account.

So, like I said the actual WordPress blog was only down around 10 minutes or so while I backed up and restored that batch with the database. I did my 11GB of photos in Piwigo last as that took longest to copy across and restore. I did ditch my Immich photos setup as I realised, I’d complicated things by having that connect directly into the Piwigo photos volume. That resulted in the Immich backup, including the 11GB of photos, and worse, restoring them in an odd location which was no longer linked to Piwigo. But no harm done, as it was just a test install and all my photos were still safely in, and attached to, Piwigo.

The Docker-Backup install guide was pretty long across more than one page, so I could summarise my steps here as its instructions at https://github.com/muesli/docker-backup and to install Go at https://go.dev/doc/install/source using:
wget -c https://go.dev/dl/go1.22.1.linux-amd64.tar.gz
rm -rf /usr/local/go # In case it exists already
tar -C /usr/local -xzf go1.22.1.linux-amd64.tar.gz
export PATH=$PATH:/usr/local/go/bin
go version # To check its installed
Installing Docker-Backup itself:
git clone https://github.com/muesli/docker-backup.git
cd docker-backup
go build
./docker-backup # Should show help info
In hindsight maybe I would try just the straight backup of the volume data using the docker command docker run --rm --volumes-from dbstore -v $(pwd):/backup ubuntu tar cvf /backup/backup.tar /dbdata and its corresponding restore on the other side.

It all went through fine although I also realised that my Nextcloud setup was quite out of date. I’ve been running it for many years and some of the configuration setup has actually changed. I decided to actually wipe the container as well as all volume data as I’m the only user, and the data is anyway synced from my desktop PC. The new install is way faster, and as soon as I connected the Nextcloud desktop sync again, all my documents and photos were just resynced to the server.

All-in-all about a full day’s research and testing bits of this to see how I’d do it, and I used an afternoon the following day to do the actual migration. It was lots of fun and now at least I know how to switch hosting providers again if I need, or want, to.
#Blog, #docker, #hosting, #migration, #technology

danie10@squeet.me

Can you safely revive a dead lithium-ion battery? Yes – here’s how

Four batteries standing vertically, placed next to each other. The top halves are yellow and the bottom halves are silver.
More and more devices now come kitted out with rechargeable lithium-ion batteries — you know, the ones that look like the old-style AA or C cell batteries, but are a slightly different size. The most common size is the now ubiquitous 18650, but there are loads of other sizes in use too, such as the 14500, 16340, and 26650.

These batteries are incredibly safe if treated properly, especially when you consider how much power they contain, and can last for many years and hundreds of discharge cycles before needing to be replaced.

Sometimes basic chargers can be “better” than clever chargers! Chiefly the issue being solved here is when lithium-ion batteries discharge down below their cut-off voltage point.

See https://www.zdnet.com/home-and-office/can-you-safely-revive-a-dead-lithium-ion-battery-yes-heres-how/
#Blog, #batteries, #technology

danie10@squeet.me

Some Downtime Due to Migration of Hosting

Hosting migration
Periodically I love a serious challenge…

My website at https://gadgeteer.co.za/category/uncategorized/feed as well as photos site etc may experience intermittent download from 20th to 21st March 2024 as I migrate my hosting to a new hosting provider. I’ll have double the RAM and a good 60GB extra storage for around the same price, so hopefully it is worth it.

I’m going to test also then removing Cloudflare DNS (as a second phase) to see if the site is still responsive without it.

It is a Docker container migration so maybe there’ll also be a post about it showing a cheat sheet type list of actions done to get this to work.
#Site_Notice, #Blog, #siteupdate, #technology

#Site Notice