[Dxspider-support] G6NHU-2 has moved!
Keith, G6NHU
g6nhu at me.com
Wed Apr 16 10:29:39 BST 2025
Warning - Long post ahead!
In the lead up to doing this, I've reduced my number of node partners by over 50%. I've never said no to a partner request and that meant I just had far too many partners.
After a couple of weeks of trying different things and working out what's the best solution for me, on Monday evening I moved my node from a Raspberry Pi5 at home to a Digital Ocean Droplet.
I should start by saying that there's nothing wrong with a cluster node running on a Pi5, it works really well, especially when using an external SSD instead of an SD card. Before the Pi5 was released, mine ran on a Pi4 and that was good as well.
I first picked the lowest spec Digital Ocean Droplet that gave me 1Gb memory at $7 but after some testing, I discovered that it bogged down a bit doing historical sh/dx searches as I have my search history set to one year. I tried with two cores and that was significantly better. I was given a referral link which gave me $200 credit that lasts for two months so I was able to try lots of different configurations at zero cost. If you want to have a play, please use my referral link which will get you the same $200 credit: https://m.do.co/c/d94f86a3201c
At the weekend I changed the TTL on my my main access url to 60 seconds so that when I came to do the final migration, it would be with minimal downtime for my users.
The actual process was really straightforward. I picked Ubuntu 24.04 LTS as the operating system for the droplet and used SV4FRI's install script to install dxspider, I tested that for a couple of days and then cloned my existing /spider directory from the Pi to the droplet, changed the callsign to -5 and again, ran that for a few days.
I was happy with that so on Monday evening, this was my process:
I stopped the spider service and renamed the /spider directory on the Droplet that I'd been testing.
I stopped the spider service on my Pi5 and started a new copy of the /spider directory over to the Droplet.
While this was copying, I updated the A record on dxspider.co.uk and for any historic users, I updated the A record on g6nhu.changeip.net and g6nhu.getmyip.com as well.
With the copy complete, I checked all the permissions on the Droplet /spider directory, enabled and started the spider service and rebooted the Droplet.
Within a couple of seconds, I had almost all my users back on again and then I remembered something. When I originally set up my node, it was using port 7373. When I built my first spider, I port forwarded 7300 and 7373 in my router to 7300 on the node so I quickly added port 7373 in the Droplet firewall, added another listener in /spider/local/Listeners.pm and restarted the node.
And that was it. Total downtime was about five minutes and because I'd set TTL nice and low, everyone was straight back in.
The VPS I finally went with was a Basic Droplet with Premium AMD CPU, 2 GiB RAM and 2 vCPUs. It comes with 60 GiB storage and a total transfer bandwidth of 3 TB per month. This is far more storage and transfer I'll ever need but it's what came with the two cores I wanted. You can see from the attached screenshot how much bandwidth it's actually using, this was yesterday evening so it'll be a bit higher at weekends but still nowhere near the limit. The cost for this Droplet is $21/month plus tax so it works out as under £20/month. As I said above, if anyone wants to try this, please use my link for $200 credit: https://m.do.co/c/d94f86a3201c
My home internet is fibre to the premises (FTTP) running at 900 Mbps down and 110 Mbps up so the node barely used any traffic but the number of users I had clearly added some congestion to my network as I've noticed since moving it that the internet feels a lot faster. Previously, when going to web pages, there would be a couple of seconds delay between hitting enter and the page loading, as though it was slow doing a DNS lookup. That delay has now gone and everything is a lot snappier than it was before.
This post is just for info really, to describe the process I went through and to give information to anyone who might be considering something similar.
73 Keith.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.tobit.co.uk/pipermail/dxspider-support/attachments/20250416/3f29f434/attachment-0001.htm>
-------------- next part --------------
A non-text attachment was scrubbed...
Name: bandwidth.jpeg
Type: image/jpeg
Size: 46578 bytes
Desc: not available
URL: <https://mailman.tobit.co.uk/pipermail/dxspider-support/attachments/20250416/3f29f434/attachment-0001.jpeg>
More information about the Dxspider-support
mailing list