<html>
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8">
</head>
<body>
<p>I would advise you to look at the great work that Yiannis, SV5FRI
has done:<br>
<br>
<b><a class="moz-txt-link-freetext" href="https://www.sv5fri.eu/?tag=dxspider">https://www.sv5fri.eu/?tag=dxspider</a></b><br>
<br>
it will install the necessary perl modules for you.<br>
<br>
The other option is to follow what Dirk put in: <b>/spider/UPGRADE.mojo</b>:<br>
<br>
<br>
9th July 2020<br>
-------------<br>
<br>
There are the notes for upgrading to the mojo branch. PLEASE NOTE<br>
THERE HAVE BEEN CHANGES FOR all MOJO BRANCH USERS. See APPENDIX(i)
at<br>
the end of this document.<br>
<br>
There is NO POINT in doing this at the moment unless you are
running a<br>
node with many (>50) users. It is the future, but at the moment
I am<br>
testing larger and larger installations to check that it a) still<br>
works as people imagine it should and b) it provides the
improvement<br>
in scaling that I am anticipating. There are no significant new<br>
features - yet.<br>
<br>
The BIG TICKET ITEM in this branch is that (potentially) "long
lived"<br>
commands such as sh/dx and commands that poll external internet<br>
resources now don't halt the flow of data through the node. I am
also<br>
using a modern, event driven, web socket "manager" called
Mojolicious<br>
which is considerably more efficient than what went before (but is
not<br>
necessary for small nodes). There are some 200-400 user nodes out<br>
there that will definitely see the difference in terms of both CPU<br>
usage and general responsiveness. Using Mojolicious also brings
the<br>
tantalising possibility of grafting on a web frontend, as it were,
to<br>
the "side" of a DXSpider node. But serious work on this won't
start<br>
until we have a stable base to work on. Apart from anything else
there<br>
will, almost certainly, need to be some internal data structure<br>
reorganisation before a decent web frontend could be constructed.<br>
<br>
*IMPORTANT* There is an action needed to go from mojo build 228
and<br>
*below. See items marked IMPORTANT* below.<br>
<br>
Upgrading is not for the faint of heart. There is no installation<br>
script (but there will be) so, for the time being, you need to do
some<br>
manual editing. Also, while there is a backward path, it will
involve<br>
moving various files from their new home (/spider/local_data),
back to<br>
where they came from (/spider/data).<br>
<br>
Prerequisites:<br>
<br>
A supply of good, strong tea - preferably in pint mugs. A
tin hat,<br>
stout boots, a rucksack with survival rations and a decent
miners'<br>
lamp might also prove comforting. I enclose this link:<br>
<a class="moz-txt-link-freetext" href="http://www.noswearing.com/dictionary">http://www.noswearing.com/dictionary</a> in case you run out
of swear<br>
words.<br>
<br>
An installed and known working git based installation.
Mojo is not<br>
supported under CVS or installation from a tarball.<br>
<br>
perl 5.10.1, preferably 5.14.1 or greater. This basically
means<br>
running ubuntu 12.04 or later (or one of the other linux
distros<br>
of similar age or later). The install instructions are for
debian<br>
based systems. IT WILL NOT WORK WITHOUT A "MODERN" PERL.
Yes, you<br>
can use bleadperl if you know how to use it and can get it
to run<br>
the node under it as a daemon without resorting the handy
URL<br>
supplied above. Personally, I wouldn't bother. It's easier
and<br>
quicker just to upgrade your linux distro. Apart from
anything<br>
else things like ssh ntpd are broken on ALL older systems
and will<br>
allow the ungodly in more easily than something modern.<br>
<br>
Install cpamminus:<br>
<br>
sudo apt-get install cpanminus<br>
or<br>
wget -O - <a class="moz-txt-link-freetext" href="https://cpanmin.us">https://cpanmin.us</a> | perl - --sudo App::cpanminus<br>
or<br>
sudo apt-get install curl<br>
curl -L <a class="moz-txt-link-freetext" href="https://cpanmin.us">https://cpanmin.us</a> | perl - --sudo App::cpanminus<br>
<br>
You will need the following CPAN packages:<br>
<br>
If you are on a Debian based system (Devuan, Ubuntu, Mint
etc)<br>
that is reasonably new (I use Ubuntu 18.04 and Debian 10)
then you<br>
can simply do:<br>
<br>
sudo apt-get install libev-perl libmojolicious-perl
libjson-perl libjson-xs-perl libdata-structure-util-perl
libmath-round-perl<br>
<br>
or on Redhat based systems you can install the very similarly
(but<br>
not the same) named packages. I don't know the exact names
but<br>
using anything less than Centos 7 is likely to cause a
world of<br>
pain. Also I doubt that EV and Mojolicious are packaged
for Centos<br>
at all.<br>
<br>
If in doubt or it is taking too long to find the packages
you<br>
should build from CPAN. Note: you may need to install the<br>
essential packages to build some of these. At the very
least you<br>
will need to install 'make' (sudo apt-get install make) or
just<br>
get everything you are likely to need with:<br>
<br>
sudo apt-get install build-essential.<br>
<br>
sudo cpanm EV Mojolicious JSON JSON::XS
<a class="moz-txt-link-freetext" href="Data::Structure::Util">Data::Structure::Util</a> Math::Round<br>
<br>
# just in case it's missing (top, that is)<br>
sudo apt-get install procps<br>
<br>
Please make sure that, if you insist on using operating system<br>
packages, that your Mojolicious is at least version<br>
7.26. Mojo::IOLoop::ForkCall is NOT LONGER IN USE! The current
version<br>
at time of writing is 8.36.<br>
<br>
Login as the sysop user.<br>
<br>
Edit your /spider/local/DXVars.pm so that the bottom of the file
is<br>
changed from something like:<br>
<br>
---- old ----<br>
<br>
# the port number of the cluster (just leave this, unless
it REALLY matters to you)<br>
$clusterport = 27754;<br>
<br>
# your favorite way to say 'Yes'<br>
$yes = 'Yes';<br>
<br>
# your favorite way to say 'No'<br>
$no = 'No';<br>
<br>
# the interval between unsolicited prompts if not traffic<br>
$user_interval = 11*60;<br>
<br>
# data files live in<br>
$data = "$root/data";<br>
<br>
# system files live in<br>
$system = "$root/sys";<br>
<br>
# command files live in<br>
$cmd = "$root/cmd";<br>
<br>
# local command files live in (and overide $cmd)<br>
$localcmd = "$root/local_cmd";<br>
<br>
# where the user data lives<br>
$userfn = "$data/users";<br>
<br>
# the "message of the day" file<br>
$motd = "$data/motd";<br>
<br>
# are we debugging ?<br>
@debug = qw(chan state msg cron );<br>
<br>
---- to this: ----<br>
<br>
# the port number of the cluster (just leave this, unless
it REALLY matters to you)<br>
$clusterport = 27754;<br>
<br>
# your favorite way to say 'Yes'<br>
$yes = 'Yes';<br>
<br>
# your favorite way to say 'No'<br>
$no = 'No';<br>
<br>
# this is where the paths used to be which you have just
removed<br>
<br>
# are we debugging ?<br>
@debug = qw(chan state msg cron );<br>
<br>
---- new ------<br>
<br>
There may be other stuff after this in DXVars.pm, that doesn't<br>
matter. The point is to remove all the path definitions in<br>
DXVars.pm. If this isn't clear to you then it would be better if
you<br>
asked on dxspider-support for help before attempting to go any<br>
further.<br>
<br>
One of the things that will happen is that several files currently
in<br>
/spider/data will be placed in /spider/local_data. These include
the<br>
user, qsl and usdb data files, the band and prefix files, and
various<br>
"bad" data files. I.e. everything that is modified from the base
git<br>
distribution.<br>
<br>
Now run the console program or telnet localhost and login as the
sysop<br>
user.<br>
<br>
export_users<br>
bye<br>
<br>
as the sysop user:<br>
<br>
sudo service dxspider stop<br>
or<br>
sudo systemctl stop dxspider<br>
<br>
having stopped the node:<br>
<br>
mkdir /spider/local_data<br>
git reset --hard<br>
git pull --all<br>
git checkout --track -b mojo origin/mojo<br>
<br>
if you have not already done this:<br>
<br>
sudo ln -s /spider/perl/console.pl /usr/local/bin/dx<br>
sudo ln -s /spider/perl/*dbg /usr/local/bin<br>
<br>
Now in another window run:<br>
<br>
watchdbg<br>
<br>
and finally:<br>
<br>
sudo service dxspider start<br>
or<br>
sudo service systemctl start dxspider<br>
<br>
You should be aware that this code base is now under active<br>
development and, if you do a 'git pull', what you get may be<br>
broken. But, if this does happen, the likelihood is that I am
actively<br>
working on the codebase and any brokenness may be fixed (maybe in<br>
minutes) with another 'git pull'.<br>
<br>
I try very hard not to leave it in a broken state...<br>
<br>
Dirk G1TLH<br>
<br>
APPENDIX(i)<br>
<br>
Before shutting down to do the update, do a 'sh/ver' and take node
of<br>
the current git revision number (the hex string after "git: mojo/"
and<br>
the "[r]"). Also do an 'export_users' (belt and braces).<br>
<br>
With this revision of the code, the users.v3 file will be replaced<br>
with users.v3j. On restarting the node, the users.v3j file will
be<br>
generated from the users.v3 file. The users.v3 file is not
changed.<br>
The process of generation will take up to 30 seconds depending on
the<br>
number of users in your file, the speed of your disk(s) and the
CPU<br>
speed (probably in that order. On my machine, it takes about 5<br>
seconds, on an RPi??? This is a reversable change. Simply checkout
the<br>
revision you noted down before ("git checkout <reversion>")
and email<br>
me should anything go wrong.<br>
<br>
Part of this process may clear out some old records or suggest
that<br>
there might errors. DO NOT BE ALARM. This is completely normal.<br>
<br>
This change not only should make the rebuilding of the users file<br>
(much) less likely, but tests suggest that access to the users
file is<br>
about 2.5 times quicker. How much difference this makes in
practise<br>
remains to be seen.<br>
<br>
When you done this, in another shell, run<br>
/spider/perl/create_dxsql.pl. This will convert the DXQSL system
to<br>
dxqsl.v1j (for the sh/dxqsl <call> command). When this is
finished,<br>
run 'load/dxqsl' in a console (or restart the node, but it isn't<br>
necessary).<br>
<br>
This has been done to remove Storable - completely - from active
use<br>
in DXSpider. I have started to get more reports of user file<br>
corruptions in the last year than I ever saw in the previous 10.
One<br>
advantage of this is that change is that user file access is now
2.5<br>
times faster. So things like 'export_users' should not stop the
node<br>
for anything like as long as the old version.<br>
<br>
On the subject of export_users. Once you are happy with the
stability<br>
of the new version, you can clean out all your user_asc.* files
(I'd<br>
keep the 'user_asc' that you just created for emergencies). The
modern<br>
equivalent of this file is now called 'user_json' and can used in<br>
exactly the same way as 'user_asc' to restore the users.v3j file
(stop<br>
the node; cd /spider/local_data; perl user_json; start the node).<br>
<br>
GL<br>
<br>
73 de Kin EA3CV<br>
<br>
<br>
</p>
<div class="moz-cite-prefix">El 24/11/2022 a las 9:51, Jan via
Dxspider-support escribió:<br>
</div>
<blockquote type="cite"
cite="mid:ae86c5d4-58dc-9946-1b41-eb4a00bab23f@pa4jj.nl"><font
size="4"><font face="Arial">Maybe a silly question but<br>
how / where do I update the mojo branch?</font></font></blockquote>
</body>
</html>