[Dxspider-support] DXSpider crontabs in general

James Snider jim at lexiann.com
Fri Jan 1 22:36:21 CET 2021


That got it.  I was logging is a root and not sysop.  My Bad.


James Snider
jim at lexiann.com



> On Jan 1, 2021, at 1:39 PM, Niels via Dxspider-support <dxspider-support at tobit.co.uk> wrote:
> 
> Hi Jim,
> 
> And this one. (as sysop)
> 
> cd /tmp | wget -qN ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz> & /spider/perl/create_usdb.pl usdbraw.gz
> load/usdb
> 
> PD9Q de PI1LAP-1  1-Jan-2021 1836Z dxspider >
> load/usdb
> US Database loaded
> PD9Q de PI1LAP-1  1-Jan-2021 1836Z dxspider >
> 
> crontab just look like this....
> 
> # Update USDB
> 0 2 * * 1,3,5 spawn("cd /tmp | wget -qN ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz> & /spider/perl/create_usdb.pl usdbraw.gz")
> 5 2 * * 1,3,5 run_cmd("load/usdb")
> 10 2 * * 1,3,5 spawn("rm /tmp/usdbraw.gz")
> 
> 
> 73 Niels PD9Q
> 
> 
> 
> 
> Op Fri, 1 Jan 2021 08:20:11 -0500
> James Snider via Dxspider-support <dxspider-support at tobit.co.uk <mailto:dxspider-support at tobit.co.uk>>
> schreef:
> 
>> Not sure if usb was working before.  I’m just running a standard
>> install on a raspberry pi.
>> 
>> I tried your commands but I get the same results.  When we do the
>> create_usdb.pl what is the name of the file that gets created and
>> where does it reside?  I want to make sure that it’s getting created.
>> 
>> Jim W8BS
>> 
>> James Snider
>> jim at lexiann.com
>> 
>> 
>> 
>>> On Dec 26, 2020, at 12:02 PM, Stephen Carroll
>>> <scarroll659 at gmail.com> wrote:
>>> 
>>> Jim,
>>> 
>>> I'm assuming you've had the USDB running previously and just
>>> updating?
>>> 
>>> If you are manually executing commands in the terminal window,
>>> here's what I've been using for the past year:
>>> 
>>> cd /spider/data
>>> 
>>> wget -qN ftp://ftp.w1nr.net/usdbraw.gz
>>> <ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz>> /spider/perl/create_usdb.pl
>>> usdbraw.gz
>>> 
>>> rm usdbraw.gz
>>> 
>>> Then login and type: load/usdb
>>> 
>>> I'm somewhat new to linux and currently working on automating this
>>> process in crontab with the help of this listserver.
>>> 
>>> 73, Steve - AA4U
>>> 
>>> 
>>> 
>>> On 12/26/2020 6:27 AM, James Snider via Dxspider-support wrote:  
>>>> Was playing with some of these commands this morning and was
>>>> trying to update the usdb files so I did the following
>>>> 
>>>> 
>>>> cd /spider
>>>> wget ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz> <ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz>>
>>>> 
>>>> Then I executed this
>>>> 
>>>> /spider/perl/create_usdb.pl usdbraw.gz
>>>> 
>>>> It came back with this
>>>> 
>>>> 1462465 records
>>>> 
>>>> 
>>>> But when I log in and do load/usdb I get this
>>>> 
>>>> 
>>>> W8BS de W8BS-2 26-Dec-2020 1220Z dxspider >
>>>> load/usdb
>>>> US Database not loaded
>>>> W8BS de W8BS-2 26-Dec-2020 1221Z dxspider >
>>>> 
>>>> 
>>>> What am I missing here??
>>>> 
>>>> Jim W8BS
>>>> 
>>>> 
>>>> James Snider
>>>> jim at lexiann.com <mailto:jim at lexiann.com> <mailto:jim at lexiann.com <mailto:jim at lexiann.com>>
>>>> 
>>>> 
>>>> 
>>>>> On Dec 24, 2020, at 2:21 AM, David Spoelstra via Dxspider-support
>>>>> <dxspider-support at tobit.co.uk <mailto:dxspider-support at tobit.co.uk>
>>>>> <mailto:dxspider-support at tobit.co.uk <mailto:dxspider-support at tobit.co.uk>>> wrote:
>>>>> 
>>>>> Since we're sharing, I thought I'd give everyone my entire
>>>>> DXSpider crontab. Maybe I can help others and maybe someone has a
>>>>> better way of doing something than I do. BTW, mine has been
>>>>> running flawlessly for years. -David, N9KT
>>>>> 
>>>>> # /spider/local_cmd/crontab
>>>>> # MAKE SURE /spider/cmd_import EXISTS!
>>>>> # min, hour, day-of-month, month, day-of-week (0=Sun)
>>>>> # TIME IS UTC!
>>>>> 
>>>>> # Connect to other nodes
>>>>> * * * * * start_connect('w9pa') unless connected('w9pa')
>>>>> * * * * * start_connect('wb3ffv') unless connected('wb3ffv')
>>>>> * * * * * start_connect('ky4xx') unless connected('ky4xx')
>>>>> 
>>>>> # Monday 1am local - Get latest FCC data (W1NR updates Sundays)
>>>>> 0  5 * * 1 spawn("cd /tmp && wget -qN
>>>>> ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz>
>>>>> <ftp://ftp.w1nr.net/usdbraw.gz <ftp://ftp.w1nr.net/usdbraw.gz>>&& /spider/perl/create_usdb.pl
>>>>> <http://create_usdb.pl/ <http://create_usdb.pl/>> usdbraw.gz") 15 5 * * 1
>>>>> run_cmd("load/usdb") 30 5 * * 1 spawn("rm /tmp/usdbraw.gz")
>>>>> 
>>>>> # Friday 1am local - Get latest Keps to be ready for any contest
>>>>> 0  5 * * 5 spawn("cd /tmp && wget -qN
>>>>> http://www.amsat.org/amsat/ftp/keps/current/nasabare.txt <http://www.amsat.org/amsat/ftp/keps/current/nasabare.txt>
>>>>> <http://www.amsat.org/amsat/ftp/keps/current/nasabare.txt <http://www.amsat.org/amsat/ftp/keps/current/nasabare.txt>>
>>>>> && /spider/perl/convkeps.pl <http://convkeps.pl/ <http://convkeps.pl/>> -p
>>>>> nasabare.txt") 15 5 * * 5 run_cmd("load/keps") 30 5 * * 5
>>>>> spawn("rm /tmp/nasabare.txt")
>>>>> 
>>>>> # Friday 2am local - Get latest cty.dat to be ready for any
>>>>> contest 0  6 * * 5 spawn("cd /spider/data && wget -qN
>>>>> http://www.country-files.com/cty/cty.dat <http://www.country-files.com/cty/cty.dat>
>>>>> <http://www.country-files.com/cty/cty.dat <http://www.country-files.com/cty/cty.dat>>") 15 6 * * 5
>>>>> spawn("cd /spider/data && wget -qN
>>>>> http://www.country-files.com/cty/wpxloc.raw <http://www.country-files.com/cty/wpxloc.raw>
>>>>> <http://www.country-files.com/cty/wpxloc.raw <http://www.country-files.com/cty/wpxloc.raw>>") 30 6 * * 5
>>>>> spawn("cd /spider/data && /spider/perl/create_prefix.pl
>>>>> <http://create_prefix.pl/ <http://create_prefix.pl/>>") 45 6 * * 5 run_cmd("load/prefixes")
>>>>> 
>>>>> # Friday 3am local - Update DXSpider to be ready for any contest
>>>>> 0  7 * * 5 spawn("cd /spider && git reset --hard && git pull")
>>>>> 
>>>>> _______________________________________________
>>>>> Dxspider-support mailing list
>>>>> Dxspider-support at tobit.co.uk <mailto:Dxspider-support at tobit.co.uk> <mailto:Dxspider-support at tobit.co.uk <mailto:Dxspider-support at tobit.co.uk>>
>>>>> https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support>
>>>>> <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support>>  
>>>> 
>>>> 
>>>> 
>>>> _______________________________________________
>>>> Dxspider-support mailing list
>>>> Dxspider-support at tobit.co.uk <mailto:Dxspider-support at tobit.co.uk> <mailto:Dxspider-support at tobit.co.uk <mailto:Dxspider-support at tobit.co.uk>>
>>>> https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support>
>>>> <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support>>  
>> 
> 
> 
> _______________________________________________
> Dxspider-support mailing list
> Dxspider-support at tobit.co.uk <mailto:Dxspider-support at tobit.co.uk>
> https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support <https://mailman.tobit.co.uk/mailman/listinfo/dxspider-support>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://mailman.tobit.co.uk/pipermail/dxspider-support/attachments/20210101/92b2c9b3/attachment-0001.htm>


More information about the Dxspider-support mailing list