[Dxspider-support] Automating The Process

Anthony (N2KI) n2ki at amsat.org
Thu Apr 5 17:04:49 CEST 2007


Hello Sysops,

 

A number of folks have inquired about updating keplerian data and others
such as Jim (AD1C's) country files and K1XX's usdbraw file on a Windows
setup.  Generally what I do is create something called a "batch" file.  This
will run totally separate from the Spider software.  Within the "batch" file
is where you tell your computer what you want it to do.  Combine this
"batch" file with a program called WGET
(http://pages.interlog.com/~tcharron/wgetwin.html) and you will be able to
automate the process of updating your keplerian data, country files or
anything else for that matter.  The interface to DXSpider is provided by the
"crontab" file located in the directory \spider\local_cmd. 

 

The Batch File

 

Create your batch file by opening "Notepad.exe". In this example, the
keplerian data will be retrieved and converted.  Keep in mind that the paths
that you see are specific to my machine.  If you are running spider using
the default paths then these should work for you too, provided that AMSAT
does not change the URL. When you save the file, name it whatever you want
and save it where you want.  I save mine to c:\.   It will save with a .txt
extension (i.e. kepupdate.txt).  Right click on the file and choose
"rename".  Rename the file "kepupdate.bat". It you do not change the file
extension to BAT the computer will not know this is a batch file.  Here is
the code I use within the batch file.

 

cd c:\spider\perl

del nasa.all

 

cd c:\spider\perl

wget http://www.amsat.org/amsat/ftp/keps/current/nasa.all

 

cd c:\spider\perl

perl convkeps.pl -c nasa.all

 

The first line will tell the computer to look in the c:\spider\perl
directory and delete the file called "nasa.all".  Once done it will look at
the next command line which tells it to work in the same directory path.
Notice the WGET in the fourth line.  This is the WGET program (freeware)
which must reside in the directory that you are working in.  The case here
is the perl folder located via the path c:\spider\perl. The computer will go
to the URL "http://www.amsat.org/amsat/ftp/keps/current/nasa.all" using the
WGET program and get the file "nasa.all" and download it to the current
directory that it is working in (c:\spider\perl). Once this is complete, the
computer will look at the next command line in the batch file and execute
it.  In this case it uses the perl command to convert the "nasa.all" to the
file that DX spider will use to read the keplerian data. In essence what the
batch file accomplished here was to delete a specific file, go to the
internet to retrieve a file and convert it to a file that DXSpider uses.
This all happens in about 2 seconds of course.  If you wanted to tailor this
batch to get AD1C's country files, just change the URL to
http://www.country-files.com/cty/cty.dat. The other commands would also
change and look like this.

 

cd c:\spider\data

 

cd c:\spider\data

wget http://www.country-files.com/cty/cty.dat

 

cd c:\spider\perl

perl create_prefix.pl

 

 

 

The Crontab File

 

In order to make this batch file execute automatically, I use the Windows
Task Scheduler and assign a time that I want this to occur.  Tailor it to
your own needs.  To further augment the process of the update, you now have
to visit the "crontab" file located in the directory \spider\local_cmd.  It
is within this file that you can have DXSpider run specific jobs for itself.
In my system I have the running of the batch file synchronized to run a few
minutes before the crontab job.  The crontab job of "Load/keps" is done this
way.

 

1 21 * * 0,2,4 run_cmd('load/keps')

 

What this does is run the job "load/keps" at 1 minute after 2100 hrs on
Sunday(0), Tuesday(2) and Thursday(4).  It is overkill I know.  I just
haven't gotten around to changing it yet, but it works. To create another
job to load the new prefix's when AD1C issues an update, use the same line
but just change the time or day that you want it done. Such as 

 

* 18 25 1,3,5,7,9,11 * run_cmd('load/prefix')

 

This will run the job at 1800hrs every 25th day of January, March, May,
July, September, and November;  In other words every other month on the
25th.  If you are wondering about the "*" and what each field represents,
here is a breakdown.

 

 


*

*

*

*

*


Minutes 0-59

Hour 0-23

Day of Month 1-31

Month 1-12

Day of Week 0-6 (0 is Sunday)

 

Ranges of numbers are allowed. Ranges are two numbers separated with a
hyphen. The specified range is inclusive. For example, 8-11 for an hours
entry specifies execution at hours 8, 9, 10 and 11. 

Lists are allowed. A list is a set of numbers (or ranges) separated by
commas. Examples: 1,2,5,9 or 0-3,5,8-12.

 

 

I closing, here are some important points to keep in mind.

 

-          WGET needs to be in the file folder that the batch file will be
doing its work.  I do not know how else to do it as I am not that computer
savvy. 

-          Make certain that you have the correct URL where you want WGET to
get its file.

-          Synchronize the Windows Task Scheduler with the Crontab so things
are done in proper sequence.

-          Pay close attention to how you set up the crontab job to set it
correctly.

-          Remember to change the extension of TXT to BAT or the batch file
will not run.

 

I hope this helps some SYSOPS to better automate some of the processes.  If
anyone cares to add comments to the procedures contained here to enable the
batch to run better or do multiple things, please do so.  I am always
willing to learn a better way.  Share the knowledge.

 

 

 

Anthony - N2KI

 

 

 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.tobit.co.uk/pipermail/dxspider-support/attachments/20070405/e4073d8a/attachment.htm 


More information about the Dxspider-support mailing list