Echolink node on Raspberry Pi using SVXLink (part 3 – Metar station data)
I thought the idea of being able to get up to date Metar info over RF was a neat SVXLink feature. Unfortunately, I was unable to get it working as described. I believe the issue is because the NWS website now features SSL Encryption and SVXLink hasn’t been updated to handle this. My work around is to install Apache web server on the Pi. Then write a script that downloads the Metar station data from NWS (HTTPS) and copies the data to the local web server on the Pi which is unencrypted (HTTP). Then we can simply point the ModuleMetarInfo.conf to the loopback address and serve the Metar data from the Pi!
Let’s start by updating apt and installing apache web server.
sudo apt update sudo apt install apache2 -y
Open up a web browser on another device on the same network. In the address bar type in the IP address of the Raspberrry Pi (ie. http://192.168.1.100) and the Apache default web page should load.
Now recreate the NWS folder structure on the Pi. You can use any folder structure you want and modify the scripts accordingly.
cd /var/www/html sudo mkdir data cd data sudo mkdir observations cd observations sudo mkdir metar cd metar sudo mkdir stations
Now is a good time to look up the Airport ICAO codes you’d like to use if you don’t know them. Next let’s edit the ModuleMetarInfo.conf.
sudo nano /etc/svxlink/svxlink.d/ModuleMetarInfo.conf
My ModuleMetarInfo.conf
[ModuleMetarInfo]
NAME=MetarInfo
ID=5
TIMEOUT=120
TYPE=TXT
SERVER=127.0.0.1
LINK=data/observations/metar/stations
STARTDEFAULT=TIST
LONGMESSAGE=1
REMARKS=1
AIRPORTS=TIST,KMIA,KJFK,KORD,KDEN,KLAX,EGLL,CYUL
Next we need to write a few scripts. I store my scripts in /home/pi/Documents/scripts. First let’s write a script to update our critical Metar station data on boot. Create a new file with nano and copy over the script lines, be sure to change the Airport codes to ones of your liking.
sudo nano /home/pi/Documents/scripts/MetarUpdateBoot.sh
#!/bin/bash
#Script will download chosen Metar Station data and copy to Pi webserver
cd /home/pi/Downloads
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/TIST.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KJFK.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KMIA.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KORD.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KDEN.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KLAX.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/EGLL.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/CYUL.txt
sudo mv TIST.TXT /var/www/html/data/observations/metar/stations
sudo mv KJFK.TXT /var/www/html/data/observations/metar/stations
sudo mv KMIA.TXT /var/www/html/data/observations/metar/stations
sudo mv KORD.TXT /var/www/html/data/observations/metar/stations
sudo mv KDEN.TXT /var/www/html/data/observations/metar/stations
sudo mv KLAX.TXT /var/www/html/data/observations/metar/stations
sudo mv EGLL.TXT /var/www/html/data/observations/metar/stations
sudo mv CYUL.TXT /var/www/html/data/observations/metar/stations
Use “ctrl + x” | “y” | Enter to save the file changes and exit nano. Now let’s set the script as executable.
sudo chmod +x MetarUpdateBoot.sh
Finally let’s setup a crontab to run this script at boot. *NOTE* I added sleep 30 to the beginning of the script to make sure the Pi has booted and the network is connected.
sudo crontab -e
Add a line at the bottom of crontab to run the script on boot.
@reboot /home/pi/Documents/scripts/MetarUpdateBoot.sh
Use “ctrl + x” | “y” | Enter to save the changes and exit nano.
Reboot the Pi. After boot you should be able to navigate to the IP address of the Pi and see the Metar station data has been copied to the apache web server (ie http://192.168.1.100/data/observations/metar/stations/ ).
Metar data is updated hourly at approximately 55 minutes passed the hour. Let’s write a second script that we can run hourly to automatically update the Metar data.
sudo nano /home/pi/Documents/scripts/MetarUpdateHourly.sh
#!/bin/bash
cd /home/pi/Downloads
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/TIST.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KJFK.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KMIA.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KORD.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KDEN.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/KLAX.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/EGLL.TXT
sudo wget https://tgftp.nws.noaa.gov/data/observations/metar/stations/CYUL.txt
sudo mv TIST.TXT /var/www/html/data/observations/metar/stations
sudo mv KJFK.TXT /var/www/html/data/observations/metar/stations
sudo mv KMIA.TXT /var/www/html/data/observations/metar/stations
sudo mv KORD.TXT /var/www/html/data/observations/metar/stations
sudo mv KDEN.TXT /var/www/html/data/observations/metar/stations
sudo mv KLAX.TXT /var/www/html/data/observations/metar/stations
sudo mv EGLL.TXT /var/www/html/data/observations/metar/stations
sudo mv CYUL.TXT /var/www/html/data/observations/metar/stations
Use “ctrl + x” | “y” | Enter to save the file changes and exit nano. Now let’s set the script as executable.
sudo chmod +x MetarUpdateHourly.sh
Finally we’ll create a contab to run the script every hour on the hour.
sudo crontab -e
Add a line at the bottom of crontab to run the script every hour on the hour.
0 0-23 * * * /home/pi/Documents/scripts/MetarUpdateHourly.sh
Use “ctrl + x” | “y” | Enter to save the changes and exit nano.
The Pi will now automatically download Metar station data every hour on the hour.
You may be thinking this is great, but I want all the Metar station data not just a few select stations. No problem, you just need another script. *NOTE* Your mileage may vary. However, in my testing it took the Pi approximately 15 minutes to download 5MB of Metar station data.
#!/bin/bash
#script to update all Meta info
#run crob job every hour on the hour to update
#cd to Downloads folder and download Metar station data from NWS
cd /home/pi/Downloads
sudo wget -r -l1 –no-parent https://tgftp.nws.noaa.gov/data/observations/metar/stations/
#cd to apache folder and delete existing Metar data
cd /var/www/html/data/observations/metar/stations
sudo rm *
#cd to the Downloads folder and move Metar data to the webserver
cd /home/pi/Downloads/tgftp.nws.noaa.gov/data/observations/metar/stations
sudo mv * /var/www/html/data/observations/metar/stations
#delete the NWS download folder after moving Metar data to the web server
cd /home/pi/Downloads
sudo rm -rf tgftp.nws.noaa.gov
Use “ctrl + x” | “y” | Enter to save the file changes and exit nano. Now let’s set the script as executable.
sudo chmod +x MetarUpdateAll.sh
Finally we add another crontab to call this script every hour.
sudo crontab -e
Add a line at the bottom of crontab to run the script every hour on the hour.
0 0-23 * * * /home/pi/Documents/scripts/MetarUpdateAll.sh
Use “ctrl + x” | “y” | Enter to save the changes and exit nano.
The Pi will now automatically download ALL Metar station data every hour on the hour.
The last thing we need to do is enable the MetarInfo module in the svxlink.conf and reboot the Pi. Edit the svxlink.conf file.
Under the [Simplex] section. Add ModuleMetarInfo to the end of the MODULES= line
MODULES=ModuleHelp,ModuleParrot,ModuleEchoLink,ModuleTclVoiceMail,ModulesMetarInfo
Use “ctrl + x” | “y” | Enter to save the changes and exit nano. Now reboot the Pi.
sudo reboot