I have a few scanners feeding online feeds such as LiveATC.net. Some of these allow no way to listen directly to the scanner in my shack without listening to the feed online. To solve this, I bought a Behringer MicroMix MX400 mixer. It has 4 inputs and one output. With audio splitters on the scanners, it was an inexpensive way to allow me to easily listen directly to the scanner audio without disconnecting the online feeds.
Utah Radio
Thursday, October 10, 2013
Wednesday, October 2, 2013
Linux Bash Script to get a TAF Forecast From the Console
Usage:
taf [ICAO STATION ID]
Example: $taf kslc
Save the following in a file called taf:
taf [ICAO STATION ID]
Example: $taf kslc
Save the following in a file called taf:
#!/bin/bash station=$(echo "$1" | tr '[:lower:]' '[:upper:]') wget -q -O /home/$USER/taftmp http://weather.noaa.gov/pub/data/forecasts/taf/stations/$station.TXT echo " " more /home/$USER/taftmp echo " " rm /home/$USER/taftmp
Python Script to Update Icecast Stream with METAR and TAF
Here are two scripts to retrieve airport METAR and TAF data from the National Weather Service website and then send this data to an Icecast server to update stream metadata.
You can get the scripts here (use wget from a Linux command line):
You can get the scripts here (use wget from a Linux command line):
http://www.k7bbr.net/files/imetarkslcExample.py
http://www.k7bbr.net/files/itafkslcExample.pySave them with whatever names you'd like (probably imetar[AirportName].py). In each script enter the server/stream information and modify the script to download your desired station name. Then run a cron job every hour (or whatever interval you choose) to update your stream metadata with the airport METAR and TAF data.
Saturday, August 17, 2013
Arduino Maxtroller Case
Finally found the time to put together a box for the 2 Arduinos controlling my Motorola Maxtracs for monitoring the local 800 MHz radio system. Here's a couple of photos of the project:
Thursday, June 13, 2013
Using Python to Update Icecast Scanner Audio Feeds with a Raspberry Pi
The python scripts below will add the ability to update Icecast feed metadata via the Raspberry Pi. They are compatible with Uniden Scanners including:
BCT8
BCT15
BCT15X
BCD996T
BCD996XT
BC346T
BC346XT
BCD396T
BCD396XT
Functionality could be added for other Uniden scanners such as the 780XLT, 785D, 796D, and 898T, but as of now I don't have access to any of these models for testing.
To implement the script, download the script by right clicking on the appropriate link below and click "Save As". You can do this from the Raspberry Pi or from another computer and use any number of tools such as ssh and scp to copy the file to your desired location on your Raspberry Pi.
Uniden MetaPy.py Script - For all models except the BCT8
Uniden MetaPy8.py Script - For the BCT8 model
Or you can use the
or for the BCT8:
(The -O option allows you to give the file location and filename for the downloaded file - in this case it's saved in your home directory under the name
Once downloaded, use a text editor such as
The script uses two Python modules requests and serial that may or may not be installed on your Raspberry Pi. You can install these packages by using apt-get:
To run the script, change directories to the location the script is saved. Then type
The script can be placed in the background by typing
You can also start the script detached from the console by adding an & to the end of the command:
To stop the script, simply type
By using Darkice to feed the audio and this metaPy.py script to feed the alpha tags, you now have a 5W streaming media box to stream both scanner audio and text alpha tags.
BCT8
BCT15
BCT15X
BCD996T
BCD996XT
BC346T
BC346XT
BCD396T
BCD396XT
Functionality could be added for other Uniden scanners such as the 780XLT, 785D, 796D, and 898T, but as of now I don't have access to any of these models for testing.
To implement the script, download the script by right clicking on the appropriate link below and click "Save As". You can do this from the Raspberry Pi or from another computer and use any number of tools such as ssh and scp to copy the file to your desired location on your Raspberry Pi.
Uniden MetaPy.py Script - For all models except the BCT8
Uniden MetaPy8.py Script - For the BCT8 model
Or you can use the
wget
command from your Raspberry Pi's console to download and save the file:wget -O ~/metaPy.py http://www.k7bbr.net/files/metaPy.py
or for the BCT8:
wget -O ~/metaPy8.py http://www.k7bbr.net/files/metaPy8.py
(The -O option allows you to give the file location and filename for the downloaded file - in this case it's saved in your home directory under the name
metaPy.py
or metaPy8.py
.)Once downloaded, use a text editor such as
nano
or vi
to change the configuration section at the top of the script to match your settings:'''-----------------USER CONFIGURATION-----------------''' port = "/dev/ttyUSB0" #enter scanner USB/serial port in quotes here baudrate = 115200 #enter scanner baudrate here icecastUser = "username" #enter icecast username in quotes here (for RR feed use "source") icecastPass = "hackme" #enter icecast password in quotes here icecastServerAddress = "192.168.1.100:8000" #enter icecast server IP Address (and port if necessary) here icecastMountpoint = "mymountpoint" #enter icecast mountpoint in quotes here - don't add leading '/' '''-----------------END USER CONFIGURATION---------------''' '''----------UNNECESSARY TO MODIFY SCRIPT BELOW----------'''To connect your scanner you'll need a USB to serial adapter. If it's the first one connected, it should show up as
/dev/ttyUSB0
. You can check and see by going to /dev
and list the files and look for ttyUSB0 or something similar. Also make sure that the baudrate in the script is the same as the baudrate set in your scanner.The script uses two Python modules requests and serial that may or may not be installed on your Raspberry Pi. You can install these packages by using apt-get:
sudo apt-get install python-serial, python-requests
To run the script, change directories to the location the script is saved. Then type
python metaPy.py
to begin the script. If it is running, you will see it print out talkgroup information, time information and an update status:Davis County Sim 11232 Syracuse PD 2 C
Thu Jun 13 21:07:03 2013
Icecast Update OK
Davis County Sim 10688 Davis Ops 2
Thu Jun 13 21:07:21 2013
Icecast Update OK
The script can be placed in the background by typing
ctrl-z
and then bg
. It can be brought back to the foreground by typing fg
.You can also start the script detached from the console by adding an & to the end of the command:
python metaPy.py &
.To stop the script, simply type
ctrl-c
, or find the process and kill it.By using Darkice to feed the audio and this metaPy.py script to feed the alpha tags, you now have a 5W streaming media box to stream both scanner audio and text alpha tags.
Tuesday, March 5, 2013
UniTrunker Automated File Merge
I've put together some tools to analyze UniTrunker data. The details can be found here: UniTrunker file merge.
These tools are great, but fairly labor-intensive. So in an effort to automate the process and spend more time viewing the files and less time processing them, I recompiled the merging program and wrote a Linux script to do the work.
This processing could be done on a Windows machine using scripts and tools there, but I decided that I'd rather work in Linux for file viewing and processing. This leaves my Windows machines to run UniTrunker and create the logfiles, and my Linux machine to copy that logfiles and do the processing.
There are a few manual tasks that I had to perform initially. These include:
1. Export the system.xml file from UniTrunker and run either msxsl.exe or xslproc to extract group and user information into a .txt file.
2. Set up the folders and file hierarchy in advance
3. Mount the shared file location of the UniTrunker-logfile on the Windows computer
Once these tasks were done, I created a script for each system I was going to export and a cron job to run them each night shortly after midnight after UniTrunker had created and zipped the daily logfiles.
A link to the Linux version of UTC to merge logfiles with user and group data can be found here: UTC.sh file (Note this is not a .sh file, but an executable with no extension. In order for it to be downloaded as a file and not shown in a browser, I added the extension. Once it's downloaded, changed the name to remove the .sh extension.)
An example of the script I created:
utcopyexample.sh
If you're interested in compiling the UTC program yourself, email me and I'll get you the source code.
These tools are great, but fairly labor-intensive. So in an effort to automate the process and spend more time viewing the files and less time processing them, I recompiled the merging program and wrote a Linux script to do the work.
This processing could be done on a Windows machine using scripts and tools there, but I decided that I'd rather work in Linux for file viewing and processing. This leaves my Windows machines to run UniTrunker and create the logfiles, and my Linux machine to copy that logfiles and do the processing.
There are a few manual tasks that I had to perform initially. These include:
1. Export the system.xml file from UniTrunker and run either msxsl.exe or xslproc to extract group and user information into a .txt file.
2. Set up the folders and file hierarchy in advance
3. Mount the shared file location of the UniTrunker-logfile on the Windows computer
Once these tasks were done, I created a script for each system I was going to export and a cron job to run them each night shortly after midnight after UniTrunker had created and zipped the daily logfiles.
A link to the Linux version of UTC to merge logfiles with user and group data can be found here: UTC.sh file (Note this is not a .sh file, but an executable with no extension. In order for it to be downloaded as a file and not shown in a browser, I added the extension. Once it's downloaded, changed the name to remove the .sh extension.)
An example of the script I created:
utcopyexample.sh
!/bin/bash
datestring=$(date +%Y%m%d)
yesterday=$(date --date="yesterday" +%Y%m%d)
yearmonth=$(date --date="yesterday" +%Y%m)
processfile=~/your/path/here/$yearmonth/SystemName/UTM$yesterday.txt
processpath=/your/path/here/$yearmonth/SystemName/
filestring=UniTrunker-$yesterday.log.Z #Name of logfile created by UniTrunker
filestring2=UT$yesterday.log.Z #Shorten filename for ease
#Copy logfile from Windows machine to Linux machine for unzipping:
cp /mnt/nameofwindows/computerRunning/unitrunker/S00000031/$filestring ~/Documents/UTSort/raw/$filestring2
gunzip /home/parallels/Documents/UTSort/raw/UT$yesterday.log.Z #unzip logfile
#Move unzipped file to log storage location:
mv /home/parallels/Documents/UTSort/raw/UT$yesterday.log /home/parallels/Documents/UTSort/$yearmonth/Davis/UT$yesterday.log
#Run UTC sort/merging program to add user names and group names to logfile in addition to UID and GID numbers
#Note: Prior to running this script, it is necessary to export the System.xml file in UT and then run MSXSL.exe in Windows
#or xsltproc in Linux to create users.txt and a groups.txt files to be used by the UTC sort/merging program
UTC /your/path/here/$yearmonth/SystemName/UT$yesterday.log /your/path/here/users.txt /your/path/here/groups.txt $processfile
#Below are examples of searches/sorts that I do on the logfiles to sort via talkgroups or users or find unknowns.
#Place your talkgroup IDs or search terms in the field after grep and create folders and change filenames to be relevant
#to your local groups and users:
more $processfile | grep ,X | grep -v XP >${processpath}/WXPD/WXPD$yesterday.txt #WX PD
more $processfile | grep 10816 >${processpath}/Ops5/Ops5$yesterday.txt #Davis County Ops 5
more $processfile | grep 10848 >${processpath}/Ops6/Ops6$yesterday.txt #Davis County Ops 6
more $processfile | grep 8544 >${processpath}/BountPD1/BountPD1$yesterday.txt #Bountiful PD 1
more $processfile | grep '9440\|9408' >${processpath}/DavisLaw/DavisLaw$yesterday.txt #Davis Law 1 and 2
more $processfile | grep 9952 >${processpath}/DavisService/DavisService$yesterday.txt #Davis Service
more $processfile | grep 17600 >${processpath}/NRegional/NRegional$yesterday.txt #Northern Regional
more $processfile | grep 18464 >${processpath}/Event2/Event2$yesterday.txt #Event 2
more $processfile | grep ', ,G' >>${processpath}/Unknown/DavisUnknown$yearmonth.txt #unknown users
more $processfile | grep 'G,.*, ,'>>${processpath}/Unknown/DavisUnknownTGID$yearmonth.txt #unknown talkgroups
sort -u -t, -k8,8 ${processpath}/Unknown/SLCUnknownTGID$yearmonth.txt | cut -d, -f8 | sort -n>${processpath}/Unknown/SLCUnknownTGIDSorted$yearmonth.txt #List unknown TGID numbers
sort -u -t, -k5,5 ${processpath}/Unknown/SLCUnknownUID$yearmonth.txt | cut -d, -f5 |sort -n>${processpath}/Unknown/SLCUnknownUIDSorted$yearmonth.txt #List unknown UID numbers
Tuesday, January 22, 2013
Virtual Radar
My radio monitoring began as the son of a volunteer fireman, but quickly turned to aviation when I bought my first scanner at around 10 years old. My love of airplanes has led to a career in flying. Strange as it may seem I still am interested in listening to the chatter of airplanes as they pass overhead even though I hear ATC all day at work.
As an addition to my ATC voice monitoring, I was interested in getting a virtual radar receiver that can plot aircraft location and/or altitude using Mode-S and ADS-B data. For a year I tried a AirNav RadarBox receiver. I enjoyed its ease of use out of the box and the ability to combine outside FAA data in one program. However, the RadarBox is limited in its ability to pass raw data to 3rd party programs such as Planeplotter.
Last month I traded in my RadarBox for a Kinetic Avionics SBS-3 Virtual Radar Receiver (http://www.kinetic.co.uk/sbs-3.php). In addition to the Mode-S reception, the SBS-3 has a couple of other features that are really great! The box has 2 SDR tuners that have a bandwidth of 8 MHz that can be divided into 6 receivers. Additionally, there is built-in ACARS decoding and for those near the coast a built-in AIS decoder as well.
The box can be connected to a PC via USB, or used stand-alone via ethernet. Kinetic provides their Basestation software to interface with the SBS-3. I've found it very robust. The main difference here between the SBS-3 and RadarBox is the lack of online data updates built into Basestation. When an aircraft is received, Basestation displays all the data that it receives, but unlike RadarBox, it doesn't go out on the web and download additional aircraft information.
The good thing is there is add-on software that interfaces with Basestation which will download the aircraft information and populate your database. I use ActiveDisplay (http://www.gatwickaviationsociety.org.uk/AD_home.asp) and decided to purchase a subscription for one year to access the Gatwick Aviation Society aircraft database. There is a lite version which is free.
My next project is to set up Planeplotter and interface the SBS-3 with it. One feature I'm interested in is Multilateration. This gives the ability to approximate the location of non-ADS-B aircraft by using data from other users in the area that can see the aircraft.
Seeing Mode-S and ADS-B data is a fascinating way to enhance aircraft monitoring and add to the awareness of who's flying around.
Subscribe to:
Posts (Atom)