As per my previous post I am feeding ADSB Exchange and Flight Radar24 from a RaspberryPi Zero and a USB DVB-T tuner.
This post is broken into three un-linked sections:
1. Logging all flights to .csv file.
2. Deciding csv was not ideal and move logging to database SQLite (incl. setup).
3. Solution to show all days flights on webpage.
Logging to .csv file:
I wanted to locally log flights that flew overhead each day but didn’t have the knowledge to put it all together until /u/gl0ckner/ on Reddit posted his work on logging flights to a .csv file. That didn’t work right out of the box for me so I made some tweaks and my slight modified version can be found on my GitHub. Simply download the file and run it manually by:
Or make executable and add to crontab to execute every minute as so:
chmod +x /home/pi/flightlogger/flight_logger_csv.py crontab -e
* * * * * /usr/bin/python3 /home/pi/flightlogger/flight_logger_csv.py
A new .csv is created for each day. It works well. After a few days I thought it would be more helpful to query the data if it was in a database.
Note: So now a new entry is made to the database every minute regardless if the particular aircraft has been logged previously or not on the same day. I want to change it to say only log an aircraft if not logged in the last hour, this has been implemented in the below database option.
Logging to database: (SQLite)
After a comment by /u/Uncle_BBQ on the same Reddit post who submitted his work on this I thought I would give it a go. This script logs each overhead flight once into the database. It was my first time ever using a database and as usual it didn’t work right out of the box for me so I had to make a few tweaks, below is how to get it running:
#First install dependencies. #They did not install properly for me from the script so did it manually. sudo apt-get update sudo apt-get upgrade sudo apt-get install sqlite3 pip3 install pandas pip3 install numpy pip3 install cython pip3 install sqlalchemy pip3 install psycopg2
Now we need to set up the database:
CREATE TABLE flightdata (date_time NUMERIC, date NUMERIC, time NUMERIC, hex TEXT, flight TEXT, alt_baro NUMERIC, alt_geom NUMERIC, gs NUMERIC, track NUMERIC, geom_rate NUMERIC, squawk NUMERIC, emergency TEXT, category TEXT, nav_qnh NUMERIC, nav_altitude_mcp NUMERIC, lat NUMERIC, lon NUMERIC);
Finally copy the file from my GitHub and run:
To see what’s in the database we can query it by:
SELECT date, time, hex, flight FROM flight_data;
Your output will look like this:
Finally to run the script every minute, make it executable and add to crontab:
chmod +x /home/pi/flightlogger/flight_logger_sql.py crontab -e
* * * * * /usr/bin/python3 /home/pi/flightlogger/flight_logger_sql.py
You can query the database directly if needed, for example:
sqlite3 flightdata_1h.db SELECT date_time, date, time, hex FROM flight_data ORDER BY DATE(DATE) desc LIMIT 100;
Updating webpage with flights that went overhead today.
Note: The perfect solution is when a webpage is requested the database is queried and the results delivered. Since I’m running SQLite and all tutorials were for SQL (and MariaDB wouldn’t run on RPi Zero) I went about it a different way. Through a cron job every hour the database is queried and the results pushed to a .csv file, this csv file is then put into a table and the webpage delivered whenever requested.
To populate the .csv file I use:
#!/usr/bin/python3 # Import dependcies (probably don't need half of them, I just used an old file) import os import json import csv from dotenv import load_dotenv from datetime import datetime from datetime import date from datetime import time from datetime import timedelta import requests import pandas as pd import numpy as np import sqlalchemy from sqlalchemy import create_engine # Load env variables load_dotenv(dotenv_path='') db = 'sqlite:////home/pi/flightlogger/flightdata_1h.db' db_table = 'flight_data' # connect to database engine = create_engine('sqlite:////home/pi/flightlogger/flightdata_1h.db') # Get today's date today = date.today() # Get the current time time = datetime.now().strftime("%H:%M:%S") # Create a current time stamp dateTime = datetime.strptime(datetime.now().strftime('%Y-%m-%d %H:%M:%S'), '%Y-%m-%d %H:%M:%S') # Try to connect to database try: df2 = pd.read_sql("SELECT * FROM flight_data WHERE flight_data.date_time > datetime('now','localtime', '-3600 seconds')", engine) #SQLite Syntax dbConnected = True except: # If database does not exist or is unable to connect then print that print('Unable to connect to database.') # Set boolen value to False dbConnected = False df2.to_csv("/var/www/html/data.csv", mode='a', header=False)
To clear the .csv file every night at midnight I use:
file = open("/var/www/html/data.csv","w") file.write("index,date_time,hex,flight,alt_baro,alt_geom,gs,track,geom_rate,squawk,emergency,category,nav_qnh,nav_altitude_mcp,lat,lon,date,time\n") file.close()
Cron jobs to run the above:
0 * * * * /usr/bin/python3 /home/pi/flightlogger/db_flight_to_csv.py 10 0 * * * /usr/bin/python /home/pi/flightlogger/csv_clear.py
The webpage located at /var/www/html/index.html is:
Give ‘pi’ user access to edit the .csv file:
sudo chown -R pi /var/www/html/data.csv
Other references I used: