55 Commits

Author SHA1 Message Date
Ian Renton
583735c99f Remove start/end dates #82 2025-11-26 07:40:46 +00:00
Ian Renton
6c9f3136b8 First pass at TOTA support #82 2025-11-24 21:57:29 +00:00
Ian Renton
4e427f26c3 About page updates 2025-11-23 11:23:13 +00:00
ian
714151a6b4 Update views/webpage_about.tpl 2025-11-23 10:58:26 +00:00
Ian Renton
0ccc2bd15d Minor tweaks 2025-11-17 17:58:52 +00:00
Ian Renton
5724c4c7ea Minor tweaks 2025-11-17 17:50:29 +00:00
Ian Renton
94c0cad769 Improve SIG regexes to specify numbers of digits 2025-11-17 17:41:01 +00:00
Ian Renton
452e4beb29 Fix imports 2025-11-17 17:22:12 +00:00
Ian Renton
b132fe8a39 Fix a bug where SIG API spots could be re-tagged as another SIG e.g. WAB if that was named in the comment. 2025-11-17 17:19:43 +00:00
Ian Renton
e525aaed92 Fix a bug where spothole was too keen on extracting secondary references for xOTA programmes from comments, and was not checking that the "references" it found were surrounded by whitespace. 2025-11-16 17:46:40 +00:00
Ian Renton
92b7110356 Merge remote-tracking branch 'origin/main' 2025-11-16 17:46:05 +00:00
Ian Renton
114eacb9dc Fix a bug where spothole was too keen on extracting secondary references for xOTA programmes from comments, and was not checking that the "references" it found were surrounded by whitespace. 2025-11-16 17:45:58 +00:00
Ian Renton
2a90b17b6b Fix URLs for WOTA outlying fells 2025-11-14 14:37:36 +00:00
Ian Renton
ae075f3ac7 Version number bump 2025-11-13 21:52:13 +00:00
Ian Renton
efa9806c64 Look up K0SWE's dxcc.json rather than using our own tables. Closes #80 2025-11-13 21:51:20 +00:00
Ian Renton
03829831c0 Fix debug code commit 2025-11-13 21:47:05 +00:00
Ian Renton
4f83468309 Add config for "Number of Spots" and "Spot Age" values used in the web UI. Closes #79 2025-11-13 21:18:27 +00:00
Ian Renton
2165ebc103 DXCC 999 2025-11-13 20:10:53 +00:00
Ian Renton
cf46017917 Fix WOTA parsing bug 2025-11-12 17:40:24 +00:00
Ian Renton
c30e1616d3 Image-based flags 2025-11-11 06:30:17 +00:00
Ian Renton
422c917073 Docs tweak 2025-11-10 19:30:40 +00:00
Ian Renton
cad1f5cfdf Defensive coding fix 2025-11-10 19:03:12 +00:00
Ian Renton
78f8cd26f0 Possible emoji flag fix for Windows/Chrome 2025-11-10 19:01:25 +00:00
Ian Renton
d6cc2673dd Search input should have search type 2025-11-08 18:44:37 +00:00
Ian Renton
8f553a59f8 Doc tweaks 2025-11-08 18:23:11 +00:00
Ian Renton
f1841ca59e v1.0 release 2025-11-08 11:44:11 +00:00
Ian Renton
85e0a7354c Reject "AA00aa" grids and 0/0 latlons from online lookup 2025-11-03 20:14:41 +00:00
Ian Renton
2ccfa28119 Get "qth" friendly name from QRZ/clublog and return in the callsign lookup. Closes #77 2025-11-02 20:51:16 +00:00
Ian Renton
b313735e28 Add missing break statements 2025-11-02 20:38:30 +00:00
Ian Renton
bbaa3597f6 Implement WWFF reference lookup. Closes #76 2025-11-02 20:37:30 +00:00
Ian Renton
e61d7bedb4 Exception handling #74 2025-11-02 18:00:24 +00:00
Ian Renton
ebf07f352f Exception handling #74 2025-11-02 17:59:37 +00:00
Ian Renton
e83ddead62 Tidy up some issues with alerts #74 2025-11-02 17:54:34 +00:00
Ian Renton
b8e1506846 Use HamQTH as a lookup provider. Closes #73 2025-11-02 17:45:54 +00:00
Ian Renton
d80c4cfbeb Provide an externally usable callsign lookup feature. #73 2025-11-02 16:52:27 +00:00
Ian Renton
92af0761aa Move checks for multiple references in comments out of POTA and DX Cluster classes into the main infer_missing() function for spots. #54 2025-11-02 16:18:33 +00:00
Ian Renton
286ff66721 Refactor looking up SIG reference details into a common location, taking it out of the individual spot providers. This means we can now look up references properly from Cluster spot comments, etc. Closes #74 as there is no longer any duplication of these lookups. Works towards #54 as sig_refs now specify their sig internally. 2025-11-02 15:45:19 +00:00
Ian Renton
28010a68ae Single common URL cache for semi-static lookups #74 2025-11-02 14:22:15 +00:00
Ian Renton
0e8c7873d8 Lookup for sig_ref data #73 2025-11-02 14:13:03 +00:00
Ian Renton
649b57a570 FIx a bug where touch scrolling on the map's filters popup would still be passed through to the map. Closes #72 2025-11-02 12:07:32 +00:00
Ian Renton
fa92657d9c Fix old alerts not getting deleted 2025-11-01 17:25:20 +00:00
Ian Renton
30fc333c8b Fix scrolling map filters panel on mobile 2025-11-01 17:05:47 +00:00
Ian Renton
0570b39e09 Add Spot page to allow sig and sig_ref entries. Closes #71 2025-11-01 12:38:57 +00:00
Ian Renton
1ed543872a Add Spot page to take mode options from API #71 2025-11-01 12:03:11 +00:00
Ian Renton
812d031a2c Fix link 2025-11-01 11:45:21 +00:00
Ian Renton
471c487132 Allow adding the DX grid when spotting #71 2025-11-01 11:11:33 +00:00
Ian Renton
57d950c1ca Fix search box appearance on mobile 2025-11-01 10:35:47 +00:00
Ian Renton
a3ec923c56 Improve add spot page warning and server-side validation. #71 2025-11-01 10:29:18 +00:00
Ian Renton
69821f817b Extract "add spot" into its own page 2025-11-01 08:52:46 +00:00
ian
0c79436399 Update views/webpage_spots.tpl 2025-10-31 19:21:01 +00:00
Ian Renton
3964134db9 Add dx_call_includes filter input on web UI 2025-10-31 17:52:29 +00:00
Ian Renton
04435e770a Add dx_call_includes filter 2025-10-31 17:33:27 +00:00
Ian Renton
a4645171e4 Thanks 2025-10-31 14:24:04 +00:00
Ian Renton
65d546ef7e Support BOTA alerts. Closes #58 2025-10-31 14:06:22 +00:00
Ian Renton
193838b9d3 Fix colours of table rows and JS exception on sig_refs being null. 2025-10-31 10:50:49 +00:00
455 changed files with 1581 additions and 2043 deletions

View File

@@ -10,7 +10,7 @@ The API is deliberately well-defined with an OpenAPI specification and auto-gene
Spothole itself is also open source, Public Domain licenced code that anyone can take and modify.
Supported data sources include DX Clusters, the Reverse Beacon Network (RBN), the APRS Internet Service (APRS-IS), POTA, SOTA, WWFF, GMA, WWBOTA, HEMA, Parks 'n' Peaks, ZLOTA, WOTA, the UK Packet Repeater Network, and NG3K.
Supported data sources include DX Clusters, the Reverse Beacon Network (RBN), the APRS Internet Service (APRS-IS), POTA, SOTA, WWFF, GMA, WWBOTA, HEMA, Parks 'n' Peaks, ZLOTA, WOTA, BOTA, the UK Packet Repeater Network, NG3K, and any site based on the xOTA software by nischu.
![Screenshot](/images/screenshot2.png)
@@ -196,8 +196,12 @@ Finally, simply add the appropriate config to the `providers` section of `config
As well as being my work, I have also gratefully received feature patches from Steven, M1SDH.
The project contains a self-hosted copy of Font Awesome's free library, in the `/webasset/fa/` directory. This is subject to Font Awesome's licence and is not covered by the overall licence declared in the `LICENSE` file. This approach was taken in preference to using their hosted kits due to the popularity of this project exceeding the page view limit for their free hosted offering.
The project contains a self-hosted copy of Font Awesome's free library, in the `/webassets/fa/` directory. This is subject to Font Awesome's licence and is not covered by the overall licence declared in the `LICENSE` file. This approach was taken in preference to using their hosted kits due to the popularity of this project exceeding the page view limit for their free hosted offering.
The software uses a number of Python libraries as listed in `requirements.txt`, and a number of JavaScript libraries such as jQuery and moment.js. This project would not have been possible without these libraries, so many thanks to their developers.
The project contains a set of flag icons generated using the "Noto Color Emoji" font on a Debian system, in the `/webassets/img/flags/` directory.
The software uses a number of Python libraries as listed in `requirements.txt`, and a number of JavaScript libraries such as jQuery, Leaflet and Bootstrap. This project would not have been possible without these libraries, so many thanks to their developers.
Particular thanks go to country-files.com for providing country lookup data for amateur radio, to K0SWE for [this JSON-formatted DXCC data](https://github.com/k0swe/dxcc-json/), and to the developers of `pyhamtools` for making it easy to use country-files.com data as well as QRZ.com and Clublog lookup.
The project's name was suggested by Harm, DK4HAA. Thanks!

View File

@@ -1,9 +1,8 @@
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from core.config import SERVER_OWNER_CALLSIGN, MAX_ALERT_AGE
from core.constants import SOFTWARE_NAME, SOFTWARE_VERSION
from core.config import MAX_ALERT_AGE
# Generic alert provider class. Subclasses of this query the individual APIs for alerts.

46
alertproviders/bota.py Normal file
View File

@@ -0,0 +1,46 @@
from datetime import datetime, timedelta
import pytz
from bs4 import BeautifulSoup
from alertproviders.http_alert_provider import HTTPAlertProvider
from data.alert import Alert
from data.sig_ref import SIGRef
# Alert provider for Beaches on the Air
class BOTA(HTTPAlertProvider):
POLL_INTERVAL_SEC = 3600
ALERTS_URL = "https://www.beachesontheair.com/"
def __init__(self, provider_config):
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
def http_response_to_alerts(self, http_response):
new_alerts = []
# Find the table of upcoming alerts
bs = BeautifulSoup(http_response.content.decode(), features="lxml")
tbody = bs.body.find('div', attrs={'class': 'view-activations-public'}).find('table', attrs={'class': 'views-table'}).find('tbody')
for row in tbody.find_all('tr'):
cells = row.find_all('td')
first_cell_text = str(cells[0].find('a').contents[0]).strip()
ref_name = first_cell_text.split(" by ")[0]
dx_call = str(cells[1].find('a').contents[0]).strip().upper()
# Get the date, dealing with the fact we get no year so have to figure out if it's last year or next year
date_text = str(cells[2].find('span').contents[0]).strip()
date_time = datetime.strptime(date_text,"%d %b - %H:%M UTC").replace(tzinfo=pytz.UTC)
date_time = date_time.replace(year=datetime.now(pytz.UTC).year)
# If this was more than a day ago, activation is actually next year
if date_time < datetime.now(pytz.UTC) - timedelta(days=1):
date_time = date_time.replace(year=datetime.now(pytz.UTC).year + 1)
# Convert to our alert format
alert = Alert(source=self.name,
dx_calls=[dx_call],
sig_refs=[SIGRef(id=ref_name, sig="BOTA")],
start_time=date_time.timestamp(),
is_dxpedition=False)
new_alerts.append(alert)
return new_alerts

View File

@@ -4,7 +4,6 @@ from datetime import datetime
import pytz
from alertproviders.http_alert_provider import HTTPAlertProvider
from core.sig_utils import get_icon_for_sig
from data.alert import Alert
from data.sig_ref import SIGRef
@@ -22,6 +21,7 @@ class ParksNPeaks(HTTPAlertProvider):
# Iterate through source data
for source_alert in http_response.json():
# Calculate some things
sig = source_alert["Class"]
if " - " in source_alert["Location"]:
split = source_alert["Location"].split(" - ")
sig_ref = split[0]
@@ -38,19 +38,17 @@ class ParksNPeaks(HTTPAlertProvider):
dx_calls=[source_alert["CallSign"].upper()],
freqs_modes=source_alert["Freq"] + " " + source_alert["MODE"],
comment=source_alert["Comments"],
sig=source_alert["Class"],
sig_refs=[SIGRef(id=sig_ref, name=sig_ref_name)],
icon=get_icon_for_sig(source_alert["Class"]),
sig_refs=[SIGRef(id=sig_ref, sig=sig, name=sig_ref_name)],
start_time=start_time,
is_dxpedition=False)
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
if alert.sig not in ["POTA", "SOTA", "WWFF", "SiOTA", "ZLOTA", "KRMNPA"]:
logging.warn("PNP alert found with sig " + alert.sig + ", developer needs to add support for this!")
if sig and sig not in ["POTA", "SOTA", "WWFF", "SiOTA", "ZLOTA", "KRMNPA"]:
logging.warn("PNP alert found with sig " + sig + ", developer needs to add support for this!")
# If this is POTA, SOTA or WWFF data we already have it through other means, so ignore. Otherwise, add to
# the alert list. Note that while ZLOTA has its own spots API, it doesn't have its own alerts API. So that
# means the PnP *spot* provider rejects ZLOTA spots here, but the PnP *alerts* provider here allows ZLOTA.
if alert.sig not in ["POTA", "SOTA", "WWFF"]:
if sig not in ["POTA", "SOTA", "WWFF"]:
new_alerts.append(alert)
return new_alerts

View File

@@ -3,7 +3,6 @@ from datetime import datetime
import pytz
from alertproviders.http_alert_provider import HTTPAlertProvider
from core.sig_utils import get_icon_for_sig
from data.alert import Alert
from data.sig_ref import SIGRef
@@ -26,9 +25,7 @@ class POTA(HTTPAlertProvider):
dx_calls=[source_alert["activator"].upper()],
freqs_modes=source_alert["frequencies"],
comment=source_alert["comments"],
sig="POTA",
sig_refs=[SIGRef(id=source_alert["reference"], name=source_alert["name"], url="https://pota.app/#/park/" + source_alert["reference"])],
icon=get_icon_for_sig("POTA"),
sig_refs=[SIGRef(id=source_alert["reference"], sig="POTA", name=source_alert["name"], url="https://pota.app/#/park/" + source_alert["reference"])],
start_time=datetime.strptime(source_alert["startDate"] + source_alert["startTime"],
"%Y-%m-%d%H:%M").replace(tzinfo=pytz.UTC).timestamp(),
end_time=datetime.strptime(source_alert["endDate"] + source_alert["endTime"],

View File

@@ -3,7 +3,6 @@ from datetime import datetime
import pytz
from alertproviders.http_alert_provider import HTTPAlertProvider
from core.sig_utils import get_icon_for_sig
from data.alert import Alert
from data.sig_ref import SIGRef
@@ -27,9 +26,7 @@ class SOTA(HTTPAlertProvider):
dx_names=[source_alert["activatorName"].upper()],
freqs_modes=source_alert["frequency"],
comment=source_alert["comments"],
sig="SOTA",
sig_refs=[SIGRef(id=source_alert["associationCode"] + "/" + source_alert["summitCode"], name=source_alert["summitDetails"], url="https://www.sotadata.org.uk/en/summit/" + source_alert["summitCode"])],
icon=get_icon_for_sig("SOTA"),
sig_refs=[SIGRef(id=source_alert["associationCode"] + "/" + source_alert["summitCode"], sig="SOTA", name=source_alert["summitDetails"])],
start_time=datetime.strptime(source_alert["dateActivated"],
"%Y-%m-%dT%H:%M:%SZ").replace(tzinfo=pytz.UTC).timestamp(),
is_dxpedition=False)

View File

@@ -4,7 +4,6 @@ import pytz
from rss_parser import RSSParser
from alertproviders.http_alert_provider import HTTPAlertProvider
from core.sig_utils import get_icon_for_sig
from data.alert import Alert
from data.sig_ref import SIGRef
@@ -54,9 +53,7 @@ class WOTA(HTTPAlertProvider):
dx_calls=[dx_call],
freqs_modes=freqs_modes,
comment=comment,
sig="WOTA",
sig_refs=[SIGRef(id=ref, name=ref_name, url="https://www.wota.org.uk/MM_" + ref)] if ref else [],
icon=get_icon_for_sig("WOTA"),
sig_refs=[SIGRef(id=ref, sig="WOTA", name=ref_name)] if ref else [],
start_time=time.timestamp())
# Add to our list.

View File

@@ -3,7 +3,6 @@ from datetime import datetime
import pytz
from alertproviders.http_alert_provider import HTTPAlertProvider
from core.sig_utils import get_icon_for_sig
from data.alert import Alert
from data.sig_ref import SIGRef
@@ -26,9 +25,7 @@ class WWFF(HTTPAlertProvider):
dx_calls=[source_alert["activator_call"].upper()],
freqs_modes=source_alert["band"] + " " + source_alert["mode"],
comment=source_alert["remarks"],
sig="WWFF",
sig_refs=[SIGRef(id=source_alert["reference"], url="https://wwff.co/directory/?showRef=" + source_alert["reference"])],
icon=get_icon_for_sig("WWFF"),
sig_refs=[SIGRef(id=source_alert["reference"], sig="WWFF")],
start_time=datetime.strptime(source_alert["utc_start"],
"%Y-%m-%d %H:%M:%S").replace(tzinfo=pytz.UTC).timestamp(),
end_time=datetime.strptime(source_alert["utc_end"],

View File

@@ -81,6 +81,18 @@ spot-providers:
class: "UKPacketNet"
name: "UK Packet Radio Net"
enabled: false
-
class: "XOTA"
name: "39C3 TOTA"
enabled: false
url: "https://39c3.c3nav.de/"
# Fixed SIG/latitude/longitude for all spots from a provider is currently only a feature for the "XOTA" provider,
# the software found at https://github.com/nischu/xOTA/. This is because this is a generic backend for xOTA
# programmes and so different URLs provide different programmes.
sig: "TOTA"
latitude: 53.5622678
longitude: 9.9855205
# Alert providers to use. Same setup as the spot providers list above.
alert-providers:
@@ -104,6 +116,10 @@ alert-providers:
class: "WOTA"
name: "WOTA"
enabled: true
-
class: "BOTA"
name: "BOTA"
enabled: true
-
class: "NG3K"
name: "NG3K"
@@ -119,12 +135,25 @@ max-alert-age-sec: 604800
# Login for QRZ.com to look up information. Optional. You will need an "XML Subscriber" (paid) package to retrieve all
# the data for a callsign via their system.
qrz-username: "N0CALL"
qrz-username: ""
qrz-password: ""
# Login for HamQTH to look up information. Optional.
hamqth-username: ""
hamqth-password: ""
# API key for Clublog to look up information. Optional. You sill need to request one via their helpdesk portal if you
# want to use callsign lookups from Clublog.
clublog-api-key: ""
# Allow submitting spots to the Spothole API?
allow-spotting: true
# Options for the web UI.
web-ui-options:
spot-count: [10, 25, 50, 100]
spot-count-default: 50
max-spot-age: [5, 10, 30, 60]
max-spot-age-default: 30
alert-count: [25, 50, 100, 200, 500]
alert-count-default: 100

10
core/cache_utils.py Normal file
View File

@@ -0,0 +1,10 @@
from datetime import timedelta
from requests_cache import CachedSession
# Cache for "semi-static" data such as the locations of parks, CSVs of reference lists, etc.
# This has an expiry time of 30 days, so will re-request from the source after that amount
# of time has passed. This is used throughout Spothole to cache data that does not change
# rapidly.
SEMI_STATIC_URL_DATA_CACHE = CachedSession("cache/semi_static_url_data_cache",
expire_after=timedelta(days=30))

View File

@@ -1,5 +1,5 @@
import logging
from datetime import datetime, timedelta
from datetime import datetime
from threading import Timer
from time import sleep
@@ -38,7 +38,7 @@ class CleanupTimer:
for id in list(self.alerts.iterkeys()):
alert = self.alerts[id]
if alert.expired():
self.alerts.evict(id)
self.alerts.delete(id)
self.status = "OK"
self.last_cleanup_time = datetime.now(pytz.UTC)

View File

@@ -17,3 +17,4 @@ MAX_ALERT_AGE = config["max-alert-age-sec"]
SERVER_OWNER_CALLSIGN = config["server-owner-callsign"]
WEB_SERVER_PORT = config["web-server-port"]
ALLOW_SPOTTING = config["allow-spotting"]
WEB_UI_OPTIONS = config["web-ui-options"]

File diff suppressed because it is too large Load Diff

View File

@@ -1,7 +1,11 @@
import gzip
import json
import logging
import re
import urllib.parse
from datetime import timedelta
import xmltodict
from diskcache import Cache
from pyhamtools import LookupLib, Callinfo, callinfo
from pyhamtools.exceptions import APIKeyMissingError
@@ -9,9 +13,11 @@ from pyhamtools.frequency import freq_to_band
from pyhamtools.locator import latlong_to_locator
from requests_cache import CachedSession
from core.cache_utils import SEMI_STATIC_URL_DATA_CACHE
from core.config import config
from core.constants import BANDS, UNKNOWN_BAND, CW_MODES, PHONE_MODES, DATA_MODES, ALL_MODES, \
QRZCQ_CALLSIGN_LOOKUP_DATA, HTTP_HEADERS
HTTP_HEADERS, HAMQTH_PRG
# Singleton class that provides lookup functionality.
class LookupHelper:
@@ -31,18 +37,26 @@ class LookupHelper:
self.QRZ_CALLSIGN_DATA_CACHE = None
self.LOOKUP_LIB_QRZ = None
self.QRZ_AVAILABLE = None
self.HAMQTH_AVAILABLE = None
self.HAMQTH_CALLSIGN_DATA_CACHE = None
self.HAMQTH_BASE_URL = "https://www.hamqth.com/xml.php"
# HamQTH session keys expire after an hour. Rather than working out how much time has passed manually, we cheat
# and cache the HTTP response for 55 minutes, so when the login URL is queried within 55 minutes of the previous
# time, you just get the cached response.
self.HAMQTH_SESSION_LOOKUP_CACHE = CachedSession("cache/hamqth_session_cache",
expire_after=timedelta(minutes=55))
self.CALL_INFO_BASIC = None
self.LOOKUP_LIB_BASIC = None
self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION = None
self.COUNTRY_FILES_CTY_PLIST_CACHE = None
self.DXCC_JSON_DOWNLOAD_LOCATION = None
self.DXCC_DATA = None
def start(self):
# Lookup helpers from pyhamtools. We use four (!) of these. The simplest is country-files.com, which downloads the data
# once on startup, and requires no login/key, but does not have the best coverage.
# If the user provides login details/API keys, we also set up helpers for QRZ.com, Clublog (live API request), and
# Clublog (XML download). The lookup functions iterate through these in a sensible order, looking for suitable data.
self.COUNTRY_FILES_CTY_PLIST_CACHE = CachedSession("cache/country_files_city_plist_cache",
expire_after=timedelta(days=10))
# Lookup helpers from pyhamtools. We use five (!) of these. The simplest is country-files.com, which downloads
# the data once on startup, and requires no login/key, but does not have the best coverage.
# If the user provides login details/API keys, we also set up helpers for QRZ.com, HamQTH, Clublog (live API
# request), and Clublog (XML download). The lookup functions iterate through these in a sensible order, looking
# for suitable data.
self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION = "cache/cty.plist"
success = self.download_country_files_cty_plist()
if success:
@@ -52,12 +66,15 @@ class LookupHelper:
self.LOOKUP_LIB_BASIC = LookupLib(lookuptype="countryfile")
self.CALL_INFO_BASIC = Callinfo(self.LOOKUP_LIB_BASIC)
self.QRZ_AVAILABLE = config["qrz-password"] != ""
self.QRZ_AVAILABLE = config["qrz-username"] != "" and config["qrz-password"] != ""
if self.QRZ_AVAILABLE:
self.LOOKUP_LIB_QRZ = LookupLib(lookuptype="qrz", username=config["qrz-username"],
pwd=config["qrz-password"])
self.QRZ_CALLSIGN_DATA_CACHE = Cache('cache/qrz_callsign_lookup_cache')
self.HAMQTH_AVAILABLE = config["hamqth-username"] != "" and config["hamqth-password"] != ""
self.HAMQTH_CALLSIGN_DATA_CACHE = Cache('cache/hamqth_callsign_lookup_cache')
self.CLUBLOG_API_KEY = config["clublog-api-key"]
self.CLUBLOG_CTY_XML_CACHE = CachedSession("cache/clublog_cty_xml_cache", expire_after=timedelta(days=10))
self.CLUBLOG_API_AVAILABLE = self.CLUBLOG_API_KEY != ""
@@ -71,6 +88,19 @@ class LookupHelper:
filename=self.CLUBLOG_XML_DOWNLOAD_LOCATION)
self.CLUBLOG_CALLSIGN_DATA_CACHE = Cache('cache/clublog_callsign_lookup_cache')
# We also get a lookup of DXCC data from K0SWE to use for additional lookups of e.g. flags.
self.DXCC_JSON_DOWNLOAD_LOCATION = "cache/dxcc.json"
success = self.download_dxcc_json()
if success:
with open(self.DXCC_JSON_DOWNLOAD_LOCATION) as f:
tmp_dxcc_data = json.load(f)["dxcc"]
# Reformat as a map for faster lookup
self.DXCC_DATA = {}
for dxcc in tmp_dxcc_data:
self.DXCC_DATA[dxcc["entityCode"]] = dxcc
else:
logging.error("Could not download DXCC data, flags and similar data may be missing!")
# Download the cty.plist file from country-files.com on first startup. The pyhamtools lib can actually download and use
# this itself, but it's occasionally offline which causes it to throw an error. By downloading it separately, we can
# catch errors and handle them, falling back to a previous copy of the file in the cache, and we can use the
@@ -78,7 +108,7 @@ class LookupHelper:
def download_country_files_cty_plist(self):
try:
logging.info("Downloading Country-files.com cty.plist...")
response = self.COUNTRY_FILES_CTY_PLIST_CACHE.get("https://www.country-files.com/cty/cty.plist",
response = SEMI_STATIC_URL_DATA_CACHE.get("https://www.country-files.com/cty/cty.plist",
headers=HTTP_HEADERS).text
with open(self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION, "w") as f:
@@ -90,6 +120,22 @@ class LookupHelper:
logging.error("Exception when downloading Clublog cty.xml", e)
return False
# Download the dxcc.json file on first startup.
def download_dxcc_json(self):
try:
logging.info("Downloading dxcc.json...")
response = SEMI_STATIC_URL_DATA_CACHE.get("https://raw.githubusercontent.com/k0swe/dxcc-json/refs/heads/main/dxcc.json",
headers=HTTP_HEADERS).text
with open(self.DXCC_JSON_DOWNLOAD_LOCATION, "w") as f:
f.write(response)
f.flush()
return True
except Exception as e:
logging.error("Exception when downloading dxcc.json", e)
return False
# Download the cty.xml (gzipped) file from Clublog on first startup, so we can use it in preference to querying the
# database live if possible.
def download_clublog_ctyxml(self):
@@ -148,7 +194,12 @@ class LookupHelper:
qrz_data = self.get_qrz_data_for_callsign(call)
if qrz_data and "country" in qrz_data:
country = qrz_data["country"]
# Couldn't get anything from QRZ.com database, try Clublog data
# Couldn't get anything from QRZ.com database, try HamQTH
if not country:
hamqth_data = self.get_hamqth_data_for_callsign(call)
if hamqth_data and "country" in hamqth_data:
country = hamqth_data["country"]
# Couldn't get anything from HamQTH database, try Clublog data
if not country:
clublog_data = self.get_clublog_xml_data_for_callsign(call)
if clublog_data and "Name" in clublog_data:
@@ -157,16 +208,15 @@ class LookupHelper:
clublog_data = self.get_clublog_api_data_for_callsign(call)
if clublog_data and "Name" in clublog_data:
country = clublog_data["Name"]
# Couldn't get anything from Clublog database, try QRZCQ data
# Couldn't get anything from Clublog database, try DXCC data
if not country:
qrzcq_data = self.get_qrzcq_data_for_callsign(call)
if qrzcq_data and "country" in qrzcq_data:
country = qrzcq_data["country"]
dxcc_data = self.get_dxcc_data_for_callsign(call)
if dxcc_data and "name" in dxcc_data:
country = dxcc_data["name"]
return country
# Infer a DXCC ID from a callsign
def infer_dxcc_id_from_callsign(self, call):
self.get_clublog_xml_data_for_callsign("M0TRT")
try:
# Start with the basic country-files.com-based decoder.
dxcc = self.CALL_INFO_BASIC.get_adif_id(call)
@@ -177,7 +227,12 @@ class LookupHelper:
qrz_data = self.get_qrz_data_for_callsign(call)
if qrz_data and "adif" in qrz_data:
dxcc = qrz_data["adif"]
# Couldn't get anything from QRZ.com database, try Clublog data
# Couldn't get anything from QRZ.com database, try HamQTH
if not dxcc:
hamqth_data = self.get_hamqth_data_for_callsign(call)
if hamqth_data and "adif" in hamqth_data:
dxcc = hamqth_data["adif"]
# Couldn't get anything from HamQTH database, try Clublog data
if not dxcc:
clublog_data = self.get_clublog_xml_data_for_callsign(call)
if clublog_data and "DXCC" in clublog_data:
@@ -186,11 +241,11 @@ class LookupHelper:
clublog_data = self.get_clublog_api_data_for_callsign(call)
if clublog_data and "DXCC" in clublog_data:
dxcc = clublog_data["DXCC"]
# Couldn't get anything from Clublog database, try QRZCQ data
# Couldn't get anything from Clublog database, try DXCC data
if not dxcc:
qrzcq_data = self.get_qrzcq_data_for_callsign(call)
if qrzcq_data and "dxcc" in qrzcq_data:
dxcc = qrzcq_data["dxcc"]
dxcc_data = self.get_dxcc_data_for_callsign(call)
if dxcc_data and "entityCode" in dxcc_data:
dxcc = dxcc_data["entityCode"]
return dxcc
# Infer a continent shortcode from a callsign
@@ -200,7 +255,12 @@ class LookupHelper:
continent = self.CALL_INFO_BASIC.get_continent(call)
except (KeyError, ValueError) as e:
continent = None
# Couldn't get anything from basic call info database, try Clublog data
# Couldn't get anything from basic call info database, try HamQTH
if not continent:
hamqth_data = self.get_hamqth_data_for_callsign(call)
if hamqth_data and "continent" in hamqth_data:
country = hamqth_data["continent"]
# Couldn't get anything from HamQTH database, try Clublog data
if not continent:
clublog_data = self.get_clublog_xml_data_for_callsign(call)
if clublog_data and "Continent" in clublog_data:
@@ -209,11 +269,12 @@ class LookupHelper:
clublog_data = self.get_clublog_api_data_for_callsign(call)
if clublog_data and "Continent" in clublog_data:
continent = clublog_data["Continent"]
# Couldn't get anything from Clublog database, try QRZCQ data
# Couldn't get anything from Clublog database, try DXCC data
if not continent:
qrzcq_data = self.get_qrzcq_data_for_callsign(call)
if qrzcq_data and "continent" in qrzcq_data:
continent = qrzcq_data["continent"]
dxcc_data = self.get_dxcc_data_for_callsign(call)
# Some DXCCs are in two continents, if so don't use the continent data as we can't be sure
if dxcc_data and "continent" in dxcc_data and len(dxcc_data["continent"]) == 1:
continent = dxcc_data["continent"][0]
return continent
# Infer a CQ zone from a callsign
@@ -228,7 +289,12 @@ class LookupHelper:
qrz_data = self.get_qrz_data_for_callsign(call)
if qrz_data and "cqz" in qrz_data:
cqz = qrz_data["cqz"]
# Couldn't get anything from QRZ.com database, try Clublog data
# Couldn't get anything from QRZ.com database, try HamQTH
if not cqz:
hamqth_data = self.get_hamqth_data_for_callsign(call)
if hamqth_data and "cq" in hamqth_data:
cqz = hamqth_data["cq"]
# Couldn't get anything from HamQTH database, try Clublog data
if not cqz:
clublog_data = self.get_clublog_xml_data_for_callsign(call)
if clublog_data and "CQZ" in clublog_data:
@@ -237,11 +303,12 @@ class LookupHelper:
clublog_data = self.get_clublog_api_data_for_callsign(call)
if clublog_data and "CQZ" in clublog_data:
cqz = clublog_data["CQZ"]
# Couldn't get anything from Clublog database, try QRZCQ data
# Couldn't get anything from Clublog database, try DXCC data
if not cqz:
qrzcq_data = self.get_qrzcq_data_for_callsign(call)
if qrzcq_data and "cqz" in qrzcq_data:
cqz = qrzcq_data["cqz"]
dxcc_data = self.get_dxcc_data_for_callsign(call)
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
if dxcc_data and "cq" in dxcc_data and len(dxcc_data["cq"]) == 1:
cqz = dxcc_data["cq"][0]
return cqz
# Infer a ITU zone from a callsign
@@ -256,13 +323,105 @@ class LookupHelper:
qrz_data = self.get_qrz_data_for_callsign(call)
if qrz_data and "ituz" in qrz_data:
ituz = qrz_data["ituz"]
# Couldn't get anything from QRZ.com database, Clublog doesn't provide this, so try QRZCQ data
# Couldn't get anything from QRZ.com database, try HamQTH
if not ituz:
qrzcq_data = self.get_qrzcq_data_for_callsign(call)
if qrzcq_data and "ituz" in qrzcq_data:
ituz = qrzcq_data["ituz"]
hamqth_data = self.get_hamqth_data_for_callsign(call)
if hamqth_data and "itu" in hamqth_data:
ituz = hamqth_data["itu"]
# Couldn't get anything from HamQTH database, Clublog doesn't provide this, so try DXCC data
if not ituz:
dxcc_data = self.get_dxcc_data_for_callsign(call)
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
if dxcc_data and "itu" in dxcc_data and len(dxcc_data["itu"]) == 1:
ituz = dxcc_data["itu"]
return ituz
# Get an emoji flag for a given DXCC entity ID
def get_flag_for_dxcc(self, dxcc):
return self.DXCC_DATA[dxcc]["flag"] if dxcc in self.DXCC_DATA else None
# Infer an operator name from a callsign (requires QRZ.com/HamQTH)
def infer_name_from_callsign_online_lookup(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "fname" in data:
name = data["fname"]
if "name" in data:
name = name + " " + data["name"]
return name
data = self.get_hamqth_data_for_callsign(call)
if data and "nick" in data:
return data["nick"]
else:
return None
# Infer a latitude and longitude from a callsign (requires QRZ.com/HamQTH)
# Coordinates that look default are rejected (apologies if your position really is 0,0, enjoy your voyage)
def infer_latlon_from_callsign_online_lookup(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "latitude" in data and "longitude" in data and (data["latitude"] != 0 or data["longitude"] != 0):
return [data["latitude"], data["longitude"]]
data = self.get_hamqth_data_for_callsign(call)
if data and "latitude" in data and "longitude" in data and (data["latitude"] != 0 or data["longitude"] != 0):
return [data["latitude"], data["longitude"]]
else:
return None
# Infer a grid locator from a callsign (requires QRZ.com/HamQTH).
# Grids that look default are rejected (apologies if your grid really is AA00aa, enjoy your research)
def infer_grid_from_callsign_online_lookup(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "locator" in data and data["locator"].upper() != "AA00" and data["locator"].upper() != "AA00AA" and data["locator"].upper() != "AA00AA00":
return data["locator"]
data = self.get_hamqth_data_for_callsign(call)
if data and "grid" in data and data["grid"].upper() != "AA00" and data["grid"].upper() != "AA00AA" and data["grid"].upper() != "AA00AA00":
return data["grid"]
else:
return None
# Infer a textual QTH from a callsign (requires QRZ.com/HamQTH)
def infer_qth_from_callsign_online_lookup(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "addr2" in data:
return data["addr2"]
data = self.get_hamqth_data_for_callsign(call)
if data and "qth" in data:
return data["qth"]
else:
return None
# Infer a latitude and longitude from a callsign (using DXCC, probably very inaccurate)
def infer_latlon_from_callsign_dxcc(self, call):
try:
data = self.CALL_INFO_BASIC.get_lat_long(call)
if data and "latitude" in data and "longitude" in data:
loc = [data["latitude"], data["longitude"]]
else:
loc = None
except KeyError:
loc = None
# Couldn't get anything from basic call info database, try Clublog data
if not loc:
data = self.get_clublog_xml_data_for_callsign(call)
if data and "Lat" in data and "Lon" in data:
loc = [data["Lat"], data["Lon"]]
if not loc:
data = self.get_clublog_api_data_for_callsign(call)
if data and "Lat" in data and "Lon" in data:
loc = [data["Lat"], data["Lon"]]
return loc
# Infer a grid locator from a callsign (using DXCC, probably very inaccurate)
def infer_grid_from_callsign_dxcc(self, call):
latlon = self.infer_latlon_from_callsign_dxcc(call)
return latlong_to_locator(latlon[0], latlon[1], 8)
# Infer a mode from the frequency (in Hz) according to the band plan. Just a guess really.
def infer_mode_from_frequency(self, freq):
try:
return freq_to_band(freq / 1000.0)["mode"]
except KeyError:
return None
# Utility method to get QRZ.com data from cache if possible, if not get it from the API and cache it
def get_qrz_data_for_callsign(self, call):
# Fetch from cache if we can, otherwise fetch from the API and cache it
@@ -286,6 +445,49 @@ class LookupHelper:
else:
return None
# Utility method to get HamQTH data from cache if possible, if not get it from the API and cache it
def get_hamqth_data_for_callsign(self, call):
# Fetch from cache if we can, otherwise fetch from the API and cache it
if call in self.HAMQTH_CALLSIGN_DATA_CACHE:
return self.HAMQTH_CALLSIGN_DATA_CACHE.get(call)
elif self.HAMQTH_AVAILABLE:
try:
# First we need to log in and get a session token.
session_data = self.HAMQTH_SESSION_LOOKUP_CACHE.get(
self.HAMQTH_BASE_URL + "?u=" + urllib.parse.quote_plus(config["hamqth-username"]) +
"&p=" + urllib.parse.quote_plus(config["hamqth-password"]), headers=HTTP_HEADERS).content
dict_data = xmltodict.parse(session_data)
if "session_id" in dict_data["HamQTH"]["session"]:
session_id = dict_data["HamQTH"]["session"]["session_id"]
# Now look up the actual data.
try:
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
self.HAMQTH_BASE_URL + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
call) + "&prg=" + HAMQTH_PRG, headers=HTTP_HEADERS).content
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
return data
except (KeyError, ValueError):
# HamQTH had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
try:
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
self.HAMQTH_BASE_URL + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
callinfo.Callinfo.get_homecall(call)) + "&prg=" + HAMQTH_PRG, headers=HTTP_HEADERS).content
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
return data
except (KeyError, ValueError):
# HamQTH had no info for the call, that's OK. Cache a None so we don't try to look this up again
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, None, expire=604800) # 1 week in seconds
return None
else:
logging.warn("HamQTH login details incorrect, failed to look up with HamQTH.")
except:
logging.error("Exception when looking up HamQTH data")
return None
# Utility method to get Clublog API data from cache if possible, if not get it from the API and cache it
def get_clublog_api_data_for_callsign(self, call):
# Fetch from cache if we can, otherwise fetch from the API and cache it
@@ -326,78 +528,18 @@ class LookupHelper:
else:
return None
# Utility method to get QRZCQ data from our constants table, if we can find it
def get_qrzcq_data_for_callsign(self, call):
# Iterate in reverse order - see comments on the data structure itself
for entry in reversed(QRZCQ_CALLSIGN_LOOKUP_DATA):
if call.startswith(entry["prefix"]):
# Utility method to get generic DXCC data from our lookup table, if we can find it
def get_dxcc_data_for_callsign(self, call):
for entry in self.DXCC_DATA.values():
if re.match(entry["prefixRegex"], call):
return entry
return None
# Infer an operator name from a callsign (requires QRZ.com)
def infer_name_from_callsign(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "fname" in data:
name = data["fname"]
if "name" in data:
name = name + " " + data["name"]
return name
else:
return None
# Infer a latitude and longitude from a callsign (requires QRZ.com)
def infer_latlon_from_callsign_qrz(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "latitude" in data and "longitude" in data:
return [data["latitude"], data["longitude"]]
else:
return None
# Infer a grid locator from a callsign (requires QRZ.com)
def infer_grid_from_callsign_qrz(self, call):
data = self.get_qrz_data_for_callsign(call)
if data and "locator" in data:
return data["locator"]
else:
return None
# Infer a latitude and longitude from a callsign (using DXCC, probably very inaccurate)
def infer_latlon_from_callsign_dxcc(self, call):
try:
data = self.CALL_INFO_BASIC.get_lat_long(call)
if data and "latitude" in data and "longitude" in data:
loc = [data["latitude"], data["longitude"]]
else:
loc = None
except KeyError:
loc = None
# Couldn't get anything from basic call info database, try Clublog data
if not loc:
data = self.get_clublog_xml_data_for_callsign(call)
if data and "Lat" in data and "Lon" in data:
loc = [data["Lat"], data["Lon"]]
if not loc:
data = self.get_clublog_api_data_for_callsign(call)
if data and "Lat" in data and "Lon" in data:
loc = [data["Lat"], data["Lon"]]
return loc
# Infer a grid locator from a callsign (using DXCC, probably very inaccurate)
def infer_grid_from_callsign_dxcc(self, call):
latlon = self.infer_latlon_from_callsign_dxcc(call)
return latlong_to_locator(latlon[0], latlon[1], 8)
# Infer a mode from the frequency (in Hz) according to the band plan. Just a guess really.
def infer_mode_from_frequency(self, freq):
try:
return freq_to_band(freq / 1000.0)["mode"]
except KeyError:
return None
# Shutdown method to close down any caches neatly.
def stop(self):
self.QRZ_CALLSIGN_DATA_CACHE.close()
self.CLUBLOG_CALLSIGN_DATA_CACHE.close()
# Singleton object
lookup_helper = LookupHelper()

View File

@@ -1,4 +1,13 @@
from core.constants import SIGS
import csv
import logging
from pyhamtools.locator import latlong_to_locator
from core.cache_utils import SEMI_STATIC_URL_DATA_CACHE
from core.constants import SIGS, HTTP_HEADERS
from core.geo_utils import wab_wai_square_to_lat_lon
from data.sig_ref import SIGRef
# Utility function to get the icon for a named SIG. If no match is found, the "circle-question" icon will be returned.
def get_icon_for_sig(sig):
@@ -7,13 +16,125 @@ def get_icon_for_sig(sig):
return s.icon
return "circle-question"
# Utility function to get the regex string for a SIG reference for a named SIG. If no match is found, None will be returned.
def get_ref_regex_for_sig(sig):
for s in SIGS:
if s.name == sig:
if s.name.upper() == sig.upper():
return s.ref_regex
return None
# Look up details of a SIG reference (e.g. POTA park) such as name, lat/lon, and grid.
# Note there is currently no support for KRMNPA location lookup, see issue #61.
def get_sig_ref_info(sig, sig_ref_id):
sig_ref = SIGRef(id=sig_ref_id, sig=sig)
try:
if sig.upper() == "POTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://api.pota.app/park/" + sig_ref_id, headers=HTTP_HEADERS).json()
if data:
fullname = data["name"] if "name" in data else None
if fullname and "parktypeDesc" in data and data["parktypeDesc"] != "":
fullname = fullname + " " + data["parktypeDesc"]
sig_ref.name = fullname
sig_ref.url = "https://pota.app/#/park/" + sig_ref_id
sig_ref.grid = data["grid6"] if "grid6" in data else None
sig_ref.latitude = data["latitude"] if "latitude" in data else None
sig_ref.longitude = data["longitude"] if "longitude" in data else None
elif sig.upper() == "SOTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://api-db2.sota.org.uk/api/summits/" + sig_ref_id,
headers=HTTP_HEADERS).json()
if data:
sig_ref.name = data["name"] if "name" in data else None
sig_ref.url = "https://www.sotadata.org.uk/en/summit/" + sig_ref_id
sig_ref.grid = data["locator"] if "locator" in data else None
sig_ref.latitude = data["latitude"] if "latitude" in data else None
sig_ref.longitude = data["longitude"] if "longitude" in data else None
elif sig.upper() == "WWBOTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://api.wwbota.org/bunkers/" + sig_ref_id,
headers=HTTP_HEADERS).json()
if data:
sig_ref.name = data["name"] if "name" in data else None
sig_ref.url = "https://bunkerwiki.org/?s=" + sig_ref_id if sig_ref_id.startswith("B/G") else None
sig_ref.grid = data["locator"] if "locator" in data else None
sig_ref.latitude = data["lat"] if "lat" in data else None
sig_ref.longitude = data["long"] if "long" in data else None
elif sig.upper() == "GMA" or sig.upper() == "ARLHS" or sig.upper() == "ILLW" or sig.upper() == "WCA" or sig.upper() == "MOTA" or sig.upper() == "IOTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://www.cqgma.org/api/ref/?" + sig_ref_id,
headers=HTTP_HEADERS).json()
if data:
sig_ref.name = data["name"] if "name" in data else None
sig_ref.url = "https://www.cqgma.org/zinfo.php?ref=" + sig_ref_id
sig_ref.grid = data["locator"] if "locator" in data else None
sig_ref.latitude = data["latitude"] if "latitude" in data else None
sig_ref.longitude = data["longitude"] if "longitude" in data else None
elif sig.upper() == "WWFF":
wwff_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://wwff.co/wwff-data/wwff_directory.csv",
headers=HTTP_HEADERS)
wwff_dr = csv.DictReader(wwff_csv_data.content.decode().splitlines())
for row in wwff_dr:
if row["reference"] == sig_ref_id:
sig_ref.name = row["name"] if "name" in row else None
sig_ref.url = "https://wwff.co/directory/?showRef=" + sig_ref_id
sig_ref.grid = row["iaruLocator"] if "iaruLocator" in row else None
sig_ref.latitude = float(row["latitude"]) if "latitude" in row else None
sig_ref.longitude = float(row["longitude"]) if "longitude" in row else None
break
elif sig.upper() == "SIOTA":
siota_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://www.silosontheair.com/data/silos.csv",
headers=HTTP_HEADERS)
siota_dr = csv.DictReader(siota_csv_data.content.decode().splitlines())
for row in siota_dr:
if row["SILO_CODE"] == sig_ref_id:
sig_ref.name = row["NAME"] if "NAME" in row else None
sig_ref.grid = row["LOCATOR"] if "LOCATOR" in row else None
sig_ref.latitude = float(row["LAT"]) if "LAT" in row else None
sig_ref.longitude = float(row["LNG"]) if "LNG" in row else None
break
elif sig.upper() == "WOTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://www.wota.org.uk/mapping/data/summits.json",
headers=HTTP_HEADERS).json()
if data:
for feature in data["features"]:
if feature["properties"]["wotaId"] == sig_ref_id:
sig_ref.name = feature["properties"]["title"]
# Fudge WOTA URLs. Outlying fell (LDO) URLs don't match their ID numbers but require 214 to be
# added to them
sig_ref.url = "https://www.wota.org.uk/MM_" + sig_ref_id
if sig_ref_id.upper().startswith("LDO-"):
number = int(sig_ref_id.upper().replace("LDO-", ""))
sig_ref.url = "https://www.wota.org.uk/MM_LDO-" + str(number + 214)
sig_ref.grid = feature["properties"]["qthLocator"]
sig_ref.latitude = feature["geometry"]["coordinates"][1]
sig_ref.longitude = feature["geometry"]["coordinates"][0]
break
elif sig.upper() == "ZLOTA":
data = SEMI_STATIC_URL_DATA_CACHE.get("https://ontheair.nz/assets/assets.json", headers=HTTP_HEADERS).json()
if data:
for asset in data:
if asset["code"] == sig_ref_id:
sig_ref.name = asset["name"]
sig_ref.url = "https://ontheair.nz/assets/ZLI_OT-030" + sig_ref_id.replace("/", "_")
sig_ref.grid = latlong_to_locator(asset["y"], asset["x"], 6)
sig_ref.latitude = asset["y"]
sig_ref.longitude = asset["x"]
break
elif sig.upper() == "BOTA":
if not sig_ref.name:
sig_ref.name = sig_ref.id
sig_ref.url = "https://www.beachesontheair.com/beaches/" + sig_ref.name.lower().replace(" ", "-")
elif sig.upper() == "WAB" or sig.upper() == "WAI":
ll = wab_wai_square_to_lat_lon(sig_ref_id)
if ll:
sig_ref.name = sig_ref_id
sig_ref.grid = latlong_to_locator(ll[0], ll[1], 6)
sig_ref.latitude = ll[0]
sig_ref.longitude = ll[1]
except:
logging.warn("Failed to look up sig_ref info for " + sig + " ref " + sig_ref_id + ".")
return sig_ref
# Regex matching any SIG
ANY_SIG_REGEX = r"(" + r"|".join(list(map(lambda p: p.name, SIGS))) + r")"

View File

@@ -1,15 +1,13 @@
import copy
import hashlib
import json
import re
from dataclasses import dataclass
from datetime import datetime, timedelta
import pytz
from core.constants import DXCC_FLAGS
from core.lookup_helper import lookup_helper
from core.sig_utils import get_icon_for_sig
from core.sig_utils import get_icon_for_sig, get_sig_ref_info
# Data class that defines an alert.
@@ -96,8 +94,23 @@ class Alert:
self.dx_itu_zone = lookup_helper.infer_itu_zone_from_callsign(self.dx_calls[0])
if self.dx_calls and self.dx_calls[0] and not self.dx_dxcc_id:
self.dx_dxcc_id = lookup_helper.infer_dxcc_id_from_callsign(self.dx_calls[0])
if self.dx_dxcc_id and self.dx_dxcc_id in DXCC_FLAGS and not self.dx_flag:
self.dx_flag = DXCC_FLAGS[self.dx_dxcc_id]
if self.dx_dxcc_id and not self.dx_flag:
self.dx_flag = lookup_helper.get_flag_for_dxcc(self.dx_dxcc_id)
# Fetch SIG data. In case a particular API doesn't provide a full set of name, lat, lon & grid for a reference
# in its initial call, we use this code to populate the rest of the data. This includes working out grid refs
# from WAB and WAI, which count as a SIG even though there's no real lookup, just maths
if self.sig_refs and len(self.sig_refs) > 0:
for sig_ref in self.sig_refs:
lookup_data = get_sig_ref_info(sig_ref.sig, sig_ref.id)
if lookup_data:
# Update the sig_ref data from the lookup
sig_ref.__dict__.update(lookup_data.__dict__)
# If the spot itself doesn't have a SIG yet, but we have at least one SIG reference, take that reference's SIG
# and apply it to the whole spot.
if self.sig_refs and len(self.sig_refs) > 0 and not self.sig:
self.sig = self.sig_refs[0].sig
# Icon from SIG
if self.sig and not self.icon:
@@ -107,7 +120,7 @@ class Alert:
# the actual alertting service, e.g. we don't want to accidentally use a user's QRZ.com home lat/lon instead of
# the one from the park reference they're at.
if self.dx_calls and not self.dx_names:
self.dx_names = list(map(lambda c: lookup_helper.infer_name_from_callsign(c), self.dx_calls))
self.dx_names = list(map(lambda c: lookup_helper.infer_name_from_callsign_online_lookup(c), self.dx_calls))
# Always create an ID based on a hash of every parameter *except* received_time. This is used as the index
# to a map, which as a byproduct avoids us having multiple duplicate copies of the object that are identical

View File

@@ -11,4 +11,4 @@ class SIG:
# and Field Spotter. Does not include the "fa-" prefix.
icon: str
# Regex matcher for references, e.g. for POTA r"[A-Z]{2}\-\d+".
ref_regex: str
ref_regex: str = None

View File

@@ -6,7 +6,15 @@ from dataclasses import dataclass
class SIGRef:
# Reference ID, e.g. "GB-0001".
id: str
# SIG that this reference is in, e.g. "POTA".
sig: str
# Name of the reference, e.g. "Null Country Park", if known.
name: str = None
# URL to look up more information about the reference, if known.
url: str = None
# Latitude of the reference, if known.
latitude: float = None
# Longitude of the reference, if known.
longitude: float = None
# Maidenhead grid reference of the reference, if known.
grid: str = None

View File

@@ -9,10 +9,9 @@ from datetime import datetime
import pytz
from pyhamtools.locator import locator_to_latlong, latlong_to_locator
from core.constants import DXCC_FLAGS
from core.geo_utils import wab_wai_square_to_lat_lon
from core.lookup_helper import lookup_helper
from core.sig_utils import get_icon_for_sig
from core.sig_utils import get_icon_for_sig, get_sig_ref_info, ANY_SIG_REGEX, get_ref_regex_for_sig
from data.sig_ref import SIGRef
# Data class that defines a spot.
@@ -27,6 +26,9 @@ class Spot:
dx_call: str = None
# Name of the operator that has been spotted
dx_name: str = None
# QTH of the operator that has been spotted. This could be from any SIG refs or could be from online lookup of their
# home QTH.
dx_qth: str = None
# Country of the DX operator
dx_country: str = None
# Country flag of the DX operator
@@ -35,8 +37,6 @@ class Spot:
dx_continent: str = None
# DXCC ID of the DX operator
dx_dxcc_id: int = None
# DXCC ID of the spotter
de_dxcc_id: int = None
# CQ zone of the DX operator
dx_cq_zone: int = None
# ITU zone of the DX operator
@@ -50,11 +50,12 @@ class Spot:
# lookup
dx_latitude: float = None
dx_longitude: float = None
# DX Location source. Indicates how accurate the location might be. Values: "SPOT", "WAB/WAI GRID", "QRZ", "DXCC", "NONE"
# DX Location source. Indicates how accurate the location might be. Values: "SPOT", "WAB/WAI GRID", "HOME QTH",
# "DXCC", "NONE"
dx_location_source: str = "NONE"
# DX Location good. Indicates that the software thinks the location data is good enough to plot on a map. This is
# true if the location source is "SPOT" or "WAB/WAI GRID", or if the location source is "QRZ" and the DX callsign
# doesn't have a suffix like /P.
# true if the location source is "SPOT" or "WAB/WAI GRID", or if the location source is "HOME QTH" and the DX
# callsign doesn't have a suffix like /P.
dx_location_good: bool = False
# DE (Spotter) info
@@ -67,6 +68,8 @@ class Spot:
de_flag: str = None
# Continent of the spotter
de_continent: str = None
# DXCC ID of the spotter
de_dxcc_id: int = None
# If this is an APRS/Packet/etc spot, what SSID was the spotter/receiver using?
de_ssid: str = None
# Maidenhead grid locator for the spotter. This is not going to be from a xOTA reference so it will likely just be
@@ -170,8 +173,8 @@ class Spot:
self.dx_itu_zone = lookup_helper.infer_itu_zone_from_callsign(self.dx_call)
if self.dx_call and not self.dx_dxcc_id:
self.dx_dxcc_id = lookup_helper.infer_dxcc_id_from_callsign(self.dx_call)
if self.dx_dxcc_id and self.dx_dxcc_id in DXCC_FLAGS and not self.dx_flag:
self.dx_flag = DXCC_FLAGS[self.dx_dxcc_id]
if self.dx_dxcc_id and not self.dx_flag:
self.dx_flag = lookup_helper.get_flag_for_dxcc(self.dx_dxcc_id)
# Clean up spotter call if it has an SSID or -# from RBN
if self.de_call and "-" in self.de_call:
@@ -196,15 +199,15 @@ class Spot:
# Spotter country, continent, zones etc. from callsign.
# DE call with no digits, or APRS servers starting "T2" are not things we can look up location for
if any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
if self.de_call and not self.de_country:
if self.de_call and any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
if not self.de_country:
self.de_country = lookup_helper.infer_country_from_callsign(self.de_call)
if self.de_call and not self.de_continent:
if not self.de_continent:
self.de_continent = lookup_helper.infer_continent_from_callsign(self.de_call)
if self.de_call and not self.de_dxcc_id:
if not self.de_dxcc_id:
self.de_dxcc_id = lookup_helper.infer_dxcc_id_from_callsign(self.de_call)
if self.de_dxcc_id and self.de_dxcc_id in DXCC_FLAGS and not self.de_flag:
self.de_flag = DXCC_FLAGS[self.de_dxcc_id]
if self.de_dxcc_id and not self.de_flag:
self.de_flag = lookup_helper.get_flag_for_dxcc(self.de_dxcc_id)
# Band from frequency
if self.freq and not self.band:
@@ -232,11 +235,72 @@ class Spot:
if self.mode and not self.mode_type:
self.mode_type = lookup_helper.infer_mode_type_from_mode(self.mode)
# Icon from SIG
if self.sig and not self.icon:
# If we have a latitude at this point, it can only have been provided by the spot itself
if self.dx_latitude:
self.dx_location_source = "SPOT"
# Set the top-level "SIG" if it is missing but we have at least one SIG ref.
if not self.sig and self.sig_refs and len(self.sig_refs) > 0:
self.sig = self.sig_refs[0].sig.upper()
# See if we already have a SIG reference, but the comment looks like it contains more for the same SIG. This
# should catch e.g. POTA comments like "2-fer: GB-0001 GB-0002".
if self.comment and self.sig_refs and len(self.sig_refs) > 0:
sig = self.sig_refs[0].sig.upper()
all_comment_ref_matches = re.finditer(r"(^|\W)(" + get_ref_regex_for_sig(sig) + r")(^|\W)", self.comment, re.IGNORECASE)
for ref_match in all_comment_ref_matches:
self.append_sig_ref_if_missing(SIGRef(id=ref_match.group(2).upper(), sig=sig))
# See if the comment looks like it contains any SIGs (and optionally SIG references) that we can
# add to the spot. This should catch cluster spot comments like "POTA GB-0001 WWFF GFF-0001" and e.g. POTA
# comments like "also WWFF GFF-0001".
if self.comment:
sig_matches = re.finditer(r"(^|\W)" + ANY_SIG_REGEX + r"($|\W)", self.comment, re.IGNORECASE)
for sig_match in sig_matches:
# First of all, if we haven't got a SIG for this spot set yet, now we have. This covers things like cluster
# spots where the comment is just "POTA".
found_sig = sig_match.group(2).upper()
if not self.sig:
self.sig = found_sig
# Now look to see if that SIG name was followed by something that looks like a reference ID for that SIG.
# If so, add that to the sig_refs list for this spot.
ref_regex = get_ref_regex_for_sig(found_sig)
if ref_regex:
ref_matches = re.finditer(r"(^|\W)" + found_sig + r"($|\W)(" + ref_regex + r")($|\W)", self.comment, re.IGNORECASE)
for ref_match in ref_matches:
self.append_sig_ref_if_missing(SIGRef(id=ref_match.group(3).upper(), sig=found_sig))
# Fetch SIG data. In case a particular API doesn't provide a full set of name, lat, lon & grid for a reference
# in its initial call, we use this code to populate the rest of the data. This includes working out grid refs
# from WAB and WAI, which count as a SIG even though there's no real lookup, just maths
if self.sig_refs and len(self.sig_refs) > 0:
for sig_ref in self.sig_refs:
lookup_data = get_sig_ref_info(sig_ref.sig, sig_ref.id)
if lookup_data:
# Update the sig_ref data from the lookup
sig_ref.__dict__.update(lookup_data.__dict__)
# If the spot itself doesn't have location yet, but the SIG ref does, extract it
if lookup_data.grid and not self.dx_grid:
self.dx_grid = lookup_data.grid
if lookup_data.latitude and not self.dx_latitude:
self.dx_latitude = lookup_data.latitude
self.dx_longitude = lookup_data.longitude
if self.sig == "WAB" or self.sig == "WAI":
self.dx_location_source = "WAB/WAI GRID"
else:
self.dx_location_source = "SIG REF LOOKUP"
# If the spot itself doesn't have a SIG yet, but we have at least one SIG reference, take that reference's SIG
# and apply it to the whole spot.
if self.sig_refs and len(self.sig_refs) > 0 and not self.sig:
self.sig = self.sig_refs[0].sig
# Icon from SIG if we have one
if self.sig:
self.icon = get_icon_for_sig(self.sig)
# DX Grid to lat/lon and vice versa
# DX Grid to lat/lon and vice versa in case one is missing
if self.dx_grid and not self.dx_latitude:
ll = locator_to_latlong(self.dx_grid)
self.dx_latitude = ll[0]
@@ -246,21 +310,6 @@ class Spot:
self.dx_grid = latlong_to_locator(self.dx_latitude, self.dx_longitude, 8)
except:
logging.debug("Invalid lat/lon received for spot")
if self.dx_latitude:
self.dx_location_source = "SPOT"
# WAB/WAI grid to lat/lon
if not self.dx_latitude and self.sig and self.sig_refs and len(self.sig_refs) > 0 and (
self.sig == "WAB" or self.sig == "WAI"):
ll = wab_wai_square_to_lat_lon(self.sig_refs[0])
if ll:
self.dx_latitude = ll[0]
self.dx_longitude = ll[1]
try:
self.dx_grid = latlong_to_locator(self.dx_latitude, self.dx_longitude, 8)
except:
logging.debug("Invalid lat/lon received from WAB/WAI grid")
self.dx_location_source = "WAB/WAI GRID"
# QRT comment detection
if self.comment and not self.qrt:
@@ -270,14 +319,23 @@ class Spot:
# the actual spotting service, e.g. we don't want to accidentally use a user's QRZ.com home lat/lon instead of
# the one from the park reference they're at.
if self.dx_call and not self.dx_name:
self.dx_name = lookup_helper.infer_name_from_callsign(self.dx_call)
self.dx_name = lookup_helper.infer_name_from_callsign_online_lookup(self.dx_call)
if self.dx_call and not self.dx_latitude:
latlon = lookup_helper.infer_latlon_from_callsign_qrz(self.dx_call)
latlon = lookup_helper.infer_latlon_from_callsign_online_lookup(self.dx_call)
if latlon:
self.dx_latitude = latlon[0]
self.dx_longitude = latlon[1]
self.dx_grid = lookup_helper.infer_grid_from_callsign_qrz(self.dx_call)
self.dx_location_source = "QRZ"
self.dx_grid = lookup_helper.infer_grid_from_callsign_online_lookup(self.dx_call)
self.dx_location_source = "HOME QTH"
# Determine a "QTH" string. If we have a SIG ref, pick the first one and turn it into a suitable stirng,
# otherwise see what they have set on an online lookup service.
if self.sig_refs and len(self.sig_refs) > 0:
self.dx_qth = self.sig_refs[0].id
if self.sig_refs[0].name:
self.dx_qth = self.dx_qth + " " + self.sig_refs[0].name
else:
self.dx_qth = lookup_helper.infer_qth_from_callsign_online_lookup(self.dx_call)
# Last resort for getting a DX position, use the DXCC entity.
if self.dx_call and not self.dx_latitude:
@@ -290,21 +348,22 @@ class Spot:
# DX Location is "good" if it is from a spot, or from QRZ if the callsign doesn't contain a slash, so the operator
# is likely at home.
self.dx_location_good = self.dx_location_source == "SPOT" or self.dx_location_source == "WAB/WAI GRID" or (
self.dx_location_source == "QRZ" and not "/" in self.dx_call)
self.dx_location_good = (self.dx_location_source == "SPOT" or self.dx_location_source == "SIG REF LOOKUP"
or self.dx_location_source == "WAB/WAI GRID"
or (self.dx_location_source == "HOME QTH" and not "/" in self.dx_call))
# DE with no digits and APRS servers starting "T2" are not things we can look up location for
if any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
if self.de_call and any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
# DE operator position lookup, using QRZ.com.
if self.de_call and not self.de_latitude:
latlon = lookup_helper.infer_latlon_from_callsign_qrz(self.de_call)
if not self.de_latitude:
latlon = lookup_helper.infer_latlon_from_callsign_online_lookup(self.de_call)
if latlon:
self.de_latitude = latlon[0]
self.de_longitude = latlon[1]
self.de_grid = lookup_helper.infer_grid_from_callsign_qrz(self.de_call)
self.de_grid = lookup_helper.infer_grid_from_callsign_online_lookup(self.de_call)
# Last resort for getting a DE position, use the DXCC entity.
if self.de_call and not self.de_latitude:
if not self.de_latitude:
latlon = lookup_helper.infer_latlon_from_callsign_dxcc(self.de_call)
if latlon:
self.de_latitude = latlon[0]
@@ -324,3 +383,16 @@ class Spot:
# JSON serialise
def to_json(self):
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
# Append a sig_ref to the list, so long as it's not already there.
def append_sig_ref_if_missing(self, new_sig_ref):
if not self.sig_refs:
self.sig_refs = []
new_sig_ref.id = new_sig_ref.id.strip().upper()
new_sig_ref.sig = new_sig_ref.sig.strip().upper()
if new_sig_ref.id == "":
return
for sig_ref in self.sig_refs:
if sig_ref.id == new_sig_ref.id and sig_ref.sig == new_sig_ref.sig:
return
self.sig_refs.append(new_sig_ref)

View File

@@ -12,3 +12,4 @@ requests-sse~=0.5.2
rss-parser~=2.1.1
pyproj~=3.7.2
prometheus_client~=0.23.1
beautifulsoup4~=4.14.2

View File

@@ -1,16 +1,19 @@
import json
import logging
import re
from datetime import datetime, timedelta
from threading import Thread
import bottle
import pytz
from bottle import run, request, response, template
from prometheus_client import CONTENT_TYPE_LATEST, generate_latest
from core.config import MAX_SPOT_AGE, ALLOW_SPOTTING
from core.constants import BANDS, ALL_MODES, MODE_TYPES, SIGS, CONTINENTS, SOFTWARE_VERSION
from core.prometheus_metrics_handler import page_requests_counter, registry, get_metrics, api_requests_counter
from core.config import MAX_SPOT_AGE, ALLOW_SPOTTING, WEB_UI_OPTIONS
from core.constants import BANDS, ALL_MODES, MODE_TYPES, SIGS, CONTINENTS, SOFTWARE_VERSION, UNKNOWN_BAND
from core.lookup_helper import lookup_helper
from core.prometheus_metrics_handler import page_requests_counter, get_metrics, api_requests_counter
from core.sig_utils import get_ref_regex_for_sig, get_sig_ref_info
from data.sig_ref import SIGRef
from data.spot import Spot
@@ -33,18 +36,22 @@ class WebServer:
# Base template data
bottle.BaseTemplate.defaults['software_version'] = SOFTWARE_VERSION
bottle.BaseTemplate.defaults['allow_spotting'] = ALLOW_SPOTTING
# Routes for API calls
bottle.get("/api/v1/spots")(lambda: self.serve_spots_api())
bottle.get("/api/v1/alerts")(lambda: self.serve_alerts_api())
bottle.get("/api/v1/options")(lambda: self.serve_api(self.get_options()))
bottle.get("/api/v1/status")(lambda: self.serve_api(self.status_data))
bottle.get("/api/v1/lookup/call")(lambda: self.serve_call_lookup_api())
bottle.get("/api/v1/lookup/sigref")(lambda: self.serve_sig_ref_lookup_api())
bottle.post("/api/v1/spot")(lambda: self.accept_spot())
# Routes for templated pages
bottle.get("/")(lambda: self.serve_template('webpage_spots'))
bottle.get("/map")(lambda: self.serve_template('webpage_map'))
bottle.get("/bands")(lambda: self.serve_template('webpage_bands'))
bottle.get("/alerts")(lambda: self.serve_template('webpage_alerts'))
bottle.get("/add-spot")(lambda: self.serve_template('webpage_add_spot'))
bottle.get("/status")(lambda: self.serve_template('webpage_status'))
bottle.get("/about")(lambda: self.serve_template('webpage_about'))
bottle.get("/apidocs")(lambda: self.serve_template('webpage_apidocs'))
@@ -95,6 +102,83 @@ class WebServer:
response.status = 500
return json.dumps("Error - " + str(e), default=serialize_everything)
# Look up data for a callsign
def serve_call_lookup_api(self):
try:
# Reject if no callsign
query = bottle.request.query
if not "call" in query.keys():
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - call must be provided", default=serialize_everything)
call = query.get("call").upper()
# Reject badly formatted callsigns
if not re.match(r"^[A-Za-z0-9/\-]*$", call):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + call + "' does not look like a valid callsign.",
default=serialize_everything)
# Take the callsign, make a "fake spot" so we can run infer_missing() on it, then repack the resulting data
# in the correct way for the API response.
fake_spot = Spot(dx_call=call)
fake_spot.infer_missing()
return self.serve_api({
"call": call,
"name": fake_spot.dx_name,
"qth": fake_spot.dx_qth,
"country": fake_spot.dx_country,
"flag": fake_spot.dx_flag,
"continent": fake_spot.dx_continent,
"dxcc_id": fake_spot.dx_dxcc_id,
"cq_zone": fake_spot.dx_cq_zone,
"itu_zone": fake_spot.dx_itu_zone,
"grid": fake_spot.dx_grid,
"latitude": fake_spot.dx_latitude,
"longitude": fake_spot.dx_longitude,
"location_source": fake_spot.dx_location_source
})
except Exception as e:
logging.error(e)
response.content_type = 'application/json'
response.status = 500
return json.dumps("Error - " + str(e), default=serialize_everything)
# Look up data for a SIG reference
def serve_sig_ref_lookup_api(self):
try:
# Reject if no sig or sig_ref
query = bottle.request.query
if not "sig" in query.keys() or not "id" in query.keys():
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - sig and id must be provided", default=serialize_everything)
sig = query.get("sig").upper()
id = query.get("id").upper()
# Reject if sig unknown
if not sig in list(map(lambda p: p.name, SIGS)):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - sig '" + sig + "' is not known.", default=serialize_everything)
# Reject if sig_ref format incorrect for sig
if get_ref_regex_for_sig(sig) and not re.match(get_ref_regex_for_sig(sig), id):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + id + "' does not look like a valid reference ID for " + sig + ".", default=serialize_everything)
data = get_sig_ref_info(sig, id)
return self.serve_api(data)
except Exception as e:
logging.error(e)
response.content_type = 'application/json'
response.status = 500
return json.dumps("Error - " + str(e), default=serialize_everything)
# Serve a JSON API endpoint
def serve_api(self, data):
self.last_api_access_time = datetime.now(pytz.UTC)
@@ -137,15 +221,54 @@ class WebServer:
json_spot = json.loads(post_data)
spot = Spot(**json_spot)
# Reject if no timestamp or dx_call
if not spot.time or not spot.dx_call:
# Converting to a spot object this way won't have coped with sig_ref objects, so fix that. (Would be nice to
# redo this in a functional style)
if spot.sig_refs:
real_sig_refs = []
for dict_obj in spot.sig_refs:
real_sig_refs.append(json.loads(json.dumps(dict_obj), object_hook=lambda d: SIGRef(**d)))
spot.sig_refs = real_sig_refs
# Reject if no timestamp, frequency, dx_call or de_call
if not spot.time or not spot.dx_call or not spot.freq or not spot.de_call:
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - 'time' and 'dx_call' must be provided as a minimum.",
return json.dumps("Error - 'time', 'dx_call', 'freq' and 'de_call' must be provided as a minimum.",
default=serialize_everything)
# Reject invalid-looking callsigns
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.dx_call):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + spot.dx_call + "' does not look like a valid callsign.",
default=serialize_everything)
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.de_call):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + spot.de_call + "' does not look like a valid callsign.",
default=serialize_everything)
# Reject if frequency not in a known band
if lookup_helper.infer_band_from_freq(spot.freq) == UNKNOWN_BAND:
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - Frequency of " + str(spot.freq / 1000.0) + "kHz is not in a known band.", default=serialize_everything)
# Reject if grid formatting incorrect
if spot.dx_grid and not re.match(r"^([A-R]{2}[0-9]{2}[A-X]{2}[0-9]{2}[A-X]{2}|[A-R]{2}[0-9]{2}[A-X]{2}[0-9]{2}|[A-R]{2}[0-9]{2}[A-X]{2}|[A-R]{2}[0-9]{2})$", spot.dx_grid.upper()):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + spot.dx_grid + "' does not look like a valid Maidenhead grid.", default=serialize_everything)
# Reject if sig_ref format incorrect for sig
if spot.sig and spot.sig_refs and len(spot.sig_refs) > 0 and spot.sig_refs[0].id and get_ref_regex_for_sig(spot.sig) and not re.match(get_ref_regex_for_sig(spot.sig), spot.sig_refs[0].id):
response.content_type = 'application/json'
response.status = 422
return json.dumps("Error - '" + spot.sig_refs[0].id + "' does not look like a valid reference for " + spot.sig + ".", default=serialize_everything)
# infer missing data, and add it to our database.
spot.source = "API"
if not spot.sig:
spot.icon = "desktop"
spot.infer_missing()
self.spots.add(spot.id, spot, expire=MAX_SPOT_AGE)
@@ -246,6 +369,9 @@ class WebServer:
case "comment_includes":
comment_includes = query.get(k).strip()
spots = [s for s in spots if s.comment and comment_includes.upper() in s.comment.upper()]
case "dx_call_includes":
dx_call_includes = query.get(k).strip()
spots = [s for s in spots if s.dx_call and dx_call_includes.upper() in s.dx_call.upper()]
case "allow_qrt":
# If false, spots that are flagged as QRT are not returned.
prevent_qrt = query.get(k).upper() == "FALSE"
@@ -320,6 +446,9 @@ class WebServer:
case "dx_continent":
dxconts = query.get(k).split(",")
alerts = [a for a in alerts if a.dx_continent and a.dx_continent in dxconts]
case "dx_call_includes":
dx_call_includes = query.get(k).strip()
spots = [a for a in alerts if a.dx_call and dx_call_includes.upper() in a.dx_call.upper()]
# If we have a "limit" parameter, we apply that last, regardless of where it appeared in the list of keys.
if "limit" in query.keys():
alerts = alerts[:int(query.get("limit"))]
@@ -340,7 +469,8 @@ class WebServer:
map(lambda p: p["name"], filter(lambda p: p["enabled"], self.status_data["alert_providers"]))),
"continents": CONTINENTS,
"max_spot_age": MAX_SPOT_AGE,
"spot_allowed": ALLOW_SPOTTING}
"spot_allowed": ALLOW_SPOTTING,
"web-ui-options": WEB_UI_OPTIONS}
# If spotting to this server is enabled, "API" is another valid spot source even though it does not come from
# one of our proviers.
if ALLOW_SPOTTING:

View File

@@ -1,5 +1,5 @@
import logging
from datetime import datetime, timezone
from datetime import datetime
from threading import Thread
import aprslib
@@ -58,5 +58,5 @@ class APRSIS(SpotProvider):
self.submit(spot)
self.status = "OK"
self.last_update_time = datetime.now(timezone.utc)
self.last_update_time = datetime.now(pytz.UTC)
logging.debug("Data received from APRS-IS.")

View File

@@ -1,17 +1,14 @@
import logging
import re
from datetime import datetime, timezone
from datetime import datetime
from threading import Thread
from time import sleep
import pytz
import telnetlib3
from core.constants import SIGS
from core.sig_utils import ANY_SIG_REGEX, ANY_XOTA_SIG_REF_REGEX, get_icon_for_sig, get_ref_regex_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from core.config import SERVER_OWNER_CALLSIGN
from data.spot import Spot
from spotproviders.spot_provider import SpotProvider
@@ -78,26 +75,11 @@ class DXCluster(SpotProvider):
icon="desktop",
time=spot_datetime.timestamp())
# See if the comment looks like it contains a SIG (and optionally SIG reference). Currently,
# only one sig ref is supported. Note that this code is specifically in the DX Cluster class and
# not in the general "spot" infer_missing() method. Because we only support one SIG per spot
# at the moment (see issue #54), we don't want to risk e.g. a POTA spot with comment "WWFF GFF-0001"
# being converted into a WWFF spot.
sig_match = re.search(r"(^|\W)" + ANY_SIG_REGEX + r"($|\W)", spot.comment, re.IGNORECASE)
if sig_match:
spot.sig = sig_match.group(2).upper()
spot.icon = get_icon_for_sig(spot.sig)
ref_regex = get_ref_regex_for_sig(spot.sig)
if ref_regex:
sig_ref_match = re.search(r"(^|\W)" + spot.sig + r"($|\W)(" + ref_regex + r")($|\W)", spot.comment, re.IGNORECASE)
if sig_ref_match:
spot.sig_refs = [SIGRef(id=sig_ref_match.group(3).upper())]
# Add to our list
self.submit(spot)
self.status = "OK"
self.last_update_time = datetime.now(timezone.utc)
self.last_update_time = datetime.now(pytz.UTC)
logging.debug("Data received from DX Cluster " + self.hostname + ".")
except Exception as e:

View File

@@ -1,11 +1,10 @@
import logging
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from requests_cache import CachedSession
from core.cache_utils import SEMI_STATIC_URL_DATA_CACHE
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -17,8 +16,6 @@ class GMA(HTTPSpotProvider):
SPOTS_URL = "https://www.cqgma.org/api/spots/25/"
# GMA spots don't contain the details of the programme they are for, we need a separate lookup for that
REF_INFO_URL_ROOT = "https://www.cqgma.org/api/ref/?"
REF_INFO_CACHE_TIME_DAYS = 30
REF_INFO_CACHE = CachedSession("cache/gma_ref_info_cache", expire_after=timedelta(days=REF_INFO_CACHE_TIME_DAYS))
def __init__(self, provider_config):
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
@@ -36,7 +33,7 @@ class GMA(HTTPSpotProvider):
mode=source_spot["MODE"].upper() if "<>" not in source_spot["MODE"] else None,
# Filter out some weird mode strings
comment=source_spot["TEXT"],
sig_refs=[SIGRef(id=source_spot["REF"], name=source_spot["NAME"], url="https://www.cqgma.org/zinfo.php?ref=" + source_spot["REF"])],
sig_refs=[SIGRef(id=source_spot["REF"], sig="", name=source_spot["NAME"])],
time=datetime.strptime(source_spot["DATE"] + source_spot["TIME"], "%Y%m%d%H%M").replace(
tzinfo=pytz.UTC).timestamp(),
dx_latitude=float(source_spot["LAT"]) if (source_spot["LAT"] and source_spot["LAT"] != "") else None,
@@ -44,7 +41,7 @@ class GMA(HTTPSpotProvider):
dx_longitude=float(source_spot["LON"]) if (source_spot["LON"] and source_spot["LON"] != "") else None)
# GMA doesn't give what programme (SIG) the reference is for until we separately look it up.
ref_response = self.REF_INFO_CACHE.get(self.REF_INFO_URL_ROOT + source_spot["REF"],
ref_response = SEMI_STATIC_URL_DATA_CACHE.get(self.REF_INFO_URL_ROOT + source_spot["REF"],
headers=HTTP_HEADERS)
# Sometimes this is blank, so handle that
if ref_response.text is not None and ref_response.text != "":
@@ -53,25 +50,31 @@ class GMA(HTTPSpotProvider):
# spots come through with reftype=POTA or reftype=WWFF. SOTA is harder to figure out because both SOTA
# and GMA summits come through with reftype=Summit, so we must check for the presence of a "sota" entry
# to determine if it's a SOTA summit.
if ref_info["reftype"] not in ["POTA", "WWFF"] and (ref_info["reftype"] != "Summit" or ref_info["sota"] == ""):
if "reftype" in ref_info and ref_info["reftype"] not in ["POTA", "WWFF"] and (ref_info["reftype"] != "Summit" or ref_info["sota"] == ""):
match ref_info["reftype"]:
case "Summit":
spot.sig_refs[0].sig = "GMA"
spot.sig = "GMA"
case "IOTA Island":
spot.sig_refs[0].sig = "IOTA"
spot.sig = "IOTA"
case "Lighthouse (ILLW)":
spot.sig_refs[0].sig = "ILLW"
spot.sig = "ILLW"
case "Lighthouse (ARLHS)":
spot.sig_refs[0].sig = "ARLHS"
spot.sig = "ARLHS"
case "Castle":
spot.sig_refs[0].sig = "WCA"
spot.sig = "WCA"
case "Mill":
spot.sig_refs[0].sig = "MOTA"
spot.sig = "MOTA"
case _:
logging.warn("GMA spot found with ref type " + ref_info[
"reftype"] + ", developer needs to add support for this!")
spot.sig_refs[0].sig = ref_info["reftype"]
spot.sig = ref_info["reftype"]
spot.icon = get_icon_for_sig(spot.sig)
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
# that for us.

View File

@@ -5,7 +5,6 @@ import pytz
import requests
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -54,8 +53,7 @@ class HEMA(HTTPSpotProvider):
mode=freq_mode_match.group(2).upper(),
comment=spotter_comment_match.group(2),
sig="HEMA",
sig_refs=[SIGRef(id=spot_items[3].upper(), name=spot_items[4])],
icon=get_icon_for_sig("HEMA"),
sig_refs=[SIGRef(id=spot_items[3].upper(), sig="HEMA", name=spot_items[4])],
time=datetime.strptime(spot_items[0], "%d/%m/%Y %H:%M").replace(tzinfo=pytz.UTC).timestamp(),
dx_latitude=float(spot_items[7]),
dx_longitude=float(spot_items[8]))

View File

@@ -1,13 +1,9 @@
import csv
import logging
import re
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from requests_cache import CachedSession
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -18,8 +14,6 @@ class ParksNPeaks(HTTPSpotProvider):
POLL_INTERVAL_SEC = 120
SPOTS_URL = "https://www.parksnpeaks.org/api/ALL"
SIOTA_LIST_URL = "https://www.silosontheair.com/data/silos.csv"
SIOTA_LIST_CACHE_TIME_DAYS = 30
SIOTA_LIST_CACHE = CachedSession("cache/siota_data_cache", expire_after=timedelta(days=SIOTA_LIST_CACHE_TIME_DAYS))
def __init__(self, provider_config):
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
@@ -38,9 +32,8 @@ class ParksNPeaks(HTTPSpotProvider):
# Seen PNP spots with empty frequency, and with comma-separated thousands digits
mode=source_spot["actMode"].upper(),
comment=source_spot["actComments"],
sig=source_spot["actClass"],
sig_refs=[SIGRef(id=source_spot["actSiteID"])],
icon=get_icon_for_sig(source_spot["actClass"]),
sig=source_spot["actClass"].upper(),
sig_refs=[SIGRef(id=source_spot["actSiteID"], sig=source_spot["actClass"].upper())],
time=datetime.strptime(source_spot["actTime"], "%Y-%m-%d %H:%M:%S").replace(
tzinfo=pytz.UTC).timestamp())
@@ -54,24 +47,11 @@ class ParksNPeaks(HTTPSpotProvider):
spot.de_call = m.group(1)
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
if spot.sig not in ["POTA", "SOTA", "WWFF", "SiOTA", "ZLOTA", "KRMNPA"]:
if spot.sig_refs[0].sig not in ["POTA", "SOTA", "WWFF", "SIOTA", "ZLOTA", "KRMNPA"]:
logging.warn("PNP spot found with sig " + spot.sig + ", developer needs to add support for this!")
# SiOTA lat/lon/grid lookup
if spot.sig == "SiOTA":
siota_csv_data = self.SIOTA_LIST_CACHE.get(self.SIOTA_LIST_URL, headers=HTTP_HEADERS)
siota_dr = csv.DictReader(siota_csv_data.content.decode().splitlines())
for row in siota_dr:
if row["SILO_CODE"] == spot.sig_refs[0]:
spot.dx_latitude = float(row["LAT"])
spot.dx_longitude = float(row["LNG"])
spot.dx_grid = row["LOCATOR"]
break
# Note there is currently no support for KRMNPA location lookup, see issue #61.
# If this is POTA, SOTA, WWFF or ZLOTA data we already have it through other means, so ignore. Otherwise,
# add to the spot list.
if spot.sig not in ["POTA", "SOTA", "WWFF", "ZLOTA"]:
if spot.sig_refs[0].sig not in ["POTA", "SOTA", "WWFF", "ZLOTA"]:
new_spots.append(spot)
return new_spots

View File

@@ -1,11 +1,7 @@
import re
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from requests_cache import CachedSession
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig, get_ref_regex_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -17,9 +13,6 @@ class POTA(HTTPSpotProvider):
SPOTS_URL = "https://api.pota.app/spot/activator"
# Might need to look up extra park data
PARK_URL_ROOT = "https://api.pota.app/park/"
PARK_DATA_CACHE_TIME_DAYS = 30
PARK_DATA_CACHE = CachedSession("cache/pota_park_data_cache",
expire_after=timedelta(days=PARK_DATA_CACHE_TIME_DAYS))
def __init__(self, provider_config):
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
@@ -37,29 +30,13 @@ class POTA(HTTPSpotProvider):
mode=source_spot["mode"].upper(),
comment=source_spot["comments"],
sig="POTA",
sig_refs=[SIGRef(id=source_spot["reference"], name=source_spot["name"], url="https://pota.app/#/park/" + source_spot["reference"])],
icon=get_icon_for_sig("POTA"),
sig_refs=[SIGRef(id=source_spot["reference"], sig="POTA", name=source_spot["name"])],
time=datetime.strptime(source_spot["spotTime"], "%Y-%m-%dT%H:%M:%S").replace(
tzinfo=pytz.UTC).timestamp(),
dx_grid=source_spot["grid6"],
dx_latitude=source_spot["latitude"],
dx_longitude=source_spot["longitude"])
# Sometimes we can get other refs in the comments for n-fer activations, extract them
all_comment_refs = re.findall(get_ref_regex_for_sig("POTA"), spot.comment)
for r in all_comment_refs:
if r not in list(map(lambda ref: ref.id, spot.sig_refs)):
ref = SIGRef(id=r.upper(), url="https://pota.app/#/park/" + r.upper())
# Now we need to look up the name of that reference from the API, because the comment won't have it
park_response = self.PARK_DATA_CACHE.get(self.PARK_URL_ROOT + r.upper(), headers=HTTP_HEADERS)
park_data = park_response.json()
if park_data and "name" in park_data:
ref.name = park_data["name"]
# Finally append our new reference to the spot's reference list
spot.sig_refs.append(ref)
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
# that for us.
new_spots.append(spot)

View File

@@ -1,14 +1,14 @@
import logging
import re
from datetime import datetime, timezone
from datetime import datetime
from threading import Thread
from time import sleep
import pytz
import telnetlib3
from data.spot import Spot
from core.config import SERVER_OWNER_CALLSIGN
from data.spot import Spot
from spotproviders.spot_provider import SpotProvider
@@ -77,7 +77,7 @@ class RBN(SpotProvider):
self.submit(spot)
self.status = "OK"
self.last_update_time = datetime.now(timezone.utc)
self.last_update_time = datetime.now(pytz.UTC)
logging.debug("Data received from RBN on port " + str(self.port) + ".")
except Exception as e:

View File

@@ -1,11 +1,8 @@
import logging
from datetime import datetime, timedelta
from datetime import datetime
import requests
from requests_cache import CachedSession
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -21,8 +18,6 @@ class SOTA(HTTPSpotProvider):
SPOTS_URL = "https://api-db2.sota.org.uk/api/spots/60/all/all"
# SOTA spots don't contain lat/lon, we need a separate lookup for that
SUMMIT_URL_ROOT = "https://api-db2.sota.org.uk/api/summits/"
SUMMIT_DATA_CACHE_TIME_DAYS = 30
SUMMIT_DATA_CACHE = CachedSession("cache/sota_summit_data_cache", expire_after=timedelta(days=SUMMIT_DATA_CACHE_TIME_DAYS))
def __init__(self, provider_config):
super().__init__(provider_config, self.EPOCH_URL, self.POLL_INTERVAL_SEC)
@@ -50,21 +45,10 @@ class SOTA(HTTPSpotProvider):
mode=source_spot["mode"].upper(),
comment=source_spot["comments"],
sig="SOTA",
sig_refs=[SIGRef(id=source_spot["summitCode"], name=source_spot["summitName"], url="https://www.sotadata.org.uk/en/summit/" + source_spot["summitCode"])],
icon=get_icon_for_sig("SOTA"),
sig_refs=[SIGRef(id=source_spot["summitCode"], sig="SOTA", name=source_spot["summitName"])],
time=datetime.fromisoformat(source_spot["timeStamp"]).timestamp(),
activation_score=source_spot["points"])
# SOTA doesn't give summit lat/lon/grid in the main call, so we need another separate call for this
try:
summit_response = self.SUMMIT_DATA_CACHE.get(self.SUMMIT_URL_ROOT + source_spot["summitCode"], headers=HTTP_HEADERS)
summit_data = summit_response.json()
spot.dx_grid = summit_data["locator"]
spot.dx_latitude = summit_data["latitude"]
spot.dx_longitude = summit_data["longitude"]
except Exception:
logging.warn("Looking up summit " + source_spot["summitCode"] + " from the SOTA API failed. No summit data was available.")
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
# that for us.
new_spots.append(spot)

View File

@@ -2,8 +2,7 @@ from datetime import datetime
import pytz
from core.constants import SOFTWARE_NAME, SOFTWARE_VERSION
from core.config import SERVER_OWNER_CALLSIGN, MAX_SPOT_AGE
from core.config import MAX_SPOT_AGE
# Generic spot provider class. Subclasses of this query the individual APIs for data.

View File

@@ -9,6 +9,7 @@ from requests_sse import EventSource
from core.constants import HTTP_HEADERS
from spotproviders.spot_provider import SpotProvider
# Spot provider using Server-Sent Events.
class SSESpotProvider(SpotProvider):

View File

@@ -1,11 +1,8 @@
import re
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from requests_cache import CachedSession
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig, get_ref_regex_for_sig
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider

View File

@@ -1,11 +1,10 @@
from datetime import timedelta, datetime
import logging
import re
from datetime import datetime
import pytz
from requests_cache import CachedSession
from rss_parser import RSSParser
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -16,8 +15,6 @@ class WOTA(HTTPSpotProvider):
POLL_INTERVAL_SEC = 120
SPOTS_URL = "https://www.wota.org.uk/spots_rss.php"
LIST_URL = "https://www.wota.org.uk/mapping/data/summits.json"
LIST_CACHE_TIME_DAYS = 30
LIST_CACHE = CachedSession("cache/wota_data_cache", expire_after=timedelta(days=LIST_CACHE_TIME_DAYS))
RSS_DATE_TIME_FORMAT = "%a, %d %b %Y %H:%M:%S %z"
def __init__(self, provider_config):
@@ -29,6 +26,7 @@ class WOTA(HTTPSpotProvider):
# Iterate through source data
for source_spot in rss.channel.items:
try:
# Reject GUID missing or zero
if not source_spot.guid or not source_spot.guid.content or source_spot.guid.content == "http://www.wota.org.uk/spots/0":
continue
@@ -47,16 +45,16 @@ class WOTA(HTTPSpotProvider):
# Pick apart the description
desc_split = source_spot.description.split(". ")
freq_mode = desc_split[0].replace("Frequencies/modes:", "").strip()
freq_mode_split = freq_mode.split("-")
freq_mode_split = re.split(r'[\-\s]+', freq_mode)
freq_hz = float(freq_mode_split[0]) * 1000000
mode = freq_mode_split[1]
mode = freq_mode_split[1].upper()
comment = None
if len(desc_split) > 1:
comment = desc_split[1].strip()
spotter = None
if len(desc_split) > 2:
spotter = desc_split[2].replace("Spotted by ", "").replace(".", "").strip()
spotter = desc_split[2].replace("Spotted by ", "").replace(".", "").upper().strip()
time = datetime.strptime(source_spot.pub_date.content, self.RSS_DATE_TIME_FORMAT).astimezone(pytz.UTC)
@@ -69,19 +67,10 @@ class WOTA(HTTPSpotProvider):
mode=mode,
comment=comment,
sig="WOTA",
sig_refs=[SIGRef(id=ref, name=ref_name, url="https://www.wota.org.uk/MM_" + ref)] if ref else [],
icon=get_icon_for_sig("WOTA"),
sig_refs=[SIGRef(id=ref, sig="WOTA", name=ref_name)] if ref else [],
time=time.timestamp())
# WOTA name/grid/lat/lon lookup
wota_data = self.LIST_CACHE.get(self.LIST_URL, headers=HTTP_HEADERS).json()
for feature in wota_data["features"]:
if feature["properties"]["wotaId"] == spot.sig_refs[0]:
spot.sig_refs[0].name = feature["properties"]["title"]
spot.dx_latitude = feature["geometry"]["coordinates"][1]
spot.dx_longitude = feature["geometry"]["coordinates"][0]
spot.dx_grid = feature["properties"]["qthLocator"]
break
new_spots.append(spot)
except Exception as e:
logging.error("Exception parsing WOTA spot", e)
return new_spots

View File

@@ -1,7 +1,6 @@
import json
from datetime import datetime
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.sse_spot_provider import SSESpotProvider
@@ -20,10 +19,7 @@ class WWBOTA(SSESpotProvider):
# n-fer activations.
refs = []
for ref in source_spot["references"]:
sigref = SIGRef(id=ref["reference"], name=ref["name"])
# Bunkerbase URLs only work for UK bunkers, so only add a URL if we have a B/G prefix.
if ref["reference"].startswith("B/G"):
sigref.url="https://bunkerwiki.org/?s=" + ref["reference"]
sigref = SIGRef(id=ref["reference"], sig="WWBOTA", name=ref["name"])
refs.append(sigref)
spot = Spot(source=self.name,
@@ -34,7 +30,6 @@ class WWBOTA(SSESpotProvider):
comment=source_spot["comment"],
sig="WWBOTA",
sig_refs=refs,
icon=get_icon_for_sig("WWBOTA"),
time=datetime.fromisoformat(source_spot["time"]).timestamp(),
# WWBOTA spots can contain multiple references for bunkers being activated simultaneously. For
# now, we will just pick the first one to use as our grid, latitude and longitude.

View File

@@ -2,7 +2,6 @@ from datetime import datetime
import pytz
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -29,8 +28,7 @@ class WWFF(HTTPSpotProvider):
mode=source_spot["mode"].upper(),
comment=source_spot["remarks"],
sig="WWFF",
sig_refs=[SIGRef(id=source_spot["reference"], name=source_spot["reference_name"], url="https://wwff.co/directory/?showRef=" + source_spot["reference"])],
icon=get_icon_for_sig("WWFF"),
sig_refs=[SIGRef(id=source_spot["reference"], sig="WWFF", name=source_spot["reference_name"])],
time=datetime.fromtimestamp(source_spot["spot_time"], tz=pytz.UTC).timestamp(),
dx_latitude=source_spot["latitude"],
dx_longitude=source_spot["longitude"])

43
spotproviders/xota.py Normal file
View File

@@ -0,0 +1,43 @@
from datetime import datetime
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
# Spot provider for servers based on the "xOTA" software at https://github.com/nischu/xOTA/
# The provider typically doesn't give us a lat/lon or SIG explicitly, so our own config provides this information. This
# functionality is implemented for TOTA events.
class XOTA(HTTPSpotProvider):
POLL_INTERVAL_SEC = 300
FIXED_LATITUDE = None
FIXED_LONGITUDE = None
SIG = None
def __init__(self, provider_config):
super().__init__(provider_config, provider_config["url"] + "/api/spot/all", self.POLL_INTERVAL_SEC)
self.FIXED_LATITUDE = provider_config["latitude"] if "latitude" in provider_config else None
self.FIXED_LONGITUDE = provider_config["longitude"] if "longitude" in provider_config else None
self.SIG = provider_config["sig"] if "sig" in provider_config else None
def http_response_to_spots(self, http_response):
new_spots = []
# Iterate through source data
for source_spot in http_response.json():
# Convert to our spot format
spot = Spot(source=self.name,
source_id=source_spot["id"],
dx_call=source_spot["stationCallSign"].upper(),
freq=float(source_spot["freq"]) * 1000,
mode=source_spot["mode"].upper(),
sig=self.SIG,
sig_refs=[SIGRef(id=source_spot["reference"]["title"], sig=self.SIG, url=source_spot["reference"]["website"])],
time=datetime.fromisoformat(source_spot["modificationDate"]).timestamp(),
dx_latitude=self.FIXED_LATITUDE,
dx_longitude=self.FIXED_LONGITUDE,
qrt=source_spot["state"] != "active")
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
# that for us.
new_spots.append(spot)
return new_spots

View File

@@ -1,13 +1,7 @@
import csv
import logging
import re
from datetime import datetime, timedelta
from datetime import datetime
import pytz
from requests_cache import CachedSession
from core.constants import HTTP_HEADERS
from core.sig_utils import get_icon_for_sig
from data.sig_ref import SIGRef
from data.spot import Spot
from spotproviders.http_spot_provider import HTTPSpotProvider
@@ -18,8 +12,6 @@ class ZLOTA(HTTPSpotProvider):
POLL_INTERVAL_SEC = 120
SPOTS_URL = "https://ontheair.nz/api/spots?zlota_only=true"
LIST_URL = "https://ontheair.nz/assets/assets.json"
LIST_CACHE_TIME_DAYS = 30
LIST_CACHE = CachedSession("cache/zlota_data_cache", expire_after=timedelta(days=LIST_CACHE_TIME_DAYS))
def __init__(self, provider_config):
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
@@ -42,17 +34,8 @@ class ZLOTA(HTTPSpotProvider):
mode=source_spot["mode"].upper().strip(),
comment=source_spot["comments"],
sig="ZLOTA",
sig_refs=[SIGRef(id=source_spot["reference"], name=source_spot["name"])],
icon=get_icon_for_sig("ZLOTA"),
sig_refs=[SIGRef(id=source_spot["reference"], sig="ZLOTA", name=source_spot["name"])],
time=datetime.fromisoformat(source_spot["referenced_time"]).astimezone(pytz.UTC).timestamp())
# ZLOTA lat/lon lookup
zlota_data = self.LIST_CACHE.get(self.LIST_URL, headers=HTTP_HEADERS).json()
for asset in zlota_data:
if asset["code"] == spot.sig_refs[0]:
spot.dx_latitude = asset["y"]
spot.dx_longitude = asset["x"]
break
new_spots.append(spot)
return new_spots

View File

@@ -14,16 +14,18 @@
<p>Spothole is an "aggregator" for those spots, so it checks lots of different services for data, and brings it all together in one place. So no matter what kinds of interesting spots you are looking for, you can find them here.</p>
<p>As well as spots, it also provides a similar feed of "alerts". This is where amateur radio users who are going to interesting places soon will announce their intentions.</p>
<h4 class="mt-4">What are "DX", "DE" and modes?</h4>
<p>In amateur radio terminology, the "DX" contact is the "interesting" one that is using the frequency shown. They might be on a remote island or just in a local park, but either way it's interesting enough that someone has "spotted" them. The callsign listed under "DE" is the person who spotted the "DX" operator. "Modes" are the type of communication they are using. You might see "CW" which is Morse Code, or voice "modes" like SSB or FM, or more exotic "data" modes which are used for computer-to-computer communication.</p>
<p>In amateur radio terminology, the "DX" contact is the "interesting" one that is using the frequency shown and looking for callers. They might be on a remote island or just in a local park, but either way it's interesting enough that someone has "spotted" them. The callsign listed under "DE" is the person who entered the spot of the "DX" operator. "Modes" are the type of communication they are using. For example you might see "CW" which is Morse Code, or voice "modes" like SSB or FM, or more exotic "data" modes which are used for computer-to-computer communication.</p>
<h4 class="mt-4">What data sources are supported?</h4>
<p>Spothole can retrieve spots from: Telnet-based DX clusters, the Reverse Beacon Network (RBN), the APRS Internet Service (APRS-IS), POTA, SOTA, WWFF, GMA, WWBOTA, HEMA, Parks 'n' Peaks, ZLOTA, WOTA, and the UK Packet Repeater Network.</p>
<p>Spothole can retrieve alerts from: NG3K, POTA, SOTA, WWFF, Parks 'n' Peaks, and WOTA.</p>
<p>Between the various data sources, the following Special Interest Groups (SIGs) are supported: POTA, SOTA, WWFF, GMA, WWBOTA, HEMA, IOTA, MOTS, ARLHS, ILLW, SIOTA, WCA, ZLOTA, KRMNPA, WOTA, WAB & WAI.</p>
<p>Spothole can retrieve spots from: Telnet-based DX clusters, the Reverse Beacon Network (RBN), the APRS Internet Service (APRS-IS), POTA, SOTA, WWFF, GMA, WWBOTA, HEMA, Parks 'n' Peaks, ZLOTA, WOTA, the UK Packet Repeater Network, and any site based on the xOTA software by nischu.</p>
<p>Spothole can retrieve alerts from: NG3K, POTA, SOTA, WWFF, Parks 'n' Peaks, WOTA and BOTA.</p>
<p>Note that the server owner has not necessarily enabled all these data sources. In particular it is common to disable RBN, to avoid the server being swamped with FT8 traffic, and to disable APRS-IS and UK Packet Net so that the server only displays stations where there is likely to be an operator physically present for a QSO.</p>
<p>Between the various data sources, the following Special Interest Groups (SIGs) are supported: Parks on the Air (POTA), Summits on the Air (SOTA), Worldwide Flora & Fauna (WWFF), Global Mountain Activity (GMA), Worldwide Bunkers on the Air (WWBOTA), HuMPs Excluding Marilyns Award (HEMA), Islands on the Air (IOTA), Mills on the Air (MOTA), the Amateur Radio Lighthouse Socirty (ARLHS), International Lighthouse Lightship Weekend (ILLW), Silos on the Air (SIOTA), World Castles Award (WCA), New Zealand on the Air (ZLOTA), Keith Roget Memorial National Parks Award (KRMNPA), Wainwrights on the Air (WOTA), Beaches on the Air (BOTA), Worked All Britain (WAB), Worked All Ireland (WAI), and Toilets on the Air (TOTA).</p>
<h4 class="mt-4">How is this better than DXheat, DXsummit, POTA's own website, etc?</h4>
<p>It's probably not? But it's nice to have choice.</p>
<p>I think it's got two key advantages over those sites:</p>
<p>I think it's got three key advantages over those sites:</p>
<ol><li>It provides a public, <a href="/apidocs">well-documented API</a> with an <a href="/apidocs/openapi.yml">OpenAPI specification</a>. Other sites don't have official APIs or don't bother documenting them publicly, because they want people to use their web page. I like Spothole's web page, but you don't have to use it&mdash;if you're a programmer, you can build your own software on Spothole's API. Spothole does the hard work of taking all the various data sources and providing a consistent, well-documented data set. You can then do the fun bit of writing your own application.</li>
<li>It grabs data from a lot more sources, and it's easy to add more. Since it's open source, anyone can contribute a new data source and share it with the community.</li></ol>
<li>It grabs data from a lot more sources. I've seen other sites that pull in DX Cluster and POTA spots together, but nothing on the scale of what Spothole supports.</li>
<li>Spothole is open source, so anyone can contribute the code to support a new data source or add new features, and share them with the community.</li></ol>
<h4 class="mt-4">Why does this website ask me if I want to install it?</h4>
<p>Spothole is a Progressive Web App, which means you can install it on an Android or iOS device by opening the site in Chrome or Safari respectively, and clicking "Install" on the pop-up panel. It'll only prompt you once, so if you dismiss the prompt and change your mind, you'll find an Install / Add to Home Screen option on your browser's menu.</p>
<p>Installing Spothole on your phone is completely optional, the website works exactly the same way as the "app" does.</p>
@@ -31,11 +33,17 @@
<p>To avoid putting too much load on the various servers that Spothole connects to, the Spothole server only polls them once every two minutes for spots, and once every hour for alerts. (Some sources, such as DX clusters, RBN, APRS-IS and WWBOTA use a non-polling mechanism, and their updates will therefore arrive more quickly.) Then if you are using the web interface, that has its own rate at which it reloads the data from Spothole, which is once a minute for spots or 30 minutes for alerts. So you could be waiting around three minutes to see a newly added spot, or 90 minutes to see a newly added alert.</p>
<h4 class="mt-4">What licence does Spothole use?</h4>
<p>Spothole's source code is licenced under the Public Domain. You can write a Spothole client, run your own server, modify it however you like, you can claim you wrote it and charge people £1000 for a copy, I don't really mind. (Please don't do the last one. But if you're using my code for something cool, it would be nice to hear from you!)</p>
<h2 class="mt-4">Data Accuracy</h2>
<p>Please note that the data coming out of Spothole is only as good as the data going in. People mis-hear and make typos when spotting callsigns all the time. There are also plenty of cases where Spothole's data, particularly location data, may be inaccurate. For example, there are POTA parks that span multiple US states, countries that span multiple CQ zones, portable operators with no requirement to sign /P, etc. If you are doing something where accuracy is important, such as contesting, you should not rely on Spothole's data to fill in any gaps in your log.</p>
<h2 id="privacy" class="mt-4">Privacy</h2>
<p>Spothole collects no data about you, and there is no way to enter personally identifying information into the site apart from by spotting and alerting through Spothole or the various services it connects to. All spots and alerts are "timed out" and deleted from the system after a set interval, which by default is one hour for spots and one week for alerts.</p>
<p>Settings you select from Spothole's menus are sent to the server, in order to provide the data with the requested filters. They are also stored in your browser's local storage, so that your preferences are remembered between sessions.</p>
<p>There are no trackers, no ads, and no cookies.</p>
<p>Spothole is open source, so you can audit <a href="https://git.ianrenton.com/ian/spothole">the code</a> if you like.</p>
<h2 class="mt-4">Thanks</h2>
<p>This project would not have been possible without those volunteers who have taken it upon themselves to run DX clusters, xOTA programmes, DXpedition lists, callsign lookup databases, and other online tools on which Spothole's data is based.</p>
<p>Spothole is also dependent on a number of Python libraries, in particular pyhamtools, and many JavaScript libraries, as well as the Font Awesome icon set and flag icons from the Noto Color Emoji set.</p>
<p>This software is dedicated to the memory of Tom G1PJB, SK, a friend and colleague who sadly passed away around the time I started writing it in Autumn 2025. I was looking forward to showing it to you when it was done.</p>
</div>
<script>$(document).ready(function() { $("#nav-link-about").addClass("active"); }); <!-- highlight active page in nav --></script>

View File

@@ -0,0 +1,73 @@
% rebase('webpage_base.tpl')
<div id="add-spot-intro-box" class="permanently-dismissible-box mt-3">
<div class="alert alert-primary alert-dismissible fade show" role="alert">
<i class="fa-solid fa-circle-info"></i> <strong>Adding spots to Spothole</strong><br/>This page is implemented as a proof of concept for adding spots to the Spothole system. Currently, spots added in this way are only visible within Spothole and are not sent "upstream" to DX clusters or xOTA spotting sites. The functionality might be extended to include this in future if there is demand for it. If you'd like this to be added, please give a thumbs-up on <a href="https://git.ianrenton.com/ian/spothole/issues/39" target="_new" class="alert-link">issue #39</a> or get in touch via email.
<button type="button" id="add-spot-intro-box-dismiss" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
</div>
</div>
<div class="mt-3">
<div id="add-spot-area" class="card mb-3">
<div class="card-header text-white bg-primary">
<div class="row">
<div class="col-auto me-auto">
Add a Spot
</div>
</div>
</div>
<div class="card-body">
<form class="row g-3">
<div class="col-auto">
<label for="dx-call" class="form-label">DX Call *</label>
<input type="text" class="form-control" id="dx-call" placeholder="N0CALL" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="freq" class="form-label">Frequency (kHz) *</label>
<input type="text" class="form-control" id="freq" placeholder="e.g. 14100" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="mode" class="form-label">Mode</label>
<select id="mode" class="form-select">
<option value="" selected></option>
</select>
</div>
<div class="col-auto">
<label for="sig" class="form-label">SIG</label>
<select id="sig" class="form-select">
<option value="" selected></option>
</select>
</div>
<div class="col-auto">
<label for="sig-ref" class="form-label">SIG Reference</label>
<input type="text" class="form-control" id="sig-ref" placeholder="e.g. GB-0001" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="dx-grid" class="form-label">DX Grid</label>
<input type="text" class="form-control" id="dx-grid" placeholder="e.g. AA00aa" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="comment" class="form-label">Comment</label>
<input type="text" class="form-control" id="comment" placeholder="e.g. 59 TNX QSO 73" style="max-width: 12em;">
</div>
<div class="col-auto">
<label for="de-call" class="form-label">Your Call *</label>
<input type="text" class="form-control storeable-text" id="de-call" placeholder="N0CALL" style="max-width: 8em;">
</div>
<div class="col-auto">
<button type="button" class="btn btn-primary" style="margin-top: 2em;" onclick="addSpot();">Spot</button>
</div>
</form>
<div id="result-good"></div>
<div id="result-bad"></div>
<p class="small mt-4 mb-1">* Required field</p>
</div>
</div>
</div>
<script src="/js/common.js"></script>
<script src="/js/add-spot.js"></script>
<script>$(document).ready(function() { $("#nav-link-add-spot").addClass("active"); }); <!-- highlight active page in nav --></script>

View File

@@ -101,11 +101,6 @@
<h5 class="card-title">Number of Alerts</h5>
<p class="card-text spothole-card-text">Show up to
<select id="alerts-to-fetch" class="storeable-select form-select ms-2" oninput="filtersUpdated();" style="width: 5em;display: inline-block;">
<option value="25">25</option>
<option value="50">50</option>
<option value="100" selected>100</option>
<option value="200">200</option>
<option value="500">500</option>
</select>
alerts
</p>

View File

@@ -93,10 +93,6 @@
<h5 class="card-title">Spot Age</h5>
<p class="card-text spothole-card-text">Last
<select id="max-spot-age" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
<option value="300">5</option>
<option value="600">10</option>
<option value="1800" selected>30</option>
<option value="3600">60</option>
</select>
minutes
</p>

View File

@@ -62,6 +62,9 @@
<li class="nav-item ms-4"><a href="/map" class="nav-link" id="nav-link-map"><i class="fa-solid fa-map"></i> Map</a></li>
<li class="nav-item ms-4"><a href="/bands" class="nav-link" id="nav-link-bands"><i class="fa-solid fa-ruler-vertical"></i> Bands</a></li>
<li class="nav-item ms-4"><a href="/alerts" class="nav-link" id="nav-link-alerts"><i class="fa-solid fa-bell"></i> Alerts</a></li>
% if allow_spotting:
<li class="nav-item ms-4"><a href="/add-spot" class="nav-link" id="nav-link-add-spot"><i class="fa-solid fa-comment"></i> Add Spot</a></li>
% end
<li class="nav-item ms-4"><a href="/status" class="nav-link" id="nav-link-status"><i class="fa-solid fa-chart-simple"></i> Status</a></li>
<li class="nav-item ms-4"><a href="/about" class="nav-link" id="nav-link-about"><i class="fa-solid fa-circle-info"></i> About</a></li>
<li class="nav-item ms-4"><a href="/apidocs" class="nav-link" id="nav-link-api"><i class="fa-solid fa-gear"></i> API</a></li>

View File

@@ -1,7 +1,7 @@
% rebase('webpage_base.tpl')
<div id="map">
<div class="mt-3 px-3" style="z-index: 1002; position: relative;">
<div id="maptools" class="mt-3 px-3" style="z-index: 1002; position: relative;">
<div class="row">
<div class="col-auto me-auto pt-3"></div>
<div class="col-auto">
@@ -92,10 +92,6 @@
<h5 class="card-title">Spot Age</h5>
<p class="card-text spothole-card-text">Last
<select id="max-spot-age" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
<option value="300">5</option>
<option value="600">10</option>
<option value="1800" selected>30</option>
<option value="3600">60</option>
</select>
minutes
</p>

View File

@@ -1,6 +1,6 @@
% rebase('webpage_base.tpl')
<div id="intro-box" class="mt-3">
<div id="intro-box" class="permanently-dismissible-box mt-3">
<div class="alert alert-primary alert-dismissible fade show" role="alert">
<i class="fa-solid fa-circle-info"></i> <strong>What is Spothole?</strong><br/>Spothole is an aggregator of amateur radio spots from DX clusters and outdoor activity programmes. It's free for anyone to use and includes an API that developers can build other applications on. For more information, check out the <a href="/about" class="alert-link">"About" page</a>. If that sounds like nonsense to you, you can visit <a href="/about#faq" class="alert-link">the FAQ section</a> to learn more.
<button type="button" id="intro-box-dismiss" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
@@ -14,7 +14,10 @@
</div>
<div class="col-auto">
<p class="d-inline-flex gap-1">
<button id="add-spot-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleAddSpotPanel();"><i class="fa-solid fa-comment"></i> Add Spot</button>
<span style="position: relative;">
<i class="fa-solid fa-magnifying-glass" style="position: absolute; left: 0px; top: 2px; padding: 10px; pointer-events: none;"></i>
<input id="filter-dx-call" type="search" class="form-control" oninput="filtersUpdated();" placeholder="Callsign">
</span>
<button id="filters-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i> Filters</button>
<button id="display-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i> Display</button>
</p>
@@ -114,10 +117,6 @@
<h5 class="card-title">Number of Spots</h5>
<p class="card-text spothole-card-text">Show up to
<select id="spots-to-fetch" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
<option value="10">10</option>
<option value="25">25</option>
<option value="50" selected>50</option>
<option value="100">100</option>
</select>
spots
</p>
@@ -184,55 +183,6 @@
</div>
</div>
<div id="add-spot-area" class="appearing-panel card mb-3">
<div class="card-header text-white bg-primary">
<div class="row">
<div class="col-auto me-auto">
Add a Spot
</div>
<div class="col-auto d-inline-flex">
<button id="close-add-spot-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeAddSpotPanel();"></button>
</div>
</div>
</div>
<div class="card-body">
<form class="row g-2">
<div class="col-auto">
<label for="add-spot-dx-call" class="form-label">DX Call</label>
<input type="text" class="form-control" id="add-spot-dx-call" placeholder="N0CALL" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="add-spot-freq" class="form-label">Frequency (kHz)</label>
<input type="text" class="form-control" id="add-spot-freq" placeholder="14100" style="max-width: 8em;">
</div>
<div class="col-auto">
<label for="add-spot-mode" class="form-label">Mode</label>
<input type="text" class="form-control" id="add-spot-mode" placeholder="SSB" style="max-width: 6em;">
</div>
<div class="col-auto">
<label for="add-spot-comment" class="form-label">Comment</label>
<input type="text" class="form-control" id="add-spot-comment" placeholder="59 TNX QSO 73" style="max-width: 12em;">
</div>
<div class="col-auto">
<label for="add-spot-de-call" class="form-label">Your Call</label>
<input type="text" class="form-control" id="add-spot-de-call" placeholder="N0CALL" style="max-width: 8em;">
</div>
<div class="col-auto">
<button type="button" class="btn btn-primary" style="margin-top: 2em;" onclick="addSpot();">Spot</button>
<span id="post-spot-result-good"></span>
</div>
</form>
<div id="post-spot-result-bad"></div>
<div class="alert alert-warning alert-dismissible fade show mb-0 mt-4" role="alert">
Please note that spots added to Spothole are not currently sent "upstream" to DX clusters or xOTA spotting sites.
<button type="button" class="btn-close" data-bs-dismiss="alert" aria-label="Close"></button>
</div>
</div>
</div>
<div id="table-container"></div>
</div>

View File

@@ -5,6 +5,10 @@ info:
Spothole is a utility to aggregate "spots" from amateur radio DX clusters and xOTA spotting sites, and provide an open JSON API as well as a website to browse the data.
While there are other web-based interfaces to DX clusters, and sites that aggregate spots from various outdoor activity programmes for amateur radio, Spothole differentiates itself by supporting a large number of data sources, and by being "API first" rather than just providing a web front-end. This allows other software to be built on top of it. Spothole itself is also open source, Public Domain licenced code that anyone can take and modify.
The API calls described below allow third-party software to access data from Spothole, and receive data on spots and alerts in a consistent format regardless of the data sources used by Spothole itself. Utility calls are also provided for general data lookups.
Please note that the data coming out of Spothole is only as good as the data going in. People mis-hear and make typos when spotting callsigns all the time, and there are plenty of areas where Spothole's location data may be inaccurate. If you are doing something where accuracy is important, such as contesting, you should not rely on Spothole's data to fill in any gaps in your log.
contact:
email: ian@ianrenton.com
license:
@@ -17,7 +21,7 @@ paths:
/spots:
get:
tags:
- spots
- Spots
summary: Get spots
description: The main API call that retrieves spots from the system. Supply this with no query parameters to retrieve all spots known to the system. Supply query parameters to filter what is retrieved.
operationId: spots
@@ -48,7 +52,7 @@ paths:
type: number
- name: source
in: query
description: "Limit the spots to only ones from one or more sources. To select more than one source, supply a comma-separated list."
description: "Limit the spots to only ones from one or more sources. To select more than one source, supply a comma-separated list. The allowed options will vary based on how the sources are named within the server's config. See the /options call for how to retrieve a list of these."
required: false
schema:
type: string
@@ -62,10 +66,12 @@ paths:
- ParksNPeaks
- ZLOTA
- WOTA
- BOTA
- Cluster
- RBN
- APRS-IS
- UKPacketNet
- TOTA
- name: sig
in: query
description: "Limit the spots to only ones from one or more Special Interest Groups provided as an argument. To select more than one SIG, supply a comma-separated list."
@@ -85,10 +91,13 @@ paths:
- ARLHS
- ILLW
- ZLOTA
- KRMNPA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
- TOTA
- name: needs_sig
in: query
description: "Limit the spots to only ones with a Special Interest Group such as POTA. Because supplying all known SIGs as a `sigs` parameter is unwieldy, and leaving `sigs` blank will also return spots with *no* SIG, this parameter can be set true to return only spots with a SIG, regardless of what it is, so long as it's not blank. This is what Field Spotter uses to exclude generic cluster spots and only retrieve xOTA things."
@@ -205,6 +214,12 @@ paths:
schema:
type: boolean
default: false
- name: dx_call_includes
in: query
description: "Limit the alerts to only ones where the DX callsign includes the supplied string (case-insensitive). Generally a complete callsign, but you can supply a shorter string for partial matches."
required: false
schema:
type: string
- name: comment_includes
in: query
description: "Return only spots where the comment includes the provided string (case-insensitive)."
@@ -239,7 +254,7 @@ paths:
/alerts:
get:
tags:
- alerts
- Alerts
summary: Get alerts
description: Retrieves alerts (indications of upcoming activations) from the system. Supply this with no query parameters to retrieve all alerts known to the system. Supply query parameters to filter what is retrieved.
operationId: spots
@@ -270,7 +285,7 @@ paths:
type: boolean
- name: source
in: query
description: "Limit the alerts to only ones from one or more sources. To select more than one source, supply a comma-separated list."
description: "Limit the alerts to only ones from one or more sources. To select more than one source, supply a comma-separated list. The options will vary based on how the sources are named within the server's config. See the /options call for how to retrieve a list of these."
required: false
schema:
type: string
@@ -284,10 +299,12 @@ paths:
- ParksNPeaks
- ZLOTA
- WOTA
- BOTA
- Cluster
- RBN
- APRS-IS
- UKPacketNet
- TOTA
- name: sig
in: query
description: "Limit the alerts to only ones from one or more Special Interest Groups. To select more than one SIG, supply a comma-separated list."
@@ -307,13 +324,16 @@ paths:
- ARLHS
- ILLW
- ZLOTA
- KRMNPA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
- TOTA
- name: dx_continent
in: query
description: "Limit the alerts to only ones where the DX (the operator being spotted) is on the given continent(s). To select more than one continent, supply a comma-separated list."
description: "Limit the alerts to only ones where the DX operator is on the given continent(s). To select more than one continent, supply a comma-separated list."
required: false
schema:
type: string
@@ -325,6 +345,12 @@ paths:
- AF
- OC
- AN
- name: dx_call_includes
in: query
description: "Limit the alerts to only ones where the DX callsign includes the supplied string (case-insensitive). Generally a complete callsign, but you can supply a shorter string for partial matches."
required: false
schema:
type: string
responses:
'200':
description: Success
@@ -339,7 +365,7 @@ paths:
/status:
get:
tags:
- general
- General
summary: Get server status
description: Query information about the server for use in a diagnostics display.
operationId: status
@@ -416,9 +442,9 @@ paths:
/options:
get:
tags:
- general
- General
summary: Get enumeration options
description: Retrieves the list of options for various enumerated types, which can be found in the spots and also provided back to the API as query parameters. While these enumerated options are defined in this spec anyway, providing them in an API call allows us to define extra parameters, like the colours associated with bands, and also allows clients to set up their filters and features without having to have internal knowledge about, for example, what bands the server knows about.
description: Retrieves the list of options for various enumerated types, which can be found in the spots and also provided back to the API as query parameters. While these enumerated options are defined in this spec anyway, providing them in an API call allows us to define extra parameters, like the colours associated with bands, and also allows clients to set up their filters and features without having to have internal knowledge about, for example, what bands the server knows about. The call also returns a variety of other parameters that may be of use to a web UI, including the contents of the "web-ui-options" config section, which provides guidance for web UI implementations such as the built-in one on sensible configuration options such as the number of spots/alerts to retrieve, or the maximum age of spots to retrieve.
operationId: options
responses:
'200':
@@ -470,14 +496,198 @@ paths:
type: boolean
description: Whether the POST /spot call, to add spots to the server directly via its API, is permitted on this server.
example: true
web-ui-options:
type: object
properties:
spot-count:
type: array
description: An array of suggested "spot counts" that the web UI can retrieve from the API
items:
type: integer
example: 50
spot-count-default:
type: integer
example: 50
description: The suggested default "spot count" that the web UI should retrieve from the API
max-spot-age:
type: array
description: An array of suggested "maximum spot ages" that the web UI can retrieve from the API
items:
type: integer
example: 30
max-spot-age-default:
type: integer
example: 30
description: The suggested default "maximum spot age" that the web UI should retrieve from the API
alert-count:
type: array
description: An array of suggested "alert counts" that the web UI can retrieve from the API
items:
type: integer
example: 100
alert-count-default:
type: integer
example: 100
description: The suggested default "alert count" that the web UI should retrieve from the API
/lookup/call:
get:
tags:
- Utilities
summary: Look up callsign details
description: Perform a lookup of data about a certain callsign, using any of the lookup services available to the Spothole server.
operationId: call
parameters:
- name: call
in: query
description: A callsign
required: true
type: string
example: M0TRT
responses:
'200':
description: Success
content:
application/json:
schema:
type: object
properties:
call:
type: string
description: Callsign, as provided to the API
example: M0TRT
name:
type: string
description: Name of the operator
example: Ian
qth:
type: string
description: QTH of the operator. This could be from any SIG refs or could be from online lookup of their home QTH.
example: Dorset
country:
type: string
description: Country of the operator. Note that this is named "country" for commonality with other amateur radio tools, but in reality this is more of a "DXCC Name", as it includes many options which are not countries, just territories that DXCC uniquely identifies.
example: England
flag:
type: string
description: Country flag of the operator. This is limited to the range of emoji flags. For some DXCCs there may not be an official emoji flag, e.g. Northern Ireland, so the appearance may vary depending on your browser and operating system. Some small islands may also have no flag. Many DXCCs may also share a flag, e.g. mainland Spain, Balearic Islands, etc.
example: ""
continent:
type: string
description: Continent of the operator
enum:
- EU
- NA
- SA
- AS
- AF
- OC
- AN
example: EU
dxcc_id:
type: integer
description: DXCC ID of the operator
example: 235
cq_zone:
type: integer
description: CQ zone of the operator
example: 27
itu_zone:
type: integer
description: ITU zone of the operator
example: 14
grid:
type: string
description: Maidenhead grid locator for the operator's QTH. This could be from an online lookup service, or just based on the DXCC.
example: IO91aa
latitude:
type: number
description: Latitude of the operator's QTH, in degrees. This could be from an online lookup service, or just based on the DXCC.
example: 51.2345
longitude:
type: number
description: Longitude of the opertor's QTH, in degrees. This could be from an online lookup service, or just based on the DXCC.
example: -1.2345
location_source:
type: string
description: Where we got the location (grid/latitude/longitude) from. Unlike a spot where we might have a summit position or WAB square, here the only options are an online QTH lookup, or a location based purely on DXCC, or nothing.
enum:
- "HOME QTH"
- DXCC
- NONE
example: "HOME QTH"
'422':
description: Validation error e.g. callsign missing or format incorrect
content:
application/json:
schema:
type: string
example: "Failed"
/lookup/sigref:
get:
tags:
- Utilities
summary: Look up SIG ref details
description: Perform a lookup of data about a certain reference, providing the SIG and the ID of the reference. A SIGRef structure will be returned containing the SIG and ID, plus any other information Spothole could find about it.
operationId: sigref
parameters:
- name: sig
in: query
description: Special Interest Group (SIG), e.g. outdoor activity programme such as POTA
required: true
type: string
enum:
- POTA
- SOTA
- WWFF
- WWBOTA
- GMA
- HEMA
- WCA
- MOTA
- SIOTA
- ARLHS
- ILLW
- ZLOTA
- KRMNPA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
- TOTA
example: POTA
- name: id
in: query
description: ID of a reference in that SIG
required: true
type: string
example: GB-0001
responses:
'200':
description: Success
content:
application/json:
schema:
$ref: '#/components/schemas/SIGRef'
'422':
description: Validation error e.g. SIG not supported or reference format incorrect
content:
application/json:
schema:
type: string
example: "Failed"
/spot:
post:
tags:
- spots
- Spots
summary: Add a spot
description: "Supply a new spot object, which will be added to the system. Currently, this will not be reported up the chain to a cluster, POTA, SOTA etc. This will be introduced in a future version. cURL example: `curl --request POST --header \"Content-Type: application/json\" --data '{\"dx_call\":\"M0TRT\",\"time\":1760019539, \"freq\":14200000, \"comment\":\"Test spot please ignore\", \"de_call\":\"M0TRT\"}' https://spothole.app/api/v1/spot`"
description: "Supply a new spot object, which will be added to the system. Currently, this will not be reported up the chain to a cluster, POTA, SOTA etc. This may be introduced in a future version. cURL example: `curl --request POST --header \"Content-Type: application/json\" --data '{\"dx_call\":\"M0TRT\",\"time\":1760019539, \"freq\":14200000, \"comment\":\"Test spot please ignore\", \"de_call\":\"M0TRT\"}' https://spothole.app/api/v1/spot`"
operationId: spot
requestBody:
description: The JSON spot object
@@ -526,6 +736,30 @@ components:
type: string
description: SIG reference ID.
example: GB-0001
sig:
type: string
description: SIG that this reference is in.
enum:
- POTA
- SOTA
- WWFF
- WWBOTA
- GMA
- HEMA
- WCA
- MOTA
- SIOTA
- ARLHS
- ILLW
- ZLOTA
- KRMNPA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
- TOTA
example: POTA
name:
type: string
description: SIG reference name
@@ -534,6 +768,18 @@ components:
type: string
description: SIG reference URL, which the user can look up for more information
example: "https://pota.app/#/park/GB-0001"
grid:
type: string
description: Maidenhead grid locator for the reference, if known.
example: IO91aa
latitude:
type: number
description: Latitude of the reference, in degrees, if known.
example: 51.2345
longitude:
type: number
description: Longitude of the reference, in degrees, if known.
example: -1.2345
Spot:
type: object
@@ -550,13 +796,17 @@ components:
type: string
description: Name of the operator that has been spotted
example: Ian
dx_qth:
type: string
description: QTH of the operator that has been spotted. This could be from any SIG refs or could be from online lookup of their home QTH.
example: Dorset
dx_country:
type: string
description: Country of the DX operator
example: United Kingdom
description: Country of the operator. Note that this is named "country" for commonality with other amateur radio tools, but in reality this is more of a "DXCC Name", as it includes many options which are not countries, just territories that DXCC uniquely identifies.
example: England
dx_flag:
type: string
description: Country flag of the DX operator
description: Country flag of the DX operator. This is limited to the range of emoji flags. For some DXCCs there may not be an official emoji flag, e.g. Northern Ireland, so the appearance may vary depending on your browser and operating system. Some small islands may also have no flag. Many DXCCs may also share a flag, e.g. mainland Spain, Balearic Islands, etc.
example: ""
dx_continent:
type: string
@@ -600,17 +850,18 @@ components:
example: -1.2345
dx_location_source:
type: string
description: Where we got the DX location (grid/latitude/longitude) from. If this was from the spot itself, it's likely quite accurate, but if we had to fall back to QRZ lookup, or even a location based on the DXCC itself, it will be a lot less accurate.
description: Where we got the DX location (grid/latitude/longitude) from. If this was from the spot itself, or from a lookup of the SIG ref (e.g. park) it's likely quite accurate, but if we had to fall back to QRZ lookup, or even a location based on the DXCC itself, it will be a lot less accurate.
enum:
- SPOT
- "SIG REF LOOKUP"
- "WAB/WAI GRID"
- QRZ
- "HOME QTH"
- DXCC
- NONE
example: SPOT
dx_location_good:
type: boolean
description: Does the software think the location is good enough to put a marker on a map? This is true if the source is "SPOT" or "WAB/WAI GRID", or alternatively if the source is "QRZ" and the callsign doesn't have a slash in it (i.e. operator likely at home).
description: Does the software think the location is good enough to put a marker on a map? This is true if the source is "SPOT", "SIG REF LOOKUP" or "WAB/WAI GRID", or alternatively if the source is "HOME QTH" and the callsign doesn't have a slash in it (i.e. operator likely at home).
example: true
de_call:
type: string
@@ -618,11 +869,11 @@ components:
example: M0TEST
de_country:
type: string
description: Country of the spotter
example: United Kingdom
description: Country of the operator. Note that this is named "country" for commonality with other amateur radio tools, but in reality this is more of a "DXCC Name", as it includes many options which are not countries, just territories that DXCC uniquely identifies.
example: England
de_flag:
type: string
description: Country flag of the spotter
description: Country flag of the spotter. This is limited to the range of emoji flags. For some DXCCs there may not be an official emoji flag, e.g. Northern Ireland, so the appearance may vary depending on your browser and operating system. Some small islands may also have no flag. Many DXCCs may also share a flag, e.g. mainland Spain, Balearic Islands, etc.
example: ""
de_continent:
type: string
@@ -654,7 +905,7 @@ components:
example: 51.2345
de_longitude:
type: number
description: Longitude of the DX spotspotter, in degrees. This is not going to be from a xOTA reference so it will likely just be a QRZ or DXCC lookup. If the spotter is also portable, this is probably wrong, but it's good enough for some simple mapping.
description: Longitude of the spotter, in degrees. This is not going to be from a xOTA reference so it will likely just be a QRZ or DXCC lookup. If the spotter is also portable, this is probably wrong, but it's good enough for some simple mapping.
example: -1.2345
mode:
type: string
@@ -766,10 +1017,13 @@ components:
- ARLHS
- ILLW
- ZLOTA
- KRMNPA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
- TOTA
example: POTA
sig_refs:
type: array
@@ -798,7 +1052,7 @@ components:
example: false
source:
type: string
description: Where we got the spot from.
description: Where we got the spot from. The options will vary based on how the sources are named within the server's config. See the /options call for how to retrieve a list of these.
enum:
- POTA
- SOTA
@@ -813,6 +1067,7 @@ components:
- RBN
- APRS-IS
- UKPacketNet
- TOTA
example: POTA
source_id:
type: string
@@ -841,11 +1096,11 @@ components:
example: Ian
dx_country:
type: string
description: Country of the DX operator. This, and the subsequent fields, assume that all activators will be in the same country!
example: United Kingdom
description: Country of the DX operator. Country of the operator. Note that this is named "country" for commonality with other amateur radio tools, but in reality this is more of a "DXCC Name", as it includes many options which are not countries, just territories that DXCC uniquely identifies. This, and the subsequent fields, assume that all activators will be in the same country!
example: England
dx_flag:
type: string
description: Country flag of the DX operator
description: Country flag of the DX operator. This is limited to the range of emoji flags. For some DXCCs there may not be an official emoji flag, e.g. Northern Ireland, so the appearance may vary depending on your browser and operating system. Some small islands may also have no flag. Many DXCCs may also share a flag, e.g. mainland Spain, Balearic Islands, etc.
example: ""
dx_continent:
type: string
@@ -921,6 +1176,7 @@ components:
- ZLOTA
- IOTA
- WOTA
- BOTA
- WAB
- WAI
example: POTA
@@ -950,6 +1206,7 @@ components:
- ParksNPeaks
- ZLOTA
- WOTA
- BOTA
- Cluster
- RBN
- APRS-IS

View File

@@ -7,7 +7,7 @@
/* INTRO/WARNING BOXES */
#intro-box {
.permanently-dismissible-box {
display: none;
}
@@ -43,6 +43,12 @@ div.container {
/* SPOTS/ALERTS PAGES, SETTINGS/STATUS AREAS */
input#filter-dx-call {
max-width: 12em;
margin-right: 1rem;
padding-left: 2em;
}
div.appearing-panel {
display: none;
}
@@ -59,25 +65,22 @@ button#add-spot-button {
/* SPOTS/ALERTS PAGES, MAIN TABLE */
/* Custom version of Bootstrap table colouring to colour 2 in every 4 rows, because of our second row per spot that
appears on mobile */
.table-striped-custom > tbody > tr:nth-of-type(4n+3) > *,
.table-striped-custom > tbody > tr:nth-of-type(4n+4) > * {
--bs-table-color-type: var(--bs-table-striped-color);
--bs-table-bg-type: var(--bs-table-striped-bg);
}
td.nowrap, span.nowrap {
text-wrap: nowrap;
}
span.flag-wrapper {
display: inline-block;
width: 1.7em;
width: 1.8em;
text-align: center;
cursor: default;
}
img.flag {
position: relative;
top: -2px;
}
span.band-bullet {
display: inline-block;
cursor: default;
@@ -248,13 +251,28 @@ div.band-spot:hover span.band-spot-info {
/* GENERAL MOBILE SUPPORT */
@media (max-width: 991.99px) {
/* General "hide this on mobile" class */
.hideonmobile {
display: none !important;
}
/* Make map stretch to horizontal screen edges */
div#map, div#table-container, div#bands-container {
margin-left: -1em;
margin-right: -1em;
}
/* Avoid map page filters panel being larger than the map itself */
#maptools .appearing-panel {
max-height: 30em;
}
#maptools .appearing-panel .card-body {
max-height: 26em;
overflow: scroll;
}
/* Filter/search DX Call field should be smaller on mobile */
input#filter-dx-call {
max-width: 9em;
margin-right: 0;
}
}
@media (min-width: 992px) {

BIN
webassets/img/flags/1.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.9 KiB

BIN
webassets/img/flags/10.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

BIN
webassets/img/flags/100.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.5 KiB

BIN
webassets/img/flags/101.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/102.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/103.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.2 KiB

BIN
webassets/img/flags/104.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.8 KiB

BIN
webassets/img/flags/105.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

BIN
webassets/img/flags/106.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.9 KiB

BIN
webassets/img/flags/107.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.1 KiB

BIN
webassets/img/flags/108.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.7 KiB

BIN
webassets/img/flags/109.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

BIN
webassets/img/flags/11.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 KiB

BIN
webassets/img/flags/110.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

BIN
webassets/img/flags/111.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.1 KiB

BIN
webassets/img/flags/112.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.8 KiB

BIN
webassets/img/flags/113.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/114.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.3 KiB

BIN
webassets/img/flags/115.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/116.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

BIN
webassets/img/flags/117.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.5 KiB

BIN
webassets/img/flags/118.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.5 KiB

BIN
webassets/img/flags/119.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/12.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.8 KiB

BIN
webassets/img/flags/120.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.2 KiB

BIN
webassets/img/flags/122.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.9 KiB

BIN
webassets/img/flags/123.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

BIN
webassets/img/flags/124.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

BIN
webassets/img/flags/125.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.8 KiB

BIN
webassets/img/flags/126.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.4 KiB

BIN
webassets/img/flags/127.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/128.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/129.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.6 KiB

BIN
webassets/img/flags/13.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 KiB

BIN
webassets/img/flags/130.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.8 KiB

BIN
webassets/img/flags/131.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.0 KiB

BIN
webassets/img/flags/132.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.1 KiB

BIN
webassets/img/flags/133.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.3 KiB

BIN
webassets/img/flags/134.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/135.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

BIN
webassets/img/flags/136.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.3 KiB

BIN
webassets/img/flags/137.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.4 KiB

BIN
webassets/img/flags/138.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 14 KiB

BIN
webassets/img/flags/139.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 348 B

BIN
webassets/img/flags/14.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

BIN
webassets/img/flags/140.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.4 KiB

BIN
webassets/img/flags/141.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

BIN
webassets/img/flags/142.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.4 KiB

BIN
webassets/img/flags/143.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.5 KiB

BIN
webassets/img/flags/144.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 10 KiB

BIN
webassets/img/flags/145.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 4.9 KiB

BIN
webassets/img/flags/146.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.0 KiB

BIN
webassets/img/flags/147.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 9.1 KiB

BIN
webassets/img/flags/148.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.1 KiB

Some files were not shown because too many files have changed in this diff Show More