mirror of
https://git.ianrenton.com/ian/spothole.git
synced 2026-04-30 10:45:57 +00:00
Compare commits
49 Commits
web_ui_opt
...
11dd8fa77f
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
11dd8fa77f | ||
|
|
a44b4f5eb6 | ||
|
|
edbbb13087 | ||
|
|
c58c22d9a9 | ||
|
|
11cec58f75 | ||
|
|
9814b656b2 | ||
|
|
936e675d56 | ||
|
|
14c4e6f221 | ||
|
|
041216c5bb | ||
|
|
8257ec492d | ||
|
|
02f564b515 | ||
|
|
7de3cdc49c | ||
|
|
6f0101a861 | ||
|
|
4fe8dfc36a | ||
|
|
44f38b8114 | ||
|
|
5de5a7ffdf | ||
|
|
ed1f9e5b06 | ||
|
|
11d71629ce | ||
|
|
ee47d736eb | ||
|
|
a55179d944 | ||
|
|
8127122c11 | ||
|
|
91276067b9 | ||
|
|
126ebcb8b2 | ||
|
|
2a5e0db5bc | ||
|
|
1173af6a9d | ||
|
|
ce99bbc6cf | ||
|
|
4861e42798 | ||
|
|
b0a7e4ea81 | ||
|
|
b6407b4f66 | ||
|
|
30c6222fa0 | ||
|
|
07b7ce49da | ||
|
|
3792e9f4d9 | ||
|
|
6982354364 | ||
|
|
6b18ec6f88 | ||
|
|
068c732796 | ||
|
|
e6c9bb1853 | ||
|
|
6e7ffd626e | ||
|
|
4c22861666 | ||
|
|
76f289d66e | ||
|
|
29afcce504 | ||
|
|
3cd1352ff3 | ||
|
|
9241a26a47 | ||
|
|
3be63a8dd6 | ||
|
|
1e3cec1599 | ||
|
|
7b409bcb67 | ||
|
|
47b4ddb5c8 | ||
|
|
94094974d0 | ||
|
|
5230fa535f | ||
|
|
2be1c5b3d3 |
2
.idea/spothole.iml
generated
2
.idea/spothole.iml
generated
@@ -4,7 +4,7 @@
|
|||||||
<content url="file://$MODULE_DIR$">
|
<content url="file://$MODULE_DIR$">
|
||||||
<excludeFolder url="file://$MODULE_DIR$/.venv" />
|
<excludeFolder url="file://$MODULE_DIR$/.venv" />
|
||||||
</content>
|
</content>
|
||||||
<orderEntry type="jdk" jdkName="Python 3.13 virtualenv at ~/code/spothole/.venv" jdkType="Python SDK" />
|
<orderEntry type="jdk" jdkName="Python 3.13 (spothole)" jdkType="Python SDK" />
|
||||||
<orderEntry type="sourceFolder" forTests="false" />
|
<orderEntry type="sourceFolder" forTests="false" />
|
||||||
</component>
|
</component>
|
||||||
</module>
|
</module>
|
||||||
14
README.md
14
README.md
@@ -69,6 +69,8 @@ Various approaches exist to writing your own client, but in general:
|
|||||||
|
|
||||||
If you want to run a copy of Spothole with different configuration settings than the main instance, you can download it and run it on your own local machine or server.
|
If you want to run a copy of Spothole with different configuration settings than the main instance, you can download it and run it on your own local machine or server.
|
||||||
|
|
||||||
|
You will require Python version 3.8 or later. If you encounter an error about `gdal-config` during the following process, you will also need `libgdal-dev` installed.
|
||||||
|
|
||||||
To download and set up Spothole on a Debian server, run the following commands. Other operating systems will likely be similar.
|
To download and set up Spothole on a Debian server, run the following commands. Other operating systems will likely be similar.
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
@@ -254,10 +256,11 @@ server {
|
|||||||
}
|
}
|
||||||
|
|
||||||
location / {
|
location / {
|
||||||
add_header Access-Control-Allow-Origin $xssorigin;
|
|
||||||
proxy_http_version 1.1;
|
proxy_http_version 1.1;
|
||||||
proxy_set_header Connection "";
|
proxy_set_header Connection "";
|
||||||
proxy_pass http://127.0.0.1:8080;
|
proxy_pass http://127.0.0.1:8080;
|
||||||
|
proxy_hide_header Access-Control-Allow-Origin;
|
||||||
|
add_header Access-Control-Allow-Origin $xssorigin;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
```
|
```
|
||||||
@@ -266,9 +269,9 @@ One further change you might want to make to the file above is the `add_header A
|
|||||||
my own Spothole server to make sure that other third-party web-based software can get the data from my instance, and applies to any endpoint underneath `/api`. If you want
|
my own Spothole server to make sure that other third-party web-based software can get the data from my instance, and applies to any endpoint underneath `/api`. If you want
|
||||||
*your* Spothole instance to be set up the same way, so that others can write software in JavaScript that can access it,
|
*your* Spothole instance to be set up the same way, so that others can write software in JavaScript that can access it,
|
||||||
leave this intact. But if you want your Spothole instance to only be usable by scripts running on the web server you write,
|
leave this intact. But if you want your Spothole instance to only be usable by scripts running on the web server you write,
|
||||||
you can remove this block. (Note that this doesn't stop other people writing *non-web-based* software that accesses your
|
you can remove this line. (Note that this doesn't stop other people writing *non-web-based* software that accesses your
|
||||||
Spothole API—the enforcement of cross-origin headers only happens within the user's browser. If you need to lock your
|
Spothole API—the enforcement of cross-origin headers only happens within the user's browser. If you need to lock your
|
||||||
instance down so that no-one else can access it with *any* software, that's an aspect of nginx config that you will need
|
instance down so that no-one else can access it with *any* software, that's an aspect of nginx or firewall config that you will need
|
||||||
to find help with elsewhere.)
|
to find help with elsewhere.)
|
||||||
|
|
||||||
Now, make a symbolic link to enable the site:
|
Now, make a symbolic link to enable the site:
|
||||||
@@ -300,6 +303,7 @@ To navigate your way around the source code, this list may help.
|
|||||||
* `/data` - Data storage classes
|
* `/data` - Data storage classes
|
||||||
* `/spotproviders` - Classes providing spots by accessing the APIs of other services
|
* `/spotproviders` - Classes providing spots by accessing the APIs of other services
|
||||||
* `/alertproviders` - Classes providing alerts by accessing the APIs of other services
|
* `/alertproviders` - Classes providing alerts by accessing the APIs of other services
|
||||||
|
* `/solarconditionsproviders` - Classes providing solar and propagation by accessing the APIs of other services
|
||||||
* `/server` - Classes for running Spothole's own web server
|
* `/server` - Classes for running Spothole's own web server
|
||||||
|
|
||||||
*Templates*
|
*Templates*
|
||||||
@@ -344,6 +348,8 @@ The same approach as above is also used for alert providers.
|
|||||||
|
|
||||||
As well as being my work, I have also gratefully received feature patches from Steven, M1SDH.
|
As well as being my work, I have also gratefully received feature patches from Steven, M1SDH.
|
||||||
|
|
||||||
|
The project contains GeoJSON files for CQ and ITU zones, in the `/datafiles/` directory. These are MIT-licenced and, to my knowledge, created by HA8TKS for his CQ and ITU zone layers for Leaflet.
|
||||||
|
|
||||||
The project contains a self-hosted copy of Font Awesome's free library, in the `/webassets/fa/` directory. This is subject to Font Awesome's licence and is not covered by the overall licence declared in the `LICENSE` file. This approach was taken in preference to using their hosted kits due to the popularity of this project exceeding the page view limit for their free hosted offering.
|
The project contains a self-hosted copy of Font Awesome's free library, in the `/webassets/fa/` directory. This is subject to Font Awesome's licence and is not covered by the overall licence declared in the `LICENSE` file. This approach was taken in preference to using their hosted kits due to the popularity of this project exceeding the page view limit for their free hosted offering.
|
||||||
|
|
||||||
The project contains a set of flag icons generated using the "Noto Color Emoji" font on a Debian system, in the `/webassets/img/flags/` directory.
|
The project contains a set of flag icons generated using the "Noto Color Emoji" font on a Debian system, in the `/webassets/img/flags/` directory.
|
||||||
@@ -352,4 +358,6 @@ The software uses a number of Python libraries as listed in `requirements.txt`,
|
|||||||
|
|
||||||
Particular thanks go to country-files.com for providing country lookup data for amateur radio, to K0SWE for [this JSON-formatted DXCC data](https://github.com/k0swe/dxcc-json/), and to the developers of `pyhamtools` for making it easy to use country-files.com data as well as QRZ.com and Clublog lookup.
|
Particular thanks go to country-files.com for providing country lookup data for amateur radio, to K0SWE for [this JSON-formatted DXCC data](https://github.com/k0swe/dxcc-json/), and to the developers of `pyhamtools` for making it easy to use country-files.com data as well as QRZ.com and Clublog lookup.
|
||||||
|
|
||||||
|
Amateur radio clusters, outdoor programmes, propagation data providers etc. are almost all volunteer-run services that make no or little profit, and are done for the love of amateur radio. Services like Spothole, which build on top of them, are truly standing on the shoulders of giants. None of this would have been possible without the hard work and dedication of many other people within the amaetur radio community.
|
||||||
|
|
||||||
The project's name was suggested by Harm, DK4HAA. Thanks!
|
The project's name was suggested by Harm, DK4HAA. Thanks!
|
||||||
|
|||||||
@@ -5,46 +5,51 @@ import pytz
|
|||||||
from core.config import MAX_ALERT_AGE
|
from core.config import MAX_ALERT_AGE
|
||||||
|
|
||||||
|
|
||||||
# Generic alert provider class. Subclasses of this query the individual APIs for alerts.
|
|
||||||
class AlertProvider:
|
class AlertProvider:
|
||||||
|
"""Generic alert provider class. Subclasses of this query the individual APIs for alerts."""
|
||||||
|
|
||||||
# Constructor
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
|
"""Constructor"""
|
||||||
|
|
||||||
self.name = provider_config["name"]
|
self.name = provider_config["name"]
|
||||||
self.enabled = provider_config["enabled"]
|
self.enabled = provider_config["enabled"]
|
||||||
self.last_update_time = datetime.min.replace(tzinfo=pytz.UTC)
|
self.last_update_time = datetime.min.replace(tzinfo=pytz.UTC)
|
||||||
self.status = "Not Started" if self.enabled else "Disabled"
|
self.status = "Not Started" if self.enabled else "Disabled"
|
||||||
self.alerts = None
|
self._alerts = None
|
||||||
self.web_server = None
|
self._web_server = None
|
||||||
|
|
||||||
# Set up the provider, e.g. giving it the alert list to work from
|
|
||||||
def setup(self, alerts, web_server):
|
def setup(self, alerts, web_server):
|
||||||
self.alerts = alerts
|
"""Set up the provider, e.g. giving it the alert list to work from"""
|
||||||
self.web_server = web_server
|
|
||||||
|
self._alerts = alerts
|
||||||
|
self._web_server = web_server
|
||||||
|
|
||||||
# Start the provider. This should return immediately after spawning threads to access the remote resources
|
|
||||||
def start(self):
|
def start(self):
|
||||||
|
"""Start the provider. This should return immediately after spawning threads to access the remote resources"""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
|
|
||||||
# Submit a batch of alerts retrieved from the provider. There is no timestamp checking like there is for spots,
|
def _submit_batch(self, alerts):
|
||||||
# because alerts could be created at any point for any time in the future. Rely on hashcode-based id matching
|
"""Submit a batch of alerts retrieved from the provider. There is no timestamp checking like there is for spots,
|
||||||
# to deal with duplicates.
|
because alerts could be created at any point for any time in the future. Rely on hashcode-based id matching
|
||||||
def submit_batch(self, alerts):
|
to deal with duplicates."""
|
||||||
|
|
||||||
# Sort the batch so that earliest ones go in first. This helps keep the ordering correct when alerts are fired
|
# Sort the batch so that earliest ones go in first. This helps keep the ordering correct when alerts are fired
|
||||||
# off to SSE listeners.
|
# off to SSE listeners.
|
||||||
alerts = sorted(alerts, key=lambda alert: (alert.start_time if alert and alert.start_time else 0))
|
alerts = sorted(alerts, key=lambda a: (a.start_time if a and a.start_time else 0))
|
||||||
for alert in alerts:
|
for alert in alerts:
|
||||||
# Fill in any blanks and add to the list
|
# Fill in any blanks and add to the list
|
||||||
alert.infer_missing()
|
alert.infer_missing()
|
||||||
self.add_alert(alert)
|
self._add_alert(alert)
|
||||||
|
|
||||||
def add_alert(self, alert):
|
def _add_alert(self, alert):
|
||||||
if not alert.expired():
|
if not alert.expired():
|
||||||
self.alerts.add(alert.id, alert, expire=MAX_ALERT_AGE)
|
self._alerts.add(alert.id, alert, expire=MAX_ALERT_AGE)
|
||||||
# Ping the web server in case we have any SSE connections that need to see this immediately
|
# Ping the web server in case we have any SSE connections that need to see this immediately
|
||||||
if self.web_server:
|
if self._web_server:
|
||||||
self.web_server.notify_new_alert(alert)
|
self._web_server.notify_new_alert(alert)
|
||||||
|
|
||||||
# Stop any threads and prepare for application shutdown
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
|
"""Stop any threads and prepare for application shutdown"""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -8,15 +8,16 @@ from data.alert import Alert
|
|||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Beaches on the Air
|
|
||||||
class BOTA(HTTPAlertProvider):
|
class BOTA(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Beaches on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://www.beachesontheair.com/"
|
ALERTS_URL = "https://www.beachesontheair.com/"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
# Find the table of upcoming alerts
|
# Find the table of upcoming alerts
|
||||||
bs = BeautifulSoup(http_response.content.decode(), features="lxml")
|
bs = BeautifulSoup(http_response.content.decode(), features="lxml")
|
||||||
@@ -33,7 +34,7 @@ class BOTA(HTTPAlertProvider):
|
|||||||
|
|
||||||
# Get the date, dealing with the fact we get no year so have to figure out if it's last year or next year
|
# Get the date, dealing with the fact we get no year so have to figure out if it's last year or next year
|
||||||
date_text = str(cells[2].find('span').contents[0]).strip()
|
date_text = str(cells[2].find('span').contents[0]).strip()
|
||||||
date_time = datetime.strptime(date_text,"%d %b - %H:%M UTC").replace(tzinfo=pytz.UTC)
|
date_time = datetime.strptime(date_text, "%d %b - %H:%M UTC").replace(tzinfo=pytz.UTC)
|
||||||
date_time = date_time.replace(year=datetime.now(pytz.UTC).year)
|
date_time = date_time.replace(year=datetime.now(pytz.UTC).year)
|
||||||
# If this was more than a day ago, activation is actually next year
|
# If this was more than a day ago, activation is actually next year
|
||||||
if date_time < datetime.now(pytz.UTC) - timedelta(days=1):
|
if date_time < datetime.now(pytz.UTC) - timedelta(days=1):
|
||||||
|
|||||||
@@ -1,7 +1,6 @@
|
|||||||
import logging
|
import logging
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from threading import Timer, Thread
|
from threading import Thread, Event
|
||||||
from time import sleep
|
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
import requests
|
import requests
|
||||||
@@ -10,54 +9,57 @@ from alertproviders.alert_provider import AlertProvider
|
|||||||
from core.constants import HTTP_HEADERS
|
from core.constants import HTTP_HEADERS
|
||||||
|
|
||||||
|
|
||||||
# Generic alert provider class for providers that request data via HTTP(S). Just for convenience to avoid code
|
|
||||||
# duplication. Subclasses of this query the individual APIs for data.
|
|
||||||
class HTTPAlertProvider(AlertProvider):
|
class HTTPAlertProvider(AlertProvider):
|
||||||
|
"""Generic alert provider class for providers that request data via HTTP(S). Just for convenience to avoid code
|
||||||
|
duplication. Subclasses of this query the individual APIs for data."""
|
||||||
|
|
||||||
def __init__(self, provider_config, url, poll_interval):
|
def __init__(self, provider_config, url, poll_interval):
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.url = url
|
self._url = url
|
||||||
self.poll_interval = poll_interval
|
self._poll_interval = poll_interval
|
||||||
self.poll_timer = None
|
self._thread = None
|
||||||
|
self._stop_event = Event()
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
# Fire off a one-shot thread to run poll() for the first time, just to ensure start() returns immediately and
|
# Fire off the polling thread. It will poll immediately on startup, then sleep for poll_interval between
|
||||||
# the application can continue starting. The thread itself will then die, and the timer will kick in on its own
|
# subsequent polls, so start() returns immediately and the application can continue starting.
|
||||||
# thread.
|
logging.info("Set up query of " + self.name + " alert API every " + str(self._poll_interval) + " seconds.")
|
||||||
logging.info("Set up query of " + self.name + " alert API every " + str(self.poll_interval) + " seconds.")
|
self._thread = Thread(target=self._run, daemon=True)
|
||||||
thread = Thread(target=self.poll)
|
self._thread.start()
|
||||||
thread.daemon = True
|
|
||||||
thread.start()
|
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
if self.poll_timer:
|
self._stop_event.set()
|
||||||
self.poll_timer.cancel()
|
|
||||||
|
|
||||||
def poll(self):
|
def _run(self):
|
||||||
|
while True:
|
||||||
|
self._poll()
|
||||||
|
if self._stop_event.wait(timeout=self._poll_interval):
|
||||||
|
break
|
||||||
|
|
||||||
|
def _poll(self):
|
||||||
try:
|
try:
|
||||||
# Request data from API
|
# Request data from API
|
||||||
logging.debug("Polling " + self.name + " alert API...")
|
logging.debug("Polling " + self.name + " alert API...")
|
||||||
http_response = requests.get(self.url, headers=HTTP_HEADERS)
|
http_response = requests.get(self._url, headers=HTTP_HEADERS)
|
||||||
# Pass off to the subclass for processing
|
# Pass off to the subclass for processing
|
||||||
new_alerts = self.http_response_to_alerts(http_response)
|
new_alerts = self._http_response_to_alerts(http_response)
|
||||||
# Submit the new alerts for processing. There might not be any alerts for the less popular programs.
|
# Submit the new alerts for processing. There might not be any alerts for the less popular programs.
|
||||||
if new_alerts:
|
if new_alerts:
|
||||||
self.submit_batch(new_alerts)
|
self._submit_batch(new_alerts)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Received data from " + self.name + " alert API.")
|
logging.debug("Received data from " + self.name + " alert API.")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in HTTP JSON Alert Provider (" + self.name + ")")
|
logging.exception("Exception in HTTP JSON Alert Provider (" + self.name + ")")
|
||||||
sleep(1)
|
# Brief pause on error before the next poll, but still respond promptly to stop()
|
||||||
|
self._stop_event.wait(timeout=1)
|
||||||
|
|
||||||
self.poll_timer = Timer(self.poll_interval, self.poll)
|
def _http_response_to_alerts(self, http_response):
|
||||||
self.poll_timer.start()
|
"""Convert an HTTP response returned by the API into alert data. The whole response is provided here so the subclass
|
||||||
|
implementations can check for HTTP status codes if necessary, and handle the response as JSON, XML, text, whatever
|
||||||
|
the API actually provides."""
|
||||||
|
|
||||||
# Convert an HTTP response returned by the API into alert data. The whole response is provided here so the subclass
|
|
||||||
# implementations can check for HTTP status codes if necessary, and handle the response as JSON, XML, text, whatever
|
|
||||||
# the API actually provides.
|
|
||||||
def http_response_to_alerts(self, http_response):
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -2,14 +2,15 @@ import re
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
from rss_parser import RSSParser
|
from rss_parser import Parser
|
||||||
|
|
||||||
from alertproviders.http_alert_provider import HTTPAlertProvider
|
from alertproviders.http_alert_provider import HTTPAlertProvider
|
||||||
from data.alert import Alert
|
from data.alert import Alert
|
||||||
|
|
||||||
|
|
||||||
# Alert provider NG3K DXpedition list
|
|
||||||
class NG3K(HTTPAlertProvider):
|
class NG3K(HTTPAlertProvider):
|
||||||
|
"""Alert provider NG3K DXpedition list"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://www.ng3k.com/adxo.xml"
|
ALERTS_URL = "https://www.ng3k.com/adxo.xml"
|
||||||
AS_CALL_PATTERN = re.compile("as ([a-z0-9/]+)", re.IGNORECASE)
|
AS_CALL_PATTERN = re.compile("as ([a-z0-9/]+)", re.IGNORECASE)
|
||||||
@@ -17,9 +18,9 @@ class NG3K(HTTPAlertProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
rss = RSSParser.parse(http_response.content.decode())
|
rss = Parser.parse(http_response.content.decode())
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_alert in rss.channel.items:
|
for source_alert in rss.channel.items:
|
||||||
# Deal with "the format"...
|
# Deal with "the format"...
|
||||||
@@ -48,7 +49,8 @@ class NG3K(HTTPAlertProvider):
|
|||||||
|
|
||||||
start_timestamp = datetime.strptime(start_year + " " + start_mon + " " + start_day, "%Y %b %d").replace(
|
start_timestamp = datetime.strptime(start_year + " " + start_mon + " " + start_day, "%Y %b %d").replace(
|
||||||
tzinfo=pytz.UTC).timestamp()
|
tzinfo=pytz.UTC).timestamp()
|
||||||
end_timestamp = datetime.strptime(end_year + " " + end_mon + " " + end_day + " 23:59", "%Y %b %d %H:%M").replace(
|
end_timestamp = datetime.strptime(end_year + " " + end_mon + " " + end_day + " 23:59",
|
||||||
|
"%Y %b %d %H:%M").replace(
|
||||||
tzinfo=pytz.UTC).timestamp()
|
tzinfo=pytz.UTC).timestamp()
|
||||||
|
|
||||||
# Sometimes the DX callsign is "real", sometimes you just get a prefix with the real working callsigns being
|
# Sometimes the DX callsign is "real", sometimes you just get a prefix with the real working callsigns being
|
||||||
@@ -62,7 +64,7 @@ class NG3K(HTTPAlertProvider):
|
|||||||
dx_calls = [parts[2].upper()]
|
dx_calls = [parts[2].upper()]
|
||||||
|
|
||||||
# "Calls" of TBA, TBC or TBD are not real attempts at Turkish callsigns
|
# "Calls" of TBA, TBC or TBD are not real attempts at Turkish callsigns
|
||||||
dx_calls = list(filter(lambda a: a != "TBA" and a != "TBC" and a != "TBD" , dx_calls))
|
dx_calls = list(filter(lambda a: a != "TBA" and a != "TBC" and a != "TBD", dx_calls))
|
||||||
|
|
||||||
dx_country = parts[1]
|
dx_country = parts[1]
|
||||||
qsl_info = parts[3]
|
qsl_info = parts[3]
|
||||||
|
|||||||
@@ -8,15 +8,16 @@ from data.alert import Alert
|
|||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Parks n Peaks
|
|
||||||
class ParksNPeaks(HTTPAlertProvider):
|
class ParksNPeaks(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Parks n Peaks"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "http://parksnpeaks.org/api/ALERTS/"
|
ALERTS_URL = "http://parksnpeaks.org/api/ALERTS/"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_alert in http_response.json():
|
for source_alert in http_response.json():
|
||||||
@@ -44,7 +45,7 @@ class ParksNPeaks(HTTPAlertProvider):
|
|||||||
|
|
||||||
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
|
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
|
||||||
if sig and sig not in ["POTA", "SOTA", "WWFF", "SiOTA", "ZLOTA", "KRMNPA"]:
|
if sig and sig not in ["POTA", "SOTA", "WWFF", "SiOTA", "ZLOTA", "KRMNPA"]:
|
||||||
logging.warn("PNP alert found with sig " + sig + ", developer needs to add support for this!")
|
logging.warning("PNP alert found with sig " + sig + ", developer needs to add support for this!")
|
||||||
|
|
||||||
# If this is POTA, SOTA or WWFF data we already have it through other means, so ignore. Otherwise, add to
|
# If this is POTA, SOTA or WWFF data we already have it through other means, so ignore. Otherwise, add to
|
||||||
# the alert list. Note that while ZLOTA has its own spots API, it doesn't have its own alerts API. So that
|
# the alert list. Note that while ZLOTA has its own spots API, it doesn't have its own alerts API. So that
|
||||||
|
|||||||
@@ -7,15 +7,16 @@ from data.alert import Alert
|
|||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Parks on the Air
|
|
||||||
class POTA(HTTPAlertProvider):
|
class POTA(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Parks on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://api.pota.app/activation"
|
ALERTS_URL = "https://api.pota.app/activation"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_alert in http_response.json():
|
for source_alert in http_response.json():
|
||||||
@@ -25,7 +26,8 @@ class POTA(HTTPAlertProvider):
|
|||||||
dx_calls=[source_alert["activator"].upper()],
|
dx_calls=[source_alert["activator"].upper()],
|
||||||
freqs_modes=source_alert["frequencies"],
|
freqs_modes=source_alert["frequencies"],
|
||||||
comment=source_alert["comments"],
|
comment=source_alert["comments"],
|
||||||
sig_refs=[SIGRef(id=source_alert["reference"], sig="POTA", name=source_alert["name"], url="https://pota.app/#/park/" + source_alert["reference"])],
|
sig_refs=[SIGRef(id=source_alert["reference"], sig="POTA", name=source_alert["name"],
|
||||||
|
url="https://pota.app/#/park/" + source_alert["reference"])],
|
||||||
start_time=datetime.strptime(source_alert["startDate"] + source_alert["startTime"],
|
start_time=datetime.strptime(source_alert["startDate"] + source_alert["startTime"],
|
||||||
"%Y-%m-%d%H:%M").replace(tzinfo=pytz.UTC).timestamp(),
|
"%Y-%m-%d%H:%M").replace(tzinfo=pytz.UTC).timestamp(),
|
||||||
end_time=datetime.strptime(source_alert["endDate"] + source_alert["endTime"],
|
end_time=datetime.strptime(source_alert["endDate"] + source_alert["endTime"],
|
||||||
|
|||||||
@@ -7,15 +7,16 @@ from data.alert import Alert
|
|||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Summits on the Air
|
|
||||||
class SOTA(HTTPAlertProvider):
|
class SOTA(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Summits on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://api-db2.sota.org.uk/api/alerts/365/all/all"
|
ALERTS_URL = "https://api-db2.sota.org.uk/api/alerts/365/all/all"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_alert in http_response.json():
|
for source_alert in http_response.json():
|
||||||
@@ -31,7 +32,9 @@ class SOTA(HTTPAlertProvider):
|
|||||||
dx_names=[source_alert["activatorName"].upper()],
|
dx_names=[source_alert["activatorName"].upper()],
|
||||||
freqs_modes=source_alert["frequency"],
|
freqs_modes=source_alert["frequency"],
|
||||||
comment=source_alert["comments"],
|
comment=source_alert["comments"],
|
||||||
sig_refs=[SIGRef(id=source_alert["associationCode"] + "/" + source_alert["summitCode"], sig="SOTA", name=summit_name, activation_score=summit_points)],
|
sig_refs=[
|
||||||
|
SIGRef(id=source_alert["associationCode"] + "/" + source_alert["summitCode"], sig="SOTA",
|
||||||
|
name=summit_name, activation_score=summit_points)],
|
||||||
start_time=datetime.strptime(source_alert["dateActivated"],
|
start_time=datetime.strptime(source_alert["dateActivated"],
|
||||||
"%Y-%m-%dT%H:%M:%SZ").replace(tzinfo=pytz.UTC).timestamp(),
|
"%Y-%m-%dT%H:%M:%SZ").replace(tzinfo=pytz.UTC).timestamp(),
|
||||||
is_dxpedition=False)
|
is_dxpedition=False)
|
||||||
|
|||||||
@@ -1,15 +1,16 @@
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
from rss_parser import RSSParser
|
from rss_parser import Parser as RSSParser
|
||||||
|
|
||||||
from alertproviders.http_alert_provider import HTTPAlertProvider
|
from alertproviders.http_alert_provider import HTTPAlertProvider
|
||||||
from data.alert import Alert
|
from data.alert import Alert
|
||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Wainwrights on the Air
|
|
||||||
class WOTA(HTTPAlertProvider):
|
class WOTA(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Wainwrights on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://www.wota.org.uk/alerts_rss.php"
|
ALERTS_URL = "https://www.wota.org.uk/alerts_rss.php"
|
||||||
RSS_DATE_TIME_FORMAT = "%a, %d %b %Y %H:%M:%S %z"
|
RSS_DATE_TIME_FORMAT = "%a, %d %b %Y %H:%M:%S %z"
|
||||||
@@ -17,7 +18,7 @@ class WOTA(HTTPAlertProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
rss = RSSParser.parse(http_response.content.decode())
|
rss = RSSParser.parse(http_response.content.decode())
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
|
|||||||
@@ -7,15 +7,16 @@ from data.alert import Alert
|
|||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Alert provider for Worldwide Flora and Fauna
|
|
||||||
class WWFF(HTTPAlertProvider):
|
class WWFF(HTTPAlertProvider):
|
||||||
|
"""Alert provider for Worldwide Flora and Fauna"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 1800
|
POLL_INTERVAL_SEC = 1800
|
||||||
ALERTS_URL = "https://spots.wwff.co/static/agendas.json"
|
ALERTS_URL = "https://spots.wwff.co/static/agendas.json"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.ALERTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_alerts(self, http_response):
|
def _http_response_to_alerts(self, http_response):
|
||||||
new_alerts = []
|
new_alerts = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_alert in http_response.json():
|
for source_alert in http_response.json():
|
||||||
|
|||||||
@@ -6,6 +6,9 @@
|
|||||||
# this as "N0CALL" and it shouldn't do any harm, as we're not sending anything to the various networks, only receiving.
|
# this as "N0CALL" and it shouldn't do any harm, as we're not sending anything to the various networks, only receiving.
|
||||||
server-owner-callsign: "N0CALL"
|
server-owner-callsign: "N0CALL"
|
||||||
|
|
||||||
|
# The base URL at which the software runs.
|
||||||
|
base-url: "http://localhost:8080"
|
||||||
|
|
||||||
# Spot providers to use. This is an example set, tailor it to your liking by commenting and uncommenting.
|
# Spot providers to use. This is an example set, tailor it to your liking by commenting and uncommenting.
|
||||||
# RBN and APRS-IS are supported but have such a high data rate, you probably don't want them enabled.
|
# RBN and APRS-IS are supported but have such a high data rate, you probably don't want them enabled.
|
||||||
# Each provider needs a class, a name, and an enabled/disabled state. Some require more config such as hostnames/IP
|
# Each provider needs a class, a name, and an enabled/disabled state. Some require more config such as hostnames/IP
|
||||||
@@ -125,7 +128,6 @@ spot-providers:
|
|||||||
sig: "TOTA"
|
sig: "TOTA"
|
||||||
locations-csv: "datafiles/39c3-tota.csv"
|
locations-csv: "datafiles/39c3-tota.csv"
|
||||||
|
|
||||||
|
|
||||||
# Alert providers to use. Same setup as the spot providers list above.
|
# Alert providers to use. Same setup as the spot providers list above.
|
||||||
alert-providers:
|
alert-providers:
|
||||||
-
|
-
|
||||||
@@ -157,6 +159,15 @@ alert-providers:
|
|||||||
name: "NG3K"
|
name: "NG3K"
|
||||||
enabled: true
|
enabled: true
|
||||||
|
|
||||||
|
|
||||||
|
# Solar condition providers to use. These poll external APIs for solar propagation data (SFI, A/K indices, band
|
||||||
|
# conditions, etc.) and make it available via the /api/v1/solar endpoint.
|
||||||
|
solar-condition-providers:
|
||||||
|
-
|
||||||
|
class: "HamQSL"
|
||||||
|
name: "HamQSL"
|
||||||
|
enabled: true
|
||||||
|
|
||||||
# Port to open the local web server on
|
# Port to open the local web server on
|
||||||
web-server-port: 8080
|
web-server-port: 8080
|
||||||
|
|
||||||
@@ -189,3 +200,14 @@ web-ui-options:
|
|||||||
max-spot-age-default: 30
|
max-spot-age-default: 30
|
||||||
alert-count: [25, 50, 100, 200, 500]
|
alert-count: [25, 50, 100, 200, 500]
|
||||||
alert-count-default: 100
|
alert-count-default: 100
|
||||||
|
# Default UI colour scheme. Supported values are "light", "dark" and "auto" (i.e. use the browser/OS colour scheme).
|
||||||
|
# Users can still override this in the UI to their own preference.
|
||||||
|
color-scheme-default: "auto"
|
||||||
|
# Default band colour scheme. Supported values are the full names of any band colour scheme shown in the UI.
|
||||||
|
# Users can still override this in the UI to their own preference.
|
||||||
|
band-color-scheme-default: "PSK Reporter (Adjusted)"
|
||||||
|
# Custom HTML insert. This can be any arbitrary HTML. It will be inserted next to the start/stop buttons on the spots
|
||||||
|
# (home) page, although being arbitrary HTML you can also use a div with absolute, relative, float placement etc. This
|
||||||
|
# is designed for a "donate/support the server" type button, though you are free to do whatever you want with it.
|
||||||
|
# As the server owner you are responsible for the safe usage of this option!
|
||||||
|
support-button-html: ""
|
||||||
@@ -7,4 +7,4 @@ from requests_cache import CachedSession
|
|||||||
# of time has passed. This is used throughout Spothole to cache data that does not change
|
# of time has passed. This is used throughout Spothole to cache data that does not change
|
||||||
# rapidly.
|
# rapidly.
|
||||||
SEMI_STATIC_URL_DATA_CACHE = CachedSession("cache/semi_static_url_data_cache",
|
SEMI_STATIC_URL_DATA_CACHE = CachedSession("cache/semi_static_url_data_cache",
|
||||||
expire_after=timedelta(days=30))
|
expire_after=timedelta(days=30))
|
||||||
|
|||||||
@@ -1,67 +1,73 @@
|
|||||||
import logging
|
import logging
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from threading import Timer
|
from threading import Event, Thread
|
||||||
from time import sleep
|
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
|
|
||||||
|
|
||||||
# Provides a timed cleanup of the spot list.
|
|
||||||
class CleanupTimer:
|
class CleanupTimer:
|
||||||
|
"""Provides a timed cleanup of the spot list."""
|
||||||
|
|
||||||
# Constructor
|
|
||||||
def __init__(self, spots, alerts, web_server, cleanup_interval):
|
def __init__(self, spots, alerts, web_server, cleanup_interval):
|
||||||
self.spots = spots
|
"""Constructor"""
|
||||||
self.alerts = alerts
|
|
||||||
self.web_server = web_server
|
self._spots = spots
|
||||||
self.cleanup_interval = cleanup_interval
|
self._alerts = alerts
|
||||||
self.cleanup_timer = None
|
self._web_server = web_server
|
||||||
|
self._cleanup_interval = cleanup_interval
|
||||||
self.last_cleanup_time = datetime.min.replace(tzinfo=pytz.UTC)
|
self.last_cleanup_time = datetime.min.replace(tzinfo=pytz.UTC)
|
||||||
self.status = "Starting"
|
self.status = "Starting"
|
||||||
|
self._thread = None
|
||||||
|
self._stop_event = Event()
|
||||||
|
|
||||||
# Start the cleanup timer
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.cleanup()
|
"""Start the cleanup timer"""
|
||||||
|
|
||||||
|
self._thread = Thread(target=self._run, daemon=True)
|
||||||
|
self._thread.start()
|
||||||
|
|
||||||
# Stop any threads and prepare for application shutdown
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.cleanup_timer.cancel()
|
"""Stop any threads and prepare for application shutdown"""
|
||||||
|
|
||||||
|
self._stop_event.set()
|
||||||
|
|
||||||
|
def _run(self):
|
||||||
|
while not self._stop_event.wait(timeout=self._cleanup_interval):
|
||||||
|
self._cleanup()
|
||||||
|
|
||||||
|
def _cleanup(self):
|
||||||
|
"""Perform cleanup and reschedule next timer"""
|
||||||
|
|
||||||
# Perform cleanup and reschedule next timer
|
|
||||||
def cleanup(self):
|
|
||||||
try:
|
try:
|
||||||
# Perform cleanup via letting the data expire
|
# Perform cleanup via letting the data expire
|
||||||
self.spots.expire()
|
self._spots.expire()
|
||||||
self.alerts.expire()
|
self._alerts.expire()
|
||||||
|
|
||||||
# Explicitly clean up any spots and alerts that have expired
|
# Explicitly clean up any spots and alerts that have expired
|
||||||
for id in list(self.spots.iterkeys()):
|
for i in list(self._spots.iterkeys()):
|
||||||
try:
|
try:
|
||||||
spot = self.spots[id]
|
spot = self._spots[i]
|
||||||
if spot.expired():
|
if spot.expired():
|
||||||
self.spots.delete(id)
|
self._spots.delete(i)
|
||||||
except KeyError:
|
except KeyError:
|
||||||
# Must have already been deleted, OK with that
|
# Must have already been deleted, OK with that
|
||||||
pass
|
pass
|
||||||
for id in list(self.alerts.iterkeys()):
|
for i in list(self._alerts.iterkeys()):
|
||||||
try:
|
try:
|
||||||
alert = self.alerts[id]
|
alert = self._alerts[i]
|
||||||
if alert.expired():
|
if alert.expired():
|
||||||
self.alerts.delete(id)
|
self._alerts.delete(i)
|
||||||
except KeyError:
|
except KeyError:
|
||||||
# Must have already been deleted, OK with that
|
# Must have already been deleted, OK with that
|
||||||
pass
|
pass
|
||||||
|
|
||||||
# Clean up web server SSE spot/alert queues
|
# Clean up web server SSE spot/alert queues
|
||||||
self.web_server.clean_up_sse_queues()
|
self._web_server.clean_up_sse_queues()
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_cleanup_time = datetime.now(pytz.UTC)
|
self.last_cleanup_time = datetime.now(pytz.UTC)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in Cleanup thread")
|
logging.exception("Exception in Cleanup thread")
|
||||||
sleep(1)
|
self._stop_event.wait(timeout=1)
|
||||||
|
|
||||||
self.cleanup_timer = Timer(self.cleanup_interval, self.cleanup)
|
|
||||||
self.cleanup_timer.start()
|
|
||||||
|
|||||||
@@ -10,9 +10,11 @@ if not os.path.isfile("config.yml"):
|
|||||||
exit()
|
exit()
|
||||||
|
|
||||||
# Load config
|
# Load config
|
||||||
config = yaml.safe_load(open("config.yml"))
|
with open("config.yml") as f:
|
||||||
|
config = yaml.safe_load(f)
|
||||||
logging.info("Loaded config.")
|
logging.info("Loaded config.")
|
||||||
|
|
||||||
|
BASE_URL = config["base-url"]
|
||||||
MAX_SPOT_AGE = config["max-spot-age-sec"]
|
MAX_SPOT_AGE = config["max-spot-age-sec"]
|
||||||
MAX_ALERT_AGE = config["max-alert-age-sec"]
|
MAX_ALERT_AGE = config["max-alert-age-sec"]
|
||||||
SERVER_OWNER_CALLSIGN = config["server-owner-callsign"]
|
SERVER_OWNER_CALLSIGN = config["server-owner-callsign"]
|
||||||
@@ -23,4 +25,8 @@ WEB_UI_OPTIONS = config["web-ui-options"]
|
|||||||
# For ease of config, each spot provider owns its own config about whether it should be enabled by default in the web UI
|
# For ease of config, each spot provider owns its own config about whether it should be enabled by default in the web UI
|
||||||
# but for consistency we provide this to the front-end in web-ui-options because it has no impact outside of the web UI.
|
# but for consistency we provide this to the front-end in web-ui-options because it has no impact outside of the web UI.
|
||||||
WEB_UI_OPTIONS["spot-providers-enabled-by-default"] = [p["name"] for p in config["spot-providers"] if p["enabled"] and (
|
WEB_UI_OPTIONS["spot-providers-enabled-by-default"] = [p["name"] for p in config["spot-providers"] if p["enabled"] and (
|
||||||
"enabled-by-default-in-web-ui" not in p or p["enabled-by-default-in-web-ui"] == True)]
|
"enabled-by-default-in-web-ui" not in p or p["enabled-by-default-in-web-ui"])]
|
||||||
|
# If spotting to this server is enabled, "API" is another valid spot source even though it does not come from
|
||||||
|
# one of our proviers. We set that to also be enabled by default.
|
||||||
|
if ALLOW_SPOTTING:
|
||||||
|
WEB_UI_OPTIONS["spot-providers-enabled-by-default"].append("API")
|
||||||
|
|||||||
@@ -12,27 +12,27 @@ HAMQTH_PRG = (SOFTWARE_NAME + " v" + SOFTWARE_VERSION + " operated by " + SERVER
|
|||||||
|
|
||||||
# Special Interest Groups
|
# Special Interest Groups
|
||||||
SIGS = [
|
SIGS = [
|
||||||
SIG(name="POTA", description="Parks on the Air", ref_regex=r"[A-Z]{2}\-\d{4,5}"),
|
SIG(name="POTA", description="Parks on the Air", ref_regex=r"[A-Z]{2}\-\d{4,5}"),
|
||||||
SIG(name="SOTA", description="Summits on the Air", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{2}\-\d{3}"),
|
SIG(name="SOTA", description="Summits on the Air", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{2}\-\d{3}"),
|
||||||
SIG(name="WWFF", description="World Wide Flora & Fauna", ref_regex=r"[A-Z0-9]{1,3}FF\-\d{4}"),
|
SIG(name="WWFF", description="World Wide Flora & Fauna", ref_regex=r"[A-Z0-9]{1,3}FF\-\d{4}"),
|
||||||
SIG(name="GMA", description="Global Mountain Activity", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{2}\-\d{3}"),
|
SIG(name="GMA", description="Global Mountain Activity", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{2}\-\d{3}"),
|
||||||
SIG(name="WWBOTA", description="Worldwide Bunkers on the Air", ref_regex=r"B\/[A-Z0-9]{1,3}\-\d{3,4}"),
|
SIG(name="WWBOTA", description="Worldwide Bunkers on the Air", ref_regex=r"B\/[A-Z0-9]{1,3}\-\d{3,4}"),
|
||||||
SIG(name="HEMA", description="HuMPs Excluding Marilyns Award", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{3}\-\d{3}"),
|
SIG(name="HEMA", description="HuMPs Excluding Marilyns Award", ref_regex=r"[A-Z0-9]{1,3}\/[A-Z]{3}\-\d{3}"),
|
||||||
SIG(name="IOTA", description="Islands on the Air", ref_regex=r"[A-Z]{2}\-\d{3}"),
|
SIG(name="IOTA", description="Islands on the Air", ref_regex=r"[A-Z]{2}\-\d{3}"),
|
||||||
SIG(name="MOTA", description="Mills on the Air", ref_regex=r"X\d{4-6}"),
|
SIG(name="MOTA", description="Mills on the Air", ref_regex=r"X\d{4,6}"),
|
||||||
SIG(name="ARLHS", description="Amateur Radio Lighthouse Society", ref_regex=r"[A-Z]{3}\-\d{3,4}"),
|
SIG(name="ARLHS", description="Amateur Radio Lighthouse Society", ref_regex=r"[A-Z]{3}\-\d{3,4}"),
|
||||||
SIG(name="ILLW", description="International Lighthouse & Lightship Weekend", ref_regex=r"[A-Z]{2}\d{4}"),
|
SIG(name="ILLW", description="International Lighthouse & Lightship Weekend", ref_regex=r"[A-Z]{2}\d{4}"),
|
||||||
SIG(name="SIOTA", description="Silos on the Air", ref_regex=r"[A-Z]{2}\-[A-Z]{3}\d"),
|
SIG(name="SIOTA", description="Silos on the Air", ref_regex=r"[A-Z]{2}\-[A-Z]{3}\d"),
|
||||||
SIG(name="WCA", description="World Castles Award", ref_regex=r"[A-Z0-9]{1,3}\-\d{5}"),
|
SIG(name="WCA", description="World Castles Award", ref_regex=r"[A-Z0-9]{1,3}\-\d{5}"),
|
||||||
SIG(name="ZLOTA", description="New Zealand on the Air", ref_regex=r"ZL[A-Z]/[A-Z]{2}\-\d{3,4}"),
|
SIG(name="ZLOTA", description="New Zealand on the Air", ref_regex=r"ZL[A-Z]/[A-Z]{2}\-\d{3,4}"),
|
||||||
SIG(name="WOTA", description="Wainwrights on the Air", ref_regex=r"[A-Z]{3}-[0-9]{2}"),
|
SIG(name="WOTA", description="Wainwrights on the Air", ref_regex=r"[A-Z]{3}-[0-9]{2}"),
|
||||||
SIG(name="BOTA", description="Beaches on the Air"),
|
SIG(name="BOTA", description="Beaches on the Air"),
|
||||||
SIG(name="KRMNPA", description="Keith Roget Memorial National Parks Award"),
|
SIG(name="KRMNPA", description="Keith Roget Memorial National Parks Award"),
|
||||||
SIG(name="LLOTA", description="Lagos y Lagunas on the Air", ref_regex=r"[A-Z]{2}\-\d{4}"),
|
SIG(name="LLOTA", description="Lagos y Lagunas on the Air", ref_regex=r"[A-Z]{2}\-\d{4}"),
|
||||||
SIG(name="WWTOTA", description="Towers on the Air", ref_regex=r"[A-Z]{2}R\-\d{4}"),
|
SIG(name="WWTOTA", description="Towers on the Air", ref_regex=r"[A-Z]{2}R\-\d{4}"),
|
||||||
SIG(name="WAB", description="Worked All Britain", ref_regex=r"[A-Z]{1,2}[0-9]{2}"),
|
SIG(name="WAB", description="Worked All Britain", ref_regex=r"[A-Z]{1,2}[0-9]{2}"),
|
||||||
SIG(name="WAI", description="Worked All Ireland", ref_regex=r"[A-Z][0-9]{2}"),
|
SIG(name="WAI", description="Worked All Ireland", ref_regex=r"[A-Z][0-9]{2}"),
|
||||||
SIG(name="TOTA", description="Toilets on the Air", ref_regex=r"T\-[0-9]{2}")
|
SIG(name="TOTA", description="Toilets on the Air", ref_regex=r"T\-[0-9]{2}")
|
||||||
]
|
]
|
||||||
|
|
||||||
# Modes. Note "DIGI" and "DIGITAL" are also supported but are normalised into "DATA".
|
# Modes. Note "DIGI" and "DIGITAL" are also supported but are normalised into "DATA".
|
||||||
|
|||||||
@@ -1,16 +1,176 @@
|
|||||||
|
import json
|
||||||
import logging
|
import logging
|
||||||
import re
|
import re
|
||||||
from math import floor
|
from math import floor
|
||||||
|
|
||||||
|
import geopandas
|
||||||
from pyproj import Transformer
|
from pyproj import Transformer
|
||||||
|
from shapely import prepare
|
||||||
|
from shapely.geometry import Point, Polygon
|
||||||
|
|
||||||
TRANSFORMER_OS_GRID_TO_WGS84 = Transformer.from_crs("EPSG:27700", "EPSG:4326")
|
TRANSFORMER_OS_GRID_TO_WGS84 = Transformer.from_crs("EPSG:27700", "EPSG:4326")
|
||||||
TRANSFORMER_IRISH_GRID_TO_WGS84 = Transformer.from_crs("EPSG:29903", "EPSG:4326")
|
TRANSFORMER_IRISH_GRID_TO_WGS84 = Transformer.from_crs("EPSG:29903", "EPSG:4326")
|
||||||
TRANSFORMER_CI_UTM_GRID_TO_WGS84 = Transformer.from_crs("+proj=utm +zone=30 +ellps=WGS84", "EPSG:4326")
|
TRANSFORMER_CI_UTM_GRID_TO_WGS84 = Transformer.from_crs("+proj=utm +zone=30 +ellps=WGS84", "EPSG:4326")
|
||||||
|
|
||||||
|
with open("datafiles/cqzones.geojson") as f:
|
||||||
|
cq_zone_data = geopandas.GeoDataFrame.from_features(json.load(f)["features"])
|
||||||
|
with open("datafiles/ituzones.geojson") as f:
|
||||||
|
itu_zone_data = geopandas.GeoDataFrame.from_features(json.load(f)["features"])
|
||||||
|
for idx in cq_zone_data.index:
|
||||||
|
prepare(cq_zone_data.at[idx, 'geometry'])
|
||||||
|
for idx in itu_zone_data.index:
|
||||||
|
prepare(itu_zone_data.at[idx, 'geometry'])
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_to_cq_zone(lat, lon):
|
||||||
|
"""Finds out which CQ zone a lat/lon point is in."""
|
||||||
|
|
||||||
|
lon = ((lon + 180) % 360) - 180
|
||||||
|
for index, row in cq_zone_data.iterrows():
|
||||||
|
polygon = Polygon(row["geometry"])
|
||||||
|
test_point = Point(lon, lat)
|
||||||
|
if polygon.contains(test_point):
|
||||||
|
return int(row["name"])
|
||||||
|
|
||||||
|
# Might have problems around the antemeridian, so if we didn't find a match, try offsetting the point by + or -
|
||||||
|
# 360 degrees longitude to try the other side of the Earth
|
||||||
|
if lon < 0:
|
||||||
|
test_point = Point(lon + 360, lat)
|
||||||
|
else:
|
||||||
|
test_point = Point(lon - 360, lat)
|
||||||
|
if polygon.contains(test_point):
|
||||||
|
return int(row["name"])
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_to_itu_zone(lat, lon):
|
||||||
|
"""Finds out which ITU zone a lat/lon point is in."""
|
||||||
|
|
||||||
|
lon = ((lon + 180) % 360) - 180
|
||||||
|
for index, row in itu_zone_data.iterrows():
|
||||||
|
polygon = Polygon(row["geometry"])
|
||||||
|
test_point = Point(lon, lat)
|
||||||
|
if polygon.contains(test_point):
|
||||||
|
return int(row["name"])
|
||||||
|
|
||||||
|
# Might have problems around the antemeridian, so if we didn't find a match, try offsetting the point by + or -
|
||||||
|
# 360 degrees longitude to try the other side of the Earth
|
||||||
|
if lon < 0:
|
||||||
|
test_point = Point(lon + 360, lat)
|
||||||
|
else:
|
||||||
|
test_point = Point(lon - 360, lat)
|
||||||
|
if polygon.contains(test_point):
|
||||||
|
return int(row["name"])
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_for_grid_centre(grid):
|
||||||
|
"""Convert a Maidenhead grid reference of arbitrary precision to the lat/long of the centre point of the square.
|
||||||
|
Returns None if the grid format is invalid."""
|
||||||
|
|
||||||
|
lat, lon, lat_cell_size, lon_cell_size = lat_lon_for_grid_sw_corner_plus_size(grid)
|
||||||
|
if lat is not None and lon is not None and lat_cell_size is not None and lon_cell_size is not None:
|
||||||
|
return [lat + lat_cell_size / 2.0, lon + lon_cell_size / 2.0]
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_for_grid_sw_corner(grid):
|
||||||
|
"""Convert a Maidenhead grid reference of arbitrary precision to the lat/long of the southwest corner of the square.
|
||||||
|
Returns None if the grid format is invalid."""
|
||||||
|
|
||||||
|
lat, lon, lat_cell_size, lon_cell_size = lat_lon_for_grid_sw_corner_plus_size(grid)
|
||||||
|
if lat is not None and lon is not None:
|
||||||
|
return [lat, lon]
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_for_grid_ne_corner(grid):
|
||||||
|
"""Convert a Maidenhead grid reference of arbitrary precision to the lat/long of the northeast corner of the square.
|
||||||
|
Returns None if the grid format is invalid."""
|
||||||
|
|
||||||
|
lat, lon, lat_cell_size, lon_cell_size = lat_lon_for_grid_sw_corner_plus_size(grid)
|
||||||
|
if lat is not None and lon is not None and lat_cell_size is not None and lon_cell_size is not None:
|
||||||
|
return [lat + lat_cell_size, lon + lon_cell_size]
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def lat_lon_for_grid_sw_corner_plus_size(grid):
|
||||||
|
"""Convert a Maidenhead grid reference of arbitrary precision to lat/long, including in the result the size of the
|
||||||
|
lowest grid square. This is a utility method used by the main methods that return the centre, southwest, and
|
||||||
|
northeast coordinates of a grid square.
|
||||||
|
The return type is always a tuple of size 4. The elements in it are None if the grid format is invalid."""
|
||||||
|
|
||||||
|
# Make sure we are in upper case so our maths works. Case is arbitrary for Maidenhead references
|
||||||
|
grid = grid.upper()
|
||||||
|
|
||||||
|
# Return None if our Maidenhead string is invalid or too short
|
||||||
|
length = len(grid)
|
||||||
|
if length <= 0 or (length % 2) != 0:
|
||||||
|
return None, None, None, None
|
||||||
|
|
||||||
|
lat = 0.0 # aggregated latitude
|
||||||
|
lon = 0.0 # aggregated longitude
|
||||||
|
lat_cell_size = 10.0 # Size in degrees latitude of the current cell. Starts at 10 and gets smaller as the calculation progresses
|
||||||
|
lon_cell_size = 20.0 # Size in degrees longitude of the current cell. Starts at 20 and gets smaller as the calculation progresses
|
||||||
|
|
||||||
|
# Iterate through blocks (two-character sections)
|
||||||
|
block = 0
|
||||||
|
while block * 2 < length:
|
||||||
|
if block % 2 == 0:
|
||||||
|
# Letters in this block
|
||||||
|
lon_cell_no = ord(grid[block * 2]) - ord('A')
|
||||||
|
lat_cell_no = ord(grid[block * 2 + 1]) - ord('A')
|
||||||
|
# Bail if the values aren't in range. Allowed values are A-R (0-17) for the first letter block, or
|
||||||
|
# A-X (0-23) thereafter.
|
||||||
|
max_cell_no = 17 if block == 0 else 23
|
||||||
|
if lat_cell_no < 0 or lat_cell_no > max_cell_no or lon_cell_no < 0 or lon_cell_no > max_cell_no:
|
||||||
|
return None, None, None, None
|
||||||
|
else:
|
||||||
|
# Numbers in this block
|
||||||
|
try:
|
||||||
|
lon_cell_no = int(grid[block * 2])
|
||||||
|
lat_cell_no = int(grid[block * 2 + 1])
|
||||||
|
except ValueError:
|
||||||
|
return None, None, None, None
|
||||||
|
# Bail if the values aren't in range 0-9
|
||||||
|
if lat_cell_no < 0 or lat_cell_no > 9 or lon_cell_no < 0 or lon_cell_no > 9:
|
||||||
|
return None, None, None, None
|
||||||
|
|
||||||
|
# Aggregate the angles
|
||||||
|
lat += lat_cell_no * lat_cell_size
|
||||||
|
lon += lon_cell_no * lon_cell_size
|
||||||
|
|
||||||
|
# Reduce the cell size for the next block, unless we are on the last cell.
|
||||||
|
if block * 2 < length - 2:
|
||||||
|
# Still have more work to do, so reduce the cell size
|
||||||
|
if block % 2 == 0:
|
||||||
|
# Just dealt with letters, next block will be numbers so cells will be 1/10 the current size
|
||||||
|
lat_cell_size = lat_cell_size / 10.0
|
||||||
|
lon_cell_size = lon_cell_size / 10.0
|
||||||
|
else:
|
||||||
|
# Just dealt with numbers, next block will be letters so cells will be 1/24 the current size
|
||||||
|
lat_cell_size = lat_cell_size / 24.0
|
||||||
|
lon_cell_size = lon_cell_size / 24.0
|
||||||
|
|
||||||
|
block += 1
|
||||||
|
|
||||||
|
# Offset back to (-180, -90) where the grid starts
|
||||||
|
lon -= 180.0
|
||||||
|
lat -= 90.0
|
||||||
|
|
||||||
|
# Return None values on maths errors
|
||||||
|
if any(x != x for x in [lat, lon, lat_cell_size, lon_cell_size]): # NaN check
|
||||||
|
return None, None, None, None
|
||||||
|
|
||||||
|
return lat, lon, lat_cell_size, lon_cell_size
|
||||||
|
|
||||||
|
|
||||||
# Convert a Worked All Britain or Worked All Ireland reference to a lat/lon point.
|
|
||||||
def wab_wai_square_to_lat_lon(ref):
|
def wab_wai_square_to_lat_lon(ref):
|
||||||
|
"""Convert a Worked All Britain or Worked All Ireland reference to a lat/lon point."""
|
||||||
|
|
||||||
# First check we have a valid grid square, and based on what it looks like, use either the Ordnance Survey, Irish,
|
# First check we have a valid grid square, and based on what it looks like, use either the Ordnance Survey, Irish,
|
||||||
# or UTM grid systems to perform the conversion.
|
# or UTM grid systems to perform the conversion.
|
||||||
if re.match(r"^[HNOST][ABCDEFGHJKLMNOPQRSTUVWXYZ][0-9]{2}$", ref):
|
if re.match(r"^[HNOST][ABCDEFGHJKLMNOPQRSTUVWXYZ][0-9]{2}$", ref):
|
||||||
@@ -20,12 +180,13 @@ def wab_wai_square_to_lat_lon(ref):
|
|||||||
elif re.match(r"^W[AV][0-9]{2}$", ref):
|
elif re.match(r"^W[AV][0-9]{2}$", ref):
|
||||||
return utm_grid_square_to_lat_lon(ref)
|
return utm_grid_square_to_lat_lon(ref)
|
||||||
else:
|
else:
|
||||||
logging.warn("Invalid WAB/WAI square: " + ref)
|
logging.warning("Invalid WAB/WAI square: " + ref)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
# Get a lat/lon point for the centre of an Ordnance Survey grid square
|
|
||||||
def os_grid_square_to_lat_lon(ref):
|
def os_grid_square_to_lat_lon(ref):
|
||||||
|
"""Get a lat/lon point for the centre of an Ordnance Survey grid square"""
|
||||||
|
|
||||||
# Convert the letters into multipliers for the 500km squares and 100km squares
|
# Convert the letters into multipliers for the 500km squares and 100km squares
|
||||||
offset_500km_multiplier = ord(ref[0]) - 65
|
offset_500km_multiplier = ord(ref[0]) - 65
|
||||||
offset_100km_multiplier = ord(ref[1]) - 65
|
offset_100km_multiplier = ord(ref[1]) - 65
|
||||||
@@ -54,8 +215,9 @@ def os_grid_square_to_lat_lon(ref):
|
|||||||
return lat, lon
|
return lat, lon
|
||||||
|
|
||||||
|
|
||||||
# Get a lat/lon point for the centre of an Irish Grid square.
|
|
||||||
def irish_grid_square_to_lat_lon(ref):
|
def irish_grid_square_to_lat_lon(ref):
|
||||||
|
"""Get a lat/lon point for the centre of an Irish Grid square."""
|
||||||
|
|
||||||
# Convert the letters into multipliers for the 100km squares
|
# Convert the letters into multipliers for the 100km squares
|
||||||
offset_100km_multiplier = ord(ref[0]) - 65
|
offset_100km_multiplier = ord(ref[0]) - 65
|
||||||
|
|
||||||
@@ -81,8 +243,9 @@ def irish_grid_square_to_lat_lon(ref):
|
|||||||
return lat, lon
|
return lat, lon
|
||||||
|
|
||||||
|
|
||||||
# Get a lat/lon point for the centre of a UTM grid square (supports only squares WA & WV for the Channel Islands, nothing else implemented)
|
|
||||||
def utm_grid_square_to_lat_lon(ref):
|
def utm_grid_square_to_lat_lon(ref):
|
||||||
|
"""Get a lat/lon point for the centre of a UTM grid square (supports only squares WA & WV for the Channel Islands, nothing else implemented)"""
|
||||||
|
|
||||||
# Take the numeric parts of the grid square and multiply by 10000 to get metres from the corner of the letter-based grid square
|
# Take the numeric parts of the grid square and multiply by 10000 to get metres from the corner of the letter-based grid square
|
||||||
easting = int(ref[2]) * 10000
|
easting = int(ref[2]) * 10000
|
||||||
northing = int(ref[3]) * 10000
|
northing = int(ref[3]) * 10000
|
||||||
|
|||||||
@@ -19,37 +19,38 @@ from core.constants import BANDS, UNKNOWN_BAND, CW_MODES, PHONE_MODES, DATA_MODE
|
|||||||
HTTP_HEADERS, HAMQTH_PRG, MODE_ALIASES
|
HTTP_HEADERS, HAMQTH_PRG, MODE_ALIASES
|
||||||
|
|
||||||
|
|
||||||
# Singleton class that provides lookup functionality.
|
|
||||||
class LookupHelper:
|
class LookupHelper:
|
||||||
|
"""Singleton class that provides lookup functionality."""
|
||||||
|
|
||||||
# Create the lookup helper. Note that nothing actually happens until the start() method is called, and that all
|
|
||||||
# lookup methods will fail if start() has not yet been called. This therefore needs starting before any spot or
|
|
||||||
# alert handlers are created.
|
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE = None
|
"""Create the lookup helper. Note that nothing actually happens until the start() method is called, and that all
|
||||||
self.LOOKUP_LIB_CLUBLOG_XML = None
|
lookup methods will fail if start() has not yet been called. This therefore needs starting before any spot or
|
||||||
self.CLUBLOG_XML_AVAILABLE = None
|
alert handlers are created."""
|
||||||
self.LOOKUP_LIB_CLUBLOG_API = None
|
|
||||||
self.CLUBLOG_XML_DOWNLOAD_LOCATION = None
|
self._clublog_callsign_data_cache = None
|
||||||
self.CLUBLOG_API_AVAILABLE = None
|
self._lookup_lib_clublog_xml = None
|
||||||
self.CLUBLOG_CTY_XML_CACHE = None
|
self._clublog_xml_available = None
|
||||||
self.CLUBLOG_API_KEY = None
|
self._lookup_lib_clublog_api = None
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE = None
|
self._clublog_xml_download_location = None
|
||||||
self.LOOKUP_LIB_QRZ = None
|
self._clublog_api_available = None
|
||||||
self.QRZ_AVAILABLE = None
|
self._clublog_cty_xml_cache = None
|
||||||
self.HAMQTH_AVAILABLE = None
|
self._clublog_api_key = None
|
||||||
self.HAMQTH_CALLSIGN_DATA_CACHE = None
|
self._qrz_callsign_data_cache = None
|
||||||
self.HAMQTH_BASE_URL = "https://www.hamqth.com/xml.php"
|
self._lookup_lib_qrz = None
|
||||||
|
self._qrz_available = None
|
||||||
|
self._hamqth_available = None
|
||||||
|
self._hamqth_callsign_data_cache = None
|
||||||
|
self._hamqth_base_url = "https://www.hamqth.com/xml.php"
|
||||||
# HamQTH session keys expire after an hour. Rather than working out how much time has passed manually, we cheat
|
# HamQTH session keys expire after an hour. Rather than working out how much time has passed manually, we cheat
|
||||||
# and cache the HTTP response for 55 minutes, so when the login URL is queried within 55 minutes of the previous
|
# and cache the HTTP response for 55 minutes, so when the login URL is queried within 55 minutes of the previous
|
||||||
# time, you just get the cached response.
|
# time, you just get the cached response.
|
||||||
self.HAMQTH_SESSION_LOOKUP_CACHE = CachedSession("cache/hamqth_session_cache",
|
self._hamqth_session_lookup_cache = CachedSession("cache/hamqth_session_cache",
|
||||||
expire_after=timedelta(minutes=55))
|
expire_after=timedelta(minutes=55))
|
||||||
self.CALL_INFO_BASIC = None
|
self._call_info_basic = None
|
||||||
self.LOOKUP_LIB_BASIC = None
|
self._lookup_lib_basic = None
|
||||||
self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION = None
|
self._country_files_cty_plist_download_location = None
|
||||||
self.DXCC_JSON_DOWNLOAD_LOCATION = None
|
self._dxcc_json_download_location = None
|
||||||
self.DXCC_DATA = None
|
self._dxcc_data = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
# Lookup helpers from pyhamtools. We use five (!) of these. The simplest is country-files.com, which downloads
|
# Lookup helpers from pyhamtools. We use five (!) of these. The simplest is country-files.com, which downloads
|
||||||
@@ -57,61 +58,66 @@ class LookupHelper:
|
|||||||
# If the user provides login details/API keys, we also set up helpers for QRZ.com, HamQTH, Clublog (live API
|
# If the user provides login details/API keys, we also set up helpers for QRZ.com, HamQTH, Clublog (live API
|
||||||
# request), and Clublog (XML download). The lookup functions iterate through these in a sensible order, looking
|
# request), and Clublog (XML download). The lookup functions iterate through these in a sensible order, looking
|
||||||
# for suitable data.
|
# for suitable data.
|
||||||
self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION = "cache/cty.plist"
|
self._country_files_cty_plist_download_location = "cache/cty.plist"
|
||||||
success = self.download_country_files_cty_plist()
|
success = self._download_country_files_cty_plist()
|
||||||
if success:
|
if success:
|
||||||
self.LOOKUP_LIB_BASIC = LookupLib(lookuptype="countryfile",
|
self._lookup_lib_basic = LookupLib(lookuptype="countryfile",
|
||||||
filename=self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION)
|
filename=self._country_files_cty_plist_download_location)
|
||||||
else:
|
else:
|
||||||
self.LOOKUP_LIB_BASIC = LookupLib(lookuptype="countryfile")
|
self._lookup_lib_basic = LookupLib(lookuptype="countryfile")
|
||||||
self.CALL_INFO_BASIC = Callinfo(self.LOOKUP_LIB_BASIC)
|
self._call_info_basic = Callinfo(self._lookup_lib_basic)
|
||||||
|
|
||||||
self.QRZ_AVAILABLE = config["qrz-username"] != "" and config["qrz-password"] != ""
|
self._qrz_available = config["qrz-username"] != "" and config["qrz-password"] != ""
|
||||||
if self.QRZ_AVAILABLE:
|
if self._qrz_available:
|
||||||
self.LOOKUP_LIB_QRZ = LookupLib(lookuptype="qrz", username=config["qrz-username"],
|
self._lookup_lib_qrz = LookupLib(lookuptype="qrz", username=config["qrz-username"],
|
||||||
pwd=config["qrz-password"])
|
pwd=config["qrz-password"])
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE = Cache('cache/qrz_callsign_lookup_cache')
|
self._qrz_callsign_data_cache = Cache('cache/qrz_callsign_lookup_cache')
|
||||||
|
|
||||||
self.HAMQTH_AVAILABLE = config["hamqth-username"] != "" and config["hamqth-password"] != ""
|
self._hamqth_available = config["hamqth-username"] != "" and config["hamqth-password"] != ""
|
||||||
self.HAMQTH_CALLSIGN_DATA_CACHE = Cache('cache/hamqth_callsign_lookup_cache')
|
self._hamqth_callsign_data_cache = Cache('cache/hamqth_callsign_lookup_cache')
|
||||||
|
|
||||||
self.CLUBLOG_API_KEY = config["clublog-api-key"]
|
self._clublog_api_key = config["clublog-api-key"]
|
||||||
self.CLUBLOG_CTY_XML_CACHE = CachedSession("cache/clublog_cty_xml_cache", expire_after=timedelta(days=10))
|
self._clublog_cty_xml_cache = CachedSession("cache/clublog_cty_xml_cache", expire_after=timedelta(days=10))
|
||||||
self.CLUBLOG_API_AVAILABLE = self.CLUBLOG_API_KEY != ""
|
self._clublog_api_available = self._clublog_api_key != ""
|
||||||
self.CLUBLOG_XML_DOWNLOAD_LOCATION = "cache/cty.xml"
|
self._clublog_xml_download_location = "cache/cty.xml"
|
||||||
if self.CLUBLOG_API_AVAILABLE:
|
if self._clublog_api_available:
|
||||||
self.LOOKUP_LIB_CLUBLOG_API = LookupLib(lookuptype="clublogapi", apikey=self.CLUBLOG_API_KEY)
|
self._lookup_lib_clublog_api = LookupLib(lookuptype="clublogapi", apikey=self._clublog_api_key)
|
||||||
success = self.download_clublog_ctyxml()
|
success = self._download_clublog_ctyxml()
|
||||||
self.CLUBLOG_XML_AVAILABLE = success
|
self._clublog_xml_available = success
|
||||||
if success:
|
if success:
|
||||||
self.LOOKUP_LIB_CLUBLOG_XML = LookupLib(lookuptype="clublogxml",
|
self._lookup_lib_clublog_xml = LookupLib(lookuptype="clublogxml",
|
||||||
filename=self.CLUBLOG_XML_DOWNLOAD_LOCATION)
|
filename=self._clublog_xml_download_location)
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE = Cache('cache/clublog_callsign_lookup_cache')
|
self._clublog_callsign_data_cache = Cache('cache/clublog_callsign_lookup_cache')
|
||||||
|
|
||||||
# We also get a lookup of DXCC data from K0SWE to use for additional lookups of e.g. flags.
|
# We also get a lookup of DXCC data from K0SWE to use for additional lookups of e.g. flags.
|
||||||
self.DXCC_JSON_DOWNLOAD_LOCATION = "cache/dxcc.json"
|
self._dxcc_json_download_location = "cache/dxcc.json"
|
||||||
success = self.download_dxcc_json()
|
success = self._download_dxcc_json()
|
||||||
if success:
|
if success:
|
||||||
with open(self.DXCC_JSON_DOWNLOAD_LOCATION) as f:
|
with open(self._dxcc_json_download_location) as f:
|
||||||
tmp_dxcc_data = json.load(f)["dxcc"]
|
tmp_dxcc_data = json.load(f)["dxcc"]
|
||||||
# Reformat as a map for faster lookup
|
# Reformat as a map for faster lookup
|
||||||
self.DXCC_DATA = {}
|
self._dxcc_data = {}
|
||||||
for dxcc in tmp_dxcc_data:
|
for dxcc in tmp_dxcc_data:
|
||||||
self.DXCC_DATA[dxcc["entityCode"]] = dxcc
|
self._dxcc_data[dxcc["entityCode"]] = dxcc
|
||||||
else:
|
else:
|
||||||
logging.error("Could not download DXCC data, flags and similar data may be missing!")
|
logging.error("Could not download DXCC data, flags and similar data may be missing!")
|
||||||
|
|
||||||
# Download the cty.plist file from country-files.com on first startup. The pyhamtools lib can actually download and use
|
# Precompile regex matches for DXCCs to improve efficiency when iterating through them
|
||||||
# this itself, but it's occasionally offline which causes it to throw an error. By downloading it separately, we can
|
for dxcc in (self._dxcc_data.values() if self._dxcc_data else []):
|
||||||
# catch errors and handle them, falling back to a previous copy of the file in the cache, and we can use the
|
dxcc["_prefixRegexCompiled"] = re.compile(dxcc["prefixRegex"])
|
||||||
# requests_cache library to prevent re-downloading too quickly if the software keeps restarting.
|
|
||||||
def download_country_files_cty_plist(self):
|
def _download_country_files_cty_plist(self):
|
||||||
|
"""Download the cty.plist file from country-files.com on first startup. The pyhamtools lib can actually download and use
|
||||||
|
this itself, but it's occasionally offline which causes it to throw an error. By downloading it separately, we can
|
||||||
|
catch errors and handle them, falling back to a previous copy of the file in the cache, and we can use the
|
||||||
|
requests_cache library to prevent re-downloading too quickly if the software keeps restarting."""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
logging.info("Downloading Country-files.com cty.plist...")
|
logging.info("Downloading Country-files.com cty.plist...")
|
||||||
response = SEMI_STATIC_URL_DATA_CACHE.get("https://www.country-files.com/cty/cty.plist",
|
response = SEMI_STATIC_URL_DATA_CACHE.get("https://www.country-files.com/cty/cty.plist",
|
||||||
headers=HTTP_HEADERS).text
|
headers=HTTP_HEADERS).text
|
||||||
|
|
||||||
with open(self.COUNTRY_FILES_CTY_PLIST_DOWNLOAD_LOCATION, "w") as f:
|
with open(self._country_files_cty_plist_download_location, "w") as f:
|
||||||
f.write(response)
|
f.write(response)
|
||||||
f.flush()
|
f.flush()
|
||||||
return True
|
return True
|
||||||
@@ -120,14 +126,16 @@ class LookupHelper:
|
|||||||
logging.error("Exception when downloading Clublog cty.xml", e)
|
logging.error("Exception when downloading Clublog cty.xml", e)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Download the dxcc.json file on first startup.
|
def _download_dxcc_json(self):
|
||||||
def download_dxcc_json(self):
|
"""Download the dxcc.json file on first startup."""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
logging.info("Downloading dxcc.json...")
|
logging.info("Downloading dxcc.json...")
|
||||||
response = SEMI_STATIC_URL_DATA_CACHE.get("https://raw.githubusercontent.com/k0swe/dxcc-json/refs/heads/main/dxcc.json",
|
response = SEMI_STATIC_URL_DATA_CACHE.get(
|
||||||
headers=HTTP_HEADERS).text
|
"https://raw.githubusercontent.com/k0swe/dxcc-json/refs/heads/main/dxcc.json",
|
||||||
|
headers=HTTP_HEADERS).text
|
||||||
|
|
||||||
with open(self.DXCC_JSON_DOWNLOAD_LOCATION, "w") as f:
|
with open(self._dxcc_json_download_location, "w") as f:
|
||||||
f.write(response)
|
f.write(response)
|
||||||
f.flush()
|
f.flush()
|
||||||
return True
|
return True
|
||||||
@@ -136,19 +144,20 @@ class LookupHelper:
|
|||||||
logging.error("Exception when downloading dxcc.json", e)
|
logging.error("Exception when downloading dxcc.json", e)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Download the cty.xml (gzipped) file from Clublog on first startup, so we can use it in preference to querying the
|
def _download_clublog_ctyxml(self):
|
||||||
# database live if possible.
|
"""Download the cty.xml (gzipped) file from Clublog on first startup, so we can use it in preference to querying the
|
||||||
def download_clublog_ctyxml(self):
|
database live if possible."""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
logging.info("Downloading Clublog cty.xml.gz...")
|
logging.info("Downloading Clublog cty.xml.gz...")
|
||||||
response = self.CLUBLOG_CTY_XML_CACHE.get("https://cdn.clublog.org/cty.php?api=" + self.CLUBLOG_API_KEY,
|
response = self._clublog_cty_xml_cache.get("https://cdn.clublog.org/cty.php?api=" + self._clublog_api_key,
|
||||||
headers=HTTP_HEADERS)
|
headers=HTTP_HEADERS)
|
||||||
logging.info("Caching Clublog cty.xml.gz...")
|
logging.info("Caching Clublog cty.xml.gz...")
|
||||||
open(self.CLUBLOG_XML_DOWNLOAD_LOCATION + ".gz", 'wb').write(response.content)
|
open(self._clublog_xml_download_location + ".gz", 'wb').write(response.content)
|
||||||
with gzip.open(self.CLUBLOG_XML_DOWNLOAD_LOCATION + ".gz", "rb") as uncompressed:
|
with gzip.open(self._clublog_xml_download_location + ".gz", "rb") as uncompressed:
|
||||||
file_content = uncompressed.read()
|
file_content = uncompressed.read()
|
||||||
logging.info("Caching Clublog cty.xml...")
|
logging.info("Caching Clublog cty.xml...")
|
||||||
with open(self.CLUBLOG_XML_DOWNLOAD_LOCATION, "wb") as f:
|
with open(self._clublog_xml_download_location, "wb") as f:
|
||||||
f.write(file_content)
|
f.write(file_content)
|
||||||
f.flush()
|
f.flush()
|
||||||
return True
|
return True
|
||||||
@@ -157,266 +166,254 @@ class LookupHelper:
|
|||||||
logging.error("Exception when downloading Clublog cty.xml", e)
|
logging.error("Exception when downloading Clublog cty.xml", e)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# Infer a mode from the comment
|
|
||||||
def infer_mode_from_comment(self, comment):
|
|
||||||
for mode in ALL_MODES:
|
|
||||||
if mode in comment.upper():
|
|
||||||
return mode
|
|
||||||
for mode in MODE_ALIASES.keys():
|
|
||||||
if mode in comment.upper():
|
|
||||||
return MODE_ALIASES[mode]
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Infer a "mode family" from a mode.
|
|
||||||
def infer_mode_type_from_mode(self, mode):
|
|
||||||
if mode.upper() in CW_MODES:
|
|
||||||
return "CW"
|
|
||||||
elif mode.upper() in PHONE_MODES:
|
|
||||||
return "PHONE"
|
|
||||||
elif mode.upper() in DATA_MODES:
|
|
||||||
return "DATA"
|
|
||||||
else:
|
|
||||||
if mode.upper() != "OTHER":
|
|
||||||
logging.warn("Found an unrecognised mode: " + mode + ". Developer should categorise this.")
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Infer a band from a frequency in Hz
|
|
||||||
def infer_band_from_freq(self, freq):
|
|
||||||
for b in BANDS:
|
|
||||||
if b.start_freq <= freq <= b.end_freq:
|
|
||||||
return b
|
|
||||||
return UNKNOWN_BAND
|
|
||||||
|
|
||||||
# Infer a country name from a callsign
|
|
||||||
def infer_country_from_callsign(self, call):
|
def infer_country_from_callsign(self, call):
|
||||||
|
"""Infer a country name from a callsign"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Start with the basic country-files.com-based decoder.
|
# Start with the basic country-files.com-based decoder.
|
||||||
country = self.CALL_INFO_BASIC.get_country_name(call)
|
country = self._call_info_basic.get_country_name(call)
|
||||||
except (KeyError, ValueError) as e:
|
except (KeyError, ValueError):
|
||||||
country = None
|
country = None
|
||||||
# Couldn't get anything from basic call info database, try QRZ.com
|
# Couldn't get anything from basic call info database, try QRZ.com
|
||||||
if not country:
|
if not country:
|
||||||
qrz_data = self.get_qrz_data_for_callsign(call)
|
qrz_data = self._get_qrz_data_for_callsign(call)
|
||||||
if qrz_data and "country" in qrz_data:
|
if qrz_data and "country" in qrz_data:
|
||||||
country = qrz_data["country"]
|
country = qrz_data["country"]
|
||||||
# Couldn't get anything from QRZ.com database, try HamQTH
|
# Couldn't get anything from QRZ.com database, try HamQTH
|
||||||
if not country:
|
if not country:
|
||||||
hamqth_data = self.get_hamqth_data_for_callsign(call)
|
hamqth_data = self._get_hamqth_data_for_callsign(call)
|
||||||
if hamqth_data and "country" in hamqth_data:
|
if hamqth_data and "country" in hamqth_data:
|
||||||
country = hamqth_data["country"]
|
country = hamqth_data["country"]
|
||||||
# Couldn't get anything from HamQTH database, try Clublog data
|
# Couldn't get anything from HamQTH database, try Clublog data
|
||||||
if not country:
|
if not country:
|
||||||
clublog_data = self.get_clublog_xml_data_for_callsign(call)
|
clublog_data = self._get_clublog_xml_data_for_callsign(call)
|
||||||
if clublog_data and "Name" in clublog_data:
|
if clublog_data and "Name" in clublog_data:
|
||||||
country = clublog_data["Name"]
|
country = clublog_data["Name"]
|
||||||
if not country:
|
if not country:
|
||||||
clublog_data = self.get_clublog_api_data_for_callsign(call)
|
clublog_data = self._get_clublog_api_data_for_callsign(call)
|
||||||
if clublog_data and "Name" in clublog_data:
|
if clublog_data and "Name" in clublog_data:
|
||||||
country = clublog_data["Name"]
|
country = clublog_data["Name"]
|
||||||
# Couldn't get anything from Clublog database, try DXCC data
|
# Couldn't get anything from Clublog database, try DXCC data
|
||||||
if not country:
|
if not country:
|
||||||
dxcc_data = self.get_dxcc_data_for_callsign(call)
|
dxcc_data = self._get_dxcc_data_for_callsign(call)
|
||||||
if dxcc_data and "name" in dxcc_data:
|
if dxcc_data and "name" in dxcc_data:
|
||||||
country = dxcc_data["name"]
|
country = dxcc_data["name"]
|
||||||
return country
|
return country
|
||||||
|
|
||||||
# Infer a DXCC ID from a callsign
|
|
||||||
def infer_dxcc_id_from_callsign(self, call):
|
def infer_dxcc_id_from_callsign(self, call):
|
||||||
|
"""Infer a DXCC ID from a callsign"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Start with the basic country-files.com-based decoder.
|
# Start with the basic country-files.com-based decoder.
|
||||||
dxcc = self.CALL_INFO_BASIC.get_adif_id(call)
|
dxcc = self._call_info_basic.get_adif_id(call)
|
||||||
except (KeyError, ValueError) as e:
|
except (KeyError, ValueError):
|
||||||
dxcc = None
|
dxcc = None
|
||||||
# Couldn't get anything from basic call info database, try QRZ.com
|
# Couldn't get anything from basic call info database, try QRZ.com
|
||||||
if not dxcc:
|
if not dxcc:
|
||||||
qrz_data = self.get_qrz_data_for_callsign(call)
|
qrz_data = self._get_qrz_data_for_callsign(call)
|
||||||
if qrz_data and "adif" in qrz_data:
|
if qrz_data and "adif" in qrz_data:
|
||||||
dxcc = qrz_data["adif"]
|
dxcc = qrz_data["adif"]
|
||||||
# Couldn't get anything from QRZ.com database, try HamQTH
|
# Couldn't get anything from QRZ.com database, try HamQTH
|
||||||
if not dxcc:
|
if not dxcc:
|
||||||
hamqth_data = self.get_hamqth_data_for_callsign(call)
|
hamqth_data = self._get_hamqth_data_for_callsign(call)
|
||||||
if hamqth_data and "adif" in hamqth_data:
|
if hamqth_data and "adif" in hamqth_data:
|
||||||
dxcc = hamqth_data["adif"]
|
dxcc = hamqth_data["adif"]
|
||||||
# Couldn't get anything from HamQTH database, try Clublog data
|
# Couldn't get anything from HamQTH database, try Clublog data
|
||||||
if not dxcc:
|
if not dxcc:
|
||||||
clublog_data = self.get_clublog_xml_data_for_callsign(call)
|
clublog_data = self._get_clublog_xml_data_for_callsign(call)
|
||||||
if clublog_data and "DXCC" in clublog_data:
|
if clublog_data and "DXCC" in clublog_data:
|
||||||
dxcc = clublog_data["DXCC"]
|
dxcc = clublog_data["DXCC"]
|
||||||
if not dxcc:
|
if not dxcc:
|
||||||
clublog_data = self.get_clublog_api_data_for_callsign(call)
|
clublog_data = self._get_clublog_api_data_for_callsign(call)
|
||||||
if clublog_data and "DXCC" in clublog_data:
|
if clublog_data and "DXCC" in clublog_data:
|
||||||
dxcc = clublog_data["DXCC"]
|
dxcc = clublog_data["DXCC"]
|
||||||
# Couldn't get anything from Clublog database, try DXCC data
|
# Couldn't get anything from Clublog database, try DXCC data
|
||||||
if not dxcc:
|
if not dxcc:
|
||||||
dxcc_data = self.get_dxcc_data_for_callsign(call)
|
dxcc_data = self._get_dxcc_data_for_callsign(call)
|
||||||
if dxcc_data and "entityCode" in dxcc_data:
|
if dxcc_data and "entityCode" in dxcc_data:
|
||||||
dxcc = dxcc_data["entityCode"]
|
dxcc = dxcc_data["entityCode"]
|
||||||
return dxcc
|
return dxcc
|
||||||
|
|
||||||
# Infer a continent shortcode from a callsign
|
|
||||||
def infer_continent_from_callsign(self, call):
|
def infer_continent_from_callsign(self, call):
|
||||||
|
"""Infer a continent shortcode from a callsign"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Start with the basic country-files.com-based decoder.
|
# Start with the basic country-files.com-based decoder.
|
||||||
continent = self.CALL_INFO_BASIC.get_continent(call)
|
continent = self._call_info_basic.get_continent(call)
|
||||||
except (KeyError, ValueError) as e:
|
except (KeyError, ValueError):
|
||||||
continent = None
|
continent = None
|
||||||
# Couldn't get anything from basic call info database, try HamQTH
|
# Couldn't get anything from basic call info database, try HamQTH
|
||||||
if not continent:
|
if not continent:
|
||||||
hamqth_data = self.get_hamqth_data_for_callsign(call)
|
hamqth_data = self._get_hamqth_data_for_callsign(call)
|
||||||
if hamqth_data and "continent" in hamqth_data:
|
if hamqth_data and "continent" in hamqth_data:
|
||||||
country = hamqth_data["continent"]
|
continent = hamqth_data["continent"]
|
||||||
# Couldn't get anything from HamQTH database, try Clublog data
|
# Couldn't get anything from HamQTH database, try Clublog data
|
||||||
if not continent:
|
if not continent:
|
||||||
clublog_data = self.get_clublog_xml_data_for_callsign(call)
|
clublog_data = self._get_clublog_xml_data_for_callsign(call)
|
||||||
if clublog_data and "Continent" in clublog_data:
|
if clublog_data and "Continent" in clublog_data:
|
||||||
continent = clublog_data["Continent"]
|
continent = clublog_data["Continent"]
|
||||||
if not continent:
|
if not continent:
|
||||||
clublog_data = self.get_clublog_api_data_for_callsign(call)
|
clublog_data = self._get_clublog_api_data_for_callsign(call)
|
||||||
if clublog_data and "Continent" in clublog_data:
|
if clublog_data and "Continent" in clublog_data:
|
||||||
continent = clublog_data["Continent"]
|
continent = clublog_data["Continent"]
|
||||||
# Couldn't get anything from Clublog database, try DXCC data
|
# Couldn't get anything from Clublog database, try DXCC data
|
||||||
if not continent:
|
if not continent:
|
||||||
dxcc_data = self.get_dxcc_data_for_callsign(call)
|
dxcc_data = self._get_dxcc_data_for_callsign(call)
|
||||||
# Some DXCCs are in two continents, if so don't use the continent data as we can't be sure
|
# Some DXCCs are in two continents, if so don't use the continent data as we can't be sure
|
||||||
if dxcc_data and "continent" in dxcc_data and len(dxcc_data["continent"]) == 1:
|
if dxcc_data and "continent" in dxcc_data and len(dxcc_data["continent"]) == 1:
|
||||||
continent = dxcc_data["continent"][0]
|
continent = dxcc_data["continent"][0]
|
||||||
return continent
|
return continent
|
||||||
|
|
||||||
# Infer a CQ zone from a callsign
|
|
||||||
def infer_cq_zone_from_callsign(self, call):
|
def infer_cq_zone_from_callsign(self, call):
|
||||||
|
"""Infer a CQ zone from a callsign"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Start with the basic country-files.com-based decoder.
|
# Start with the basic country-files.com-based decoder.
|
||||||
cqz = self.CALL_INFO_BASIC.get_cqz(call)
|
cqz = self._call_info_basic.get_cqz(call)
|
||||||
except (KeyError, ValueError) as e:
|
except (KeyError, ValueError):
|
||||||
cqz = None
|
cqz = None
|
||||||
# Couldn't get anything from basic call info database, try QRZ.com
|
# Couldn't get anything from basic call info database, try QRZ.com
|
||||||
if not cqz:
|
if not cqz:
|
||||||
qrz_data = self.get_qrz_data_for_callsign(call)
|
qrz_data = self._get_qrz_data_for_callsign(call)
|
||||||
if qrz_data and "cqz" in qrz_data:
|
if qrz_data and "cqz" in qrz_data:
|
||||||
cqz = qrz_data["cqz"]
|
cqz = qrz_data["cqz"]
|
||||||
# Couldn't get anything from QRZ.com database, try HamQTH
|
# Couldn't get anything from QRZ.com database, try HamQTH
|
||||||
if not cqz:
|
if not cqz:
|
||||||
hamqth_data = self.get_hamqth_data_for_callsign(call)
|
hamqth_data = self._get_hamqth_data_for_callsign(call)
|
||||||
if hamqth_data and "cq" in hamqth_data:
|
if hamqth_data and "cq" in hamqth_data:
|
||||||
cqz = hamqth_data["cq"]
|
cqz = hamqth_data["cq"]
|
||||||
# Couldn't get anything from HamQTH database, try Clublog data
|
# Couldn't get anything from HamQTH database, try Clublog data
|
||||||
if not cqz:
|
if not cqz:
|
||||||
clublog_data = self.get_clublog_xml_data_for_callsign(call)
|
clublog_data = self._get_clublog_xml_data_for_callsign(call)
|
||||||
if clublog_data and "CQZ" in clublog_data:
|
if clublog_data and "CQZ" in clublog_data:
|
||||||
cqz = clublog_data["CQZ"]
|
cqz = clublog_data["CQZ"]
|
||||||
if not cqz:
|
if not cqz:
|
||||||
clublog_data = self.get_clublog_api_data_for_callsign(call)
|
clublog_data = self._get_clublog_api_data_for_callsign(call)
|
||||||
if clublog_data and "CQZ" in clublog_data:
|
if clublog_data and "CQZ" in clublog_data:
|
||||||
cqz = clublog_data["CQZ"]
|
cqz = clublog_data["CQZ"]
|
||||||
# Couldn't get anything from Clublog database, try DXCC data
|
# Couldn't get anything from Clublog database, try DXCC data
|
||||||
if not cqz:
|
if not cqz:
|
||||||
dxcc_data = self.get_dxcc_data_for_callsign(call)
|
dxcc_data = self._get_dxcc_data_for_callsign(call)
|
||||||
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
|
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
|
||||||
if dxcc_data and "cq" in dxcc_data and len(dxcc_data["cq"]) == 1:
|
if dxcc_data and "cq" in dxcc_data and len(dxcc_data["cq"]) == 1:
|
||||||
cqz = dxcc_data["cq"][0]
|
cqz = dxcc_data["cq"][0]
|
||||||
return cqz
|
return cqz
|
||||||
|
|
||||||
# Infer a ITU zone from a callsign
|
|
||||||
def infer_itu_zone_from_callsign(self, call):
|
def infer_itu_zone_from_callsign(self, call):
|
||||||
|
"""Infer a ITU zone from a callsign"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Start with the basic country-files.com-based decoder.
|
# Start with the basic country-files.com-based decoder.
|
||||||
ituz = self.CALL_INFO_BASIC.get_ituz(call)
|
ituz = self._call_info_basic.get_ituz(call)
|
||||||
except (KeyError, ValueError) as e:
|
except (KeyError, ValueError):
|
||||||
ituz = None
|
ituz = None
|
||||||
# Couldn't get anything from basic call info database, try QRZ.com
|
# Couldn't get anything from basic call info database, try QRZ.com
|
||||||
if not ituz:
|
if not ituz:
|
||||||
qrz_data = self.get_qrz_data_for_callsign(call)
|
qrz_data = self._get_qrz_data_for_callsign(call)
|
||||||
if qrz_data and "ituz" in qrz_data:
|
if qrz_data and "ituz" in qrz_data:
|
||||||
ituz = qrz_data["ituz"]
|
ituz = qrz_data["ituz"]
|
||||||
# Couldn't get anything from QRZ.com database, try HamQTH
|
# Couldn't get anything from QRZ.com database, try HamQTH
|
||||||
if not ituz:
|
if not ituz:
|
||||||
hamqth_data = self.get_hamqth_data_for_callsign(call)
|
hamqth_data = self._get_hamqth_data_for_callsign(call)
|
||||||
if hamqth_data and "itu" in hamqth_data:
|
if hamqth_data and "itu" in hamqth_data:
|
||||||
ituz = hamqth_data["itu"]
|
ituz = hamqth_data["itu"]
|
||||||
# Couldn't get anything from HamQTH database, Clublog doesn't provide this, so try DXCC data
|
# Couldn't get anything from HamQTH database, Clublog doesn't provide this, so try DXCC data
|
||||||
if not ituz:
|
if not ituz:
|
||||||
dxcc_data = self.get_dxcc_data_for_callsign(call)
|
dxcc_data = self._get_dxcc_data_for_callsign(call)
|
||||||
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
|
# Some DXCCs are in multiple zones, if so don't use the zone data as we can't be sure
|
||||||
if dxcc_data and "itu" in dxcc_data and len(dxcc_data["itu"]) == 1:
|
if dxcc_data and "itu" in dxcc_data and len(dxcc_data["itu"]) == 1:
|
||||||
ituz = dxcc_data["itu"]
|
ituz = dxcc_data["itu"]
|
||||||
return ituz
|
return ituz
|
||||||
|
|
||||||
# Get an emoji flag for a given DXCC entity ID
|
|
||||||
def get_flag_for_dxcc(self, dxcc):
|
def get_flag_for_dxcc(self, dxcc):
|
||||||
return self.DXCC_DATA[dxcc]["flag"] if dxcc in self.DXCC_DATA else None
|
"""Get an emoji flag for a given DXCC entity ID"""
|
||||||
|
|
||||||
|
return self._dxcc_data[dxcc]["flag"] if dxcc in self._dxcc_data else None
|
||||||
|
|
||||||
# Infer an operator name from a callsign (requires QRZ.com/HamQTH)
|
|
||||||
def infer_name_from_callsign_online_lookup(self, call):
|
def infer_name_from_callsign_online_lookup(self, call):
|
||||||
data = self.get_qrz_data_for_callsign(call)
|
"""Infer an operator name from a callsign (requires QRZ.com/HamQTH)"""
|
||||||
|
|
||||||
|
data = self._get_qrz_data_for_callsign(call)
|
||||||
if data and "fname" in data:
|
if data and "fname" in data:
|
||||||
name = data["fname"]
|
name = data["fname"]
|
||||||
if "name" in data:
|
if "name" in data:
|
||||||
name = name + " " + data["name"]
|
name = name + " " + data["name"]
|
||||||
return name
|
return name
|
||||||
data = self.get_hamqth_data_for_callsign(call)
|
data = self._get_hamqth_data_for_callsign(call)
|
||||||
if data and "nick" in data:
|
if data and "nick" in data:
|
||||||
return data["nick"]
|
return data["nick"]
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Infer a latitude and longitude from a callsign (requires QRZ.com/HamQTH)
|
|
||||||
# Coordinates that look default are rejected (apologies if your position really is 0,0, enjoy your voyage)
|
|
||||||
def infer_latlon_from_callsign_online_lookup(self, call):
|
def infer_latlon_from_callsign_online_lookup(self, call):
|
||||||
data = self.get_qrz_data_for_callsign(call)
|
"""Infer a latitude and longitude from a callsign (requires QRZ.com/HamQTH)
|
||||||
if data and "latitude" in data and "longitude" in data and (float(data["latitude"]) != 0 or float(data["longitude"]) != 0) and -89.9 < float(data["latitude"]) < 89.9:
|
Coordinates that look default are rejected (apologies if your position really is 0,0, enjoy your voyage)"""
|
||||||
return [data["latitude"], data["longitude"]]
|
|
||||||
data = self.get_hamqth_data_for_callsign(call)
|
data = self._get_qrz_data_for_callsign(call)
|
||||||
if data and "latitude" in data and "longitude" in data and (float(data["latitude"]) != 0 or float(data["longitude"]) != 0) and -89.9 < float(data["latitude"]) < 89.9:
|
if data and "latitude" in data and "longitude" in data and (
|
||||||
return [data["latitude"], data["longitude"]]
|
float(data["latitude"]) != 0 or float(data["longitude"]) != 0) and -89.9 < float(
|
||||||
|
data["latitude"]) < 89.9:
|
||||||
|
return [float(data["latitude"]), float(data["longitude"])]
|
||||||
|
data = self._get_hamqth_data_for_callsign(call)
|
||||||
|
if data and "latitude" in data and "longitude" in data and (
|
||||||
|
float(data["latitude"]) != 0 or float(data["longitude"]) != 0) and -89.9 < float(
|
||||||
|
data["latitude"]) < 89.9:
|
||||||
|
return [float(data["latitude"]), float(data["longitude"])]
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Infer a grid locator from a callsign (requires QRZ.com/HamQTH).
|
|
||||||
# Grids that look default are rejected (apologies if your grid really is AA00aa, enjoy your research)
|
|
||||||
def infer_grid_from_callsign_online_lookup(self, call):
|
def infer_grid_from_callsign_online_lookup(self, call):
|
||||||
data = self.get_qrz_data_for_callsign(call)
|
"""Infer a grid locator from a callsign (requires QRZ.com/HamQTH).
|
||||||
if data and "locator" in data and data["locator"].upper() != "AA00" and data["locator"].upper() != "AA00AA" and data["locator"].upper() != "AA00AA00":
|
Grids that look default are rejected (apologies if your grid really is AA00aa, enjoy your research)"""
|
||||||
|
|
||||||
|
data = self._get_qrz_data_for_callsign(call)
|
||||||
|
if data and "locator" in data and data["locator"].upper() != "AA00" and data["locator"].upper() != "AA00AA" and \
|
||||||
|
data["locator"].upper() != "AA00AA00":
|
||||||
return data["locator"]
|
return data["locator"]
|
||||||
data = self.get_hamqth_data_for_callsign(call)
|
data = self._get_hamqth_data_for_callsign(call)
|
||||||
if data and "grid" in data and data["grid"].upper() != "AA00" and data["grid"].upper() != "AA00AA" and data["grid"].upper() != "AA00AA00":
|
if data and "grid" in data and data["grid"].upper() != "AA00" and data["grid"].upper() != "AA00AA" and data[
|
||||||
|
"grid"].upper() != "AA00AA00":
|
||||||
return data["grid"]
|
return data["grid"]
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Infer a textual QTH from a callsign (requires QRZ.com/HamQTH)
|
|
||||||
def infer_qth_from_callsign_online_lookup(self, call):
|
def infer_qth_from_callsign_online_lookup(self, call):
|
||||||
data = self.get_qrz_data_for_callsign(call)
|
"""Infer a textual QTH from a callsign (requires QRZ.com/HamQTH)"""
|
||||||
|
|
||||||
|
data = self._get_qrz_data_for_callsign(call)
|
||||||
if data and "addr2" in data:
|
if data and "addr2" in data:
|
||||||
return data["addr2"]
|
return data["addr2"]
|
||||||
data = self.get_hamqth_data_for_callsign(call)
|
data = self._get_hamqth_data_for_callsign(call)
|
||||||
if data and "qth" in data:
|
if data and "qth" in data:
|
||||||
return data["qth"]
|
return data["qth"]
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Infer a latitude and longitude from a callsign (using DXCC, probably very inaccurate)
|
|
||||||
def infer_latlon_from_callsign_dxcc(self, call):
|
def infer_latlon_from_callsign_dxcc(self, call):
|
||||||
|
"""Infer a latitude and longitude from a callsign (using DXCC, probably very inaccurate)"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
data = self.CALL_INFO_BASIC.get_lat_long(call)
|
data = self._call_info_basic.get_lat_long(call)
|
||||||
if data and "latitude" in data and "longitude" in data:
|
if data and "latitude" in data and "longitude" in data:
|
||||||
loc = [data["latitude"], data["longitude"]]
|
loc = [float(data["latitude"]), float(data["longitude"])]
|
||||||
else:
|
else:
|
||||||
loc = None
|
loc = None
|
||||||
except KeyError:
|
except KeyError:
|
||||||
loc = None
|
loc = None
|
||||||
# Couldn't get anything from basic call info database, try Clublog data
|
# Couldn't get anything from basic call info database, try Clublog data
|
||||||
if not loc:
|
if not loc:
|
||||||
data = self.get_clublog_xml_data_for_callsign(call)
|
data = self._get_clublog_xml_data_for_callsign(call)
|
||||||
if data and "Lat" in data and "Lon" in data:
|
if data and "Lat" in data and "Lon" in data:
|
||||||
loc = [data["Lat"], data["Lon"]]
|
loc = [float(data["Lat"]), float(data["Lon"])]
|
||||||
if not loc:
|
if not loc:
|
||||||
data = self.get_clublog_api_data_for_callsign(call)
|
data = self._get_clublog_api_data_for_callsign(call)
|
||||||
if data and "Lat" in data and "Lon" in data:
|
if data and "Lat" in data and "Lon" in data:
|
||||||
loc = [data["Lat"], data["Lon"]]
|
loc = [float(data["Lat"]), float(data["Lon"])]
|
||||||
return loc
|
return loc
|
||||||
|
|
||||||
# Infer a grid locator from a callsign (using DXCC, probably very inaccurate)
|
|
||||||
def infer_grid_from_callsign_dxcc(self, call):
|
def infer_grid_from_callsign_dxcc(self, call):
|
||||||
|
"""Infer a grid locator from a callsign (using DXCC, probably very inaccurate)"""
|
||||||
|
|
||||||
latlon = self.infer_latlon_from_callsign_dxcc(call)
|
latlon = self.infer_latlon_from_callsign_dxcc(call)
|
||||||
grid = None
|
grid = None
|
||||||
try:
|
try:
|
||||||
@@ -425,47 +422,28 @@ class LookupHelper:
|
|||||||
logging.debug("Invalid lat/lon received for DXCC")
|
logging.debug("Invalid lat/lon received for DXCC")
|
||||||
return grid
|
return grid
|
||||||
|
|
||||||
# Infer a mode from the frequency (in Hz) according to the band plan. Just a guess really.
|
def _get_qrz_data_for_callsign(self, call):
|
||||||
def infer_mode_from_frequency(self, freq):
|
"""Utility method to get QRZ.com data from cache if possible, if not get it from the API and cache it"""
|
||||||
try:
|
|
||||||
khz = freq / 1000.0
|
|
||||||
mode = freq_to_band(khz)["mode"]
|
|
||||||
# Some additional common digimode ranges in addition to what the 3rd-party freq_to_band function returns.
|
|
||||||
# This is mostly here just because freq_to_band is very specific about things like FT8 frequencies, and e.g.
|
|
||||||
# a spot at 7074.5 kHz will be indicated as LSB, even though it's clearly in the FT8 range. Future updates
|
|
||||||
# might include other common digimode centres of activity here, but this achieves the main goal of keeping
|
|
||||||
# large numbers of clearly-FT* spots off the list of people filtering out digimodes.
|
|
||||||
if (7074 <= khz < 7077) or (10136 <= khz < 10139) or (14074 <= khz < 14077) or (18100 <= khz < 18103) or (
|
|
||||||
21074 <= khz < 21077) or (24915 <= khz < 24918) or (28074 <= khz < 28077):
|
|
||||||
mode = "FT8"
|
|
||||||
if (7047.5 <= khz < 7050.5) or (10140 <= khz < 10143) or (14080 <= khz < 14083) or (
|
|
||||||
18104 <= khz < 18107) or (21140 <= khz < 21143) or (24919 <= khz < 24922) or (28180 <= khz < 28183):
|
|
||||||
mode = "FT4"
|
|
||||||
return mode
|
|
||||||
except KeyError:
|
|
||||||
return None
|
|
||||||
|
|
||||||
# Utility method to get QRZ.com data from cache if possible, if not get it from the API and cache it
|
|
||||||
def get_qrz_data_for_callsign(self, call):
|
|
||||||
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
||||||
if call in self.QRZ_CALLSIGN_DATA_CACHE:
|
if call in self._qrz_callsign_data_cache:
|
||||||
return self.QRZ_CALLSIGN_DATA_CACHE.get(call)
|
return self._qrz_callsign_data_cache.get(call)
|
||||||
elif self.QRZ_AVAILABLE:
|
elif self._qrz_available:
|
||||||
try:
|
try:
|
||||||
data = self.LOOKUP_LIB_QRZ.lookup_callsign(callsign=call)
|
data = self._lookup_lib_qrz.lookup_callsign(callsign=call)
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._qrz_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# QRZ had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
# QRZ had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
||||||
try:
|
try:
|
||||||
data = self.LOOKUP_LIB_QRZ.lookup_callsign(callsign=callinfo.Callinfo.get_homecall(call))
|
data = self._lookup_lib_qrz.lookup_callsign(callsign=callinfo.Callinfo.get_homecall(call))
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._qrz_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# QRZ had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
# QRZ had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE.add(call, None, expire=604800) # 1 week in seconds
|
self._qrz_callsign_data_cache.add(call, None, expire=604800) # 1 week in seconds
|
||||||
return None
|
return None
|
||||||
except (Exception):
|
except Exception:
|
||||||
# General exception like a timeout when communicating with QRZ. Return None this time, but don't cache
|
# General exception like a timeout when communicating with QRZ. Return None this time, but don't cache
|
||||||
# that, so we can try again next time.
|
# that, so we can try again next time.
|
||||||
logging.error("Exception when looking up QRZ data")
|
logging.error("Exception when looking up QRZ data")
|
||||||
@@ -473,16 +451,17 @@ class LookupHelper:
|
|||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Utility method to get HamQTH data from cache if possible, if not get it from the API and cache it
|
def _get_hamqth_data_for_callsign(self, call):
|
||||||
def get_hamqth_data_for_callsign(self, call):
|
"""Utility method to get HamQTH data from cache if possible, if not get it from the API and cache it"""
|
||||||
|
|
||||||
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
||||||
if call in self.HAMQTH_CALLSIGN_DATA_CACHE:
|
if call in self._hamqth_callsign_data_cache:
|
||||||
return self.HAMQTH_CALLSIGN_DATA_CACHE.get(call)
|
return self._hamqth_callsign_data_cache.get(call)
|
||||||
elif self.HAMQTH_AVAILABLE:
|
elif self._hamqth_available:
|
||||||
try:
|
try:
|
||||||
# First we need to log in and get a session token.
|
# First we need to log in and get a session token.
|
||||||
session_data = self.HAMQTH_SESSION_LOOKUP_CACHE.get(
|
session_data = self._hamqth_session_lookup_cache.get(
|
||||||
self.HAMQTH_BASE_URL + "?u=" + urllib.parse.quote_plus(config["hamqth-username"]) +
|
self._hamqth_base_url + "?u=" + urllib.parse.quote_plus(config["hamqth-username"]) +
|
||||||
"&p=" + urllib.parse.quote_plus(config["hamqth-password"]), headers=HTTP_HEADERS).content
|
"&p=" + urllib.parse.quote_plus(config["hamqth-password"]), headers=HTTP_HEADERS).content
|
||||||
dict_data = xmltodict.parse(session_data)
|
dict_data = xmltodict.parse(session_data)
|
||||||
if "session_id" in dict_data["HamQTH"]["session"]:
|
if "session_id" in dict_data["HamQTH"]["session"]:
|
||||||
@@ -491,83 +470,146 @@ class LookupHelper:
|
|||||||
# Now look up the actual data.
|
# Now look up the actual data.
|
||||||
try:
|
try:
|
||||||
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
|
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
|
||||||
self.HAMQTH_BASE_URL + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
|
self._hamqth_base_url + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
|
||||||
call) + "&prg=" + HAMQTH_PRG, headers=HTTP_HEADERS).content
|
call) + "&prg=" + HAMQTH_PRG, headers=HTTP_HEADERS).content
|
||||||
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
|
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
|
||||||
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._hamqth_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# HamQTH had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
# HamQTH had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
||||||
try:
|
try:
|
||||||
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
|
lookup_data = SEMI_STATIC_URL_DATA_CACHE.get(
|
||||||
self.HAMQTH_BASE_URL + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
|
self._hamqth_base_url + "?id=" + session_id + "&callsign=" + urllib.parse.quote_plus(
|
||||||
callinfo.Callinfo.get_homecall(call)) + "&prg=" + HAMQTH_PRG, headers=HTTP_HEADERS).content
|
callinfo.Callinfo.get_homecall(call)) + "&prg=" + HAMQTH_PRG,
|
||||||
|
headers=HTTP_HEADERS).content
|
||||||
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
|
data = xmltodict.parse(lookup_data)["HamQTH"]["search"]
|
||||||
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._hamqth_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# HamQTH had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
# HamQTH had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
||||||
self.HAMQTH_CALLSIGN_DATA_CACHE.add(call, None, expire=604800) # 1 week in seconds
|
self._hamqth_callsign_data_cache.add(call, None, expire=604800) # 1 week in seconds
|
||||||
return None
|
return None
|
||||||
|
|
||||||
else:
|
else:
|
||||||
logging.warn("HamQTH login details incorrect, failed to look up with HamQTH.")
|
logging.warning("HamQTH login details incorrect, failed to look up with HamQTH.")
|
||||||
except:
|
except:
|
||||||
logging.error("Exception when looking up HamQTH data")
|
logging.error("Exception when looking up HamQTH data")
|
||||||
return None
|
return None
|
||||||
|
return None
|
||||||
|
|
||||||
|
def _get_clublog_api_data_for_callsign(self, call):
|
||||||
|
"""Utility method to get Clublog API data from cache if possible, if not get it from the API and cache it"""
|
||||||
|
|
||||||
# Utility method to get Clublog API data from cache if possible, if not get it from the API and cache it
|
|
||||||
def get_clublog_api_data_for_callsign(self, call):
|
|
||||||
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
# Fetch from cache if we can, otherwise fetch from the API and cache it
|
||||||
if call in self.CLUBLOG_CALLSIGN_DATA_CACHE:
|
if call in self._clublog_callsign_data_cache:
|
||||||
return self.CLUBLOG_CALLSIGN_DATA_CACHE.get(call)
|
return self._clublog_callsign_data_cache.get(call)
|
||||||
elif self.CLUBLOG_API_AVAILABLE:
|
elif self._clublog_api_available:
|
||||||
try:
|
try:
|
||||||
data = self.LOOKUP_LIB_CLUBLOG_API.lookup_callsign(callsign=call)
|
data = self._lookup_lib_clublog_api.lookup_callsign(callsign=call)
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._clublog_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# Clublog had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
# Clublog had no info for the call, but maybe it had prefixes or suffixes. Try again with the base call.
|
||||||
try:
|
try:
|
||||||
data = self.LOOKUP_LIB_CLUBLOG_API.lookup_callsign(callsign=callinfo.Callinfo.get_homecall(call))
|
data = self._lookup_lib_clublog_api.lookup_callsign(callsign=callinfo.Callinfo.get_homecall(call))
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE.add(call, data, expire=604800) # 1 week in seconds
|
self._clublog_callsign_data_cache.add(call, data, expire=604800) # 1 week in seconds
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# Clublog had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
# Clublog had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE.add(call, None, expire=604800) # 1 week in seconds
|
self._clublog_callsign_data_cache.add(call, None, expire=604800) # 1 week in seconds
|
||||||
return None
|
return None
|
||||||
except APIKeyMissingError:
|
except APIKeyMissingError:
|
||||||
# User API key was wrong, warn
|
# User API key was wrong, warn
|
||||||
logging.error("Could not look up via Clublog API, key " + self.CLUBLOG_API_KEY + " was rejected.")
|
logging.error("Could not look up via Clublog API, key " + self._clublog_api_key + " was rejected.")
|
||||||
return None
|
return None
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Utility method to get Clublog XML data from file
|
def _get_clublog_xml_data_for_callsign(self, call):
|
||||||
def get_clublog_xml_data_for_callsign(self, call):
|
"""Utility method to get Clublog XML data from file"""
|
||||||
if self.CLUBLOG_XML_AVAILABLE:
|
|
||||||
|
if self._clublog_xml_available:
|
||||||
try:
|
try:
|
||||||
data = self.LOOKUP_LIB_CLUBLOG_XML.lookup_callsign(callsign=call)
|
data = self._lookup_lib_clublog_xml.lookup_callsign(callsign=call)
|
||||||
return data
|
return data
|
||||||
except (KeyError, ValueError):
|
except (KeyError, ValueError):
|
||||||
# Clublog had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
# Clublog had no info for the call, that's OK. Cache a None so we don't try to look this up again
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE.add(call, None, expire=604800) # 1 week in seconds
|
self._clublog_callsign_data_cache.add(call, None, expire=604800) # 1 week in seconds
|
||||||
return None
|
return None
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Utility method to get generic DXCC data from our lookup table, if we can find it
|
def _get_dxcc_data_for_callsign(self, call):
|
||||||
def get_dxcc_data_for_callsign(self, call):
|
"""Utility method to get generic DXCC data from our lookup table, if we can find it"""
|
||||||
for entry in self.DXCC_DATA.values():
|
|
||||||
if re.match(entry["prefixRegex"], call):
|
for entry in self._dxcc_data.values():
|
||||||
|
if entry["_prefixRegexCompiled"].match(call):
|
||||||
return entry
|
return entry
|
||||||
return None
|
return None
|
||||||
|
|
||||||
# Shutdown method to close down any caches neatly.
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.QRZ_CALLSIGN_DATA_CACHE.close()
|
"""Shutdown method to close down any caches neatly."""
|
||||||
self.CLUBLOG_CALLSIGN_DATA_CACHE.close()
|
|
||||||
|
self._qrz_callsign_data_cache.close()
|
||||||
|
self._clublog_callsign_data_cache.close()
|
||||||
|
|
||||||
|
|
||||||
# Singleton object
|
# Singleton object
|
||||||
lookup_helper = LookupHelper()
|
lookup_helper = LookupHelper()
|
||||||
|
|
||||||
|
def infer_mode_from_comment(comment):
|
||||||
|
"""Infer a mode from the comment"""
|
||||||
|
|
||||||
|
for mode in ALL_MODES:
|
||||||
|
if mode in comment.upper():
|
||||||
|
return mode
|
||||||
|
for mode in MODE_ALIASES.keys():
|
||||||
|
if mode in comment.upper():
|
||||||
|
return MODE_ALIASES[mode]
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def infer_mode_type_from_mode(mode):
|
||||||
|
"""Infer a "mode family" from a mode."""
|
||||||
|
|
||||||
|
if mode.upper() in CW_MODES:
|
||||||
|
return "CW"
|
||||||
|
elif mode.upper() in PHONE_MODES:
|
||||||
|
return "PHONE"
|
||||||
|
elif mode.upper() in DATA_MODES:
|
||||||
|
return "DATA"
|
||||||
|
else:
|
||||||
|
if mode.upper() != "OTHER":
|
||||||
|
logging.warning("Found an unrecognised mode: " + mode + ". Developer should categorise this.")
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def infer_band_from_freq(freq):
|
||||||
|
"""Infer a band from a frequency in Hz"""
|
||||||
|
|
||||||
|
for b in BANDS:
|
||||||
|
if b.start_freq <= freq <= b.end_freq:
|
||||||
|
return b
|
||||||
|
return UNKNOWN_BAND
|
||||||
|
|
||||||
|
|
||||||
|
def infer_mode_from_frequency(freq):
|
||||||
|
"""Infer a mode from the frequency (in Hz) according to the band plan. Just a guess really."""
|
||||||
|
|
||||||
|
try:
|
||||||
|
khz = freq / 1000.0
|
||||||
|
mode = freq_to_band(khz)["mode"]
|
||||||
|
# Some additional common digimode ranges in addition to what the 3rd-party freq_to_band function returns.
|
||||||
|
# This is mostly here just because freq_to_band is very specific about things like FT8 frequencies, and e.g.
|
||||||
|
# a spot at 7074.5 kHz will be indicated as LSB, even though it's clearly in the FT8 range. Future updates
|
||||||
|
# might include other common digimode centres of activity here, but this achieves the main goal of keeping
|
||||||
|
# large numbers of clearly-FT* spots off the list of people filtering out digimodes.
|
||||||
|
if (7074 <= khz < 7077) or (10136 <= khz < 10139) or (14074 <= khz < 14077) or (18100 <= khz < 18103) or (
|
||||||
|
21074 <= khz < 21077) or (24915 <= khz < 24918) or (28074 <= khz < 28077):
|
||||||
|
mode = "FT8"
|
||||||
|
if (7047.5 <= khz < 7050.5) or (10140 <= khz < 10143) or (14080 <= khz < 14083) or (
|
||||||
|
18104 <= khz < 18107) or (21140 <= khz < 21143) or (24919 <= khz < 24922) or (28180 <= khz < 28183):
|
||||||
|
mode = "FT4"
|
||||||
|
return mode
|
||||||
|
except KeyError:
|
||||||
|
return None
|
||||||
|
|||||||
@@ -31,6 +31,7 @@ memory_use_gauge = Gauge(
|
|||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
# Get a Prometheus metrics response for the web server
|
|
||||||
def get_metrics():
|
def get_metrics():
|
||||||
|
"""Get a Prometheus metrics response for the web server"""
|
||||||
|
|
||||||
return generate_latest(registry)
|
return generate_latest(registry)
|
||||||
|
|||||||
@@ -8,18 +8,20 @@ from core.constants import SIGS, HTTP_HEADERS
|
|||||||
from core.geo_utils import wab_wai_square_to_lat_lon
|
from core.geo_utils import wab_wai_square_to_lat_lon
|
||||||
|
|
||||||
|
|
||||||
# Utility function to get the regex string for a SIG reference for a named SIG. If no match is found, None will be returned.
|
|
||||||
def get_ref_regex_for_sig(sig):
|
def get_ref_regex_for_sig(sig):
|
||||||
|
"""Utility function to get the regex string for a SIG reference for a named SIG. If no match is found, None will be returned."""
|
||||||
|
|
||||||
for s in SIGS:
|
for s in SIGS:
|
||||||
if s.name.upper() == sig.upper():
|
if s.name.upper() == sig.upper():
|
||||||
return s.ref_regex
|
return s.ref_regex
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
# Look up details of a SIG reference (e.g. POTA park) such as name, lat/lon, and grid. Takes in a sig_ref object which
|
|
||||||
# must at minimum have a "sig" and an "id". The rest of the object will be populated and returned.
|
|
||||||
# Note there is currently no support for KRMNPA location lookup, see issue #61.
|
|
||||||
def populate_sig_ref_info(sig_ref):
|
def populate_sig_ref_info(sig_ref):
|
||||||
|
"""Look up details of a SIG reference (e.g. POTA park) such as name, lat/lon, and grid. Takes in a sig_ref object which
|
||||||
|
must at minimum have a "sig" and an "id". The rest of the object will be populated and returned.
|
||||||
|
Note there is currently no support for KRMNPA location lookup, see issue #61."""
|
||||||
|
|
||||||
if sig_ref.sig is None or sig_ref.id is None:
|
if sig_ref.sig is None or sig_ref.id is None:
|
||||||
logging.warning("Failed to look up sig_ref info, sig or id were not set.")
|
logging.warning("Failed to look up sig_ref info, sig or id were not set.")
|
||||||
|
|
||||||
@@ -67,7 +69,7 @@ def populate_sig_ref_info(sig_ref):
|
|||||||
sig_ref.longitude = data["longitude"] if "longitude" in data else None
|
sig_ref.longitude = data["longitude"] if "longitude" in data else None
|
||||||
elif sig.upper() == "WWFF":
|
elif sig.upper() == "WWFF":
|
||||||
wwff_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://wwff.co/wwff-data/wwff_directory.csv",
|
wwff_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://wwff.co/wwff-data/wwff_directory.csv",
|
||||||
headers=HTTP_HEADERS)
|
headers=HTTP_HEADERS)
|
||||||
wwff_dr = csv.DictReader(wwff_csv_data.content.decode().splitlines())
|
wwff_dr = csv.DictReader(wwff_csv_data.content.decode().splitlines())
|
||||||
for row in wwff_dr:
|
for row in wwff_dr:
|
||||||
if row["reference"] == ref_id:
|
if row["reference"] == ref_id:
|
||||||
@@ -75,7 +77,8 @@ def populate_sig_ref_info(sig_ref):
|
|||||||
sig_ref.url = "https://wwff.co/directory/?showRef=" + ref_id
|
sig_ref.url = "https://wwff.co/directory/?showRef=" + ref_id
|
||||||
sig_ref.grid = row["iaruLocator"] if "iaruLocator" in row and row["iaruLocator"] != "-" else None
|
sig_ref.grid = row["iaruLocator"] if "iaruLocator" in row and row["iaruLocator"] != "-" else None
|
||||||
sig_ref.latitude = float(row["latitude"]) if "latitude" in row and row["latitude"] != "-" else None
|
sig_ref.latitude = float(row["latitude"]) if "latitude" in row and row["latitude"] != "-" else None
|
||||||
sig_ref.longitude = float(row["longitude"]) if "longitude" in row and row["longitude"] != "-" else None
|
sig_ref.longitude = float(row["longitude"]) if "longitude" in row and row[
|
||||||
|
"longitude"] != "-" else None
|
||||||
break
|
break
|
||||||
elif sig.upper() == "SIOTA":
|
elif sig.upper() == "SIOTA":
|
||||||
siota_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://www.silosontheair.com/data/silos.csv",
|
siota_csv_data = SEMI_STATIC_URL_DATA_CACHE.get("https://www.silosontheair.com/data/silos.csv",
|
||||||
@@ -124,7 +127,8 @@ def populate_sig_ref_info(sig_ref):
|
|||||||
sig_ref.name = sig_ref.id
|
sig_ref.name = sig_ref.id
|
||||||
sig_ref.url = "https://www.beachesontheair.com/beaches/" + sig_ref.name.lower().replace(" ", "-")
|
sig_ref.url = "https://www.beachesontheair.com/beaches/" + sig_ref.name.lower().replace(" ", "-")
|
||||||
elif sig.upper() == "LLOTA":
|
elif sig.upper() == "LLOTA":
|
||||||
data = SEMI_STATIC_URL_DATA_CACHE.get("https://llota.app/api/public/references", headers=HTTP_HEADERS).json()
|
data = SEMI_STATIC_URL_DATA_CACHE.get("https://llota.app/api/public/references",
|
||||||
|
headers=HTTP_HEADERS).json()
|
||||||
if data:
|
if data:
|
||||||
for ref in data:
|
for ref in data:
|
||||||
if ref["reference_code"] == ref_id:
|
if ref["reference_code"] == ref_id:
|
||||||
|
|||||||
@@ -1,6 +1,6 @@
|
|||||||
import os
|
import os
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from threading import Timer
|
from threading import Thread, Event
|
||||||
|
|
||||||
import psutil
|
import psutil
|
||||||
import pytz
|
import pytz
|
||||||
@@ -10,70 +10,88 @@ from core.constants import SOFTWARE_VERSION
|
|||||||
from core.prometheus_metrics_handler import memory_use_gauge, spots_gauge, alerts_gauge
|
from core.prometheus_metrics_handler import memory_use_gauge, spots_gauge, alerts_gauge
|
||||||
|
|
||||||
|
|
||||||
# Provides a timed update of the application's status data.
|
|
||||||
class StatusReporter:
|
class StatusReporter:
|
||||||
|
"""Provides a timed update of the application's status data."""
|
||||||
|
|
||||||
# Constructor
|
|
||||||
def __init__(self, status_data, run_interval, web_server, cleanup_timer, spots, spot_providers, alerts,
|
def __init__(self, status_data, run_interval, web_server, cleanup_timer, spots, spot_providers, alerts,
|
||||||
alert_providers):
|
alert_providers, solar_condition_providers):
|
||||||
self.status_data = status_data
|
"""Constructor"""
|
||||||
self.run_interval = run_interval
|
|
||||||
self.web_server = web_server
|
|
||||||
self.cleanup_timer = cleanup_timer
|
|
||||||
self.spots = spots
|
|
||||||
self.spot_providers = spot_providers
|
|
||||||
self.alerts = alerts
|
|
||||||
self.alert_providers = alert_providers
|
|
||||||
self.run_timer = None
|
|
||||||
self.startup_time = datetime.now(pytz.UTC)
|
|
||||||
|
|
||||||
self.status_data["software-version"] = SOFTWARE_VERSION
|
self._status_data = status_data
|
||||||
self.status_data["server-owner-callsign"] = SERVER_OWNER_CALLSIGN
|
self._run_interval = run_interval
|
||||||
|
self._web_server = web_server
|
||||||
|
self._cleanup_timer = cleanup_timer
|
||||||
|
self._spots = spots
|
||||||
|
self._spot_providers = spot_providers
|
||||||
|
self._alerts = alerts
|
||||||
|
self._alert_providers = alert_providers
|
||||||
|
self._solar_condition_providers = solar_condition_providers
|
||||||
|
self._thread = None
|
||||||
|
self._stop_event = Event()
|
||||||
|
self._startup_time = datetime.now(pytz.UTC)
|
||||||
|
|
||||||
|
self._status_data["software-version"] = SOFTWARE_VERSION
|
||||||
|
self._status_data["server-owner-callsign"] = SERVER_OWNER_CALLSIGN
|
||||||
|
|
||||||
# Start the cleanup timer
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.run()
|
"""Start the reporter thread"""
|
||||||
|
|
||||||
|
self._thread = Thread(target=self._run, daemon=True)
|
||||||
|
self._thread.start()
|
||||||
|
|
||||||
# Stop any threads and prepare for application shutdown
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.run_timer.cancel()
|
"""Stop any threads and prepare for application shutdown"""
|
||||||
|
|
||||||
# Write status information and reschedule next timer
|
self._stop_event.set()
|
||||||
def run(self):
|
|
||||||
self.status_data["uptime"] = (datetime.now(pytz.UTC) - self.startup_time).total_seconds()
|
def _run(self):
|
||||||
self.status_data["mem_use_mb"] = round(psutil.Process(os.getpid()).memory_info().rss / (1024 * 1024), 3)
|
"""Thread entry point: report immediately on startup, then on each interval until stopped"""
|
||||||
self.status_data["num_spots"] = len(self.spots)
|
|
||||||
self.status_data["num_alerts"] = len(self.alerts)
|
while True:
|
||||||
self.status_data["spot_providers"] = list(
|
self._report()
|
||||||
|
if self._stop_event.wait(timeout=self._run_interval):
|
||||||
|
break
|
||||||
|
|
||||||
|
def _report(self):
|
||||||
|
"""Write status information"""
|
||||||
|
|
||||||
|
self._status_data["uptime"] = (datetime.now(pytz.UTC) - self._startup_time).total_seconds()
|
||||||
|
self._status_data["mem_use_mb"] = round(psutil.Process(os.getpid()).memory_info().rss / (1024 * 1024), 3)
|
||||||
|
self._status_data["num_spots"] = len(self._spots)
|
||||||
|
self._status_data["num_alerts"] = len(self._alerts)
|
||||||
|
self._status_data["spot_providers"] = list(
|
||||||
map(lambda p: {"name": p.name, "enabled": p.enabled, "status": p.status,
|
map(lambda p: {"name": p.name, "enabled": p.enabled, "status": p.status,
|
||||||
"last_updated": p.last_update_time.replace(
|
"last_updated": p.last_update_time.replace(
|
||||||
tzinfo=pytz.UTC).timestamp() if p.last_update_time.year > 2000 else 0,
|
tzinfo=pytz.UTC).timestamp() if p.last_update_time.year > 2000 else 0,
|
||||||
"last_spot": p.last_spot_time.replace(
|
"last_spot": p.last_spot_time.replace(
|
||||||
tzinfo=pytz.UTC).timestamp() if p.last_spot_time.year > 2000 else 0}, self.spot_providers))
|
tzinfo=pytz.UTC).timestamp() if p.last_spot_time.year > 2000 else 0},
|
||||||
self.status_data["alert_providers"] = list(
|
self._spot_providers))
|
||||||
|
self._status_data["alert_providers"] = list(
|
||||||
map(lambda p: {"name": p.name, "enabled": p.enabled, "status": p.status,
|
map(lambda p: {"name": p.name, "enabled": p.enabled, "status": p.status,
|
||||||
"last_updated": p.last_update_time.replace(
|
"last_updated": p.last_update_time.replace(
|
||||||
tzinfo=pytz.UTC).timestamp() if p.last_update_time.year > 2000 else 0},
|
tzinfo=pytz.UTC).timestamp() if p.last_update_time.year > 2000 else 0},
|
||||||
self.alert_providers))
|
self._alert_providers))
|
||||||
self.status_data["cleanup"] = {"status": self.cleanup_timer.status,
|
self._status_data["solar_condition_providers"] = list(
|
||||||
"last_ran": self.cleanup_timer.last_cleanup_time.replace(
|
map(lambda p: {"name": p.name, "enabled": p.enabled, "status": p.status,
|
||||||
tzinfo=pytz.UTC).timestamp() if self.cleanup_timer.last_cleanup_time else 0}
|
"last_updated": p.last_update_time.replace(
|
||||||
self.status_data["webserver"] = {"status": self.web_server.web_server_metrics["status"],
|
tzinfo=pytz.UTC).timestamp() if p.last_update_time.year > 2000 else 0},
|
||||||
"last_api_access": self.web_server.web_server_metrics[
|
self._solar_condition_providers))
|
||||||
"last_api_access_time"].replace(
|
self._status_data["cleanup"] = {"status": self._cleanup_timer.status,
|
||||||
tzinfo=pytz.UTC).timestamp() if self.web_server.web_server_metrics[
|
"last_ran": self._cleanup_timer.last_cleanup_time.replace(
|
||||||
"last_api_access_time"] else 0,
|
tzinfo=pytz.UTC).timestamp() if self._cleanup_timer.last_cleanup_time else 0}
|
||||||
"api_access_count": self.web_server.web_server_metrics["api_access_counter"],
|
self._status_data["webserver"] = {"status": self._web_server.web_server_metrics["status"],
|
||||||
"last_page_access": self.web_server.web_server_metrics[
|
"last_api_access": self._web_server.web_server_metrics[
|
||||||
"last_page_access_time"].replace(
|
"last_api_access_time"].replace(
|
||||||
tzinfo=pytz.UTC).timestamp() if self.web_server.web_server_metrics[
|
tzinfo=pytz.UTC).timestamp() if self._web_server.web_server_metrics[
|
||||||
"last_page_access_time"] else 0,
|
"last_api_access_time"] else 0,
|
||||||
"page_access_count": self.web_server.web_server_metrics["page_access_counter"]}
|
"api_access_count": self._web_server.web_server_metrics["api_access_counter"],
|
||||||
|
"last_page_access": self._web_server.web_server_metrics[
|
||||||
|
"last_page_access_time"].replace(
|
||||||
|
tzinfo=pytz.UTC).timestamp() if self._web_server.web_server_metrics[
|
||||||
|
"last_page_access_time"] else 0,
|
||||||
|
"page_access_count": self._web_server.web_server_metrics["page_access_counter"]}
|
||||||
|
|
||||||
# Update Prometheus metrics
|
# Update Prometheus metrics
|
||||||
memory_use_gauge.set(psutil.Process(os.getpid()).memory_info().rss * 1024)
|
memory_use_gauge.set(psutil.Process(os.getpid()).memory_info().rss)
|
||||||
spots_gauge.set(len(self.spots))
|
spots_gauge.set(len(self._spots))
|
||||||
alerts_gauge.set(len(self.alerts))
|
alerts_gauge.set(len(self._alerts))
|
||||||
|
|
||||||
self.run_timer = Timer(self.run_interval, self.run)
|
|
||||||
self.run_timer.start()
|
|
||||||
|
|||||||
@@ -1,5 +1,15 @@
|
|||||||
# Convert objects to serialisable things. Used by JSON serialiser as a default when it encounters unserializable things.
|
|
||||||
# Just converts objects to dict. Try to avoid doing anything clever here when serialising spots, because we also need
|
|
||||||
# to receive spots without complex handling.
|
|
||||||
def serialize_everything(obj):
|
def serialize_everything(obj):
|
||||||
|
"""Convert objects to serialisable things. Used by JSON serialiser as a default when it encounters unserializable things.
|
||||||
|
Just converts objects to dict. Try to avoid doing anything clever here when serialising spots, because we also need
|
||||||
|
to receive spots without complex handling."""
|
||||||
return obj.__dict__
|
return obj.__dict__
|
||||||
|
|
||||||
|
|
||||||
|
def empty_queue(q):
|
||||||
|
"""Empty a queue"""
|
||||||
|
|
||||||
|
while not q.empty():
|
||||||
|
try:
|
||||||
|
q.get_nowait()
|
||||||
|
except:
|
||||||
|
break
|
||||||
|
|||||||
@@ -10,9 +10,10 @@ from core.lookup_helper import lookup_helper
|
|||||||
from core.sig_utils import populate_sig_ref_info
|
from core.sig_utils import populate_sig_ref_info
|
||||||
|
|
||||||
|
|
||||||
# Data class that defines an alert.
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class Alert:
|
class Alert:
|
||||||
|
"""Data class that defines an alert."""
|
||||||
|
|
||||||
# Unique identifier for the alert
|
# Unique identifier for the alert
|
||||||
id: str = None
|
id: str = None
|
||||||
# Callsigns of the operators that has been alerted
|
# Callsigns of the operators that has been alerted
|
||||||
@@ -60,8 +61,9 @@ class Alert:
|
|||||||
# The ID the source gave it, if any.
|
# The ID the source gave it, if any.
|
||||||
source_id: str = None
|
source_id: str = None
|
||||||
|
|
||||||
# Infer missing parameters where possible
|
|
||||||
def infer_missing(self):
|
def infer_missing(self):
|
||||||
|
"""Infer missing parameters where possible"""
|
||||||
|
|
||||||
# If we somehow don't have a start time, set it to zero so it sorts off the bottom of any list but
|
# If we somehow don't have a start time, set it to zero so it sorts off the bottom of any list but
|
||||||
# clients can still reliably parse it as a number.
|
# clients can still reliably parse it as a number.
|
||||||
if not self.start_time:
|
if not self.start_time:
|
||||||
@@ -79,7 +81,8 @@ class Alert:
|
|||||||
if self.received_time and not self.received_time_iso:
|
if self.received_time and not self.received_time_iso:
|
||||||
self.received_time_iso = datetime.fromtimestamp(self.received_time, pytz.UTC).isoformat()
|
self.received_time_iso = datetime.fromtimestamp(self.received_time, pytz.UTC).isoformat()
|
||||||
|
|
||||||
# DX country, continent, zones etc. from callsign
|
# DX country, continent, zones etc. from callsign. CQ/ITU zone are better looked up with a location but we don't
|
||||||
|
# have a real location for alerts.
|
||||||
if self.dx_calls and self.dx_calls[0] and not self.dx_country:
|
if self.dx_calls and self.dx_calls[0] and not self.dx_country:
|
||||||
self.dx_country = lookup_helper.infer_country_from_callsign(self.dx_calls[0])
|
self.dx_country = lookup_helper.infer_country_from_callsign(self.dx_calls[0])
|
||||||
if self.dx_calls and self.dx_calls[0] and not self.dx_continent:
|
if self.dx_calls and self.dx_calls[0] and not self.dx_continent:
|
||||||
@@ -102,7 +105,7 @@ class Alert:
|
|||||||
|
|
||||||
# If the spot itself doesn't have a SIG yet, but we have at least one SIG reference, take that reference's SIG
|
# If the spot itself doesn't have a SIG yet, but we have at least one SIG reference, take that reference's SIG
|
||||||
# and apply it to the whole spot.
|
# and apply it to the whole spot.
|
||||||
if self.sig_refs and len(self.sig_refs) > 0 and not self.sig:
|
if self.sig_refs and len(self.sig_refs) > 0 and self.sig_refs[0] and not self.sig:
|
||||||
self.sig = self.sig_refs[0].sig
|
self.sig = self.sig_refs[0].sig
|
||||||
|
|
||||||
# DX operator details lookup, using QRZ.com. This should be the last resort compared to taking the data from
|
# DX operator details lookup, using QRZ.com. This should be the last resort compared to taking the data from
|
||||||
@@ -121,14 +124,16 @@ class Alert:
|
|||||||
self_copy.received_time_iso = ""
|
self_copy.received_time_iso = ""
|
||||||
self.id = hashlib.sha256(str(self_copy).encode("utf-8")).hexdigest()
|
self.id = hashlib.sha256(str(self_copy).encode("utf-8")).hexdigest()
|
||||||
|
|
||||||
# JSON serialise
|
|
||||||
def to_json(self):
|
def to_json(self):
|
||||||
|
"""JSON serialise"""
|
||||||
|
|
||||||
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
|
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
|
||||||
|
|
||||||
# Decide if this alert has expired (in which case it should not be added to the system in the first place, and not
|
|
||||||
# returned by the web server if later requested, and removed by the cleanup functions). "Expired" is defined as
|
|
||||||
# either having an end_time in the past, or if it only has a start_time, then that start time was more than 3 hours
|
|
||||||
# ago. If it somehow doesn't have a start_time either, it is considered to be expired.
|
|
||||||
def expired(self):
|
def expired(self):
|
||||||
|
"""Decide if this alert has expired (in which case it should not be added to the system in the first place, and not
|
||||||
|
returned by the web server if later requested, and removed by the cleanup functions). "Expired" is defined as
|
||||||
|
either having an end_time in the past, or if it only has a start_time, then that start time was more than 3 hours
|
||||||
|
ago. If it somehow doesn't have a start_time either, it is considered to be expired."""
|
||||||
|
|
||||||
return not self.start_time or (self.end_time and self.end_time < datetime.now(pytz.UTC).timestamp()) or (
|
return not self.start_time or (self.end_time and self.end_time < datetime.now(pytz.UTC).timestamp()) or (
|
||||||
not self.end_time and self.start_time < (datetime.now(pytz.UTC) - timedelta(hours=3)).timestamp())
|
not self.end_time and self.start_time < (datetime.now(pytz.UTC) - timedelta(hours=3)).timestamp())
|
||||||
|
|||||||
@@ -1,8 +1,10 @@
|
|||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
|
||||||
# Data class that defines a band.
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class Band:
|
class Band:
|
||||||
|
"""Data class that defines a band."""
|
||||||
|
|
||||||
# Band name
|
# Band name
|
||||||
name: str
|
name: str
|
||||||
# Start frequency, in Hz
|
# Start frequency, in Hz
|
||||||
|
|||||||
@@ -1,8 +1,10 @@
|
|||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
|
||||||
# Data class that defines a Special Interest Group.
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class SIG:
|
class SIG:
|
||||||
|
"""Data class that defines a Special Interest Group."""
|
||||||
|
|
||||||
# SIG name, e.g. "POTA"
|
# SIG name, e.g. "POTA"
|
||||||
name: str
|
name: str
|
||||||
# Description, e.g. "Parks on the Air"
|
# Description, e.g. "Parks on the Air"
|
||||||
|
|||||||
@@ -1,9 +1,11 @@
|
|||||||
from dataclasses import dataclass
|
from dataclasses import dataclass
|
||||||
|
|
||||||
# Data class that defines a Special Interest Group "info" or reference. As well as the basic reference ID we include a
|
|
||||||
# name and a lookup URL.
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class SIGRef:
|
class SIGRef:
|
||||||
|
"""Data class that defines a Special Interest Group "info" or reference. As well as the basic reference ID we include a
|
||||||
|
name and a lookup URL."""
|
||||||
|
|
||||||
# Reference ID, e.g. "GB-0001".
|
# Reference ID, e.g. "GB-0001".
|
||||||
id: str
|
id: str
|
||||||
# SIG that this reference is in, e.g. "POTA".
|
# SIG that this reference is in, e.g. "POTA".
|
||||||
|
|||||||
168
data/solar_conditions.py
Normal file
168
data/solar_conditions.py
Normal file
@@ -0,0 +1,168 @@
|
|||||||
|
import json
|
||||||
|
from dataclasses import dataclass
|
||||||
|
|
||||||
|
# Lookup tables for derived text descriptions.
|
||||||
|
# Each threshold-based table is a list of (min_value, description) pairs in descending order;
|
||||||
|
# the first entry whose threshold the value meets or exceeds is used.
|
||||||
|
|
||||||
|
BLACKOUT_DESCRIPTIONS = {
|
||||||
|
"X": "Wide area HF radio blackout across sunlit side",
|
||||||
|
"M": "Occasional loss of HF communications on sunlit side",
|
||||||
|
"C": "Low absorption of HF signals on sunlit side",
|
||||||
|
"B": "No significant radio blackout",
|
||||||
|
"A": "No impact",
|
||||||
|
}
|
||||||
|
|
||||||
|
PROTON_FLUX_DESCRIPTIONS = [
|
||||||
|
(1000000, "Complete HF blackout in polar regions"),
|
||||||
|
(100000, "Partial HF blackout in polar regions"),
|
||||||
|
(10000, "Degraded HF propagation in polar regions"),
|
||||||
|
(1000, "Small effect on HF propagation in polar regions"),
|
||||||
|
(100, "Minor effect on HF propagation in polar regions"),
|
||||||
|
(10, "Very minor effect on HF propagation in polar regions"),
|
||||||
|
(0, "No impact"),
|
||||||
|
]
|
||||||
|
|
||||||
|
SOLAR_STORM_SCALES = [
|
||||||
|
(100000, 5),
|
||||||
|
(10000, 4),
|
||||||
|
(1000, 3),
|
||||||
|
(100, 2),
|
||||||
|
(10, 1),
|
||||||
|
(0, 0),
|
||||||
|
]
|
||||||
|
|
||||||
|
GEOMAG_STORM_DESCRIPTIONS = [
|
||||||
|
(9, "Complete HF blackout"),
|
||||||
|
(8, "HF sporadic only"),
|
||||||
|
(7, "HF intermittent"),
|
||||||
|
(6, "HF fading at higher latitudes"),
|
||||||
|
(5, "HF fading at higher latitudes"),
|
||||||
|
(4, "Minor HF fading at higher latitudes"),
|
||||||
|
(3, "Minor HF fading at higher latitudes"),
|
||||||
|
(2, "No impact"),
|
||||||
|
(1, "No impact"),
|
||||||
|
(0, "No impact"),
|
||||||
|
]
|
||||||
|
|
||||||
|
GEOMAG_STORM_SCALES = [
|
||||||
|
(9, 5),
|
||||||
|
(8, 4),
|
||||||
|
(7, 3),
|
||||||
|
(6, 2),
|
||||||
|
(5, 1),
|
||||||
|
(0, 0),
|
||||||
|
]
|
||||||
|
|
||||||
|
BAND_CONDITIONS_DESCRIPTIONS = [
|
||||||
|
(200, "Reliable conditions on all bands including 6m"),
|
||||||
|
(150, "Excellent conditions on all bands up to 10m, occasional 6m openings"),
|
||||||
|
(120, "Fair to good conditions on all bands up to 10m"),
|
||||||
|
(90, "Fair conditions on bands up to 15m"),
|
||||||
|
(70, "Poor to fair conditions on bands up to 20m"),
|
||||||
|
(0, "Bands above 40m unusable"),
|
||||||
|
]
|
||||||
|
|
||||||
|
ELECTRON_FLUX_DESCRIPTIONS = [
|
||||||
|
(1000, "Partial to complete HF blackout in polar regions"),
|
||||||
|
(100, "Degraded HF propagation in polar regions"),
|
||||||
|
(10, "Minor impact on HF in polar regions"),
|
||||||
|
(0, "No impact"),
|
||||||
|
]
|
||||||
|
|
||||||
|
|
||||||
|
def _lookup_by_threshold(value, table, default=None):
|
||||||
|
"""Return the description from a threshold table for the given numeric value.
|
||||||
|
The table is a list of (min_value, description) pairs in descending order."""
|
||||||
|
|
||||||
|
if value is None:
|
||||||
|
return default
|
||||||
|
for threshold, description in table:
|
||||||
|
if value >= threshold:
|
||||||
|
return description
|
||||||
|
return default
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class HFBandCondition:
|
||||||
|
"""Data class representing HF propagation conditions for certain bands and time of day."""
|
||||||
|
|
||||||
|
# Band name, e.g. "80m-40m", "20m-17m", "10m-6m"
|
||||||
|
band: str = None
|
||||||
|
# Time of day: "day" or "night"
|
||||||
|
time: str = None
|
||||||
|
# Propagation condition: "Good", "Fair", or "Poor"
|
||||||
|
condition: str = None
|
||||||
|
|
||||||
|
|
||||||
|
@dataclass
|
||||||
|
class SolarConditions:
|
||||||
|
"""Data class representing current solar and propagation conditions."""
|
||||||
|
|
||||||
|
# Time the data was last updated at the source, UTC seconds since UNIX epoch
|
||||||
|
updated: float = None
|
||||||
|
# Solar Flux Index (SFI)
|
||||||
|
sfi: int = None
|
||||||
|
# A-index (daily geomagnetic activity)
|
||||||
|
a_index: int = None
|
||||||
|
# K-index (3-hour geomagnetic activity)
|
||||||
|
k_index: int = None
|
||||||
|
# X-ray flux class, e.g. "B2.3", "C1.0"
|
||||||
|
x_ray: str = None
|
||||||
|
# Proton flux
|
||||||
|
proton_flux: int = None
|
||||||
|
# Electron flux
|
||||||
|
electron_flux: int = None
|
||||||
|
# Aurora activity level
|
||||||
|
aurora: int = None
|
||||||
|
# Latitude in degrees of the aurora boundary
|
||||||
|
aurora_latitude: float = None
|
||||||
|
# Sunspot count
|
||||||
|
sunspots: int = None
|
||||||
|
# Solar wind speed in km/s
|
||||||
|
solar_wind: float = None
|
||||||
|
# Interplanetary magnetic field strength in nT
|
||||||
|
magnetic_field: float = None
|
||||||
|
# Geomagnetic field condition, e.g. "Quiet", "Unsettled", "Active", "Storm"
|
||||||
|
geomag_field: str = None
|
||||||
|
# Geomagnetic background noise level, e.g. "S0", "S1", "S2"
|
||||||
|
geomag_noise: str = None
|
||||||
|
# HF band propagation conditions, keyed by "{band}-{time}" e.g. "80m-40m-day"
|
||||||
|
hf_conditions: dict = None
|
||||||
|
# VHF propagation conditions, keyed by condition name
|
||||||
|
vhf_conditions: dict = None
|
||||||
|
|
||||||
|
# Derived values (populated by infer_descriptions())
|
||||||
|
# HF radio blackout risk description, derived from x_ray
|
||||||
|
blackout_desc: str = None
|
||||||
|
# Solar radiation storm level description, derived from proton_flux
|
||||||
|
proton_flux_desc: str = None
|
||||||
|
# Solar radiation storm scale number (S0-S5), derived from proton_flux
|
||||||
|
solar_storm_scale: int = None
|
||||||
|
# Geomagnetic storm level description, derived from k_index
|
||||||
|
geomag_storm_desc: str = None
|
||||||
|
# Geomagnetic storm scale number (G0-G5), derived from k_index
|
||||||
|
geomag_storm_scale: int = None
|
||||||
|
# Overall HF band conditions summary, derived from sfi
|
||||||
|
band_conditions_desc: str = None
|
||||||
|
# Electron flux description, derived from electron_flux
|
||||||
|
electron_flux_desc: str = None
|
||||||
|
|
||||||
|
def infer_descriptions(self):
|
||||||
|
"""Populate derived text description fields from the current numeric/raw field values."""
|
||||||
|
|
||||||
|
# blackout_desc: use the X-ray flux class letter (first character of x_ray)
|
||||||
|
if self.x_ray and len(self.x_ray) > 0:
|
||||||
|
self.blackout_desc = BLACKOUT_DESCRIPTIONS.get(self.x_ray[0].upper())
|
||||||
|
|
||||||
|
self.proton_flux_desc = _lookup_by_threshold(self.proton_flux, PROTON_FLUX_DESCRIPTIONS)
|
||||||
|
self.solar_storm_scale = _lookup_by_threshold(self.proton_flux, SOLAR_STORM_SCALES)
|
||||||
|
self.geomag_storm_desc = _lookup_by_threshold(self.k_index, GEOMAG_STORM_DESCRIPTIONS)
|
||||||
|
self.geomag_storm_scale = _lookup_by_threshold(self.k_index, GEOMAG_STORM_SCALES)
|
||||||
|
self.band_conditions_desc = _lookup_by_threshold(self.sfi, BAND_CONDITIONS_DESCRIPTIONS)
|
||||||
|
self.electron_flux_desc = _lookup_by_threshold(self.electron_flux, ELECTRON_FLUX_DESCRIPTIONS)
|
||||||
|
|
||||||
|
def to_json(self):
|
||||||
|
"""JSON serialise"""
|
||||||
|
|
||||||
|
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
|
||||||
83
data/spot.py
83
data/spot.py
@@ -11,14 +11,17 @@ from pyhamtools.locator import locator_to_latlong, latlong_to_locator
|
|||||||
|
|
||||||
from core.config import MAX_SPOT_AGE
|
from core.config import MAX_SPOT_AGE
|
||||||
from core.constants import MODE_ALIASES
|
from core.constants import MODE_ALIASES
|
||||||
from core.lookup_helper import lookup_helper
|
from core.geo_utils import lat_lon_to_cq_zone, lat_lon_to_itu_zone
|
||||||
|
from core.lookup_helper import lookup_helper, infer_band_from_freq, infer_mode_from_comment, infer_mode_from_frequency, \
|
||||||
|
infer_mode_type_from_mode
|
||||||
from core.sig_utils import populate_sig_ref_info, ANY_SIG_REGEX, get_ref_regex_for_sig
|
from core.sig_utils import populate_sig_ref_info, ANY_SIG_REGEX, get_ref_regex_for_sig
|
||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
|
|
||||||
|
|
||||||
# Data class that defines a spot.
|
|
||||||
@dataclass
|
@dataclass
|
||||||
class Spot:
|
class Spot:
|
||||||
|
"""Data class that defines a spot."""
|
||||||
|
|
||||||
# Unique identifier for the spot
|
# Unique identifier for the spot
|
||||||
id: str = None
|
id: str = None
|
||||||
|
|
||||||
@@ -128,8 +131,9 @@ class Spot:
|
|||||||
# The ID the source gave it, if any.
|
# The ID the source gave it, if any.
|
||||||
source_id: str = None
|
source_id: str = None
|
||||||
|
|
||||||
# Infer missing parameters where possible
|
|
||||||
def infer_missing(self):
|
def infer_missing(self):
|
||||||
|
"""Infer missing parameters where possible"""
|
||||||
|
|
||||||
# If we somehow don't have a spot time, set it to zero so it sorts off the bottom of any list but
|
# If we somehow don't have a spot time, set it to zero so it sorts off the bottom of any list but
|
||||||
# clients can still reliably parse it as a number.
|
# clients can still reliably parse it as a number.
|
||||||
if not self.time:
|
if not self.time:
|
||||||
@@ -152,15 +156,11 @@ class Spot:
|
|||||||
if len(split) > 1 and split[1] != "#":
|
if len(split) > 1 and split[1] != "#":
|
||||||
self.dx_ssid = split[1]
|
self.dx_ssid = split[1]
|
||||||
|
|
||||||
# DX country, continent, zones etc. from callsign
|
# DX country, continent etc. from callsign
|
||||||
if self.dx_call and not self.dx_country:
|
if self.dx_call and not self.dx_country:
|
||||||
self.dx_country = lookup_helper.infer_country_from_callsign(self.dx_call)
|
self.dx_country = lookup_helper.infer_country_from_callsign(self.dx_call)
|
||||||
if self.dx_call and not self.dx_continent:
|
if self.dx_call and not self.dx_continent:
|
||||||
self.dx_continent = lookup_helper.infer_continent_from_callsign(self.dx_call)
|
self.dx_continent = lookup_helper.infer_continent_from_callsign(self.dx_call)
|
||||||
if self.dx_call and not self.dx_cq_zone:
|
|
||||||
self.dx_cq_zone = lookup_helper.infer_cq_zone_from_callsign(self.dx_call)
|
|
||||||
if self.dx_call and not self.dx_itu_zone:
|
|
||||||
self.dx_itu_zone = lookup_helper.infer_itu_zone_from_callsign(self.dx_call)
|
|
||||||
if self.dx_call and not self.dx_dxcc_id:
|
if self.dx_call and not self.dx_dxcc_id:
|
||||||
self.dx_dxcc_id = lookup_helper.infer_dxcc_id_from_callsign(self.dx_call)
|
self.dx_dxcc_id = lookup_helper.infer_dxcc_id_from_callsign(self.dx_call)
|
||||||
if self.dx_dxcc_id and not self.dx_flag:
|
if self.dx_dxcc_id and not self.dx_flag:
|
||||||
@@ -189,7 +189,8 @@ class Spot:
|
|||||||
|
|
||||||
# Spotter country, continent, zones etc. from callsign.
|
# Spotter country, continent, zones etc. from callsign.
|
||||||
# DE call with no digits, or APRS servers starting "T2" are not things we can look up location for
|
# DE call with no digits, or APRS servers starting "T2" are not things we can look up location for
|
||||||
if self.de_call and any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
|
if self.de_call and any(char.isdigit() for char in self.de_call) and not (
|
||||||
|
self.de_call.startswith("T2") and self.source == "APRS-IS"):
|
||||||
if not self.de_country:
|
if not self.de_country:
|
||||||
self.de_country = lookup_helper.infer_country_from_callsign(self.de_call)
|
self.de_country = lookup_helper.infer_country_from_callsign(self.de_call)
|
||||||
if not self.de_continent:
|
if not self.de_continent:
|
||||||
@@ -199,19 +200,23 @@ class Spot:
|
|||||||
if self.de_dxcc_id and not self.de_flag:
|
if self.de_dxcc_id and not self.de_flag:
|
||||||
self.de_flag = lookup_helper.get_flag_for_dxcc(self.de_dxcc_id)
|
self.de_flag = lookup_helper.get_flag_for_dxcc(self.de_dxcc_id)
|
||||||
|
|
||||||
|
# Remove NaNs in frequency
|
||||||
|
if self.freq and self.freq == float("nan"):
|
||||||
|
self.freq = None
|
||||||
|
|
||||||
# Band from frequency
|
# Band from frequency
|
||||||
if self.freq and not self.band:
|
if self.freq and not self.band:
|
||||||
band = lookup_helper.infer_band_from_freq(self.freq)
|
band = infer_band_from_freq(self.freq)
|
||||||
self.band = band.name
|
self.band = band.name
|
||||||
|
|
||||||
# Mode from comments or bandplan
|
# Mode from comments or bandplan
|
||||||
if self.mode:
|
if self.mode:
|
||||||
self.mode_source = "SPOT"
|
self.mode_source = "SPOT"
|
||||||
if self.comment and not self.mode:
|
if self.comment and not self.mode:
|
||||||
self.mode = lookup_helper.infer_mode_from_comment(self.comment)
|
self.mode = infer_mode_from_comment(self.comment)
|
||||||
self.mode_source = "COMMENT"
|
self.mode_source = "COMMENT"
|
||||||
if self.freq and not self.mode:
|
if self.freq and not self.mode:
|
||||||
self.mode = lookup_helper.infer_mode_from_frequency(self.freq)
|
self.mode = infer_mode_from_frequency(self.freq)
|
||||||
self.mode_source = "BANDPLAN"
|
self.mode_source = "BANDPLAN"
|
||||||
|
|
||||||
# Normalise mode if necessary.
|
# Normalise mode if necessary.
|
||||||
@@ -220,7 +225,7 @@ class Spot:
|
|||||||
|
|
||||||
# Mode type from mode
|
# Mode type from mode
|
||||||
if self.mode and not self.mode_type:
|
if self.mode and not self.mode_type:
|
||||||
self.mode_type = lookup_helper.infer_mode_type_from_mode(self.mode)
|
self.mode_type = infer_mode_type_from_mode(self.mode)
|
||||||
|
|
||||||
# If we have a latitude or grid at this point, it can only have been provided by the spot itself
|
# If we have a latitude or grid at this point, it can only have been provided by the spot itself
|
||||||
if self.dx_latitude or self.dx_grid:
|
if self.dx_latitude or self.dx_grid:
|
||||||
@@ -238,7 +243,7 @@ class Spot:
|
|||||||
if regex:
|
if regex:
|
||||||
all_comment_ref_matches = re.finditer(r"(^|\W)(" + regex + r")(^|\W)", self.comment, re.IGNORECASE)
|
all_comment_ref_matches = re.finditer(r"(^|\W)(" + regex + r")(^|\W)", self.comment, re.IGNORECASE)
|
||||||
for ref_match in all_comment_ref_matches:
|
for ref_match in all_comment_ref_matches:
|
||||||
self.append_sig_ref_if_missing(SIGRef(id=ref_match.group(2).upper(), sig=sig))
|
self._append_sig_ref_if_missing(SIGRef(id=ref_match.group(2).upper(), sig=sig))
|
||||||
|
|
||||||
# See if the comment looks like it contains any SIGs (and optionally SIG references) that we can
|
# See if the comment looks like it contains any SIGs (and optionally SIG references) that we can
|
||||||
# add to the spot. This should catch cluster spot comments like "POTA GB-0001 WWFF GFF-0001" and e.g. POTA
|
# add to the spot. This should catch cluster spot comments like "POTA GB-0001 WWFF GFF-0001" and e.g. POTA
|
||||||
@@ -256,9 +261,10 @@ class Spot:
|
|||||||
# If so, add that to the sig_refs list for this spot.
|
# If so, add that to the sig_refs list for this spot.
|
||||||
ref_regex = get_ref_regex_for_sig(found_sig)
|
ref_regex = get_ref_regex_for_sig(found_sig)
|
||||||
if ref_regex:
|
if ref_regex:
|
||||||
ref_matches = re.finditer(r"(^|\W)" + found_sig + r"($|\W)(" + ref_regex + r")($|\W)", self.comment, re.IGNORECASE)
|
ref_matches = re.finditer(r"(^|\W)" + found_sig + r"($|\W)(" + ref_regex + r")($|\W)", self.comment,
|
||||||
|
re.IGNORECASE)
|
||||||
for ref_match in ref_matches:
|
for ref_match in ref_matches:
|
||||||
self.append_sig_ref_if_missing(SIGRef(id=ref_match.group(3).upper(), sig=found_sig))
|
self._append_sig_ref_if_missing(SIGRef(id=ref_match.group(3).upper(), sig=found_sig))
|
||||||
|
|
||||||
# Fetch SIG data. In case a particular API doesn't provide a full set of name, lat, lon & grid for a reference
|
# Fetch SIG data. In case a particular API doesn't provide a full set of name, lat, lon & grid for a reference
|
||||||
# in its initial call, we use this code to populate the rest of the data. This includes working out grid refs
|
# in its initial call, we use this code to populate the rest of the data. This includes working out grid refs
|
||||||
@@ -285,7 +291,6 @@ class Spot:
|
|||||||
# DX Grid to lat/lon and vice versa in case one is missing
|
# DX Grid to lat/lon and vice versa in case one is missing
|
||||||
if self.dx_grid and not self.dx_latitude:
|
if self.dx_grid and not self.dx_latitude:
|
||||||
try:
|
try:
|
||||||
print(json.dumps(self))
|
|
||||||
ll = locator_to_latlong(self.dx_grid)
|
ll = locator_to_latlong(self.dx_grid)
|
||||||
self.dx_latitude = ll[0]
|
self.dx_latitude = ll[0]
|
||||||
self.dx_longitude = ll[1]
|
self.dx_longitude = ll[1]
|
||||||
@@ -332,15 +337,34 @@ class Spot:
|
|||||||
self.dx_grid = lookup_helper.infer_grid_from_callsign_dxcc(self.dx_call)
|
self.dx_grid = lookup_helper.infer_grid_from_callsign_dxcc(self.dx_call)
|
||||||
self.dx_location_source = "DXCC"
|
self.dx_location_source = "DXCC"
|
||||||
|
|
||||||
|
# It looks like we can sometimes get a string into lat/lon, so reject that before we try looking anything up
|
||||||
|
if isinstance(self.dx_latitude, str) or isinstance(self.dx_longitude, str):
|
||||||
|
logging.warning("Received strings in lat/lon (" + str(self.dx_latitude) + ", " + str(self.dx_longitude) + ") for call " + self.dx_call + ", rejecting it")
|
||||||
|
self.dx_latitude = None
|
||||||
|
self.dx_longitude = None
|
||||||
|
|
||||||
|
# CQ and ITU zone lookup, preferably from location but failing that, from callsign
|
||||||
|
if not self.dx_cq_zone:
|
||||||
|
if self.dx_latitude:
|
||||||
|
self.dx_cq_zone = lat_lon_to_cq_zone(self.dx_latitude, self.dx_longitude)
|
||||||
|
elif self.dx_call:
|
||||||
|
self.dx_cq_zone = lookup_helper.infer_cq_zone_from_callsign(self.dx_call)
|
||||||
|
if not self.dx_itu_zone:
|
||||||
|
if self.dx_latitude:
|
||||||
|
self.dx_itu_zone = lat_lon_to_itu_zone(self.dx_latitude, self.dx_longitude)
|
||||||
|
elif self.dx_call:
|
||||||
|
self.dx_itu_zone = lookup_helper.infer_itu_zone_from_callsign(self.dx_call)
|
||||||
|
|
||||||
# DX Location is "good" if it is from a spot, or from QRZ if the callsign doesn't contain a slash, so the operator
|
# DX Location is "good" if it is from a spot, or from QRZ if the callsign doesn't contain a slash, so the operator
|
||||||
# is likely at home.
|
# is likely at home.
|
||||||
self.dx_location_good = self.dx_latitude and self.dx_longitude and (
|
self.dx_location_good = self.dx_latitude and self.dx_longitude and (
|
||||||
self.dx_location_source == "SPOT" or self.dx_location_source == "SIG REF LOOKUP"
|
self.dx_location_source == "SPOT" or self.dx_location_source == "SIG REF LOOKUP"
|
||||||
or self.dx_location_source == "WAB/WAI GRID"
|
or self.dx_location_source == "WAB/WAI GRID"
|
||||||
or (self.dx_location_source == "HOME QTH" and not "/" in self.dx_call))
|
or (self.dx_location_source == "HOME QTH" and not "/" in self.dx_call))
|
||||||
|
|
||||||
# DE with no digits and APRS servers starting "T2" are not things we can look up location for
|
# DE with no digits and APRS servers starting "T2" are not things we can look up location for
|
||||||
if self.de_call and any(char.isdigit() for char in self.de_call) and not (self.de_call.startswith("T2") and self.source == "APRS-IS"):
|
if self.de_call and any(char.isdigit() for char in self.de_call) and not (
|
||||||
|
self.de_call.startswith("T2") and self.source == "APRS-IS"):
|
||||||
# DE operator position lookup, using QRZ.com.
|
# DE operator position lookup, using QRZ.com.
|
||||||
if not self.de_latitude:
|
if not self.de_latitude:
|
||||||
latlon = lookup_helper.infer_latlon_from_callsign_online_lookup(self.de_call)
|
latlon = lookup_helper.infer_latlon_from_callsign_online_lookup(self.de_call)
|
||||||
@@ -367,12 +391,14 @@ class Spot:
|
|||||||
self_copy.received_time_iso = ""
|
self_copy.received_time_iso = ""
|
||||||
self.id = hashlib.sha256(str(self_copy).encode("utf-8")).hexdigest()
|
self.id = hashlib.sha256(str(self_copy).encode("utf-8")).hexdigest()
|
||||||
|
|
||||||
# JSON sspoterialise
|
|
||||||
def to_json(self):
|
def to_json(self):
|
||||||
|
"""JSON serialise"""
|
||||||
|
|
||||||
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
|
return json.dumps(self, default=lambda o: o.__dict__, sort_keys=True)
|
||||||
|
|
||||||
# Append a sig_ref to the list, so long as it's not already there.
|
def _append_sig_ref_if_missing(self, new_sig_ref):
|
||||||
def append_sig_ref_if_missing(self, new_sig_ref):
|
"""Append a sig_ref to the list, so long as it's not already there."""
|
||||||
|
|
||||||
if not self.sig_refs:
|
if not self.sig_refs:
|
||||||
self.sig_refs = []
|
self.sig_refs = []
|
||||||
new_sig_ref.id = new_sig_ref.id.strip().upper()
|
new_sig_ref.id = new_sig_ref.id.strip().upper()
|
||||||
@@ -384,9 +410,10 @@ class Spot:
|
|||||||
return
|
return
|
||||||
self.sig_refs.append(new_sig_ref)
|
self.sig_refs.append(new_sig_ref)
|
||||||
|
|
||||||
# Decide if this spot has expired (in which case it should not be added to the system in the first place, and not
|
|
||||||
# returned by the web server if later requested, and removed by the cleanup functions). "Expired" is defined as
|
|
||||||
# either having a time further ago than the server's MAX_SPOT_AGE. If it somehow doesn't have a time either, it is
|
|
||||||
# considered to be expired.
|
|
||||||
def expired(self):
|
def expired(self):
|
||||||
|
"""Decide if this spot has expired (in which case it should not be added to the system in the first place, and not
|
||||||
|
returned by the web server if later requested, and removed by the cleanup functions). "Expired" is defined as
|
||||||
|
either having a time further ago than the server's MAX_SPOT_AGE. If it somehow doesn't have a time either, it is
|
||||||
|
considered to be expired."""
|
||||||
|
|
||||||
return not self.time or self.time < (datetime.now(pytz.UTC) - timedelta(seconds=MAX_SPOT_AGE)).timestamp()
|
return not self.time or self.time < (datetime.now(pytz.UTC) - timedelta(seconds=MAX_SPOT_AGE)).timestamp()
|
||||||
134817
datafiles/cqzones.geojson
Normal file
134817
datafiles/cqzones.geojson
Normal file
File diff suppressed because it is too large
Load Diff
73598
datafiles/ituzones.geojson
Normal file
73598
datafiles/ituzones.geojson
Normal file
File diff suppressed because it is too large
Load Diff
Binary file not shown.
|
Before Width: | Height: | Size: 189 KiB After Width: | Height: | Size: 194 KiB |
@@ -3,15 +3,17 @@ requests-cache~=1.2.1
|
|||||||
pyhamtools~=0.12.0
|
pyhamtools~=0.12.0
|
||||||
telnetlib3~=2.0.8
|
telnetlib3~=2.0.8
|
||||||
pytz~=2025.2
|
pytz~=2025.2
|
||||||
requests~=2.32.5
|
requests~=2.32.4
|
||||||
aprslib~=0.7.2
|
aprslib~=0.7.2
|
||||||
diskcache~=5.6.3
|
diskcache~=5.6.3
|
||||||
psutil~=7.1.0
|
psutil~=7.1.0
|
||||||
requests-sse~=0.5.2
|
requests-sse~=0.5.2
|
||||||
rss-parser~=2.1.1
|
rss-parser~=1.1.1
|
||||||
pyproj~=3.7.2
|
pyproj~=3.5.0;python_version<="3.8"
|
||||||
prometheus_client~=0.23.1
|
pyproj~=3.7.2;python_version>"3.8"
|
||||||
|
prometheus_client~=0.21.1
|
||||||
beautifulsoup4~=4.14.2
|
beautifulsoup4~=4.14.2
|
||||||
websocket-client~=1.9.0
|
websocket-client~=1.8.0
|
||||||
tornado~=6.5.4
|
tornado~=6.4.2
|
||||||
tornado_eventsource~=3.0.0
|
tornado_eventsource~=3.0.0
|
||||||
|
geopandas~=0.13.2
|
||||||
@@ -8,7 +8,7 @@ import tornado
|
|||||||
|
|
||||||
from core.config import ALLOW_SPOTTING, MAX_SPOT_AGE
|
from core.config import ALLOW_SPOTTING, MAX_SPOT_AGE
|
||||||
from core.constants import UNKNOWN_BAND
|
from core.constants import UNKNOWN_BAND
|
||||||
from core.lookup_helper import lookup_helper
|
from core.lookup_helper import infer_band_from_freq
|
||||||
from core.prometheus_metrics_handler import api_requests_counter
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
from core.sig_utils import get_ref_regex_for_sig
|
from core.sig_utils import get_ref_regex_for_sig
|
||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything
|
||||||
@@ -16,33 +16,36 @@ from data.sig_ref import SIGRef
|
|||||||
from data.spot import Spot
|
from data.spot import Spot
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/spot (POST)
|
|
||||||
class APISpotHandler(tornado.web.RequestHandler):
|
class APISpotHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/spot (POST)"""
|
||||||
|
|
||||||
def initialize(self, spots, web_server_metrics):
|
def initialize(self, spots, web_server_metrics):
|
||||||
self.spots = spots
|
self._spots = spots
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def post(self):
|
def post(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# Reject if not allowed
|
# Reject if not allowed
|
||||||
if not ALLOW_SPOTTING:
|
if not ALLOW_SPOTTING:
|
||||||
self.set_status(401)
|
self.set_status(401)
|
||||||
self.write(json.dumps("Error - this server does not allow new spots to be added via the API.",
|
self.write(json.dumps("Error - this server does not allow new spots to be added via the API.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
|
|
||||||
# Reject if format not json
|
# Reject if format not json
|
||||||
if 'Content-Type' not in self.request.headers or self.request.headers.get('Content-Type') != "application/json":
|
if 'Content-Type' not in self.request.headers or self.request.headers.get(
|
||||||
|
'Content-Type') != "application/json":
|
||||||
self.set_status(415)
|
self.set_status(415)
|
||||||
self.write(json.dumps("Error - request Content-Type must be application/json", default=serialize_everything))
|
self.write(
|
||||||
|
json.dumps("Error - request Content-Type must be application/json", default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
@@ -72,7 +75,7 @@ class APISpotHandler(tornado.web.RequestHandler):
|
|||||||
if not spot.time or not spot.dx_call or not spot.freq or not spot.de_call:
|
if not spot.time or not spot.dx_call or not spot.freq or not spot.de_call:
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
self.write(json.dumps("Error - 'time', 'dx_call', 'freq' and 'de_call' must be provided as a minimum.",
|
self.write(json.dumps("Error - 'time', 'dx_call', 'freq' and 'de_call' must be provided as a minimum.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
@@ -81,23 +84,23 @@ class APISpotHandler(tornado.web.RequestHandler):
|
|||||||
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.dx_call):
|
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.dx_call):
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
self.write(json.dumps("Error - '" + spot.dx_call + "' does not look like a valid callsign.",
|
self.write(json.dumps("Error - '" + spot.dx_call + "' does not look like a valid callsign.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.de_call):
|
if not re.match(r"^[A-Za-z0-9/\-]*$", spot.de_call):
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
self.write(json.dumps("Error - '" + spot.de_call + "' does not look like a valid callsign.",
|
self.write(json.dumps("Error - '" + spot.de_call + "' does not look like a valid callsign.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
|
|
||||||
# Reject if frequency not in a known band
|
# Reject if frequency not in a known band
|
||||||
if lookup_helper.infer_band_from_freq(spot.freq) == UNKNOWN_BAND:
|
if infer_band_from_freq(spot.freq) == UNKNOWN_BAND:
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
self.write(json.dumps("Error - Frequency of " + str(spot.freq / 1000.0) + "kHz is not in a known band.",
|
self.write(json.dumps("Error - Frequency of " + str(spot.freq / 1000.0) + "kHz is not in a known band.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
@@ -108,7 +111,7 @@ class APISpotHandler(tornado.web.RequestHandler):
|
|||||||
spot.dx_grid.upper()):
|
spot.dx_grid.upper()):
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
self.write(json.dumps("Error - '" + spot.dx_grid + "' does not look like a valid Maidenhead grid.",
|
self.write(json.dumps("Error - '" + spot.dx_grid + "' does not look like a valid Maidenhead grid.",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
return
|
return
|
||||||
@@ -127,7 +130,7 @@ class APISpotHandler(tornado.web.RequestHandler):
|
|||||||
# infer missing data, and add it to our database.
|
# infer missing data, and add it to our database.
|
||||||
spot.source = "API"
|
spot.source = "API"
|
||||||
spot.infer_missing()
|
spot.infer_missing()
|
||||||
self.spots.add(spot.id, spot, expire=MAX_SPOT_AGE)
|
self._spots.add(spot.id, spot, expire=MAX_SPOT_AGE)
|
||||||
|
|
||||||
self.write(json.dumps("OK", default=serialize_everything))
|
self.write(json.dumps("OK", default=serialize_everything))
|
||||||
self.set_status(201)
|
self.set_status(201)
|
||||||
|
|||||||
@@ -8,24 +8,25 @@ import tornado
|
|||||||
import tornado_eventsource.handler
|
import tornado_eventsource.handler
|
||||||
|
|
||||||
from core.prometheus_metrics_handler import api_requests_counter
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything, empty_queue
|
||||||
|
|
||||||
SSE_HANDLER_MAX_QUEUE_SIZE = 100
|
SSE_HANDLER_MAX_QUEUE_SIZE = 100
|
||||||
SSE_HANDLER_QUEUE_CHECK_INTERVAL = 5000
|
SSE_HANDLER_QUEUE_CHECK_INTERVAL = 5000
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/alerts
|
|
||||||
class APIAlertsHandler(tornado.web.RequestHandler):
|
class APIAlertsHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/alerts"""
|
||||||
|
|
||||||
def initialize(self, alerts, web_server_metrics):
|
def initialize(self, alerts, web_server_metrics):
|
||||||
self.alerts = alerts
|
self._alerts = alerts
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
@@ -33,7 +34,7 @@ class APIAlertsHandler(tornado.web.RequestHandler):
|
|||||||
query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
||||||
|
|
||||||
# Fetch all alerts matching the query
|
# Fetch all alerts matching the query
|
||||||
data = get_alert_list_with_filters(self.alerts, query_params)
|
data = get_alert_list_with_filters(self._alerts, query_params)
|
||||||
self.write(json.dumps(data, default=serialize_everything))
|
self.write(json.dumps(data, default=serialize_everything))
|
||||||
self.set_status(200)
|
self.set_status(200)
|
||||||
except ValueError as e:
|
except ValueError as e:
|
||||||
@@ -47,74 +48,82 @@ class APIAlertsHandler(tornado.web.RequestHandler):
|
|||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|
||||||
# API request handler for /api/v1/alerts/stream
|
|
||||||
class APIAlertsStreamHandler(tornado_eventsource.handler.EventSourceHandler):
|
|
||||||
def initialize(self, sse_alert_queues, web_server_metrics):
|
|
||||||
self.sse_alert_queues = sse_alert_queues
|
|
||||||
self.web_server_metrics = web_server_metrics
|
|
||||||
|
|
||||||
# Custom headers to avoid e.g. nginx reverse proxy from buffering SSE data
|
class APIAlertsStreamHandler(tornado_eventsource.handler.EventSourceHandler):
|
||||||
|
"""API request handler for /api/v1/alerts/stream"""
|
||||||
|
|
||||||
|
def initialize(self, sse_alert_queues, web_server_metrics):
|
||||||
|
self._sse_alert_queues = sse_alert_queues
|
||||||
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def custom_headers(self):
|
def custom_headers(self):
|
||||||
|
"""Custom headers to avoid e.g. nginx reverse proxy from buffering SSE data"""
|
||||||
|
|
||||||
return {"Cache-Control": "no-store",
|
return {"Cache-Control": "no-store",
|
||||||
"X-Accel-Buffering": "no"}
|
"X-Accel-Buffering": "no"}
|
||||||
|
|
||||||
def open(self):
|
def open(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
# reduce that to just the first entry, and convert bytes to string
|
# reduce that to just the first entry, and convert bytes to string
|
||||||
self.query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
self._query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
||||||
|
|
||||||
# Create a alert queue and add it to the web server's list. The web server will fill this when alerts arrive
|
# Create a alert queue and add it to the web server's list. The web server will fill this when alerts arrive
|
||||||
self.alert_queue = Queue(maxsize=SSE_HANDLER_MAX_QUEUE_SIZE)
|
self._alert_queue = Queue(maxsize=SSE_HANDLER_MAX_QUEUE_SIZE)
|
||||||
self.sse_alert_queues.append(self.alert_queue)
|
self._sse_alert_queues.append(self._alert_queue)
|
||||||
|
|
||||||
# Set up a timed callback to check if anything is in the queue
|
# Set up a timed callback to check if anything is in the queue
|
||||||
self.heartbeat = tornado.ioloop.PeriodicCallback(self._callback, SSE_HANDLER_QUEUE_CHECK_INTERVAL)
|
self._heartbeat = tornado.ioloop.PeriodicCallback(self._callback, SSE_HANDLER_QUEUE_CHECK_INTERVAL)
|
||||||
self.heartbeat.start()
|
self._heartbeat.start()
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.warn("Exception when serving SSE socket", e)
|
logging.warning("Exception when serving SSE socket", e)
|
||||||
|
|
||||||
# When the user closes the socket, empty our queue and remove it from the list so the server no longer fills it
|
|
||||||
def close(self):
|
def close(self):
|
||||||
|
"""When the user closes the socket, empty our queue and remove it from the list so the server no longer fills it"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if self.alert_queue in self.sse_alert_queues:
|
if self._alert_queue in self._sse_alert_queues:
|
||||||
self.sse_alert_queues.remove(self.alert_queue)
|
self._sse_alert_queues.remove(self._alert_queue)
|
||||||
self.alert_queue.empty()
|
empty_queue(self._alert_queue)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
self.alert_queue = None
|
try:
|
||||||
|
self._heartbeat.stop()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
self._alert_queue = None
|
||||||
super().close()
|
super().close()
|
||||||
|
|
||||||
# Callback to check if anything has arrived in the queue, and if so send it to the client
|
|
||||||
def _callback(self):
|
def _callback(self):
|
||||||
|
"""Callback to check if anything has arrived in the queue, and if so send it to the client"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if self.alert_queue:
|
if self._alert_queue:
|
||||||
while not self.alert_queue.empty():
|
while not self._alert_queue.empty():
|
||||||
alert = self.alert_queue.get()
|
alert = self._alert_queue.get()
|
||||||
# If the new alert matches our param filters, send it to the client. If not, ignore it.
|
# If the new alert matches our param filters, send it to the client. If not, ignore it.
|
||||||
if alert_allowed_by_query(alert, self.query_params):
|
if alert_allowed_by_query(alert, self._query_params):
|
||||||
self.write_message(msg=json.dumps(alert, default=serialize_everything))
|
self.write_message(msg=json.dumps(alert, default=serialize_everything))
|
||||||
|
|
||||||
if self.alert_queue not in self.sse_alert_queues:
|
if self._alert_queue not in self._sse_alert_queues:
|
||||||
logging.error("Web server cleared up a queue of an active connection!")
|
logging.error("Web server cleared up a queue of an active connection!")
|
||||||
self.close()
|
self.close()
|
||||||
except:
|
except:
|
||||||
logging.warn("Exception in SSE callback, connection will be closed.")
|
logging.warning("Exception in SSE callback, connection will be closed.")
|
||||||
self.close()
|
self.close()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Utility method to apply filters to the overall alert list and return only a subset. Enables query parameters in
|
|
||||||
# the main "alerts" GET call.
|
|
||||||
def get_alert_list_with_filters(all_alerts, query):
|
def get_alert_list_with_filters(all_alerts, query):
|
||||||
|
"""Utility method to apply filters to the overall alert list and return only a subset. Enables query parameters in
|
||||||
|
the main "alerts" GET call."""
|
||||||
|
|
||||||
# Create a shallow copy of the alert list ordered by start time, then filter the list to reduce it only to alerts
|
# Create a shallow copy of the alert list ordered by start time, then filter the list to reduce it only to alerts
|
||||||
# that match the filter parameters in the query string. Finally, apply a limit to the number of alerts returned.
|
# that match the filter parameters in the query string. Finally, apply a limit to the number of alerts returned.
|
||||||
# The list of query string filters is defined in the API docs.
|
# The list of query string filters is defined in the API docs.
|
||||||
@@ -130,9 +139,11 @@ def get_alert_list_with_filters(all_alerts, query):
|
|||||||
alerts = alerts[:int(query.get("limit"))]
|
alerts = alerts[:int(query.get("limit"))]
|
||||||
return alerts
|
return alerts
|
||||||
|
|
||||||
# Given URL query params and an alert, figure out if the alert "passes" the requested filters or is rejected. The list
|
|
||||||
# of query parameters and their function is defined in the API docs.
|
|
||||||
def alert_allowed_by_query(alert, query):
|
def alert_allowed_by_query(alert, query):
|
||||||
|
"""Given URL query params and an alert, figure out if the alert "passes" the requested filters or is rejected. The list
|
||||||
|
of query parameters and their function is defined in the API docs."""
|
||||||
|
|
||||||
for k in query.keys():
|
for k in query.keys():
|
||||||
match k:
|
match k:
|
||||||
case "received_since":
|
case "received_since":
|
||||||
@@ -144,8 +155,8 @@ def alert_allowed_by_query(alert, query):
|
|||||||
# Check the duration if end_time is provided. If end_time is not provided, assume the activation is
|
# Check the duration if end_time is provided. If end_time is not provided, assume the activation is
|
||||||
# "short", i.e. it always passes this check. If dxpeditions_skip_max_duration_check is true and
|
# "short", i.e. it always passes this check. If dxpeditions_skip_max_duration_check is true and
|
||||||
# the alert is a dxpedition, it also always passes the check.
|
# the alert is a dxpedition, it also always passes the check.
|
||||||
if alert.is_dxpedition and (bool(query.get(
|
if alert.is_dxpedition and (query.get(
|
||||||
"dxpeditions_skip_max_duration_check")) if "dxpeditions_skip_max_duration_check" in query.keys() else False):
|
"dxpeditions_skip_max_duration_check").upper() == "TRUE" if "dxpeditions_skip_max_duration_check" in query.keys() else False):
|
||||||
continue
|
continue
|
||||||
if alert.end_time and alert.start_time and alert.end_time - alert.start_time > max_duration:
|
if alert.end_time and alert.start_time and alert.end_time - alert.start_time > max_duration:
|
||||||
return False
|
return False
|
||||||
|
|||||||
49
server/handlers/api/dxstats.py
Normal file
49
server/handlers/api/dxstats.py
Normal file
@@ -0,0 +1,49 @@
|
|||||||
|
import json
|
||||||
|
from collections import Counter
|
||||||
|
from datetime import datetime, timedelta
|
||||||
|
|
||||||
|
import pytz
|
||||||
|
import tornado
|
||||||
|
|
||||||
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
|
|
||||||
|
CONTINENTS = ["EU", "NA", "SA", "AS", "AF", "OC", "AN"]
|
||||||
|
BANDS = ["160m", "80m", "60m", "40m", "30m", "20m", "17m", "15m", "12m", "10m", "6m"]
|
||||||
|
CONTINENTS_SET = frozenset(CONTINENTS)
|
||||||
|
BANDS_SET = frozenset(BANDS)
|
||||||
|
|
||||||
|
|
||||||
|
class APIDxStatsHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/dxstats"""
|
||||||
|
|
||||||
|
def initialize(self, spots, web_server_metrics):
|
||||||
|
self._spots = spots
|
||||||
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
|
def get(self):
|
||||||
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
|
self._web_server_metrics["status"] = "OK"
|
||||||
|
api_requests_counter.inc()
|
||||||
|
|
||||||
|
one_hour_ago = (datetime.now(pytz.UTC) - timedelta(hours=1)).timestamp()
|
||||||
|
counts = Counter()
|
||||||
|
|
||||||
|
for key in self._spots.iterkeys():
|
||||||
|
spot = self._spots.get(key)
|
||||||
|
if spot is None:
|
||||||
|
continue
|
||||||
|
if not spot.time or spot.time < one_hour_ago:
|
||||||
|
continue
|
||||||
|
if spot.de_continent in CONTINENTS_SET and spot.dx_continent in CONTINENTS_SET and spot.band in BANDS_SET:
|
||||||
|
counts[spot.de_continent, spot.dx_continent, spot.band] += 1
|
||||||
|
|
||||||
|
result = {
|
||||||
|
de: {dx: {band: counts[de, dx, band] for band in BANDS} for dx in CONTINENTS}
|
||||||
|
for de in CONTINENTS
|
||||||
|
}
|
||||||
|
|
||||||
|
self.write(json.dumps(result))
|
||||||
|
self.set_status(200)
|
||||||
|
self.set_header("Cache-Control", "no-store")
|
||||||
|
self.set_header("Content-Type", "application/json")
|
||||||
@@ -7,6 +7,7 @@ import pytz
|
|||||||
import tornado
|
import tornado
|
||||||
|
|
||||||
from core.constants import SIGS
|
from core.constants import SIGS
|
||||||
|
from core.geo_utils import lat_lon_for_grid_sw_corner_plus_size, lat_lon_to_cq_zone, lat_lon_to_itu_zone
|
||||||
from core.prometheus_metrics_handler import api_requests_counter
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
from core.sig_utils import get_ref_regex_for_sig, populate_sig_ref_info
|
from core.sig_utils import get_ref_regex_for_sig, populate_sig_ref_info
|
||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything
|
||||||
@@ -14,17 +15,18 @@ from data.sig_ref import SIGRef
|
|||||||
from data.spot import Spot
|
from data.spot import Spot
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/lookup/call
|
|
||||||
class APILookupCallHandler(tornado.web.RequestHandler):
|
class APILookupCallHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/lookup/call"""
|
||||||
|
|
||||||
def initialize(self, web_server_metrics):
|
def initialize(self, web_server_metrics):
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
@@ -73,17 +75,18 @@ class APILookupCallHandler(tornado.web.RequestHandler):
|
|||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/lookup/sigref
|
|
||||||
class APILookupSIGRefHandler(tornado.web.RequestHandler):
|
class APILookupSIGRefHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/lookup/sigref"""
|
||||||
|
|
||||||
def initialize(self, web_server_metrics):
|
def initialize(self, web_server_metrics):
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
@@ -94,15 +97,15 @@ class APILookupSIGRefHandler(tornado.web.RequestHandler):
|
|||||||
# the provided id must match it.
|
# the provided id must match it.
|
||||||
if "sig" in query_params.keys() and "id" in query_params.keys():
|
if "sig" in query_params.keys() and "id" in query_params.keys():
|
||||||
sig = query_params.get("sig").upper()
|
sig = query_params.get("sig").upper()
|
||||||
id = query_params.get("id").upper()
|
ref_id = query_params.get("id").upper()
|
||||||
if sig in list(map(lambda p: p.name, SIGS)):
|
if sig in list(map(lambda p: p.name, SIGS)):
|
||||||
if not get_ref_regex_for_sig(sig) or re.match(get_ref_regex_for_sig(sig), id):
|
if not get_ref_regex_for_sig(sig) or re.match(get_ref_regex_for_sig(sig), ref_id):
|
||||||
data = populate_sig_ref_info(SIGRef(id=id, sig=sig))
|
data = populate_sig_ref_info(SIGRef(id=ref_id, sig=sig))
|
||||||
self.write(json.dumps(data, default=serialize_everything))
|
self.write(json.dumps(data, default=serialize_everything))
|
||||||
|
|
||||||
else:
|
else:
|
||||||
self.write(
|
self.write(
|
||||||
json.dumps("Error - '" + id + "' does not look like a valid reference ID for " + sig + ".",
|
json.dumps("Error - '" + ref_id + "' does not look like a valid reference ID for " + sig + ".",
|
||||||
default=serialize_everything))
|
default=serialize_everything))
|
||||||
self.set_status(422)
|
self.set_status(422)
|
||||||
else:
|
else:
|
||||||
@@ -119,3 +122,61 @@ class APILookupSIGRefHandler(tornado.web.RequestHandler):
|
|||||||
|
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|
||||||
|
|
||||||
|
class APILookupGridHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/lookup/grid"""
|
||||||
|
|
||||||
|
def initialize(self, web_server_metrics):
|
||||||
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
|
def get(self):
|
||||||
|
try:
|
||||||
|
# Metrics
|
||||||
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
|
self._web_server_metrics["status"] = "OK"
|
||||||
|
api_requests_counter.inc()
|
||||||
|
|
||||||
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
|
# reduce that to just the first entry, and convert bytes to string
|
||||||
|
query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
||||||
|
|
||||||
|
# "grid" query param must exist.
|
||||||
|
if "grid" in query_params.keys():
|
||||||
|
grid = query_params.get("grid").upper()
|
||||||
|
lat, lon, lat_cell_size, lon_cell_size = lat_lon_for_grid_sw_corner_plus_size(grid)
|
||||||
|
if lat is not None and lon is not None and lat_cell_size is not None and lon_cell_size is not None:
|
||||||
|
center_lat = lat + lat_cell_size / 2.0
|
||||||
|
center_lon = lon + lon_cell_size / 2.0
|
||||||
|
center_cq_zone = lat_lon_to_cq_zone(center_lat, center_lon)
|
||||||
|
center_itu_zone = lat_lon_to_itu_zone(center_lat, center_lon)
|
||||||
|
|
||||||
|
response = {
|
||||||
|
"center": {
|
||||||
|
"latitude": center_lat,
|
||||||
|
"longitude": center_lon,
|
||||||
|
"cq_zone": center_cq_zone,
|
||||||
|
"itu_zone": center_itu_zone
|
||||||
|
},
|
||||||
|
"southwest": {
|
||||||
|
"latitude": lat,
|
||||||
|
"longitude": lon,
|
||||||
|
},
|
||||||
|
"northeast": {
|
||||||
|
"latitude": lat + lat_cell_size,
|
||||||
|
"longitude": lon + lon_cell_size,
|
||||||
|
}}
|
||||||
|
self.write(json.dumps(response, default=serialize_everything))
|
||||||
|
|
||||||
|
else:
|
||||||
|
self.write(json.dumps("Error - grid must be provided", default=serialize_everything))
|
||||||
|
self.set_status(422)
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logging.error(e)
|
||||||
|
self.write(json.dumps("Error - " + str(e), default=serialize_everything))
|
||||||
|
self.set_status(500)
|
||||||
|
|
||||||
|
self.set_header("Cache-Control", "no-store")
|
||||||
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|||||||
@@ -4,23 +4,24 @@ from datetime import datetime
|
|||||||
import pytz
|
import pytz
|
||||||
import tornado
|
import tornado
|
||||||
|
|
||||||
from core.config import MAX_SPOT_AGE, ALLOW_SPOTTING, WEB_UI_OPTIONS
|
from core.config import MAX_SPOT_AGE, ALLOW_SPOTTING
|
||||||
from core.constants import BANDS, ALL_MODES, MODE_TYPES, SIGS, CONTINENTS
|
from core.constants import BANDS, ALL_MODES, MODE_TYPES, SIGS, CONTINENTS
|
||||||
from core.prometheus_metrics_handler import api_requests_counter
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/options
|
|
||||||
class APIOptionsHandler(tornado.web.RequestHandler):
|
class APIOptionsHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/options"""
|
||||||
|
|
||||||
def initialize(self, status_data, web_server_metrics):
|
def initialize(self, status_data, web_server_metrics):
|
||||||
self.status_data = status_data
|
self._status_data = status_data
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
options = {"bands": BANDS,
|
options = {"bands": BANDS,
|
||||||
@@ -29,18 +30,16 @@ class APIOptionsHandler(tornado.web.RequestHandler):
|
|||||||
"sigs": SIGS,
|
"sigs": SIGS,
|
||||||
# Spot/alert sources are filtered for only ones that are enabled in config, no point letting the user toggle things that aren't even available.
|
# Spot/alert sources are filtered for only ones that are enabled in config, no point letting the user toggle things that aren't even available.
|
||||||
"spot_sources": list(
|
"spot_sources": list(
|
||||||
map(lambda p: p["name"], filter(lambda p: p["enabled"], self.status_data["spot_providers"]))),
|
map(lambda p: p["name"], filter(lambda p: p["enabled"], self._status_data["spot_providers"]))),
|
||||||
"alert_sources": list(
|
"alert_sources": list(
|
||||||
map(lambda p: p["name"], filter(lambda p: p["enabled"], self.status_data["alert_providers"]))),
|
map(lambda p: p["name"], filter(lambda p: p["enabled"], self._status_data["alert_providers"]))),
|
||||||
"continents": CONTINENTS,
|
"continents": CONTINENTS,
|
||||||
"max_spot_age": MAX_SPOT_AGE,
|
"max_spot_age": MAX_SPOT_AGE,
|
||||||
"spot_allowed": ALLOW_SPOTTING,
|
"spot_allowed": ALLOW_SPOTTING}
|
||||||
"web-ui-options": WEB_UI_OPTIONS}
|
|
||||||
# If spotting to this server is enabled, "API" is another valid spot source even though it does not come from
|
# If spotting to this server is enabled, "API" is another valid spot source even though it does not come from
|
||||||
# one of our proviers.
|
# one of our proviers.
|
||||||
if ALLOW_SPOTTING:
|
if ALLOW_SPOTTING:
|
||||||
options["spot_sources"].append("API")
|
options["spot_sources"].append("API")
|
||||||
options["web-ui-options"]["spot-providers-enabled-by-default"].append("API")
|
|
||||||
|
|
||||||
self.write(json.dumps(options, default=serialize_everything))
|
self.write(json.dumps(options, default=serialize_everything))
|
||||||
self.set_status(200)
|
self.set_status(200)
|
||||||
|
|||||||
28
server/handlers/api/solar_conditions.py
Normal file
28
server/handlers/api/solar_conditions.py
Normal file
@@ -0,0 +1,28 @@
|
|||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
import pytz
|
||||||
|
import tornado
|
||||||
|
|
||||||
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
|
from core.utils import serialize_everything
|
||||||
|
|
||||||
|
|
||||||
|
class APISolarConditionsHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/solar"""
|
||||||
|
|
||||||
|
def initialize(self, solar_conditions, web_server_metrics):
|
||||||
|
self._solar_conditions = solar_conditions
|
||||||
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
|
def get(self):
|
||||||
|
# Metrics
|
||||||
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
|
self._web_server_metrics["status"] = "OK"
|
||||||
|
api_requests_counter.inc()
|
||||||
|
|
||||||
|
self.write(self._solar_conditions.to_json())
|
||||||
|
self.set_status(200)
|
||||||
|
self.set_header("Cache-Control", "no-store")
|
||||||
|
self.set_header("Content-Type", "application/json")
|
||||||
@@ -8,24 +8,25 @@ import tornado
|
|||||||
import tornado_eventsource.handler
|
import tornado_eventsource.handler
|
||||||
|
|
||||||
from core.prometheus_metrics_handler import api_requests_counter
|
from core.prometheus_metrics_handler import api_requests_counter
|
||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything, empty_queue
|
||||||
|
|
||||||
SSE_HANDLER_MAX_QUEUE_SIZE = 1000
|
SSE_HANDLER_MAX_QUEUE_SIZE = 1000
|
||||||
SSE_HANDLER_QUEUE_CHECK_INTERVAL = 5000
|
SSE_HANDLER_QUEUE_CHECK_INTERVAL = 5000
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/spots
|
|
||||||
class APISpotsHandler(tornado.web.RequestHandler):
|
class APISpotsHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/spots"""
|
||||||
|
|
||||||
def initialize(self, spots, web_server_metrics):
|
def initialize(self, spots, web_server_metrics):
|
||||||
self.spots = spots
|
self._spots = spots
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
@@ -33,7 +34,7 @@ class APISpotsHandler(tornado.web.RequestHandler):
|
|||||||
query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
||||||
|
|
||||||
# Fetch all spots matching the query
|
# Fetch all spots matching the query
|
||||||
data = get_spot_list_with_filters(self.spots, query_params)
|
data = get_spot_list_with_filters(self._spots, query_params)
|
||||||
self.write(json.dumps(data, default=serialize_everything))
|
self.write(json.dumps(data, default=serialize_everything))
|
||||||
self.set_status(200)
|
self.set_status(200)
|
||||||
except ValueError as e:
|
except ValueError as e:
|
||||||
@@ -48,74 +49,83 @@ class APISpotsHandler(tornado.web.RequestHandler):
|
|||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/spots/stream
|
|
||||||
class APISpotsStreamHandler(tornado_eventsource.handler.EventSourceHandler):
|
class APISpotsStreamHandler(tornado_eventsource.handler.EventSourceHandler):
|
||||||
def initialize(self, sse_spot_queues, web_server_metrics):
|
"""API request handler for /api/v1/spots/stream"""
|
||||||
self.sse_spot_queues = sse_spot_queues
|
|
||||||
self.web_server_metrics = web_server_metrics
|
def initialize(self, sse_spot_queues, web_server_metrics):
|
||||||
|
self._sse_spot_queues = sse_spot_queues
|
||||||
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
# Custom headers to avoid e.g. nginx reverse proxy from buffering SSE data
|
|
||||||
def custom_headers(self):
|
def custom_headers(self):
|
||||||
|
"""Custom headers to avoid e.g. nginx reverse proxy from buffering SSE data"""
|
||||||
|
|
||||||
return {"Cache-Control": "no-store",
|
return {"Cache-Control": "no-store",
|
||||||
"X-Accel-Buffering": "no"}
|
"X-Accel-Buffering": "no"}
|
||||||
|
|
||||||
# Called once on the client opening a connection, set things up
|
|
||||||
def open(self):
|
def open(self):
|
||||||
|
"""Called once on the client opening a connection, set things up"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
# request.arguments contains lists for each param key because technically the client can supply multiple,
|
||||||
# reduce that to just the first entry, and convert bytes to string
|
# reduce that to just the first entry, and convert bytes to string
|
||||||
self.query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
self._query_params = {k: v[0].decode("utf-8") for k, v in self.request.arguments.items()}
|
||||||
|
|
||||||
# Create a spot queue and add it to the web server's list. The web server will fill this when spots arrive
|
# Create a spot queue and add it to the web server's list. The web server will fill this when spots arrive
|
||||||
self.spot_queue = Queue(maxsize=SSE_HANDLER_MAX_QUEUE_SIZE)
|
self._spot_queue = Queue(maxsize=SSE_HANDLER_MAX_QUEUE_SIZE)
|
||||||
self.sse_spot_queues.append(self.spot_queue)
|
self._sse_spot_queues.append(self._spot_queue)
|
||||||
|
|
||||||
# Set up a timed callback to check if anything is in the queue
|
# Set up a timed callback to check if anything is in the queue
|
||||||
self.heartbeat = tornado.ioloop.PeriodicCallback(self._callback, SSE_HANDLER_QUEUE_CHECK_INTERVAL)
|
self._heartbeat = tornado.ioloop.PeriodicCallback(self._callback, SSE_HANDLER_QUEUE_CHECK_INTERVAL)
|
||||||
self.heartbeat.start()
|
self._heartbeat.start()
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logging.warn("Exception when serving SSE socket", e)
|
logging.warning("Exception when serving SSE socket", e)
|
||||||
|
|
||||||
# When the user closes the socket, empty our queue and remove it from the list so the server no longer fills it
|
|
||||||
def close(self):
|
def close(self):
|
||||||
|
"""When the user closes the socket, empty our queue and remove it from the list so the server no longer fills it"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if self.spot_queue in self.sse_spot_queues:
|
if self._spot_queue in self._sse_spot_queues:
|
||||||
self.sse_spot_queues.remove(self.spot_queue)
|
self._sse_spot_queues.remove(self._spot_queue)
|
||||||
self.spot_queue.empty()
|
empty_queue(self._spot_queue)
|
||||||
except:
|
except:
|
||||||
pass
|
pass
|
||||||
self.spot_queue = None
|
try:
|
||||||
|
self._heartbeat.stop()
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
self._spot_queue = None
|
||||||
super().close()
|
super().close()
|
||||||
|
|
||||||
# Callback to check if anything has arrived in the queue, and if so send it to the client
|
|
||||||
def _callback(self):
|
def _callback(self):
|
||||||
|
"""Callback to check if anything has arrived in the queue, and if so send it to the client"""
|
||||||
|
|
||||||
try:
|
try:
|
||||||
if self.spot_queue:
|
if self._spot_queue:
|
||||||
while not self.spot_queue.empty():
|
while not self._spot_queue.empty():
|
||||||
spot = self.spot_queue.get()
|
spot = self._spot_queue.get()
|
||||||
# If the new spot matches our param filters, send it to the client. If not, ignore it.
|
# If the new spot matches our param filters, send it to the client. If not, ignore it.
|
||||||
if spot_allowed_by_query(spot, self.query_params):
|
if spot_allowed_by_query(spot, self._query_params):
|
||||||
self.write_message(msg=json.dumps(spot, default=serialize_everything))
|
self.write_message(msg=json.dumps(spot, default=serialize_everything))
|
||||||
|
|
||||||
if self.spot_queue not in self.sse_spot_queues:
|
if self._spot_queue not in self._sse_spot_queues:
|
||||||
logging.error("Web server cleared up a queue of an active connection!")
|
logging.error("Web server cleared up a queue of an active connection!")
|
||||||
self.close()
|
self.close()
|
||||||
except:
|
except:
|
||||||
logging.warn("Exception in SSE callback, connection will be closed.")
|
logging.warning("Exception in SSE callback, connection will be closed.")
|
||||||
self.close()
|
self.close()
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
# Utility method to apply filters to the overall spot list and return only a subset. Enables query parameters in
|
|
||||||
# the main "spots" GET call.
|
|
||||||
def get_spot_list_with_filters(all_spots, query):
|
def get_spot_list_with_filters(all_spots, query):
|
||||||
|
"""Utility method to apply filters to the overall spot list and return only a subset. Enables query parameters in
|
||||||
|
the main "spots" GET call."""
|
||||||
|
|
||||||
# Create a shallow copy of the spot list, ordered by spot time, then filter the list to reduce it only to spots
|
# Create a shallow copy of the spot list, ordered by spot time, then filter the list to reduce it only to spots
|
||||||
# that match the filter parameters in the query string. Finally, apply a limit to the number of spots returned.
|
# that match the filter parameters in the query string. Finally, apply a limit to the number of spots returned.
|
||||||
# The list of query string filters is defined in the API docs.
|
# The list of query string filters is defined in the API docs.
|
||||||
@@ -138,22 +148,24 @@ def get_spot_list_with_filters(all_spots, query):
|
|||||||
# duplicates are fine in the main spot list (e.g. different cluster spots of the same DX) this doesn't
|
# duplicates are fine in the main spot list (e.g. different cluster spots of the same DX) this doesn't
|
||||||
# work well for the other views.
|
# work well for the other views.
|
||||||
if "dedupe" in query.keys():
|
if "dedupe" in query.keys():
|
||||||
dedupe = query.get("dedupe").upper() == "TRUE"
|
dedupe = query.get("dedupe").upper() == "TRUE"
|
||||||
if dedupe:
|
if dedupe:
|
||||||
spots_temp = []
|
spots_temp = []
|
||||||
already_seen = []
|
already_seen = []
|
||||||
for s in spots:
|
for s in spots:
|
||||||
call_plus_ssid = s.dx_call + (s.dx_ssid if s.dx_ssid else "")
|
call_plus_ssid = s.dx_call + (s.dx_ssid if s.dx_ssid else "")
|
||||||
if call_plus_ssid not in already_seen:
|
if call_plus_ssid not in already_seen:
|
||||||
spots_temp.append(s)
|
spots_temp.append(s)
|
||||||
already_seen.append(call_plus_ssid)
|
already_seen.append(call_plus_ssid)
|
||||||
spots = spots_temp
|
spots = spots_temp
|
||||||
|
|
||||||
return spots
|
return spots
|
||||||
|
|
||||||
# Given URL query params and a spot, figure out if the spot "passes" the requested filters or is rejected. The list
|
|
||||||
# of query parameters and their function is defined in the API docs.
|
|
||||||
def spot_allowed_by_query(spot, query):
|
def spot_allowed_by_query(spot, query):
|
||||||
|
"""Given URL query params and a spot, figure out if the spot "passes" the requested filters or is rejected. The list
|
||||||
|
of query parameters and their function is defined in the API docs."""
|
||||||
|
|
||||||
for k in query.keys():
|
for k in query.keys():
|
||||||
match k:
|
match k:
|
||||||
case "since":
|
case "since":
|
||||||
@@ -229,7 +241,7 @@ def spot_allowed_by_query(spot, query):
|
|||||||
case "allow_qrt":
|
case "allow_qrt":
|
||||||
# If false, spots that are flagged as QRT are not returned.
|
# If false, spots that are flagged as QRT are not returned.
|
||||||
prevent_qrt = query.get(k).upper() == "FALSE"
|
prevent_qrt = query.get(k).upper() == "FALSE"
|
||||||
if prevent_qrt and spot.qrt and spot.qrt == True:
|
if prevent_qrt and spot.qrt:
|
||||||
return False
|
return False
|
||||||
case "needs_good_location":
|
case "needs_good_location":
|
||||||
# If true, spots require a "good" location to be returned
|
# If true, spots require a "good" location to be returned
|
||||||
|
|||||||
@@ -8,20 +8,21 @@ from core.prometheus_metrics_handler import api_requests_counter
|
|||||||
from core.utils import serialize_everything
|
from core.utils import serialize_everything
|
||||||
|
|
||||||
|
|
||||||
# API request handler for /api/v1/status
|
|
||||||
class APIStatusHandler(tornado.web.RequestHandler):
|
class APIStatusHandler(tornado.web.RequestHandler):
|
||||||
|
"""API request handler for /api/v1/status"""
|
||||||
|
|
||||||
def initialize(self, status_data, web_server_metrics):
|
def initialize(self, status_data, web_server_metrics):
|
||||||
self.status_data = status_data
|
self._status_data = status_data
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_api_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["api_access_counter"] += 1
|
self._web_server_metrics["api_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
api_requests_counter.inc()
|
api_requests_counter.inc()
|
||||||
|
|
||||||
self.write(json.dumps(self.status_data, default=serialize_everything))
|
self.write(json.dumps(self._status_data, default=serialize_everything))
|
||||||
self.set_status(200)
|
self.set_status(200)
|
||||||
self.set_header("Cache-Control", "no-store")
|
self.set_header("Cache-Control", "no-store")
|
||||||
self.set_header("Content-Type", "application/json")
|
self.set_header("Content-Type", "application/json")
|
||||||
|
|||||||
@@ -4,8 +4,9 @@ from prometheus_client import CONTENT_TYPE_LATEST
|
|||||||
from core.prometheus_metrics_handler import get_metrics
|
from core.prometheus_metrics_handler import get_metrics
|
||||||
|
|
||||||
|
|
||||||
# Handler for Prometheus metrics endpoint
|
|
||||||
class PrometheusMetricsHandler(tornado.web.RequestHandler):
|
class PrometheusMetricsHandler(tornado.web.RequestHandler):
|
||||||
|
"""Handler for Prometheus metrics endpoint"""
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
self.write(get_metrics())
|
self.write(get_metrics())
|
||||||
self.set_status(200)
|
self.set_status(200)
|
||||||
|
|||||||
@@ -3,24 +3,25 @@ from datetime import datetime
|
|||||||
import pytz
|
import pytz
|
||||||
import tornado
|
import tornado
|
||||||
|
|
||||||
from core.config import ALLOW_SPOTTING
|
from core.config import ALLOW_SPOTTING, WEB_UI_OPTIONS, BASE_URL
|
||||||
from core.constants import SOFTWARE_VERSION
|
from core.constants import SOFTWARE_VERSION
|
||||||
from core.prometheus_metrics_handler import page_requests_counter
|
from core.prometheus_metrics_handler import page_requests_counter
|
||||||
|
|
||||||
|
|
||||||
# Handler for all HTML pages generated from templates
|
|
||||||
class PageTemplateHandler(tornado.web.RequestHandler):
|
class PageTemplateHandler(tornado.web.RequestHandler):
|
||||||
|
"""Handler for all HTML pages generated from templates"""
|
||||||
|
|
||||||
def initialize(self, template_name, web_server_metrics):
|
def initialize(self, template_name, web_server_metrics):
|
||||||
self.template_name = template_name
|
self._template_name = template_name
|
||||||
self.web_server_metrics = web_server_metrics
|
self._web_server_metrics = web_server_metrics
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
# Metrics
|
# Metrics
|
||||||
self.web_server_metrics["last_page_access_time"] = datetime.now(pytz.UTC)
|
self._web_server_metrics["last_page_access_time"] = datetime.now(pytz.UTC)
|
||||||
self.web_server_metrics["page_access_counter"] += 1
|
self._web_server_metrics["page_access_counter"] += 1
|
||||||
self.web_server_metrics["status"] = "OK"
|
self._web_server_metrics["status"] = "OK"
|
||||||
page_requests_counter.inc()
|
page_requests_counter.inc()
|
||||||
|
|
||||||
# Load named template, and provide variables used in templates
|
# Load named template, and provide variables used in templates
|
||||||
self.render(self.template_name + ".html", software_version=SOFTWARE_VERSION, allow_spotting=ALLOW_SPOTTING)
|
self.render(self._template_name + ".html", software_version=SOFTWARE_VERSION, allow_spotting=ALLOW_SPOTTING,
|
||||||
|
web_ui_options=WEB_UI_OPTIONS, baseurl = BASE_URL, current_path=self.request.path)
|
||||||
@@ -5,27 +5,33 @@ import os
|
|||||||
import tornado
|
import tornado
|
||||||
from tornado.web import StaticFileHandler
|
from tornado.web import StaticFileHandler
|
||||||
|
|
||||||
|
from core.utils import empty_queue
|
||||||
from server.handlers.api.addspot import APISpotHandler
|
from server.handlers.api.addspot import APISpotHandler
|
||||||
|
from server.handlers.api.dxstats import APIDxStatsHandler
|
||||||
from server.handlers.api.alerts import APIAlertsHandler, APIAlertsStreamHandler
|
from server.handlers.api.alerts import APIAlertsHandler, APIAlertsStreamHandler
|
||||||
from server.handlers.api.lookups import APILookupCallHandler, APILookupSIGRefHandler
|
from server.handlers.api.lookups import APILookupCallHandler, APILookupSIGRefHandler, APILookupGridHandler
|
||||||
from server.handlers.api.options import APIOptionsHandler
|
from server.handlers.api.options import APIOptionsHandler
|
||||||
|
from server.handlers.api.solar_conditions import APISolarConditionsHandler
|
||||||
from server.handlers.api.spots import APISpotsHandler, APISpotsStreamHandler
|
from server.handlers.api.spots import APISpotsHandler, APISpotsStreamHandler
|
||||||
from server.handlers.api.status import APIStatusHandler
|
from server.handlers.api.status import APIStatusHandler
|
||||||
from server.handlers.metrics import PrometheusMetricsHandler
|
from server.handlers.metrics import PrometheusMetricsHandler
|
||||||
from server.handlers.pagetemplate import PageTemplateHandler
|
from server.handlers.pagetemplate import PageTemplateHandler
|
||||||
|
|
||||||
|
|
||||||
# Provides the public-facing web server.
|
|
||||||
class WebServer:
|
class WebServer:
|
||||||
# Constructor
|
"""Provides the public-facing web server."""
|
||||||
def __init__(self, spots, alerts, status_data, port):
|
|
||||||
self.spots = spots
|
def __init__(self, spots, alerts, solar_conditions, status_data, port):
|
||||||
self.alerts = alerts
|
"""Constructor"""
|
||||||
self.sse_spot_queues = []
|
|
||||||
self.sse_alert_queues = []
|
self._spots = spots
|
||||||
self.status_data = status_data
|
self._alerts = alerts
|
||||||
self.port = port
|
self._solar_conditions = solar_conditions
|
||||||
self.shutdown_event = asyncio.Event()
|
self._sse_spot_queues = []
|
||||||
|
self._sse_alert_queues = []
|
||||||
|
self._status_data = status_data
|
||||||
|
self._port = port
|
||||||
|
self._shutdown_event = asyncio.Event()
|
||||||
self.web_server_metrics = {
|
self.web_server_metrics = {
|
||||||
"last_page_access_time": None,
|
"last_page_access_time": None,
|
||||||
"last_api_access_time": None,
|
"last_api_access_time": None,
|
||||||
@@ -34,36 +40,54 @@ class WebServer:
|
|||||||
"status": "Starting"
|
"status": "Starting"
|
||||||
}
|
}
|
||||||
|
|
||||||
# Start the web server
|
|
||||||
def start(self):
|
def start(self):
|
||||||
asyncio.run(self.start_inner())
|
"""Start the web server"""
|
||||||
|
|
||||||
|
asyncio.run(self._start_inner())
|
||||||
|
|
||||||
# Stop the web server
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.shutdown_event.set()
|
"""Stop the web server"""
|
||||||
|
|
||||||
|
self._shutdown_event.set()
|
||||||
|
|
||||||
|
async def _start_inner(self):
|
||||||
|
"""Start method (async). Sets up the Tornado application."""
|
||||||
|
|
||||||
# Start method (async). Sets up the Tornado application.
|
|
||||||
async def start_inner(self):
|
|
||||||
app = tornado.web.Application([
|
app = tornado.web.Application([
|
||||||
# Routes for API calls
|
# Routes for API calls
|
||||||
(r"/api/v1/spots", APISpotsHandler, {"spots": self.spots, "web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/spots", APISpotsHandler, {"spots": self._spots, "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/alerts", APIAlertsHandler, {"alerts": self.alerts, "web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/alerts", APIAlertsHandler,
|
||||||
(r"/api/v1/spots/stream", APISpotsStreamHandler, {"sse_spot_queues": self.sse_spot_queues, "web_server_metrics": self.web_server_metrics}),
|
{"alerts": self._alerts, "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/alerts/stream", APIAlertsStreamHandler, {"sse_alert_queues": self.sse_alert_queues, "web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/spots/stream", APISpotsStreamHandler,
|
||||||
(r"/api/v1/options", APIOptionsHandler, {"status_data": self.status_data, "web_server_metrics": self.web_server_metrics}),
|
{"sse_spot_queues": self._sse_spot_queues, "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/status", APIStatusHandler, {"status_data": self.status_data, "web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/alerts/stream", APIAlertsStreamHandler,
|
||||||
|
{"sse_alert_queues": self._sse_alert_queues, "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/api/v1/solar", APISolarConditionsHandler,
|
||||||
|
{"solar_conditions": self._solar_conditions, "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/api/v1/dxstats", APIDxStatsHandler, {"spots": self._spots, "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/api/v1/options", APIOptionsHandler,
|
||||||
|
{"status_data": self._status_data, "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/api/v1/status", APIStatusHandler,
|
||||||
|
{"status_data": self._status_data, "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/lookup/call", APILookupCallHandler, {"web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/lookup/call", APILookupCallHandler, {"web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/lookup/sigref", APILookupSIGRefHandler, {"web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/lookup/sigref", APILookupSIGRefHandler, {"web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/api/v1/spot", APISpotHandler, {"spots": self.spots, "web_server_metrics": self.web_server_metrics}),
|
(r"/api/v1/lookup/grid", APILookupGridHandler, {"web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/api/v1/spot", APISpotHandler, {"spots": self._spots, "web_server_metrics": self.web_server_metrics}),
|
||||||
# Routes for templated pages
|
# Routes for templated pages
|
||||||
(r"/", PageTemplateHandler, {"template_name": "spots", "web_server_metrics": self.web_server_metrics}),
|
(r"/", PageTemplateHandler, {"template_name": "spots", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/map", PageTemplateHandler, {"template_name": "map", "web_server_metrics": self.web_server_metrics}),
|
(r"/map", PageTemplateHandler, {"template_name": "map", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/bands", PageTemplateHandler, {"template_name": "bands", "web_server_metrics": self.web_server_metrics}),
|
(r"/bands", PageTemplateHandler, {"template_name": "bands", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/alerts", PageTemplateHandler, {"template_name": "alerts", "web_server_metrics": self.web_server_metrics}),
|
(r"/alerts", PageTemplateHandler,
|
||||||
(r"/add-spot", PageTemplateHandler, {"template_name": "add_spot", "web_server_metrics": self.web_server_metrics}),
|
{"template_name": "alerts", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/status", PageTemplateHandler, {"template_name": "status", "web_server_metrics": self.web_server_metrics}),
|
(r"/add-spot", PageTemplateHandler,
|
||||||
|
{"template_name": "add_spot", "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/conditions", PageTemplateHandler,
|
||||||
|
{"template_name": "conditions", "web_server_metrics": self.web_server_metrics}),
|
||||||
|
(r"/status", PageTemplateHandler,
|
||||||
|
{"template_name": "status", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/about", PageTemplateHandler, {"template_name": "about", "web_server_metrics": self.web_server_metrics}),
|
(r"/about", PageTemplateHandler, {"template_name": "about", "web_server_metrics": self.web_server_metrics}),
|
||||||
(r"/apidocs", PageTemplateHandler, {"template_name": "apidocs", "web_server_metrics": self.web_server_metrics}),
|
(r"/apidocs", PageTemplateHandler,
|
||||||
|
{"template_name": "apidocs", "web_server_metrics": self.web_server_metrics}),
|
||||||
# Route for Prometheus metrics
|
# Route for Prometheus metrics
|
||||||
(r"/metrics", PrometheusMetricsHandler),
|
(r"/metrics", PrometheusMetricsHandler),
|
||||||
# Default route to serve from "webassets"
|
# Default route to serve from "webassets"
|
||||||
@@ -71,13 +95,14 @@ class WebServer:
|
|||||||
],
|
],
|
||||||
template_path=os.path.join(os.path.dirname(__file__), "../templates"),
|
template_path=os.path.join(os.path.dirname(__file__), "../templates"),
|
||||||
debug=False)
|
debug=False)
|
||||||
app.listen(self.port)
|
app.listen(self._port)
|
||||||
await self.shutdown_event.wait()
|
await self._shutdown_event.wait()
|
||||||
|
|
||||||
# Internal method called when a new spot is added to the system. This is used to ping any SSE clients that are
|
|
||||||
# awaiting a server-sent message with new spots.
|
|
||||||
def notify_new_spot(self, spot):
|
def notify_new_spot(self, spot):
|
||||||
for queue in self.sse_spot_queues:
|
"""Internal method called when a new spot is added to the system. This is used to ping any SSE clients that are
|
||||||
|
awaiting a server-sent message with new spots."""
|
||||||
|
|
||||||
|
for queue in self._sse_spot_queues:
|
||||||
try:
|
try:
|
||||||
queue.put(spot)
|
queue.put(spot)
|
||||||
except:
|
except:
|
||||||
@@ -85,10 +110,11 @@ class WebServer:
|
|||||||
pass
|
pass
|
||||||
pass
|
pass
|
||||||
|
|
||||||
# Internal method called when a new alert is added to the system. This is used to ping any SSE clients that are
|
|
||||||
# awaiting a server-sent message with new spots.
|
|
||||||
def notify_new_alert(self, alert):
|
def notify_new_alert(self, alert):
|
||||||
for queue in self.sse_alert_queues:
|
"""Internal method called when a new alert is added to the system. This is used to ping any SSE clients that are
|
||||||
|
awaiting a server-sent message with new spots."""
|
||||||
|
|
||||||
|
for queue in self._sse_alert_queues:
|
||||||
try:
|
try:
|
||||||
queue.put(alert)
|
queue.put(alert)
|
||||||
except:
|
except:
|
||||||
@@ -96,24 +122,27 @@ class WebServer:
|
|||||||
pass
|
pass
|
||||||
pass
|
pass
|
||||||
|
|
||||||
# Clean up any SSE queues that are growing too large; probably their client disconnected and we didn't catch it
|
|
||||||
# properly for some reason.
|
|
||||||
def clean_up_sse_queues(self):
|
def clean_up_sse_queues(self):
|
||||||
for q in self.sse_spot_queues:
|
"""Clean up any SSE queues that are growing too large; probably their client disconnected and we didn't catch it
|
||||||
|
properly for some reason."""
|
||||||
|
|
||||||
|
for q in self._sse_spot_queues:
|
||||||
try:
|
try:
|
||||||
if q.full():
|
if q.full():
|
||||||
logging.warn("A full SSE spot queue was found, presumably because the client disconnected strangely. It has been removed.")
|
logging.warning(
|
||||||
self.sse_spot_queues.remove(q)
|
"A full SSE spot queue was found, presumably because the client disconnected strangely. It has been removed.")
|
||||||
q.empty()
|
self._sse_spot_queues.remove(q)
|
||||||
|
empty_queue(q)
|
||||||
except:
|
except:
|
||||||
# Probably got deleted already on another thread
|
# Probably got deleted already on another thread
|
||||||
pass
|
pass
|
||||||
for q in self.sse_alert_queues:
|
for q in self._sse_alert_queues:
|
||||||
try:
|
try:
|
||||||
if q.full():
|
if q.full():
|
||||||
logging.warn("A full SSE alert queue was found, presumably because the client disconnected strangely. It has been removed.")
|
logging.warning(
|
||||||
self.sse_alert_queues.remove(q)
|
"A full SSE alert queue was found, presumably because the client disconnected strangely. It has been removed.")
|
||||||
q.empty()
|
self._sse_alert_queues.remove(q)
|
||||||
|
empty_queue(q)
|
||||||
except:
|
except:
|
||||||
# Probably got deleted already on another thread
|
# Probably got deleted already on another thread
|
||||||
pass
|
pass
|
||||||
|
|||||||
113
solarconditionsproviders/hamqsl.py
Normal file
113
solarconditionsproviders/hamqsl.py
Normal file
@@ -0,0 +1,113 @@
|
|||||||
|
import logging
|
||||||
|
from xml.etree import ElementTree
|
||||||
|
|
||||||
|
import pytz
|
||||||
|
from dateutil import parser as dateutil_parser, tz as dateutil_tz
|
||||||
|
|
||||||
|
|
||||||
|
from solarconditionsproviders.http_solar_conditions_provider import HTTPSolarConditionsProvider
|
||||||
|
|
||||||
|
POLL_INTERVAL = 3600 # 1 hour
|
||||||
|
URL = "https://www.hamqsl.com/solarxml.php"
|
||||||
|
|
||||||
|
|
||||||
|
class HamQSL(HTTPSolarConditionsProvider):
|
||||||
|
"""Solar conditions provider using the HamQSL.com XML API (https://www.hamqsl.com/solarxml.php).
|
||||||
|
Provides solar flux index, geomagnetic indices, and HF/VHF propagation condition summaries."""
|
||||||
|
|
||||||
|
def __init__(self, provider_config):
|
||||||
|
super().__init__(provider_config, URL, POLL_INTERVAL)
|
||||||
|
|
||||||
|
def _http_response_to_solar_conditions(self, http_response):
|
||||||
|
if http_response.status_code != 200:
|
||||||
|
logging.warning("HamQSL solar conditions API returned HTTP " + str(http_response.status_code))
|
||||||
|
return None
|
||||||
|
|
||||||
|
root = ElementTree.fromstring(http_response.text)
|
||||||
|
sd = root.find("solardata")
|
||||||
|
if sd is None:
|
||||||
|
logging.warning("HamQSL solar conditions API returned unexpected XML structure")
|
||||||
|
return None
|
||||||
|
|
||||||
|
# Some error checking functions in case the data is janky.
|
||||||
|
|
||||||
|
def text(tag, default=None):
|
||||||
|
el = sd.find(tag)
|
||||||
|
return el.text.strip() if el is not None and el.text else default
|
||||||
|
|
||||||
|
def float_val(tag, default=None):
|
||||||
|
try:
|
||||||
|
return float(text(tag))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return default
|
||||||
|
|
||||||
|
def int_val(tag, default=None):
|
||||||
|
try:
|
||||||
|
return int(text(tag))
|
||||||
|
except (ValueError, TypeError):
|
||||||
|
return default
|
||||||
|
|
||||||
|
# Process HF band conditions
|
||||||
|
hf_conditions = {}
|
||||||
|
calc = sd.find("calculatedconditions")
|
||||||
|
if calc is not None:
|
||||||
|
for band_el in calc.findall("band"):
|
||||||
|
name = band_el.get("name")
|
||||||
|
time = band_el.get("time")
|
||||||
|
condition = band_el.text.strip() if band_el.text else None
|
||||||
|
if name and time and condition:
|
||||||
|
hf_conditions[f"{name}-{time}"] = condition
|
||||||
|
|
||||||
|
# Process VHF propagation conditions
|
||||||
|
vhf_map = {}
|
||||||
|
vhf = sd.find("calculatedvhfconditions")
|
||||||
|
if vhf is not None:
|
||||||
|
for ph_el in vhf.findall("phenomenon"):
|
||||||
|
key = (ph_el.get("name"), ph_el.get("location"))
|
||||||
|
vhf_map[key] = ph_el.text.strip() if ph_el.text else None
|
||||||
|
|
||||||
|
# Parse the "updated" timestamp string (format: "28 Mar 2026 0949 GMT") to UTC epoch seconds.
|
||||||
|
updated = None
|
||||||
|
updated_str = text("updated")
|
||||||
|
if updated_str:
|
||||||
|
try:
|
||||||
|
tz_abbr = updated_str.split()[-1]
|
||||||
|
timezone = dateutil_tz.gettz(tz_abbr)
|
||||||
|
if timezone is None:
|
||||||
|
raise ValueError("Unknown timezone abbreviation: " + tz_abbr)
|
||||||
|
dt = dateutil_parser.parse(updated_str, tzinfos={tz_abbr: timezone})
|
||||||
|
updated = dt.astimezone(pytz.UTC).timestamp()
|
||||||
|
except (ValueError, IndexError):
|
||||||
|
logging.warning("HamQSL solar conditions API returned unrecognised timestamp format: " + updated_str)
|
||||||
|
|
||||||
|
# Return the data ready to be put into the solar conditions object.
|
||||||
|
return {
|
||||||
|
"updated": updated,
|
||||||
|
"sfi": int_val("solarflux"),
|
||||||
|
"a_index": int_val("aindex"),
|
||||||
|
"k_index": int_val("kindex"),
|
||||||
|
"x_ray": text("xray"),
|
||||||
|
"sunspots": int_val("sunspots"),
|
||||||
|
"proton_flux": int_val("protonflux"),
|
||||||
|
"electron_flux": int_val("electonflux"),
|
||||||
|
"aurora": int_val("aurora"),
|
||||||
|
"aurora_latitude": float_val("latdegree"),
|
||||||
|
"solar_wind": float_val("solarwind"),
|
||||||
|
"magnetic_field": float_val("magneticfield"),
|
||||||
|
"geomag_field": text("geomagfield").title()
|
||||||
|
.replace("Vr Quiet", "Very Quiet")
|
||||||
|
.replace("Unsettld", "Unsettled")
|
||||||
|
.replace("Min Strm", "Minor Storm")
|
||||||
|
.replace("Maj Strm", "Major Storm")
|
||||||
|
.replace("Sev Strm", "Severe Storm")
|
||||||
|
.replace("Ext Strm", "Extreme Storm"),
|
||||||
|
"geomag_noise": text("signalnoise"),
|
||||||
|
"hf_conditions": hf_conditions,
|
||||||
|
"vhf_conditions": {
|
||||||
|
"vhf_aurora_northern_hemi": vhf_map.get(("vhf-aurora", "northern_hemi")).title().replace("Lat Aur", "Latitude"),
|
||||||
|
"es_2m_europe": vhf_map.get(("E-Skip", "europe")),
|
||||||
|
"es_4m_europe": vhf_map.get(("E-Skip", "europe_4m")),
|
||||||
|
"es_6m_europe": vhf_map.get(("E-Skip", "europe_6m")),
|
||||||
|
"es_2m_na": vhf_map.get(("E-Skip", "north_america")),
|
||||||
|
},
|
||||||
|
}
|
||||||
59
solarconditionsproviders/http_solar_conditions_provider.py
Normal file
59
solarconditionsproviders/http_solar_conditions_provider.py
Normal file
@@ -0,0 +1,59 @@
|
|||||||
|
import logging
|
||||||
|
from datetime import datetime
|
||||||
|
from threading import Thread, Event
|
||||||
|
|
||||||
|
import pytz
|
||||||
|
import requests
|
||||||
|
|
||||||
|
from core.constants import HTTP_HEADERS
|
||||||
|
from solarconditionsproviders.solar_conditions_provider import SolarConditionsProvider
|
||||||
|
|
||||||
|
|
||||||
|
class HTTPSolarConditionsProvider(SolarConditionsProvider):
|
||||||
|
"""Generic solar conditions provider for providers that request data via HTTP(S). Subclasses implement
|
||||||
|
_http_response_to_solar_conditions() to parse the specific API response format."""
|
||||||
|
|
||||||
|
def __init__(self, provider_config, url, poll_interval):
|
||||||
|
super().__init__(provider_config)
|
||||||
|
self._url = url
|
||||||
|
self._poll_interval = poll_interval
|
||||||
|
self._thread = None
|
||||||
|
self._stop_event = Event()
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
logging.info(
|
||||||
|
"Set up query of " + self.name + " solar conditions API every " + str(self._poll_interval) + " seconds.")
|
||||||
|
self._thread = Thread(target=self._run, daemon=True)
|
||||||
|
self._thread.start()
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
self._stop_event.set()
|
||||||
|
|
||||||
|
def _run(self):
|
||||||
|
while True:
|
||||||
|
self._poll()
|
||||||
|
if self._stop_event.wait(timeout=self._poll_interval):
|
||||||
|
break
|
||||||
|
|
||||||
|
def _poll(self):
|
||||||
|
try:
|
||||||
|
logging.debug("Polling " + self.name + " solar conditions API...")
|
||||||
|
http_response = requests.get(self._url, headers=HTTP_HEADERS)
|
||||||
|
new_data = self._http_response_to_solar_conditions(http_response)
|
||||||
|
self.update_data(new_data)
|
||||||
|
|
||||||
|
self.status = "OK"
|
||||||
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
|
logging.debug("Received data from " + self.name + " solar conditions API.")
|
||||||
|
|
||||||
|
except Exception:
|
||||||
|
self.status = "Error"
|
||||||
|
logging.exception("Exception in HTTP Solar Conditions Provider (" + self.name + ")")
|
||||||
|
self._stop_event.wait(timeout=1)
|
||||||
|
|
||||||
|
def _http_response_to_solar_conditions(self, http_response):
|
||||||
|
"""Convert an HTTP response into solar conditions data. Returns a dict mapping SolarConditions field
|
||||||
|
names to their new values, or None if the response could not be parsed. Only the fields returned will
|
||||||
|
be updated on the shared SolarConditions object; any fields not included will be left unchanged."""
|
||||||
|
|
||||||
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
41
solarconditionsproviders/solar_conditions_provider.py
Normal file
41
solarconditionsproviders/solar_conditions_provider.py
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
import pytz
|
||||||
|
|
||||||
|
|
||||||
|
class SolarConditionsProvider:
|
||||||
|
"""Generic solar conditions provider class. Subclasses of this query individual APIs for space weather and
|
||||||
|
propagation data."""
|
||||||
|
|
||||||
|
def __init__(self, provider_config):
|
||||||
|
"""Constructor"""
|
||||||
|
|
||||||
|
self.name = provider_config["name"]
|
||||||
|
self.enabled = provider_config["enabled"]
|
||||||
|
self.last_update_time = datetime.min.replace(tzinfo=pytz.UTC)
|
||||||
|
self.status = "Not Started" if self.enabled else "Disabled"
|
||||||
|
self._solar_conditions = None
|
||||||
|
|
||||||
|
def setup(self, solar_conditions):
|
||||||
|
"""Set up the provider, giving it the solar conditions dict to update"""
|
||||||
|
|
||||||
|
self._solar_conditions = solar_conditions
|
||||||
|
|
||||||
|
def start(self):
|
||||||
|
"""Start the provider. This should return immediately after spawning threads to access the remote resources"""
|
||||||
|
|
||||||
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
|
|
||||||
|
def stop(self):
|
||||||
|
"""Stop any threads and prepare for application shutdown"""
|
||||||
|
|
||||||
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
|
|
||||||
|
def update_data(self, new_data):
|
||||||
|
"""Update the solar conditions object with new data"""
|
||||||
|
|
||||||
|
if new_data:
|
||||||
|
for key, value in new_data.items():
|
||||||
|
if hasattr(self._solar_conditions, key):
|
||||||
|
setattr(self._solar_conditions, key, value)
|
||||||
|
self._solar_conditions.infer_descriptions()
|
||||||
48
spothole.py
48
spothole.py
@@ -8,6 +8,7 @@ import sys
|
|||||||
from diskcache import Cache
|
from diskcache import Cache
|
||||||
|
|
||||||
from core.cleanup import CleanupTimer
|
from core.cleanup import CleanupTimer
|
||||||
|
from data.solar_conditions import SolarConditions
|
||||||
from core.config import config, WEB_SERVER_PORT, SERVER_OWNER_CALLSIGN
|
from core.config import config, WEB_SERVER_PORT, SERVER_OWNER_CALLSIGN
|
||||||
from core.constants import SOFTWARE_NAME, SOFTWARE_VERSION
|
from core.constants import SOFTWARE_NAME, SOFTWARE_VERSION
|
||||||
from core.lookup_helper import lookup_helper
|
from core.lookup_helper import lookup_helper
|
||||||
@@ -17,26 +18,32 @@ from server.webserver import WebServer
|
|||||||
# Globals
|
# Globals
|
||||||
spots = Cache('cache/spots_cache')
|
spots = Cache('cache/spots_cache')
|
||||||
alerts = Cache('cache/alerts_cache')
|
alerts = Cache('cache/alerts_cache')
|
||||||
|
solar_conditions = SolarConditions()
|
||||||
web_server = None
|
web_server = None
|
||||||
status_data = {}
|
status_data = {}
|
||||||
spot_providers = []
|
spot_providers = []
|
||||||
alert_providers = []
|
alert_providers = []
|
||||||
|
solar_condition_providers = []
|
||||||
cleanup_timer = None
|
cleanup_timer = None
|
||||||
run = True
|
run = True
|
||||||
|
|
||||||
|
|
||||||
# Shutdown function
|
|
||||||
def shutdown(sig, frame):
|
def shutdown(sig, frame):
|
||||||
|
"""Shutdown function"""
|
||||||
|
|
||||||
global run
|
global run
|
||||||
|
|
||||||
logging.info("Stopping program...")
|
logging.info("Stopping program...")
|
||||||
web_server.stop()
|
web_server.stop()
|
||||||
for p in spot_providers:
|
for sp in spot_providers:
|
||||||
if p.enabled:
|
if sp.enabled:
|
||||||
p.stop()
|
sp.stop()
|
||||||
for p in alert_providers:
|
for ap in alert_providers:
|
||||||
if p.enabled:
|
if ap.enabled:
|
||||||
p.stop()
|
ap.stop()
|
||||||
|
for scp in solar_condition_providers:
|
||||||
|
if scp.enabled:
|
||||||
|
scp.stop()
|
||||||
cleanup_timer.stop()
|
cleanup_timer.stop()
|
||||||
lookup_helper.stop()
|
lookup_helper.stop()
|
||||||
spots.close()
|
spots.close()
|
||||||
@@ -44,20 +51,30 @@ def shutdown(sig, frame):
|
|||||||
os._exit(0)
|
os._exit(0)
|
||||||
|
|
||||||
|
|
||||||
# Utility method to get a spot provider based on the class specified in its config entry.
|
|
||||||
def get_spot_provider_from_config(config_providers_entry):
|
def get_spot_provider_from_config(config_providers_entry):
|
||||||
|
"""Utility method to get a spot provider based on the class specified in its config entry."""
|
||||||
|
|
||||||
module = importlib.import_module('spotproviders.' + config_providers_entry["class"].lower())
|
module = importlib.import_module('spotproviders.' + config_providers_entry["class"].lower())
|
||||||
provider_class = getattr(module, config_providers_entry["class"])
|
provider_class = getattr(module, config_providers_entry["class"])
|
||||||
return provider_class(config_providers_entry)
|
return provider_class(config_providers_entry)
|
||||||
|
|
||||||
|
|
||||||
# Utility method to get an alert provider based on the class specified in its config entry.
|
|
||||||
def get_alert_provider_from_config(config_providers_entry):
|
def get_alert_provider_from_config(config_providers_entry):
|
||||||
|
"""Utility method to get an alert provider based on the class specified in its config entry."""
|
||||||
|
|
||||||
module = importlib.import_module('alertproviders.' + config_providers_entry["class"].lower())
|
module = importlib.import_module('alertproviders.' + config_providers_entry["class"].lower())
|
||||||
provider_class = getattr(module, config_providers_entry["class"])
|
provider_class = getattr(module, config_providers_entry["class"])
|
||||||
return provider_class(config_providers_entry)
|
return provider_class(config_providers_entry)
|
||||||
|
|
||||||
|
|
||||||
|
def get_solar_conditions_provider_from_config(config_providers_entry):
|
||||||
|
"""Utility method to get a solar conditions provider based on the class specified in its config entry."""
|
||||||
|
|
||||||
|
module = importlib.import_module('solarconditionsproviders.' + config_providers_entry["class"].lower())
|
||||||
|
provider_class = getattr(module, config_providers_entry["class"])
|
||||||
|
return provider_class(config_providers_entry)
|
||||||
|
|
||||||
|
|
||||||
# Main function
|
# Main function
|
||||||
if __name__ == '__main__':
|
if __name__ == '__main__':
|
||||||
# Set up logging
|
# Set up logging
|
||||||
@@ -80,7 +97,7 @@ if __name__ == '__main__':
|
|||||||
lookup_helper.start()
|
lookup_helper.start()
|
||||||
|
|
||||||
# Set up web server
|
# Set up web server
|
||||||
web_server = WebServer(spots=spots, alerts=alerts, status_data=status_data, port=WEB_SERVER_PORT)
|
web_server = WebServer(spots=spots, alerts=alerts, solar_conditions=solar_conditions, status_data=status_data, port=WEB_SERVER_PORT)
|
||||||
|
|
||||||
# Fetch, set up and start spot providers
|
# Fetch, set up and start spot providers
|
||||||
for entry in config["spot-providers"]:
|
for entry in config["spot-providers"]:
|
||||||
@@ -98,6 +115,14 @@ if __name__ == '__main__':
|
|||||||
if p.enabled:
|
if p.enabled:
|
||||||
p.start()
|
p.start()
|
||||||
|
|
||||||
|
# Fetch, set up and start solar conditions providers
|
||||||
|
for entry in config.get("solar-condition-providers", []):
|
||||||
|
solar_condition_providers.append(get_solar_conditions_provider_from_config(entry))
|
||||||
|
for p in solar_condition_providers:
|
||||||
|
p.setup(solar_conditions=solar_conditions)
|
||||||
|
if p.enabled:
|
||||||
|
p.start()
|
||||||
|
|
||||||
# Set up timer to clear spot list of old data
|
# Set up timer to clear spot list of old data
|
||||||
cleanup_timer = CleanupTimer(spots=spots, alerts=alerts, web_server=web_server, cleanup_interval=60)
|
cleanup_timer = CleanupTimer(spots=spots, alerts=alerts, web_server=web_server, cleanup_interval=60)
|
||||||
cleanup_timer.start()
|
cleanup_timer.start()
|
||||||
@@ -105,7 +130,8 @@ if __name__ == '__main__':
|
|||||||
# Set up status reporter
|
# Set up status reporter
|
||||||
status_reporter = StatusReporter(status_data=status_data, spots=spots, alerts=alerts, web_server=web_server,
|
status_reporter = StatusReporter(status_data=status_data, spots=spots, alerts=alerts, web_server=web_server,
|
||||||
cleanup_timer=cleanup_timer, spot_providers=spot_providers,
|
cleanup_timer=cleanup_timer, spot_providers=spot_providers,
|
||||||
alert_providers=alert_providers, run_interval=5)
|
alert_providers=alert_providers,
|
||||||
|
solar_condition_providers=solar_condition_providers, run_interval=5)
|
||||||
status_reporter.start()
|
status_reporter.start()
|
||||||
|
|
||||||
logging.info("Startup complete.")
|
logging.info("Startup complete.")
|
||||||
|
|||||||
@@ -10,32 +10,32 @@ from data.spot import Spot
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for the APRS-IS.
|
|
||||||
class APRSIS(SpotProvider):
|
class APRSIS(SpotProvider):
|
||||||
|
"""Spot provider for the APRS-IS."""
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.thread = Thread(target=self.connect)
|
self._thread = Thread(target=self._connect)
|
||||||
self.thread.daemon = True
|
self._thread.daemon = True
|
||||||
self.aprsis = None
|
self._aprsis = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.thread.start()
|
self._thread.start()
|
||||||
|
|
||||||
def connect(self):
|
def _connect(self):
|
||||||
self.aprsis = aprslib.IS(SERVER_OWNER_CALLSIGN)
|
self._aprsis = aprslib.IS(SERVER_OWNER_CALLSIGN)
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
logging.info("APRS-IS connecting...")
|
logging.info("APRS-IS connecting...")
|
||||||
self.aprsis.connect()
|
self._aprsis.connect()
|
||||||
self.aprsis.consumer(self.handle)
|
self._aprsis.consumer(self._handle)
|
||||||
logging.info("APRS-IS connected.")
|
logging.info("APRS-IS connected.")
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.status = "Shutting down"
|
self.status = "Shutting down"
|
||||||
self.aprsis.close()
|
self._aprsis.close()
|
||||||
self.thread.join()
|
self._thread.join()
|
||||||
|
|
||||||
def handle(self, data):
|
def _handle(self, data):
|
||||||
# Split SSID in "from" call and store separately
|
# Split SSID in "from" call and store separately
|
||||||
from_parts = data["from"].split("-").upper()
|
from_parts = data["from"].split("-").upper()
|
||||||
dx_call = from_parts[0]
|
dx_call = from_parts[0]
|
||||||
@@ -51,10 +51,11 @@ class APRSIS(SpotProvider):
|
|||||||
comment=data["comment"] if "comment" in data else None,
|
comment=data["comment"] if "comment" in data else None,
|
||||||
dx_latitude=data["latitude"] if "latitude" in data else None,
|
dx_latitude=data["latitude"] if "latitude" in data else None,
|
||||||
dx_longitude=data["longitude"] if "longitude" in data else None,
|
dx_longitude=data["longitude"] if "longitude" in data else None,
|
||||||
time=datetime.now(pytz.UTC).timestamp()) # APRS-IS spots are live so we can assume spot time is "now"
|
time=datetime.now(
|
||||||
|
pytz.UTC).timestamp()) # APRS-IS spots are live so we can assume spot time is "now"
|
||||||
|
|
||||||
# Add to our list
|
# Add to our list
|
||||||
self.submit(spot)
|
self._submit(spot)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
|
|||||||
@@ -12,66 +12,67 @@ from data.spot import Spot
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for a DX Cluster. Hostname, port, login_prompt, login_callsign and allow_rbn_spots are provided in config.
|
|
||||||
# See config-example.yml for examples.
|
|
||||||
class DXCluster(SpotProvider):
|
class DXCluster(SpotProvider):
|
||||||
CALLSIGN_PATTERN = "([a-z|0-9|/]+)"
|
"""Spot provider for a DX Cluster. Hostname, port, login_prompt, login_callsign and allow_rbn_spots are provided in config.
|
||||||
FREQUENCY_PATTERN = "([0-9|.]+)"
|
See config-example.yml for examples."""
|
||||||
LINE_PATTERN_EXCLUDE_RBN = re.compile(
|
|
||||||
"^DX de " + CALLSIGN_PATTERN + ":\\s+" + FREQUENCY_PATTERN + "\\s+" + CALLSIGN_PATTERN + "\\s+(.*)\\s+(\\d{4}Z)",
|
_LINE_PATTERN_EXCLUDE_RBN = re.compile(
|
||||||
|
r"^DX de ([a-z0-9/]+):\s+([0-9.]+)\s+([a-z0-9/]+)\s+(.*)\s+(\d{4}Z)",
|
||||||
re.IGNORECASE)
|
re.IGNORECASE)
|
||||||
LINE_PATTERN_ALLOW_RBN = re.compile(
|
_LINE_PATTERN_ALLOW_RBN = re.compile(
|
||||||
"^DX de " + CALLSIGN_PATTERN + "-?#?:\\s+" + FREQUENCY_PATTERN + "\\s+" + CALLSIGN_PATTERN + "\\s+(.*)\\s+(\\d{4}Z)",
|
r"^DX de ([a-z0-9/]+)-?#?:\s+([0-9.]+)\s+([a-z0-9/]+)\s+(.*)\s+(\d{4}Z)",
|
||||||
re.IGNORECASE)
|
re.IGNORECASE)
|
||||||
|
|
||||||
# Constructor requires hostname and port
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
|
"""Constructor requires hostname and port"""
|
||||||
|
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.hostname = provider_config["host"]
|
self._hostname = provider_config["host"]
|
||||||
self.port = provider_config["port"]
|
self._port = provider_config["port"]
|
||||||
self.login_prompt = provider_config["login_prompt"] if "login_prompt" in provider_config else "login:"
|
self._login_prompt = provider_config["login_prompt"] if "login_prompt" in provider_config else "login:"
|
||||||
self.login_callsign = provider_config["login_callsign"] if "login_callsign" in provider_config else SERVER_OWNER_CALLSIGN
|
self._login_callsign = provider_config[
|
||||||
self.allow_rbn_spots = provider_config["allow_rbn_spots"] if "allow_rbn_spots" in provider_config else False
|
"login_callsign"] if "login_callsign" in provider_config else SERVER_OWNER_CALLSIGN
|
||||||
self.spot_line_pattern = self.LINE_PATTERN_ALLOW_RBN if self.allow_rbn_spots else self.LINE_PATTERN_EXCLUDE_RBN
|
self._allow_rbn_spots = provider_config["allow_rbn_spots"] if "allow_rbn_spots" in provider_config else False
|
||||||
self.telnet = None
|
self._spot_line_pattern = self._LINE_PATTERN_ALLOW_RBN if self._allow_rbn_spots else self._LINE_PATTERN_EXCLUDE_RBN
|
||||||
self.thread = Thread(target=self.handle)
|
self._telnet = None
|
||||||
self.thread.daemon = True
|
self._thread = Thread(target=self._handle)
|
||||||
self.run = True
|
self._thread.daemon = True
|
||||||
|
self._running = True
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.thread.start()
|
self._thread.start()
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.run = False
|
self._running = False
|
||||||
self.telnet.close()
|
self._telnet.close()
|
||||||
self.thread.join()
|
self._thread.join()
|
||||||
|
|
||||||
def handle(self):
|
def _handle(self):
|
||||||
while self.run:
|
while self._running:
|
||||||
connected = False
|
connected = False
|
||||||
while not connected and self.run:
|
while not connected and self._running:
|
||||||
try:
|
try:
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
logging.info("DX Cluster " + self.hostname + " connecting...")
|
logging.info("DX Cluster " + self._hostname + " connecting...")
|
||||||
self.telnet = telnetlib3.Telnet(self.hostname, self.port)
|
self._telnet = telnetlib3.Telnet(self._hostname, self._port)
|
||||||
self.telnet.read_until(self.login_prompt.encode("latin-1"))
|
self._telnet.read_until(self._login_prompt.encode("latin-1"))
|
||||||
self.telnet.write((self.login_callsign + "\n").encode("latin-1"))
|
self._telnet.write((self._login_callsign + "\n").encode("latin-1"))
|
||||||
connected = True
|
connected = True
|
||||||
logging.info("DX Cluster " + self.hostname + " connected.")
|
logging.info("DX Cluster " + self._hostname + " connected.")
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception while connecting to DX Cluster Provider (" + self.hostname + ").")
|
logging.exception("Exception while connecting to DX Cluster Provider (" + self._hostname + ").")
|
||||||
sleep(5)
|
sleep(5)
|
||||||
|
|
||||||
self.status = "Waiting for Data"
|
self.status = "Waiting for Data"
|
||||||
while connected and self.run:
|
while connected and self._running:
|
||||||
try:
|
try:
|
||||||
# Check new telnet info against regular expression
|
# Check new telnet info against regular expression
|
||||||
telnet_output = self.telnet.read_until("\n".encode("latin-1"))
|
telnet_output = self._telnet.read_until("\n".encode("latin-1"))
|
||||||
match = self.spot_line_pattern.match(telnet_output.decode("latin-1"))
|
match = self._spot_line_pattern.match(telnet_output.decode("latin-1"))
|
||||||
if match:
|
if match:
|
||||||
spot_time = datetime.strptime(match.group(5), "%H%MZ")
|
spot_time = datetime.strptime(match.group(5), "%H%MZ")
|
||||||
spot_datetime = datetime.combine(datetime.today(), spot_time.time()).replace(tzinfo=pytz.UTC)
|
spot_datetime = datetime.combine(datetime.now(pytz.UTC).date(), spot_time.time(), tzinfo=pytz.UTC)
|
||||||
spot = Spot(source=self.name,
|
spot = Spot(source=self.name,
|
||||||
dx_call=match.group(3),
|
dx_call=match.group(3),
|
||||||
de_call=match.group(1),
|
de_call=match.group(1),
|
||||||
@@ -80,20 +81,20 @@ class DXCluster(SpotProvider):
|
|||||||
time=spot_datetime.timestamp())
|
time=spot_datetime.timestamp())
|
||||||
|
|
||||||
# Add to our list
|
# Add to our list
|
||||||
self.submit(spot)
|
self._submit(spot)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Data received from DX Cluster " + self.hostname + ".")
|
logging.debug("Data received from DX Cluster " + self._hostname + ".")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
connected = False
|
connected = False
|
||||||
if self.run:
|
if self._running:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in DX Cluster Provider (" + self.hostname + ")")
|
logging.exception("Exception in DX Cluster Provider (" + self._hostname + ")")
|
||||||
sleep(5)
|
sleep(5)
|
||||||
else:
|
else:
|
||||||
logging.info("DX Cluster " + self.hostname + " shutting down...")
|
logging.info("DX Cluster " + self._hostname + " shutting down...")
|
||||||
self.status = "Shutting down"
|
self.status = "Shutting down"
|
||||||
|
|
||||||
self.status = "Disconnected"
|
self.status = "Disconnected"
|
||||||
@@ -10,8 +10,9 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for General Mountain Activity
|
|
||||||
class GMA(HTTPSpotProvider):
|
class GMA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for General Mountain Activity"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://www.cqgma.org/api/spots/25/"
|
SPOTS_URL = "https://www.cqgma.org/api/spots/25/"
|
||||||
# GMA spots don't contain the details of the programme they are for, we need a separate lookup for that
|
# GMA spots don't contain the details of the programme they are for, we need a separate lookup for that
|
||||||
@@ -20,7 +21,7 @@ class GMA(HTTPSpotProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json()["RCD"]:
|
for source_spot in http_response.json()["RCD"]:
|
||||||
@@ -36,9 +37,11 @@ class GMA(HTTPSpotProvider):
|
|||||||
sig_refs=[SIGRef(id=source_spot["REF"], sig="", name=source_spot["NAME"])],
|
sig_refs=[SIGRef(id=source_spot["REF"], sig="", name=source_spot["NAME"])],
|
||||||
time=datetime.strptime(source_spot["DATE"] + source_spot["TIME"], "%Y%m%d%H%M").replace(
|
time=datetime.strptime(source_spot["DATE"] + source_spot["TIME"], "%Y%m%d%H%M").replace(
|
||||||
tzinfo=pytz.UTC).timestamp(),
|
tzinfo=pytz.UTC).timestamp(),
|
||||||
dx_latitude=float(source_spot["LAT"]) if (source_spot["LAT"] and source_spot["LAT"] != "") else None,
|
dx_latitude=float(source_spot["LAT"]) if (
|
||||||
|
source_spot["LAT"] and source_spot["LAT"] != "") else None,
|
||||||
# Seen GMA spots with no (or empty) lat/lon
|
# Seen GMA spots with no (or empty) lat/lon
|
||||||
dx_longitude=float(source_spot["LON"]) if (source_spot["LON"] and source_spot["LON"] != "") else None)
|
dx_longitude=float(source_spot["LON"]) if (
|
||||||
|
source_spot["LON"] and source_spot["LON"] != "") else None)
|
||||||
|
|
||||||
# GMA doesn't give what programme (SIG) the reference is for until we separately look it up.
|
# GMA doesn't give what programme (SIG) the reference is for until we separately look it up.
|
||||||
if "REF" in source_spot:
|
if "REF" in source_spot:
|
||||||
@@ -74,7 +77,7 @@ class GMA(HTTPSpotProvider):
|
|||||||
spot.sig_refs[0].sig = "MOTA"
|
spot.sig_refs[0].sig = "MOTA"
|
||||||
spot.sig = "MOTA"
|
spot.sig = "MOTA"
|
||||||
case _:
|
case _:
|
||||||
logging.warn("GMA spot found with ref type " + ref_info[
|
logging.warning("GMA spot found with ref type " + ref_info[
|
||||||
"reftype"] + ", developer needs to add support for this!")
|
"reftype"] + ", developer needs to add support for this!")
|
||||||
spot.sig_refs[0].sig = ref_info["reftype"]
|
spot.sig_refs[0].sig = ref_info["reftype"]
|
||||||
spot.sig = ref_info["reftype"]
|
spot.sig = ref_info["reftype"]
|
||||||
@@ -83,5 +86,6 @@ class GMA(HTTPSpotProvider):
|
|||||||
# that for us.
|
# that for us.
|
||||||
new_spots.append(spot)
|
new_spots.append(spot)
|
||||||
except:
|
except:
|
||||||
logging.warn("Exception when looking up " + self.REF_INFO_URL_ROOT + source_spot["REF"] + ", ignoring this spot for now")
|
logging.warning("Exception when looking up " + self.REF_INFO_URL_ROOT + source_spot[
|
||||||
|
"REF"] + ", ignoring this spot for now")
|
||||||
return new_spots
|
return new_spots
|
||||||
|
|||||||
@@ -10,8 +10,9 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for HuMPs Excluding Marilyns Award
|
|
||||||
class HEMA(HTTPSpotProvider):
|
class HEMA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for HuMPs Excluding Marilyns Award"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 300
|
POLL_INTERVAL_SEC = 300
|
||||||
# HEMA wants us to check for a "spot seed" from the API and see if it's actually changed before querying the main
|
# HEMA wants us to check for a "spot seed" from the API and see if it's actually changed before querying the main
|
||||||
# data API. So it's actually the SPOT_SEED_URL that we pass into the constructor and get the superclass to call on a
|
# data API. So it's actually the SPOT_SEED_URL that we pass into the constructor and get the superclass to call on a
|
||||||
@@ -23,13 +24,13 @@ class HEMA(HTTPSpotProvider):
|
|||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOT_SEED_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOT_SEED_URL, self.POLL_INTERVAL_SEC)
|
||||||
self.spot_seed = ""
|
self._spot_seed = ""
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
# OK, source data is actually just the spot seed at this point. We'll then go on to fetch real data if we know
|
# OK, source data is actually just the spot seed at this point. We'll then go on to fetch real data if we know
|
||||||
# this has changed.
|
# this has changed.
|
||||||
spot_seed_changed = http_response.text != self.spot_seed
|
spot_seed_changed = http_response.text != self._spot_seed
|
||||||
self.spot_seed = http_response.text
|
self._spot_seed = http_response.text
|
||||||
|
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# OK, if the spot seed actually changed, now we make the real request for data.
|
# OK, if the spot seed actually changed, now we make the real request for data.
|
||||||
@@ -54,7 +55,8 @@ class HEMA(HTTPSpotProvider):
|
|||||||
comment=spotter_comment_match.group(2),
|
comment=spotter_comment_match.group(2),
|
||||||
sig="HEMA",
|
sig="HEMA",
|
||||||
sig_refs=[SIGRef(id=spot_items[3].upper(), sig="HEMA", name=spot_items[4])],
|
sig_refs=[SIGRef(id=spot_items[3].upper(), sig="HEMA", name=spot_items[4])],
|
||||||
time=datetime.strptime(spot_items[0], "%d/%m/%Y %H:%M").replace(tzinfo=pytz.UTC).timestamp(),
|
time=datetime.strptime(spot_items[0], "%d/%m/%Y %H:%M").replace(
|
||||||
|
tzinfo=pytz.UTC).timestamp(),
|
||||||
dx_latitude=float(spot_items[7]),
|
dx_latitude=float(spot_items[7]),
|
||||||
dx_longitude=float(spot_items[8]))
|
dx_longitude=float(spot_items[8]))
|
||||||
|
|
||||||
|
|||||||
@@ -1,7 +1,6 @@
|
|||||||
import logging
|
import logging
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from threading import Timer, Thread
|
from threading import Thread, Event
|
||||||
from time import sleep
|
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
import requests
|
import requests
|
||||||
@@ -10,54 +9,56 @@ from core.constants import HTTP_HEADERS
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Generic spot provider class for providers that request data via HTTP(S). Just for convenience to avoid code
|
|
||||||
# duplication. Subclasses of this query the individual APIs for data.
|
|
||||||
class HTTPSpotProvider(SpotProvider):
|
class HTTPSpotProvider(SpotProvider):
|
||||||
|
"""Generic spot provider class for providers that request data via HTTP(S). Just for convenience to avoid code
|
||||||
|
duplication. Subclasses of this query the individual APIs for data."""
|
||||||
|
|
||||||
def __init__(self, provider_config, url, poll_interval):
|
def __init__(self, provider_config, url, poll_interval):
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.url = url
|
self._url = url
|
||||||
self.poll_interval = poll_interval
|
self._poll_interval = poll_interval
|
||||||
self.poll_timer = None
|
self._thread = None
|
||||||
|
self._stop_event = Event()
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
# Fire off a one-shot thread to run poll() for the first time, just to ensure start() returns immediately and
|
# Fire off the polling thread. It will poll immediately on startup, then sleep for poll_interval between
|
||||||
# the application can continue starting. The thread itself will then die, and the timer will kick in on its own
|
# subsequent polls, so start() returns immediately and the application can continue starting.
|
||||||
# thread.
|
logging.info("Set up query of " + self.name + " spot API every " + str(self._poll_interval) + " seconds.")
|
||||||
logging.info("Set up query of " + self.name + " spot API every " + str(self.poll_interval) + " seconds.")
|
self._thread = Thread(target=self._run, daemon=True)
|
||||||
thread = Thread(target=self.poll)
|
self._thread.start()
|
||||||
thread.daemon = True
|
|
||||||
thread.start()
|
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
if self.poll_timer:
|
self._stop_event.set()
|
||||||
self.poll_timer.cancel()
|
|
||||||
|
|
||||||
def poll(self):
|
def _run(self):
|
||||||
|
while True:
|
||||||
|
self._poll()
|
||||||
|
if self._stop_event.wait(timeout=self._poll_interval):
|
||||||
|
break
|
||||||
|
|
||||||
|
def _poll(self):
|
||||||
try:
|
try:
|
||||||
# Request data from API
|
# Request data from API
|
||||||
logging.debug("Polling " + self.name + " spot API...")
|
logging.debug("Polling " + self.name + " spot API...")
|
||||||
http_response = requests.get(self.url, headers=HTTP_HEADERS)
|
http_response = requests.get(self._url, headers=HTTP_HEADERS)
|
||||||
# Pass off to the subclass for processing
|
# Pass off to the subclass for processing
|
||||||
new_spots = self.http_response_to_spots(http_response)
|
new_spots = self._http_response_to_spots(http_response)
|
||||||
# Submit the new spots for processing. There might not be any spots for the less popular programs.
|
# Submit the new spots for processing. There might not be any spots for the less popular programs.
|
||||||
if new_spots:
|
if new_spots:
|
||||||
self.submit_batch(new_spots)
|
self._submit_batch(new_spots)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Received data from " + self.name + " spot API.")
|
logging.debug("Received data from " + self.name + " spot API.")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in HTTP JSON Spot Provider (" + self.name + ")")
|
logging.exception("Exception in HTTP JSON Spot Provider (" + self.name + ")")
|
||||||
sleep(1)
|
self._stop_event.wait(timeout=1)
|
||||||
|
|
||||||
self.poll_timer = Timer(self.poll_interval, self.poll)
|
def _http_response_to_spots(self, http_response):
|
||||||
self.poll_timer.start()
|
"""Convert an HTTP response returned by the API into spot data. The whole response is provided here so the subclass
|
||||||
|
implementations can check for HTTP status codes if necessary, and handle the response as JSON, XML, text, whatever
|
||||||
|
the API actually provides."""
|
||||||
|
|
||||||
# Convert an HTTP response returned by the API into spot data. The whole response is provided here so the subclass
|
|
||||||
# implementations can check for HTTP status codes if necessary, and handle the response as JSON, XML, text, whatever
|
|
||||||
# the API actually provides.
|
|
||||||
def http_response_to_spots(self, http_response):
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -5,15 +5,16 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Lagos y Lagunas On the Air
|
|
||||||
class LLOTA(HTTPSpotProvider):
|
class LLOTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Lagos y Lagunas On the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://llota.app/api/public/spots"
|
SPOTS_URL = "https://llota.app/api/public/spots"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json():
|
for source_spot in http_response.json():
|
||||||
|
|||||||
@@ -9,8 +9,9 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Parks n Peaks
|
|
||||||
class ParksNPeaks(HTTPSpotProvider):
|
class ParksNPeaks(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Parks n Peaks"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://www.parksnpeaks.org/api/ALL"
|
SPOTS_URL = "https://www.parksnpeaks.org/api/ALL"
|
||||||
SIOTA_LIST_URL = "https://www.silosontheair.com/data/silos.csv"
|
SIOTA_LIST_URL = "https://www.silosontheair.com/data/silos.csv"
|
||||||
@@ -18,40 +19,46 @@ class ParksNPeaks(HTTPSpotProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json():
|
if http_response and http_response != "":
|
||||||
# Convert to our spot format
|
for source_spot in http_response.json():
|
||||||
spot = Spot(source=self.name,
|
# Convert to our spot format
|
||||||
source_id=source_spot["actID"],
|
spot = Spot(source=self.name,
|
||||||
dx_call=source_spot["actCallsign"].upper(),
|
source_id=source_spot["actID"],
|
||||||
de_call=source_spot["actSpoter"].upper() if source_spot["actSpoter"] != "" else None, # typo exists in API
|
dx_call=source_spot["actCallsign"].upper(),
|
||||||
freq=float(source_spot["actFreq"].replace(",", "")) * 1000000 if (
|
de_call=source_spot["actSpoter"].upper() if source_spot["actSpoter"] != "" else None,
|
||||||
|
# typo exists in API
|
||||||
|
freq=float(source_spot["actFreq"].replace(",", "")) * 1000000 if (
|
||||||
source_spot["actFreq"] != "") else None,
|
source_spot["actFreq"] != "") else None,
|
||||||
# Seen PNP spots with empty frequency, and with comma-separated thousands digits
|
# Seen PNP spots with empty frequency, and with comma-separated thousands digits
|
||||||
mode=source_spot["actMode"].upper(),
|
mode=source_spot["actMode"].upper(),
|
||||||
comment=source_spot["actComments"],
|
comment=source_spot["actComments"],
|
||||||
sig=source_spot["actClass"].upper(),
|
time=datetime.strptime(source_spot["actTime"], "%Y-%m-%d %H:%M:%S").replace(
|
||||||
sig_refs=[SIGRef(id=source_spot["actSiteID"], sig=source_spot["actClass"].upper())],
|
tzinfo=pytz.UTC).timestamp())
|
||||||
time=datetime.strptime(source_spot["actTime"], "%Y-%m-%d %H:%M:%S").replace(
|
|
||||||
tzinfo=pytz.UTC).timestamp())
|
|
||||||
|
|
||||||
# Free text location is not present in all spots, so only add it if it's set
|
# Extract a de_call if it's in the comment but not in the "actSpoter" field
|
||||||
if "actLocation" in source_spot and source_spot["actLocation"] != "":
|
m = re.search(r"\(de ([A-Za-z0-9]*)\)", spot.comment)
|
||||||
spot.sig_refs[0].name = source_spot["actLocation"]
|
if not spot.de_call and m:
|
||||||
|
spot.de_call = m.group(1)
|
||||||
|
|
||||||
# Extract a de_call if it's in the comment but not in the "actSpoter" field
|
# Record SIG information. Sometimes we get a "SIG" of "QRP", which we ignore as it's not a programme with a
|
||||||
m = re.search(r"\(de ([A-Za-z0-9]*)\)", spot.comment)
|
# defined set of references
|
||||||
if not spot.de_call and m:
|
sig = source_spot["actClass"].upper()
|
||||||
spot.de_call = m.group(1)
|
sig_ref = source_spot["actSiteID"]
|
||||||
|
if sig and sig != "" and sig != "QRP" and sig_ref and sig_ref != "":
|
||||||
|
spot.sig = sig
|
||||||
|
spot.sig_refs = [SIGRef(id=source_spot["actSiteID"], sig=source_spot["actClass"].upper())]
|
||||||
|
|
||||||
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
|
# Free text location is not present in all spots, so only add it if it's set
|
||||||
if spot.sig_refs[0].sig not in ["POTA", "SOTA", "WWFF", "SIOTA", "ZLOTA", "KRMNPA"]:
|
if "actLocation" in source_spot and source_spot["actLocation"] != "":
|
||||||
logging.warn("PNP spot found with sig " + spot.sig + ", developer needs to add support for this!")
|
spot.sig_refs[0].name = source_spot["actLocation"]
|
||||||
|
|
||||||
# If this is POTA, SOTA, WWFF or ZLOTA data we already have it through other means, so ignore. Otherwise,
|
# Log a warning for the developer if PnP gives us an unknown programme we've never seen before
|
||||||
# add to the spot list.
|
if sig not in ["POTA", "SOTA", "WWFF", "SIOTA", "ZLOTA", "KRMNPA"]:
|
||||||
if spot.sig_refs[0].sig not in ["POTA", "SOTA", "WWFF", "ZLOTA"]:
|
logging.warning("PNP spot found with sig " + sig + ", developer needs to add support for this!")
|
||||||
|
|
||||||
|
# Add new spot to the list
|
||||||
new_spots.append(spot)
|
new_spots.append(spot)
|
||||||
return new_spots
|
return new_spots
|
||||||
|
|||||||
@@ -7,15 +7,16 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Parks on the Air
|
|
||||||
class POTA(HTTPSpotProvider):
|
class POTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Parks on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://api.pota.app/spot/activator"
|
SPOTS_URL = "https://api.pota.app/spot/activator"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json():
|
for source_spot in http_response.json():
|
||||||
|
|||||||
@@ -12,59 +12,58 @@ from data.spot import Spot
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for the Reverse Beacon Network. Connects to a single port, if you want both CW/RTTY (port 7000) and FT8
|
|
||||||
# (port 7001) you need to instantiate two copies of this. The port is provided as an argument to the constructor.
|
|
||||||
class RBN(SpotProvider):
|
class RBN(SpotProvider):
|
||||||
CALLSIGN_PATTERN = "([a-z|0-9|/]+)"
|
"""Spot provider for the Reverse Beacon Network. Connects to a single port, if you want both CW/RTTY (port 7000) and FT8
|
||||||
FREQUENCY_PATTERM = "([0-9|.]+)"
|
(port 7001) you need to instantiate two copies of this. The port is provided as an argument to the constructor."""
|
||||||
LINE_PATTERN = re.compile(
|
|
||||||
"^DX de " + CALLSIGN_PATTERN + "-.*:\\s+" + FREQUENCY_PATTERM + "\\s+" + CALLSIGN_PATTERN + "\\s+(.*)\\s+(\\d{4}Z)",
|
_LINE_PATTERN = re.compile(
|
||||||
|
r"^DX de ([a-z0-9/]+)-.*:\s+([0-9.]+)\s+([a-z0-9/]+)\s+(.*)\s+(\d{4}Z)",
|
||||||
re.IGNORECASE)
|
re.IGNORECASE)
|
||||||
|
|
||||||
# Constructor requires port number.
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config)
|
"""Constructor requires port number."""
|
||||||
self.port = provider_config["port"]
|
|
||||||
self.telnet = None
|
|
||||||
self.thread = Thread(target=self.handle)
|
|
||||||
self.thread.daemon = True
|
|
||||||
self.run = True
|
|
||||||
|
|
||||||
|
super().__init__(provider_config)
|
||||||
|
self._port = provider_config["port"]
|
||||||
|
self._telnet = None
|
||||||
|
self._thread = Thread(target=self._handle)
|
||||||
|
self._thread.daemon = True
|
||||||
|
self._running = True
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
self.thread.start()
|
self._thread.start()
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.run = False
|
self._running = False
|
||||||
self.telnet.close()
|
self._telnet.close()
|
||||||
self.thread.join()
|
self._thread.join()
|
||||||
|
|
||||||
def handle(self):
|
def _handle(self):
|
||||||
while self.run:
|
while self._running:
|
||||||
connected = False
|
connected = False
|
||||||
while not connected and self.run:
|
while not connected and self._running:
|
||||||
try:
|
try:
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
logging.info("RBN port " + str(self.port) + " connecting...")
|
logging.info("RBN port " + str(self._port) + " connecting...")
|
||||||
self.telnet = telnetlib3.Telnet("telnet.reversebeacon.net", self.port)
|
self._telnet = telnetlib3.Telnet("telnet.reversebeacon.net", self._port)
|
||||||
telnet_output = self.telnet.read_until("Please enter your call: ".encode("latin-1"))
|
telnet_output = self._telnet.read_until("Please enter your call: ".encode("latin-1"))
|
||||||
self.telnet.write((SERVER_OWNER_CALLSIGN + "\n").encode("latin-1"))
|
self._telnet.write((SERVER_OWNER_CALLSIGN + "\n").encode("latin-1"))
|
||||||
connected = True
|
connected = True
|
||||||
logging.info("RBN port " + str(self.port) + " connected.")
|
logging.info("RBN port " + str(self._port) + " connected.")
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception while connecting to RBN (port " + str(self.port) + ").")
|
logging.exception("Exception while connecting to RBN (port " + str(self._port) + ").")
|
||||||
sleep(5)
|
sleep(5)
|
||||||
|
|
||||||
self.status = "Waiting for Data"
|
self.status = "Waiting for Data"
|
||||||
while connected and self.run:
|
while connected and self._running:
|
||||||
try:
|
try:
|
||||||
# Check new telnet info against regular expression
|
# Check new telnet info against regular expression
|
||||||
telnet_output = self.telnet.read_until("\n".encode("latin-1"))
|
telnet_output = self._telnet.read_until("\n".encode("latin-1"))
|
||||||
match = self.LINE_PATTERN.match(telnet_output.decode("latin-1"))
|
match = self._LINE_PATTERN.match(telnet_output.decode("latin-1"))
|
||||||
if match:
|
if match:
|
||||||
spot_time = datetime.strptime(match.group(5), "%H%MZ")
|
spot_time = datetime.strptime(match.group(5), "%H%MZ")
|
||||||
spot_datetime = datetime.combine(datetime.today(), spot_time.time()).replace(tzinfo=pytz.UTC)
|
spot_datetime = datetime.combine(datetime.now(pytz.UTC).date(), spot_time.time(), tzinfo=pytz.UTC)
|
||||||
spot = Spot(source=self.name,
|
spot = Spot(source=self.name,
|
||||||
dx_call=match.group(3),
|
dx_call=match.group(3),
|
||||||
de_call=match.group(1),
|
de_call=match.group(1),
|
||||||
@@ -73,20 +72,20 @@ class RBN(SpotProvider):
|
|||||||
time=spot_datetime.timestamp())
|
time=spot_datetime.timestamp())
|
||||||
|
|
||||||
# Add to our list
|
# Add to our list
|
||||||
self.submit(spot)
|
self._submit(spot)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Data received from RBN on port " + str(self.port) + ".")
|
logging.debug("Data received from RBN on port " + str(self._port) + ".")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
connected = False
|
connected = False
|
||||||
if self.run:
|
if self._running:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in RBN provider (port " + str(self.port) + ")")
|
logging.exception("Exception in RBN provider (port " + str(self._port) + ")")
|
||||||
sleep(5)
|
sleep(5)
|
||||||
else:
|
else:
|
||||||
logging.info("RBN provider (port " + str(self.port) + ") shutting down...")
|
logging.info("RBN provider (port " + str(self._port) + ") shutting down...")
|
||||||
self.status = "Shutting down"
|
self.status = "Shutting down"
|
||||||
|
|
||||||
self.status = "Disconnected"
|
self.status = "Disconnected"
|
||||||
@@ -8,8 +8,9 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Summits on the Air
|
|
||||||
class SOTA(HTTPSpotProvider):
|
class SOTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Summits on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
# SOTA wants us to check for an "epoch" from the API and see if it's actually changed before querying the main data
|
# SOTA wants us to check for an "epoch" from the API and see if it's actually changed before querying the main data
|
||||||
# APIs. So it's actually the EPOCH_URL that we pass into the constructor and get the superclass to call on a timer.
|
# APIs. So it's actually the EPOCH_URL that we pass into the constructor and get the superclass to call on a timer.
|
||||||
@@ -21,13 +22,13 @@ class SOTA(HTTPSpotProvider):
|
|||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.EPOCH_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.EPOCH_URL, self.POLL_INTERVAL_SEC)
|
||||||
self.api_epoch = ""
|
self._api_epoch = ""
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
# OK, source data is actually just the epoch at this point. We'll then go on to fetch real data if we know this
|
# OK, source data is actually just the epoch at this point. We'll then go on to fetch real data if we know this
|
||||||
# has changed.
|
# has changed.
|
||||||
epoch_changed = http_response.text != self.api_epoch
|
epoch_changed = http_response.text != self._api_epoch
|
||||||
self.api_epoch = http_response.text
|
self._api_epoch = http_response.text
|
||||||
|
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# OK, if the epoch actually changed, now we make the real request for data.
|
# OK, if the epoch actually changed, now we make the real request for data.
|
||||||
@@ -41,11 +42,14 @@ class SOTA(HTTPSpotProvider):
|
|||||||
dx_call=source_spot["activatorCallsign"].upper(),
|
dx_call=source_spot["activatorCallsign"].upper(),
|
||||||
dx_name=source_spot["activatorName"],
|
dx_name=source_spot["activatorName"],
|
||||||
de_call=source_spot["callsign"].upper(),
|
de_call=source_spot["callsign"].upper(),
|
||||||
freq=(float(source_spot["frequency"]) * 1000000) if (source_spot["frequency"] is not None) else None, # Seen SOTA spots with no frequency!
|
freq=(float(source_spot["frequency"]) * 1000000) if (
|
||||||
|
source_spot["frequency"] is not None) else None,
|
||||||
|
# Seen SOTA spots with no frequency!
|
||||||
mode=source_spot["mode"].upper(),
|
mode=source_spot["mode"].upper(),
|
||||||
comment=source_spot["comments"],
|
comment=source_spot["comments"],
|
||||||
sig="SOTA",
|
sig="SOTA",
|
||||||
sig_refs=[SIGRef(id=source_spot["summitCode"], sig="SOTA", name=source_spot["summitName"], activation_score=source_spot["points"])],
|
sig_refs=[SIGRef(id=source_spot["summitCode"], sig="SOTA", name=source_spot["summitName"],
|
||||||
|
activation_score=source_spot["points"])],
|
||||||
time=datetime.fromisoformat(source_spot["timeStamp"].replace("Z", "+00:00")).timestamp())
|
time=datetime.fromisoformat(source_spot["timeStamp"].replace("Z", "+00:00")).timestamp())
|
||||||
|
|
||||||
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
|
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
|
||||||
|
|||||||
@@ -5,59 +5,66 @@ import pytz
|
|||||||
from core.config import MAX_SPOT_AGE
|
from core.config import MAX_SPOT_AGE
|
||||||
|
|
||||||
|
|
||||||
# Generic spot provider class. Subclasses of this query the individual APIs for data.
|
|
||||||
class SpotProvider:
|
class SpotProvider:
|
||||||
|
"""Generic spot provider class. Subclasses of this query the individual APIs for data."""
|
||||||
|
|
||||||
# Constructor
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
|
"""Constructor"""
|
||||||
|
|
||||||
self.name = provider_config["name"]
|
self.name = provider_config["name"]
|
||||||
self.enabled = provider_config["enabled"]
|
self.enabled = provider_config["enabled"]
|
||||||
self.last_update_time = datetime.min.replace(tzinfo=pytz.UTC)
|
self.last_update_time = datetime.min.replace(tzinfo=pytz.UTC)
|
||||||
self.last_spot_time = datetime.min.replace(tzinfo=pytz.UTC)
|
self.last_spot_time = datetime.min.replace(tzinfo=pytz.UTC)
|
||||||
self.status = "Not Started" if self.enabled else "Disabled"
|
self.status = "Not Started" if self.enabled else "Disabled"
|
||||||
self.spots = None
|
self._spots = None
|
||||||
self.web_server = None
|
self._web_server = None
|
||||||
|
|
||||||
# Set up the provider, e.g. giving it the spot list to work from
|
|
||||||
def setup(self, spots, web_server):
|
def setup(self, spots, web_server):
|
||||||
self.spots = spots
|
"""Set up the provider, e.g. giving it the spot list to work from"""
|
||||||
self.web_server = web_server
|
|
||||||
|
self._spots = spots
|
||||||
|
self._web_server = web_server
|
||||||
|
|
||||||
# Start the provider. This should return immediately after spawning threads to access the remote resources
|
|
||||||
def start(self):
|
def start(self):
|
||||||
|
"""Start the provider. This should return immediately after spawning threads to access the remote resources"""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
|
|
||||||
# Submit a batch of spots retrieved from the provider. Only spots that are newer than the last spot retrieved
|
def _submit_batch(self, spots):
|
||||||
# by this provider will be added to the spot list, to prevent duplications. Spots passing the check will also have
|
"""Submit a batch of spots retrieved from the provider. Only spots that are newer than the last spot retrieved
|
||||||
# their infer_missing() method called to complete their data set. This is called by the API-querying
|
by this provider will be added to the spot list, to prevent duplications. Spots passing the check will also have
|
||||||
# subclasses on receiving spots.
|
their infer_missing() method called to complete their data set. This is called by the API-querying
|
||||||
def submit_batch(self, spots):
|
subclasses on receiving spots."""
|
||||||
|
|
||||||
# Sort the batch so that earliest ones go in first. This helps keep the ordering correct when spots are fired
|
# Sort the batch so that earliest ones go in first. This helps keep the ordering correct when spots are fired
|
||||||
# off to SSE listeners.
|
# off to SSE listeners.
|
||||||
spots = sorted(spots, key=lambda spot: (spot.time if spot and spot.time else 0))
|
spots = sorted(spots, key=lambda s: (s.time if s and s.time else 0))
|
||||||
for spot in spots:
|
for spot in spots:
|
||||||
if datetime.fromtimestamp(spot.time, pytz.UTC) > self.last_spot_time:
|
if datetime.fromtimestamp(spot.time, pytz.UTC) > self.last_spot_time:
|
||||||
# Fill in any blanks and add to the list
|
# Fill in any blanks and add to the list
|
||||||
spot.infer_missing()
|
spot.infer_missing()
|
||||||
self.add_spot(spot)
|
self._add_spot(spot)
|
||||||
self.last_spot_time = datetime.fromtimestamp(max(map(lambda s: s.time, spots)), pytz.UTC)
|
if spots:
|
||||||
|
self.last_spot_time = datetime.fromtimestamp(max(map(lambda s: s.time, spots)), pytz.UTC)
|
||||||
|
|
||||||
|
def _submit(self, spot):
|
||||||
|
"""Submit a single spot retrieved from the provider. This will be added to the list regardless of its age. Spots
|
||||||
|
passing the check will also have their infer_missing() method called to complete their data set. This is called by
|
||||||
|
the data streaming subclasses, which can be relied upon not to re-provide old spots."""
|
||||||
|
|
||||||
# Submit a single spot retrieved from the provider. This will be added to the list regardless of its age. Spots
|
|
||||||
# passing the check will also have their infer_missing() method called to complete their data set. This is called by
|
|
||||||
# the data streaming subclasses, which can be relied upon not to re-provide old spots.
|
|
||||||
def submit(self, spot):
|
|
||||||
# Fill in any blanks and add to the list
|
# Fill in any blanks and add to the list
|
||||||
spot.infer_missing()
|
spot.infer_missing()
|
||||||
self.add_spot(spot)
|
self._add_spot(spot)
|
||||||
self.last_spot_time = datetime.fromtimestamp(spot.time, pytz.UTC)
|
self.last_spot_time = datetime.fromtimestamp(spot.time, pytz.UTC)
|
||||||
|
|
||||||
def add_spot(self, spot):
|
def _add_spot(self, spot):
|
||||||
if not spot.expired():
|
if not spot.expired():
|
||||||
self.spots.add(spot.id, spot, expire=MAX_SPOT_AGE)
|
self._spots.add(spot.id, spot, expire=MAX_SPOT_AGE)
|
||||||
# Ping the web server in case we have any SSE connections that need to see this immediately
|
# Ping the web server in case we have any SSE connections that need to see this immediately
|
||||||
if self.web_server:
|
if self._web_server:
|
||||||
self.web_server.notify_new_spot(spot)
|
self._web_server.notify_new_spot(spot)
|
||||||
|
|
||||||
# Stop any threads and prepare for application shutdown
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
|
"""Stop any threads and prepare for application shutdown"""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -10,30 +10,30 @@ from core.constants import HTTP_HEADERS
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider using Server-Sent Events.
|
|
||||||
class SSESpotProvider(SpotProvider):
|
class SSESpotProvider(SpotProvider):
|
||||||
|
"""Spot provider using Server-Sent Events."""
|
||||||
|
|
||||||
def __init__(self, provider_config, url):
|
def __init__(self, provider_config, url):
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.url = url
|
self._url = url
|
||||||
self.event_source = None
|
self._event_source = None
|
||||||
self.thread = None
|
self._thread = None
|
||||||
self.stopped = False
|
self._stopped = False
|
||||||
self.last_event_id = None
|
self._last_event_id = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
logging.info("Set up SSE connection to " + self.name + " spot API.")
|
logging.info("Set up SSE connection to " + self.name + " spot API.")
|
||||||
self.stopped = False
|
self._stopped = False
|
||||||
self.thread = Thread(target=self.run)
|
self._thread = Thread(target=self._run)
|
||||||
self.thread.daemon = True
|
self._thread.daemon = True
|
||||||
self.thread.start()
|
self._thread.start()
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.stopped = True
|
self._stopped = True
|
||||||
if self.event_source:
|
if self._event_source:
|
||||||
self.event_source.close()
|
self._event_source.close()
|
||||||
if self.thread:
|
if self._thread:
|
||||||
self.thread.join()
|
self._thread.join()
|
||||||
|
|
||||||
def _on_open(self):
|
def _on_open(self):
|
||||||
self.status = "Waiting for Data"
|
self.status = "Waiting for Data"
|
||||||
@@ -41,37 +41,39 @@ class SSESpotProvider(SpotProvider):
|
|||||||
def _on_error(self):
|
def _on_error(self):
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
|
|
||||||
def run(self):
|
def _run(self):
|
||||||
while not self.stopped:
|
while not self._stopped:
|
||||||
try:
|
try:
|
||||||
logging.debug("Connecting to " + self.name + " spot API...")
|
logging.debug("Connecting to " + self.name + " spot API...")
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
with EventSource(self.url, headers=HTTP_HEADERS, latest_event_id=self.last_event_id, timeout=30,
|
with EventSource(self._url, headers=HTTP_HEADERS, latest_event_id=self._last_event_id, timeout=30,
|
||||||
on_open=self._on_open, on_error=self._on_error) as event_source:
|
on_open=self._on_open, on_error=self._on_error) as event_source:
|
||||||
self.event_source = event_source
|
self._event_source = event_source
|
||||||
for event in self.event_source:
|
for event in self._event_source:
|
||||||
if event.type == 'message':
|
if event.type == 'message':
|
||||||
try:
|
try:
|
||||||
self.last_event_id = event.last_event_id
|
self._last_event_id = event.last_event_id
|
||||||
new_spot = self.sse_message_to_spot(event.data)
|
new_spot = self._sse_message_to_spot(event.data)
|
||||||
if new_spot:
|
if new_spot:
|
||||||
self.submit(new_spot)
|
self._submit(new_spot)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Received data from " + self.name + " spot API.")
|
logging.debug("Received data from " + self.name + " spot API.")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
logging.exception("Exception processing message from SSE Spot Provider (" + self.name + ")")
|
logging.exception(
|
||||||
|
"Exception processing message from SSE Spot Provider (" + self.name + ")")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
logging.exception("Exception in SSE Spot Provider (" + self.name + ")")
|
logging.exception("Exception in SSE Spot Provider (" + self.name + ")")
|
||||||
else:
|
else:
|
||||||
self.status = "Disconnected"
|
self.status = "Disconnected"
|
||||||
sleep(5) # Wait before trying to reconnect
|
sleep(5) # Wait before trying to reconnect
|
||||||
|
|
||||||
# Convert an SSE message received from the API into a spot. The whole message data is provided here so the subclass
|
def _sse_message_to_spot(self, message_data):
|
||||||
# implementations can handle the message as JSON, XML, text, whatever the API actually provides.
|
"""Convert an SSE message received from the API into a spot. The whole message data is provided here so the subclass
|
||||||
def sse_message_to_spot(self, message_data):
|
implementations can handle the message as JSON, XML, text, whatever the API actually provides."""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -7,15 +7,16 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for UK Packet Radio network API
|
|
||||||
class UKPacketNet(HTTPSpotProvider):
|
class UKPacketNet(HTTPSpotProvider):
|
||||||
|
"""Spot provider for UK Packet Radio network API"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 600
|
POLL_INTERVAL_SEC = 600
|
||||||
SPOTS_URL = "https://nodes.ukpacketradio.network/api/nodedata"
|
SPOTS_URL = "https://nodes.ukpacketradio.network/api/nodedata"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
nodes = http_response.json()["nodes"]
|
nodes = http_response.json()["nodes"]
|
||||||
@@ -35,20 +36,26 @@ class UKPacketNet(HTTPSpotProvider):
|
|||||||
# First build a "full" comment combining some of the extra info
|
# First build a "full" comment combining some of the extra info
|
||||||
comment = listed_port["comment"] if "comment" in listed_port else ""
|
comment = listed_port["comment"] if "comment" in listed_port else ""
|
||||||
comment = (comment + " " + listed_port["mode"]) if "mode" in listed_port else comment
|
comment = (comment + " " + listed_port["mode"]) if "mode" in listed_port else comment
|
||||||
comment = (comment + " " + listed_port["modulation"]) if "modulation" in listed_port else comment
|
comment = (comment + " " + listed_port[
|
||||||
comment = (comment + " " + str(listed_port["baud"]) + " baud") if "baud" in listed_port and listed_port["baud"] > 0 else comment
|
"modulation"]) if "modulation" in listed_port else comment
|
||||||
|
comment = (comment + " " + str(
|
||||||
|
listed_port["baud"]) + " baud") if "baud" in listed_port and listed_port[
|
||||||
|
"baud"] > 0 else comment
|
||||||
|
|
||||||
# Get frequency from the comment if it's not set properly in the data structure. This is
|
# Get frequency from the comment if it's not set properly in the data structure. This is
|
||||||
# very hacky but a lot of node comments contain their frequency as the first or second
|
# very hacky but a lot of node comments contain their frequency as the first or second
|
||||||
# word of their comment, but not in the proper data structure field.
|
# word of their comment, but not in the proper data structure field.
|
||||||
freq = listed_port["freq"] if "freq" in listed_port and listed_port["freq"] > 0 else None
|
freq = listed_port["freq"] if "freq" in listed_port and listed_port[
|
||||||
|
"freq"] > 0 else None
|
||||||
if not freq and comment:
|
if not freq and comment:
|
||||||
possible_freq = comment.split(" ")[0].upper().replace("MHZ", "")
|
possible_freq = comment.split(" ")[0].upper().replace("MHZ", "")
|
||||||
if re.match(r"^[0-9.]+$", possible_freq) and possible_freq != "1200" and possible_freq != "9600":
|
if re.match(r"^[0-9.]+$",
|
||||||
|
possible_freq) and possible_freq != "1200" and possible_freq != "9600":
|
||||||
freq = float(possible_freq) * 1000000
|
freq = float(possible_freq) * 1000000
|
||||||
if not freq and len(comment.split(" ")) > 1:
|
if not freq and len(comment.split(" ")) > 1:
|
||||||
possible_freq = comment.split(" ")[1].upper().replace("MHZ", "")
|
possible_freq = comment.split(" ")[1].upper().replace("MHZ", "")
|
||||||
if re.match(r"^[0-9.]+$", possible_freq) and possible_freq != "1200" and possible_freq != "9600":
|
if re.match(r"^[0-9.]+$",
|
||||||
|
possible_freq) and possible_freq != "1200" and possible_freq != "9600":
|
||||||
freq = float(possible_freq) * 1000000
|
freq = float(possible_freq) * 1000000
|
||||||
# Check for a found frequency likely having been in kHz, sorry to all GHz packet folks
|
# Check for a found frequency likely having been in kHz, sorry to all GHz packet folks
|
||||||
if freq and freq > 1000000000:
|
if freq and freq > 1000000000:
|
||||||
@@ -61,8 +68,10 @@ class UKPacketNet(HTTPSpotProvider):
|
|||||||
freq=freq,
|
freq=freq,
|
||||||
mode="PKT",
|
mode="PKT",
|
||||||
comment=comment,
|
comment=comment,
|
||||||
time=datetime.strptime(heard["lastHeard"], "%Y-%m-%d %H:%M:%S").replace(tzinfo=pytz.UTC).timestamp(),
|
time=datetime.strptime(heard["lastHeard"], "%Y-%m-%d %H:%M:%S").replace(
|
||||||
de_grid=node["location"]["locator"] if "locator" in node["location"] else None,
|
tzinfo=pytz.UTC).timestamp(),
|
||||||
|
de_grid=node["location"]["locator"] if "locator" in node[
|
||||||
|
"location"] else None,
|
||||||
de_latitude=node["location"]["coords"]["lat"],
|
de_latitude=node["location"]["coords"]["lat"],
|
||||||
de_longitude=node["location"]["coords"]["lon"])
|
de_longitude=node["location"]["coords"]["lon"])
|
||||||
|
|
||||||
@@ -77,7 +86,8 @@ class UKPacketNet(HTTPSpotProvider):
|
|||||||
# data, and we can use that to look these up.
|
# data, and we can use that to look these up.
|
||||||
for spot in new_spots:
|
for spot in new_spots:
|
||||||
if spot.dx_call in nodes:
|
if spot.dx_call in nodes:
|
||||||
spot.dx_grid = nodes[spot.dx_call]["location"]["locator"] if "locator" in nodes[spot.dx_call]["location"] else None
|
spot.dx_grid = nodes[spot.dx_call]["location"]["locator"] if "locator" in nodes[spot.dx_call][
|
||||||
|
"location"] else None
|
||||||
spot.dx_latitude = nodes[spot.dx_call]["location"]["coords"]["lat"]
|
spot.dx_latitude = nodes[spot.dx_call]["location"]["coords"]["lat"]
|
||||||
spot.dx_longitude = nodes[spot.dx_call]["location"]["coords"]["lon"]
|
spot.dx_longitude = nodes[spot.dx_call]["location"]["coords"]["lon"]
|
||||||
|
|
||||||
|
|||||||
@@ -10,30 +10,30 @@ from core.constants import HTTP_HEADERS
|
|||||||
from spotproviders.spot_provider import SpotProvider
|
from spotproviders.spot_provider import SpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider using websockets.
|
|
||||||
class WebsocketSpotProvider(SpotProvider):
|
class WebsocketSpotProvider(SpotProvider):
|
||||||
|
"""Spot provider using websockets."""
|
||||||
|
|
||||||
def __init__(self, provider_config, url):
|
def __init__(self, provider_config, url):
|
||||||
super().__init__(provider_config)
|
super().__init__(provider_config)
|
||||||
self.url = url
|
self._url = url
|
||||||
self.ws = None
|
self._ws = None
|
||||||
self.thread = None
|
self._thread = None
|
||||||
self.stopped = False
|
self._stopped = False
|
||||||
self.last_event_id = None
|
self._last_event_id = None
|
||||||
|
|
||||||
def start(self):
|
def start(self):
|
||||||
logging.info("Set up websocket connection to " + self.name + " spot API.")
|
logging.info("Set up websocket connection to " + self.name + " spot API.")
|
||||||
self.stopped = False
|
self._stopped = False
|
||||||
self.thread = Thread(target=self.run)
|
self._thread = Thread(target=self._run)
|
||||||
self.thread.daemon = True
|
self._thread.daemon = True
|
||||||
self.thread.start()
|
self._thread.start()
|
||||||
|
|
||||||
def stop(self):
|
def stop(self):
|
||||||
self.stopped = True
|
self._stopped = True
|
||||||
if self.ws:
|
if self._ws:
|
||||||
self.ws.close()
|
self._ws.close()
|
||||||
if self.thread:
|
if self._thread:
|
||||||
self.thread.join()
|
self._thread.join()
|
||||||
|
|
||||||
def _on_open(self):
|
def _on_open(self):
|
||||||
self.status = "Waiting for Data"
|
self.status = "Waiting for Data"
|
||||||
@@ -41,26 +41,27 @@ class WebsocketSpotProvider(SpotProvider):
|
|||||||
def _on_error(self):
|
def _on_error(self):
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
|
|
||||||
def run(self):
|
def _run(self):
|
||||||
while not self.stopped:
|
while not self._stopped:
|
||||||
try:
|
try:
|
||||||
logging.debug("Connecting to " + self.name + " spot API...")
|
logging.debug("Connecting to " + self.name + " spot API...")
|
||||||
self.status = "Connecting"
|
self.status = "Connecting"
|
||||||
self.ws = create_connection(self.url, header=HTTP_HEADERS)
|
self._ws = create_connection(self._url, header=HTTP_HEADERS)
|
||||||
self.status = "Connected"
|
self.status = "Connected"
|
||||||
data = self.ws.recv()
|
data = self._ws.recv()
|
||||||
if data:
|
if data:
|
||||||
try:
|
try:
|
||||||
new_spot = self.ws_message_to_spot(data)
|
new_spot = self._ws_message_to_spot(data)
|
||||||
if new_spot:
|
if new_spot:
|
||||||
self.submit(new_spot)
|
self._submit(new_spot)
|
||||||
|
|
||||||
self.status = "OK"
|
self.status = "OK"
|
||||||
self.last_update_time = datetime.now(pytz.UTC)
|
self.last_update_time = datetime.now(pytz.UTC)
|
||||||
logging.debug("Received data from " + self.name + " spot API.")
|
logging.debug("Received data from " + self.name + " spot API.")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception:
|
||||||
logging.exception("Exception processing message from Websocket Spot Provider (" + self.name + ")")
|
logging.exception(
|
||||||
|
"Exception processing message from Websocket Spot Provider (" + self.name + ")")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
self.status = "Error"
|
self.status = "Error"
|
||||||
@@ -69,7 +70,8 @@ class WebsocketSpotProvider(SpotProvider):
|
|||||||
self.status = "Disconnected"
|
self.status = "Disconnected"
|
||||||
sleep(5) # Wait before trying to reconnect
|
sleep(5) # Wait before trying to reconnect
|
||||||
|
|
||||||
# Convert a WS message received from the API into a spot. The exact message data (in bytes) is provided here so the
|
def _ws_message_to_spot(self, b):
|
||||||
# subclass implementations can handle the message as string, JSON, XML, whatever the API actually provides.
|
"""Convert a WS message received from the API into a spot. The exact message data (in bytes) is provided here so the
|
||||||
def ws_message_to_spot(self, bytes):
|
subclass implementations can handle the message as string, JSON, XML, whatever the API actually provides."""
|
||||||
|
|
||||||
raise NotImplementedError("Subclasses must implement this method")
|
raise NotImplementedError("Subclasses must implement this method")
|
||||||
@@ -3,15 +3,16 @@ import re
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
import pytz
|
import pytz
|
||||||
from rss_parser import RSSParser
|
from rss_parser import Parser
|
||||||
|
|
||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
from data.spot import Spot
|
from data.spot import Spot
|
||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Wainwrights on the Air
|
|
||||||
class WOTA(HTTPSpotProvider):
|
class WOTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Wainwrights on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://www.wota.org.uk/spots_rss.php"
|
SPOTS_URL = "https://www.wota.org.uk/spots_rss.php"
|
||||||
LIST_URL = "https://www.wota.org.uk/mapping/data/summits.json"
|
LIST_URL = "https://www.wota.org.uk/mapping/data/summits.json"
|
||||||
@@ -20,9 +21,9 @@ class WOTA(HTTPSpotProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
rss = RSSParser.parse(http_response.content.decode())
|
rss = Parser.parse(http_response.content.decode())
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in rss.channel.items:
|
for source_spot in rss.channel.items:
|
||||||
|
|
||||||
@@ -47,6 +48,7 @@ class WOTA(HTTPSpotProvider):
|
|||||||
freq_mode = desc_split[0].replace("Frequencies/modes:", "").strip()
|
freq_mode = desc_split[0].replace("Frequencies/modes:", "").strip()
|
||||||
freq_mode_split = re.split(r'[\-\s]+', freq_mode)
|
freq_mode_split = re.split(r'[\-\s]+', freq_mode)
|
||||||
freq_hz = float(freq_mode_split[0]) * 1000000
|
freq_hz = float(freq_mode_split[0]) * 1000000
|
||||||
|
mode = None
|
||||||
if len(freq_mode_split) > 1:
|
if len(freq_mode_split) > 1:
|
||||||
mode = freq_mode_split[1].upper()
|
mode = freq_mode_split[1].upper()
|
||||||
|
|
||||||
|
|||||||
@@ -6,14 +6,15 @@ from data.spot import Spot
|
|||||||
from spotproviders.sse_spot_provider import SSESpotProvider
|
from spotproviders.sse_spot_provider import SSESpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Worldwide Bunkers on the Air
|
|
||||||
class WWBOTA(SSESpotProvider):
|
class WWBOTA(SSESpotProvider):
|
||||||
|
"""Spot provider for Worldwide Bunkers on the Air"""
|
||||||
|
|
||||||
SPOTS_URL = "https://api.wwbota.net/spots/"
|
SPOTS_URL = "https://api.wwbota.net/spots/"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL)
|
super().__init__(provider_config, self.SPOTS_URL)
|
||||||
|
|
||||||
def sse_message_to_spot(self, message):
|
def _sse_message_to_spot(self, message):
|
||||||
source_spot = json.loads(message)
|
source_spot = json.loads(message)
|
||||||
# Convert to our spot format. First we unpack references, because WWBOTA spots can have more than one for
|
# Convert to our spot format. First we unpack references, because WWBOTA spots can have more than one for
|
||||||
# n-fer activations.
|
# n-fer activations.
|
||||||
|
|||||||
@@ -7,15 +7,16 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Worldwide Flora & Fauna
|
|
||||||
class WWFF(HTTPSpotProvider):
|
class WWFF(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Worldwide Flora & Fauna"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://spots.wwff.co/static/spots.json"
|
SPOTS_URL = "https://spots.wwff.co/static/spots.json"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json():
|
for source_spot in http_response.json():
|
||||||
|
|||||||
@@ -1,22 +1,22 @@
|
|||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
import json
|
import json
|
||||||
import pytz
|
|
||||||
|
|
||||||
from data.sig_ref import SIGRef
|
from data.sig_ref import SIGRef
|
||||||
from data.spot import Spot
|
from data.spot import Spot
|
||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for Towers on the Air
|
|
||||||
class WWTOTA(HTTPSpotProvider):
|
class WWTOTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for Towers on the Air"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://wwtota.com/api/cluster_live.php"
|
SPOTS_URL = "https://wwtota.com/api/cluster_live.php"
|
||||||
|
|
||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
response_fixed = http_response.text.replace("\\/", "/")
|
response_fixed = http_response.text.replace("\\/", "/")
|
||||||
response_json = json.loads(response_fixed)
|
response_json = json.loads(response_fixed)
|
||||||
@@ -33,7 +33,8 @@ class WWTOTA(HTTPSpotProvider):
|
|||||||
comment=source_spot["comment"],
|
comment=source_spot["comment"],
|
||||||
sig="WWTOTA",
|
sig="WWTOTA",
|
||||||
sig_refs=[SIGRef(id=source_spot["ref"], sig="WWTOTA")],
|
sig_refs=[SIGRef(id=source_spot["ref"], sig="WWTOTA")],
|
||||||
time=datetime.strptime(response_json["updated"][:10] + source_spot["time"], "%Y-%m-%d%H:%M").timestamp())
|
time=datetime.strptime(response_json["updated"][:10] + source_spot["time"],
|
||||||
|
"%Y-%m-%d%H:%M").timestamp())
|
||||||
|
|
||||||
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
|
# Add to our list. Don't worry about de-duping, removing old spots etc. at this point; other code will do
|
||||||
# that for us.
|
# that for us.
|
||||||
|
|||||||
@@ -10,12 +10,13 @@ from data.spot import Spot
|
|||||||
from spotproviders.websocket_spot_provider import WebsocketSpotProvider
|
from spotproviders.websocket_spot_provider import WebsocketSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for servers based on the "xOTA" software at https://github.com/nischu/xOTA/
|
|
||||||
# The provider typically doesn't give us a lat/lon or SIG explicitly, so our own config provides a SIG and a reference
|
|
||||||
# to a local CSV file with location information. This functionality is implemented for TOTA events, of which there are
|
|
||||||
# several - so a plain lookup of a "TOTA reference" doesn't make sense, it depends on which TOTA and hence which server
|
|
||||||
# supplied the data, which is why the CSV location lookup is here and not in sig_utils.
|
|
||||||
class XOTA(WebsocketSpotProvider):
|
class XOTA(WebsocketSpotProvider):
|
||||||
|
"""Spot provider for servers based on the "xOTA" software at https://github.com/nischu/xOTA/
|
||||||
|
The provider typically doesn't give us a lat/lon or SIG explicitly, so our own config provides a SIG and a reference
|
||||||
|
to a local CSV file with location information. This functionality is implemented for TOTA events, of which there are
|
||||||
|
several - so a plain lookup of a "TOTA reference" doesn't make sense, it depends on which TOTA and hence which server
|
||||||
|
supplied the data, which is why the CSV location lookup is here and not in sig_utils."""
|
||||||
|
|
||||||
LOCATION_DATA = {}
|
LOCATION_DATA = {}
|
||||||
SIG = None
|
SIG = None
|
||||||
|
|
||||||
@@ -35,8 +36,8 @@ class XOTA(WebsocketSpotProvider):
|
|||||||
except:
|
except:
|
||||||
logging.exception("Could not look up location data for XOTA source.")
|
logging.exception("Could not look up location data for XOTA source.")
|
||||||
|
|
||||||
def ws_message_to_spot(self, bytes):
|
def _ws_message_to_spot(self, b):
|
||||||
string = bytes.decode("utf-8")
|
string = b.decode("utf-8")
|
||||||
source_spot = json.loads(string)
|
source_spot = json.loads(string)
|
||||||
ref_id = source_spot["reference"]["title"]
|
ref_id = source_spot["reference"]["title"]
|
||||||
lat = float(self.LOCATION_DATA[ref_id]["lat"]) if ref_id in self.LOCATION_DATA else None
|
lat = float(self.LOCATION_DATA[ref_id]["lat"]) if ref_id in self.LOCATION_DATA else None
|
||||||
@@ -47,7 +48,8 @@ class XOTA(WebsocketSpotProvider):
|
|||||||
freq=float(source_spot["freq"]) * 1000,
|
freq=float(source_spot["freq"]) * 1000,
|
||||||
mode=source_spot["mode"].upper(),
|
mode=source_spot["mode"].upper(),
|
||||||
sig=self.SIG,
|
sig=self.SIG,
|
||||||
sig_refs=[SIGRef(id=ref_id, sig=self.SIG, url=source_spot["reference"]["website"], latitude=lat, longitude=lon)],
|
sig_refs=[SIGRef(id=ref_id, sig=self.SIG, url=source_spot["reference"]["website"], latitude=lat,
|
||||||
|
longitude=lon)],
|
||||||
time=datetime.now(pytz.UTC).timestamp(),
|
time=datetime.now(pytz.UTC).timestamp(),
|
||||||
dx_latitude=lat,
|
dx_latitude=lat,
|
||||||
dx_longitude=lon,
|
dx_longitude=lon,
|
||||||
|
|||||||
@@ -7,8 +7,9 @@ from data.spot import Spot
|
|||||||
from spotproviders.http_spot_provider import HTTPSpotProvider
|
from spotproviders.http_spot_provider import HTTPSpotProvider
|
||||||
|
|
||||||
|
|
||||||
# Spot provider for ZLOTA
|
|
||||||
class ZLOTA(HTTPSpotProvider):
|
class ZLOTA(HTTPSpotProvider):
|
||||||
|
"""Spot provider for ZLOTA"""
|
||||||
|
|
||||||
POLL_INTERVAL_SEC = 120
|
POLL_INTERVAL_SEC = 120
|
||||||
SPOTS_URL = "https://ontheair.nz/api/spots?zlota_only=true"
|
SPOTS_URL = "https://ontheair.nz/api/spots?zlota_only=true"
|
||||||
LIST_URL = "https://ontheair.nz/assets/assets.json"
|
LIST_URL = "https://ontheair.nz/assets/assets.json"
|
||||||
@@ -16,7 +17,7 @@ class ZLOTA(HTTPSpotProvider):
|
|||||||
def __init__(self, provider_config):
|
def __init__(self, provider_config):
|
||||||
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
super().__init__(provider_config, self.SPOTS_URL, self.POLL_INTERVAL_SEC)
|
||||||
|
|
||||||
def http_response_to_spots(self, http_response):
|
def _http_response_to_spots(self, http_response):
|
||||||
new_spots = []
|
new_spots = []
|
||||||
# Iterate through source data
|
# Iterate through source data
|
||||||
for source_spot in http_response.json():
|
for source_spot in http_response.json():
|
||||||
@@ -35,7 +36,8 @@ class ZLOTA(HTTPSpotProvider):
|
|||||||
comment=source_spot["comments"],
|
comment=source_spot["comments"],
|
||||||
sig="ZLOTA",
|
sig="ZLOTA",
|
||||||
sig_refs=[SIGRef(id=source_spot["reference"], sig="ZLOTA", name=source_spot["name"])],
|
sig_refs=[SIGRef(id=source_spot["reference"], sig="ZLOTA", name=source_spot["name"])],
|
||||||
time=datetime.fromisoformat(source_spot["referenced_time"].replace("Z", "+00:00")).astimezone(pytz.UTC).timestamp())
|
time=datetime.fromisoformat(source_spot["referenced_time"].replace("Z", "+00:00")).astimezone(
|
||||||
|
pytz.UTC).timestamp())
|
||||||
|
|
||||||
new_spots.append(spot)
|
new_spots.append(spot)
|
||||||
return new_spots
|
return new_spots
|
||||||
|
|||||||
@@ -27,6 +27,7 @@
|
|||||||
<h4 class="mt-4">What data sources are supported?</h4>
|
<h4 class="mt-4">What data sources are supported?</h4>
|
||||||
<p>Spothole can retrieve spots from: <a href="https://www.dxcluster.info/telnet/">Telnet-based DX clusters</a>, the <a href="https://www.reversebeacon.net/">Reverse Beacon Network (RBN)</a>, the <a href="https://www.aprs-is.net/">APRS Internet Service (APRS-IS)</a>, <a href="https://pota.app">POTA</a>, <a href="https://www.sota.org.uk/">SOTA</a>, <a href="https://wwff.co/">WWFF</a>, <a href="https://www.cqgma.org/">GMA</a>, <a href="https://wwbota.net/">WWBOTA</a>, <a href="http://www.hema.org.uk/">HEMA</a>, <a href="https://www.parksnpeaks.org/">Parks 'n' Peaks</a>, <a href="https://ontheair.nz">ZLOTA</a>, <a href="https://www.wota.org.uk/">WOTA</a>, <a href="https://llota.app">LLOTA</a>, <a href="https://wwtota.com">WWTOTA</a>, the <a href="https://ukpacketradio.network/">UK Packet Repeater Network</a>, and any site based on the <a href="https://github.com/nischu/xOTA">xOTA software by nischu</a>.</p>
|
<p>Spothole can retrieve spots from: <a href="https://www.dxcluster.info/telnet/">Telnet-based DX clusters</a>, the <a href="https://www.reversebeacon.net/">Reverse Beacon Network (RBN)</a>, the <a href="https://www.aprs-is.net/">APRS Internet Service (APRS-IS)</a>, <a href="https://pota.app">POTA</a>, <a href="https://www.sota.org.uk/">SOTA</a>, <a href="https://wwff.co/">WWFF</a>, <a href="https://www.cqgma.org/">GMA</a>, <a href="https://wwbota.net/">WWBOTA</a>, <a href="http://www.hema.org.uk/">HEMA</a>, <a href="https://www.parksnpeaks.org/">Parks 'n' Peaks</a>, <a href="https://ontheair.nz">ZLOTA</a>, <a href="https://www.wota.org.uk/">WOTA</a>, <a href="https://llota.app">LLOTA</a>, <a href="https://wwtota.com">WWTOTA</a>, the <a href="https://ukpacketradio.network/">UK Packet Repeater Network</a>, and any site based on the <a href="https://github.com/nischu/xOTA">xOTA software by nischu</a>.</p>
|
||||||
<p>Spothole can retrieve alerts from: <a href="https://www.ng3k.com/">NG3K</a>, <a href="https://pota.app">POTA</a>, <a href="https://www.sota.org.uk/">SOTA</a>, <a href="https://wwff.co/">WWFF</a>, <a href="https://www.parksnpeaks.org/">Parks 'n' Peaks</a>, <a href="https://www.wota.org.uk/">WOTA</a> and <a href="https://www.beachesontheair.com/">BOTA</a>.</p>
|
<p>Spothole can retrieve alerts from: <a href="https://www.ng3k.com/">NG3K</a>, <a href="https://pota.app">POTA</a>, <a href="https://www.sota.org.uk/">SOTA</a>, <a href="https://wwff.co/">WWFF</a>, <a href="https://www.parksnpeaks.org/">Parks 'n' Peaks</a>, <a href="https://www.wota.org.uk/">WOTA</a> and <a href="https://www.beachesontheair.com/">BOTA</a>.</p>
|
||||||
|
<p>Spothole can retrieve solar and propagation condition data from <a href="https://www.hamqsl.com">HamQSL</a>.</p>
|
||||||
<p>Note that the server owner has not necessarily enabled all these data sources. In particular it is common to disable RBN, to avoid the server being swamped with FT8 traffic, and to disable APRS-IS and UK Packet Net so that the server only displays stations where there is likely to be an operator physically present for a QSO.</p>
|
<p>Note that the server owner has not necessarily enabled all these data sources. In particular it is common to disable RBN, to avoid the server being swamped with FT8 traffic, and to disable APRS-IS and UK Packet Net so that the server only displays stations where there is likely to be an operator physically present for a QSO.</p>
|
||||||
<p>Between the various data sources, the following Special Interest Groups (SIGs) are supported: Parks on the Air (POTA), Summits on the Air (SOTA), Worldwide Flora & Fauna (WWFF), Global Mountain Activity (GMA), Worldwide Bunkers on the Air (WWBOTA), HuMPs Excluding Marilyns Award (HEMA), Islands on the Air (IOTA), Mills on the Air (MOTA), the Amateur Radio Lighthouse Socirty (ARLHS), International Lighthouse Lightship Weekend (ILLW), Silos on the Air (SIOTA), World Castles Award (WCA), New Zealand on the Air (ZLOTA), Keith Roget Memorial National Parks Award (KRMNPA), Wainwrights on the Air (WOTA), Beaches on the Air (BOTA), Lagos y Lagunas On the Air (LLOTA), Towers on the Air (WWTOTA), Worked All Britain (WAB), Worked All Ireland (WAI), and Toilets on the Air (TOTA).</p>
|
<p>Between the various data sources, the following Special Interest Groups (SIGs) are supported: Parks on the Air (POTA), Summits on the Air (SOTA), Worldwide Flora & Fauna (WWFF), Global Mountain Activity (GMA), Worldwide Bunkers on the Air (WWBOTA), HuMPs Excluding Marilyns Award (HEMA), Islands on the Air (IOTA), Mills on the Air (MOTA), the Amateur Radio Lighthouse Socirty (ARLHS), International Lighthouse Lightship Weekend (ILLW), Silos on the Air (SIOTA), World Castles Award (WCA), New Zealand on the Air (ZLOTA), Keith Roget Memorial National Parks Award (KRMNPA), Wainwrights on the Air (WOTA), Beaches on the Air (BOTA), Lagos y Lagunas On the Air (LLOTA), Towers on the Air (WWTOTA), Worked All Britain (WAB), Worked All Ireland (WAI), and Toilets on the Air (TOTA).</p>
|
||||||
<p>As of the time of writing in November 2025, I think Spothole captures essentially all outdoor radio programmes that have a defined reference list, and almost certainly those that have a spotting/alerting API. If you know of one I've missed, please let me know!</p>
|
<p>As of the time of writing in November 2025, I think Spothole captures essentially all outdoor radio programmes that have a defined reference list, and almost certainly those that have a spotting/alerting API. If you know of one I've missed, please let me know!</p>
|
||||||
@@ -56,14 +57,17 @@
|
|||||||
<p>Spothole collects no data about you, and there is no way to enter personally identifying information into the site apart from by spotting and alerting through Spothole or the various services it connects to. All spots and alerts are "timed out" and deleted from the system after a set interval, which by default is one hour for spots and one week for alerts.</p>
|
<p>Spothole collects no data about you, and there is no way to enter personally identifying information into the site apart from by spotting and alerting through Spothole or the various services it connects to. All spots and alerts are "timed out" and deleted from the system after a set interval, which by default is one hour for spots and one week for alerts.</p>
|
||||||
<p>Settings you select from Spothole's menus are sent to the server, in order to provide the data with the requested filters. They are also stored in your browser's local storage, so that your preferences are remembered between sessions.</p>
|
<p>Settings you select from Spothole's menus are sent to the server, in order to provide the data with the requested filters. They are also stored in your browser's local storage, so that your preferences are remembered between sessions.</p>
|
||||||
<p>There are no trackers, no ads, and no cookies.</p>
|
<p>There are no trackers, no ads, and no cookies.</p>
|
||||||
|
{% if len(web_ui_options["support-button-html"]) > 0 %}
|
||||||
|
<p><strong>Caveat: </strong> The owner of this server has chosen to inject their own content into the "spots" page. This is designed for a "donate" or "support this server" button. The functionality of this injected content is the responsibility of the server owner, rather than the Spothole software.</p>
|
||||||
|
{% end %}
|
||||||
<p>Spothole is open source, so you can audit <a href="https://git.ianrenton.com/ian/spothole">the code</a> if you like.</p>
|
<p>Spothole is open source, so you can audit <a href="https://git.ianrenton.com/ian/spothole">the code</a> if you like.</p>
|
||||||
<h2 class="mt-4">Thanks</h2>
|
<h2 class="mt-4">Thanks</h2>
|
||||||
<p>This project would not have been possible without those volunteers who have taken it upon themselves to run DX clusters, xOTA programmes, DXpedition lists, callsign lookup databases, and other online tools on which Spothole's data is based.</p>
|
<p>This project would not have been possible without those volunteers who have taken it upon themselves to run DX clusters, xOTA programmes, DXpedition lists, callsign lookup databases, solar conditions and propagation modelling software, and other online tools on which Spothole's data is based. The vast majority of these are not profit-seeking and are made purely for the love of the hobby and to help others in the community. Spothole is standing on the shoulders of giants, who deserve a huge amount of thanks for all the work they put in.</p>
|
||||||
<p>Spothole is also dependent on a number of Python libraries, in particular pyhamtools, and many JavaScript libraries, as well as the Font Awesome icon set and flag icons from the Noto Color Emoji set.</p>
|
<p>Spothole is also dependent on a number of Python libraries, in particular pyhamtools, and many JavaScript libraries, as well as the Font Awesome icon set and flag icons from the Noto Color Emoji set, and MIT-licenced GeoJSON files for CQ and ITU zones from HA8TKS.</p>
|
||||||
<p>This software is dedicated to the memory of Tom G1PJB, SK, a friend and colleague who sadly passed away around the time I started writing it in Autumn 2025. I was looking forward to showing it to you when it was done.</p>
|
<p>This software is dedicated to the memory of Tom G1PJB, SK, a friend and colleague who sadly passed away around the time I started writing it in Autumn 2025. I was looking forward to showing it to you when it was done.</p>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-about").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-about").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -10,7 +10,7 @@
|
|||||||
|
|
||||||
<div class="mt-3">
|
<div class="mt-3">
|
||||||
<div id="add-spot-area" class="card mb-3">
|
<div id="add-spot-area" class="card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
<div class="card-header">
|
||||||
<div class="row">
|
<div class="row">
|
||||||
<div class="col-auto me-auto">
|
<div class="col-auto me-auto">
|
||||||
Add a Spot
|
Add a Spot
|
||||||
@@ -69,8 +69,8 @@
|
|||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
<script src="/js/add-spot.js?v=6"></script>
|
<script src="/js/add-spot.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-add-spot").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-add-spot").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -2,176 +2,62 @@
|
|||||||
{% block content %}
|
{% block content %}
|
||||||
|
|
||||||
<div class="mt-3">
|
<div class="mt-3">
|
||||||
<div id="settingsButtonRow" class="row">
|
<div id="settingsButtonRow" class="row mb-3">
|
||||||
<div class="col-auto me-auto pt-3">
|
<div class="col-auto me-auto pt-3">
|
||||||
<p id="timing-container">Loading...</p>
|
{% module Template("widgets/refresh-timer.html", web_ui_options=web_ui_options) %}
|
||||||
</div>
|
</div>
|
||||||
<div class="col-auto">
|
<div class="col-auto">
|
||||||
<p class="d-inline-flex gap-1">
|
<div class="d-inline-flex gap-1">
|
||||||
<button id="filters-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i> Filters</button>
|
{% module Template("widgets/filters-display-buttons.html", web_ui_options=web_ui_options) %}
|
||||||
<button id="display-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i> Display</button>
|
</div>
|
||||||
</p>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="filters-area" class="appearing-panel card mb-3">
|
<div id="filters-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/filters-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Filters
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-filters-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeFiltersPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="row row-cols-1 row-cols-md-3 g-4">
|
<div class="row row-cols-1 row-cols-md-3 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/dx-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DX Continent</h5>
|
|
||||||
<p id="dx-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sources.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Sources</h5>
|
|
||||||
<p id="source-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/duration-limit-alerts.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Duration Limit <i class='fa-solid fa-circle-question' title='Some users create long-duration alerts for the period they will be generally in and around xOTA references, when they are not indending to be on the air most of the time. Use this control to restrict the maximum duration of spots that the software will display, and exclude any with a long duration, to avoid these filling up the list. By default, we allow DXpeditions to be displayed even if they are longer than this limit, because on a DXpedition the operators typically ARE on the air most of the time.'></i></h5>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
Hide any alerts lasting more than:<br/>
|
|
||||||
<select id="max-duration" class="storeable-select form-select" onclick="filtersUpdated();" style="width: 8em; display: inline-block;">
|
|
||||||
<option value="10800">3 hours</option>
|
|
||||||
<option value="43200">12 hours</option>
|
|
||||||
<option value="86400" selected>24 hours</option>
|
|
||||||
<option value="604800">1 week</option>
|
|
||||||
<option value="2419200">4 weeks</option>
|
|
||||||
<option value="9999999999">No limit</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
<p class='card-text spothole-card-text' style='line-height: 1.5em !important;'>
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" value="" onclick="filtersUpdated();" id="dxpeditions_skip_max_duration_check" checked><label class="form-check-label ms-2" for="dxpeditions_skip_max_duration_check">Allow DXpeditions that are longer</label>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="display-area" class="appearing-panel card mb-3">
|
<div id="display-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/display-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Display
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-display-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeDisplayPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div id="display-container" class="row row-cols-1 row-cols-md-3 g-4">
|
<div id="display-container" class="row row-cols-1 row-cols-md-3 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/time-zone.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Time Zone</h5>
|
|
||||||
<p class="card-text spothole-card-text"> Use
|
|
||||||
<select id="timeZone" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="timeZoneUpdated();" style="width: 8em; display: inline-block;">
|
|
||||||
<option value="UTC" selected>UTC</option>
|
|
||||||
<option value="local">Local time</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/number-of-alerts.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Number of Alerts</h5>
|
|
||||||
<p class="card-text spothole-card-text">Show up to
|
|
||||||
<select id="alerts-to-fetch" class="storeable-select form-select ms-2" oninput="filtersUpdated();" style="width: 5em;display: inline-block;">
|
|
||||||
</select>
|
|
||||||
alerts
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Theme</h5>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="color-scheme">UI color scheme</label>
|
|
||||||
<select id="color-scheme" class="storeable-select form-select d-inline-block" oninput="setColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
<option value="auto" selected>Automatic</option>
|
|
||||||
<option value="light">Light</option>
|
|
||||||
<option value="dark">Dark</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/table-columns-alerts.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Table Data</h5>
|
|
||||||
<div class="form-group">
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowStartTime" value="tableShowStartTime" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowStartTime">Start Time</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowEndTime" value="tableShowEndTime" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowEndTime">End Time</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDX" value="tableShowDX" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowDX">DX</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowFreqsModes" value="tableShowFreqsModes" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowFreqsModes">Frequencies & Modes</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowComment" value="tableShowComment" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowComment">Comment</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowSource" value="tableShowSource" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowSource">Source</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowRef" value="tableShowRef" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowRef">Ref.</label>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="table-container">
|
<div id="table-container">
|
||||||
<table id="table" class="table"><thead><tr class="table-primary"></tr></thead><tbody></tbody></table>
|
<table id="table" class="table"><thead><tr></tr></thead><tbody></tbody></table>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
<script src="/js/alerts.js?v=6"></script>
|
<script src="/js/alerts.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-alerts").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-alerts").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -2,131 +2,54 @@
|
|||||||
{% block content %}
|
{% block content %}
|
||||||
|
|
||||||
<div class="mt-3">
|
<div class="mt-3">
|
||||||
<div id="settingsButtonRow" class="row">
|
<div id="settingsButtonRow" class="row mb-3">
|
||||||
<div class="col-auto me-auto pt-3">
|
<div class="col-auto me-auto pt-3">
|
||||||
<p id="timing-container">Loading...</p>
|
{% module Template("widgets/refresh-timer.html", web_ui_options=web_ui_options) %}
|
||||||
</div>
|
</div>
|
||||||
<div class="col-auto">
|
<div class="col-auto">
|
||||||
<p class="d-inline-flex gap-1">
|
<div class="d-inline-flex gap-1">
|
||||||
<button id="filters-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i> Filters</button>
|
{% module Template("widgets/filters-display-buttons.html", web_ui_options=web_ui_options) %}
|
||||||
<button id="display-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i> Display</button>
|
</div>
|
||||||
</p>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="filters-area" class="appearing-panel card mb-3">
|
<div id="filters-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/filters-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Filters
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-filters-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeFiltersPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/bands.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Bands</h5>
|
|
||||||
<p id="band-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sigs.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">SIGs</h5>
|
|
||||||
<p id="sig-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sources.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Sources</h5>
|
|
||||||
<p id="source-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="row row-cols-1 row-cols-md-3 g-4">
|
<div class="row row-cols-1 row-cols-md-3 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/dx-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DX Continent</h5>
|
|
||||||
<p id="dx-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/de-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DE Continent</h5>
|
|
||||||
<p id="de-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/modes.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Modes</h5>
|
|
||||||
<p id="mode-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="display-area" class="appearing-panel card mb-3">
|
<div id="display-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/display-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Display
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-display-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeDisplayPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/spot-age.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Spot Age</h5>
|
|
||||||
<p class="card-text spothole-card-text">Last
|
|
||||||
<select id="max-spot-age" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
|
|
||||||
</select>
|
|
||||||
minutes
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/color-scheme-and-band-color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Theme</h5>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="color-scheme">UI color scheme</label>
|
|
||||||
<select id="color-scheme" class="storeable-select form-select d-inline-block" oninput="setColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
<option value="auto" selected>Automatic</option>
|
|
||||||
<option value="light">Light</option>
|
|
||||||
<option value="dark">Dark</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="band-color-scheme">Band color scheme</label><br/>
|
|
||||||
<select id="band-color-scheme" class="storeable-select form-select d-inline-block" oninput="setBandColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -136,9 +59,12 @@
|
|||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script>
|
||||||
<script src="/js/spotsbandsandmap.js?v=6"></script>
|
let spotProvidersEnabledByDefault = {% raw json_encode(web_ui_options["spot-providers-enabled-by-default"]) %};
|
||||||
<script src="/js/bands.js?v=6"></script>
|
</script>
|
||||||
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
|
<script src="/js/spotsbandsandmap.js?v=1775203458"></script>
|
||||||
|
<script src="/js/bands.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-bands").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-bands").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -46,10 +46,10 @@
|
|||||||
crossorigin="anonymous"></script>
|
crossorigin="anonymous"></script>
|
||||||
<script src="https://cdn.jsdelivr.net/npm/tinycolor2@1.6.0/cjs/tinycolor.min.js"></script>
|
<script src="https://cdn.jsdelivr.net/npm/tinycolor2@1.6.0/cjs/tinycolor.min.js"></script>
|
||||||
|
|
||||||
<script src="https://misc.ianrenton.com/jsutils/utils.js?v=6"></script>
|
<script src="https://misc.ianrenton.com/jsutils/utils.js?v=1775203458"></script>
|
||||||
<script src="https://misc.ianrenton.com/jsutils/storage.js?v=6"></script>
|
<script src="https://misc.ianrenton.com/jsutils/storage.js?v=1775203458"></script>
|
||||||
<script src="https://misc.ianrenton.com/jsutils/ui-ham.js?v=6"></script>
|
<script src="https://misc.ianrenton.com/jsutils/ui-ham.js?v=1775203458"></script>
|
||||||
<script src="https://misc.ianrenton.com/jsutils/geo.js?v=6"></script>
|
<script src="https://misc.ianrenton.com/jsutils/geo.js?v=1775203458"></script>
|
||||||
|
|
||||||
</head>
|
</head>
|
||||||
<body>
|
<body>
|
||||||
@@ -67,10 +67,11 @@
|
|||||||
<li class="nav-item ms-4"><a href="/" class="nav-link" id="nav-link-spots"><i class="fa-solid fa-tower-cell"></i> Spots</a></li>
|
<li class="nav-item ms-4"><a href="/" class="nav-link" id="nav-link-spots"><i class="fa-solid fa-tower-cell"></i> Spots</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/map" class="nav-link" id="nav-link-map"><i class="fa-solid fa-map"></i> Map</a></li>
|
<li class="nav-item ms-4"><a href="/map" class="nav-link" id="nav-link-map"><i class="fa-solid fa-map"></i> Map</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/bands" class="nav-link" id="nav-link-bands"><i class="fa-solid fa-ruler-vertical"></i> Bands</a></li>
|
<li class="nav-item ms-4"><a href="/bands" class="nav-link" id="nav-link-bands"><i class="fa-solid fa-ruler-vertical"></i> Bands</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/alerts" class="nav-link" id="nav-link-alerts"><i class="fa-solid fa-bell"></i> Alerts</a></li>
|
<li class="nav-item ms-4"><a href="/alerts" class="nav-link" id="nav-link-alerts"><i class="fa-solid fa-clock"></i> Upcoming</a></li>
|
||||||
{% if allow_spotting %}
|
{% if allow_spotting %}
|
||||||
<li class="nav-item ms-4"><a href="/add-spot" class="nav-link" id="nav-link-add-spot"><i class="fa-solid fa-comment"></i> Add Spot</a></li>
|
<li class="nav-item ms-4"><a href="/add-spot" class="nav-link" id="nav-link-add-spot"><i class="fa-solid fa-comment"></i> Add Spot</a></li>
|
||||||
{% end %}
|
{% end %}
|
||||||
|
<li class="nav-item ms-4"><a href="/conditions" class="nav-link" id="nav-link-conditions"><i class="fa-solid fa-sun"></i> Conditions</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/status" class="nav-link" id="nav-link-status"><i class="fa-solid fa-chart-simple"></i> Status</a></li>
|
<li class="nav-item ms-4"><a href="/status" class="nav-link" id="nav-link-status"><i class="fa-solid fa-chart-simple"></i> Status</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/about" class="nav-link" id="nav-link-about"><i class="fa-solid fa-circle-info"></i> About</a></li>
|
<li class="nav-item ms-4"><a href="/about" class="nav-link" id="nav-link-about"><i class="fa-solid fa-circle-info"></i> About</a></li>
|
||||||
<li class="nav-item ms-4"><a href="/apidocs" class="nav-link" id="nav-link-api"><i class="fa-solid fa-gear"></i> API</a></li>
|
<li class="nav-item ms-4"><a href="/apidocs" class="nav-link" id="nav-link-api"><i class="fa-solid fa-gear"></i> API</a></li>
|
||||||
|
|||||||
6
templates/cards/bands.html
Normal file
6
templates/cards/bands.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Bands</h5>
|
||||||
|
<p id="band-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
11
templates/cards/color-scheme-and-band-color-scheme.html
Normal file
11
templates/cards/color-scheme-and-band-color-scheme.html
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Theme</h5>
|
||||||
|
<p class="card-text spothole-card-text">
|
||||||
|
{% module Template("widgets/color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
|
</p>
|
||||||
|
<p class="card-text spothole-card-text">
|
||||||
|
{% module Template("widgets/band-color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
8
templates/cards/color-scheme.html
Normal file
8
templates/cards/color-scheme.html
Normal file
@@ -0,0 +1,8 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Theme</h5>
|
||||||
|
<p class="card-text spothole-card-text">
|
||||||
|
{% module Template("widgets/color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/de-continent.html
Normal file
6
templates/cards/de-continent.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">DE Continent</h5>
|
||||||
|
<p id="de-continent-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
19
templates/cards/duration-limit-alerts.html
Normal file
19
templates/cards/duration-limit-alerts.html
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Duration Limit <i class='fa-solid fa-circle-question' title='Some users create long-duration alerts for the period they will be generally in and around xOTA references, when they are not indending to be on the air most of the time. Use this control to restrict the maximum duration of spots that the software will display, and exclude any with a long duration, to avoid these filling up the list. By default, we allow DXpeditions to be displayed even if they are longer than this limit, because on a DXpedition the operators typically ARE on the air most of the time.'></i></h5>
|
||||||
|
<p class="card-text spothole-card-text">
|
||||||
|
Hide any alerts lasting more than:<br/>
|
||||||
|
<select id="max-duration" class="storeable-select form-select" onclick="filtersUpdated();" style="width: 8em; display: inline-block;">
|
||||||
|
<option value="10800">3 hours</option>
|
||||||
|
<option value="43200">12 hours</option>
|
||||||
|
<option value="86400" selected>24 hours</option>
|
||||||
|
<option value="604800">1 week</option>
|
||||||
|
<option value="2419200">4 weeks</option>
|
||||||
|
<option value="9999999999">No limit</option>
|
||||||
|
</select>
|
||||||
|
</p>
|
||||||
|
<p class='card-text spothole-card-text' style='line-height: 1.5em !important;'>
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" value="" onclick="filtersUpdated();" id="dxpeditions_skip_max_duration_check" checked><label class="form-check-label ms-2" for="dxpeditions_skip_max_duration_check">Allow DXpeditions that are longer</label>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/dx-continent.html
Normal file
6
templates/cards/dx-continent.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">DX Continent</h5>
|
||||||
|
<p id="dx-continent-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
9
templates/cards/location.html
Normal file
9
templates/cards/location.html
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Location</h5>
|
||||||
|
<div class="form-group spothole-card-text">
|
||||||
|
<label for="userGrid">Your grid:</label>
|
||||||
|
<input type="text" class="storeable-text form-control" id="userGrid" placeholder="AA00aa" oninput="userGridUpdated();" style="width: 10em; display: inline-block;">
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
11
templates/cards/map-features.html
Normal file
11
templates/cards/map-features.html
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Map Features</h5>
|
||||||
|
<div class="form-group">
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="mapShowGeodesics" value="mapShowGeodesics" oninput="displayUpdated();">
|
||||||
|
<label class="form-check-label" for="mapShowGeodesics">Geodesic Lines</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/modes.html
Normal file
6
templates/cards/modes.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Modes</h5>
|
||||||
|
<p id="mode-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
13
templates/cards/number-of-alerts.html
Normal file
13
templates/cards/number-of-alerts.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Number of Alerts</h5>
|
||||||
|
<p class="card-text spothole-card-text">Show up to
|
||||||
|
<select id="alerts-to-fetch" class="storeable-select form-select ms-2" oninput="filtersUpdated();" style="width: 5em;display: inline-block;">
|
||||||
|
{% for c in web_ui_options["alert-count"] %}
|
||||||
|
<option value="{{c}}" {% if web_ui_options["alert-count-default"] == c %}selected{% end %}>{{c}}</option>
|
||||||
|
{% end %}
|
||||||
|
</select>
|
||||||
|
alerts
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
13
templates/cards/number-of-spots.html
Normal file
13
templates/cards/number-of-spots.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Number of Spots</h5>
|
||||||
|
<p class="card-text spothole-card-text">Show up to
|
||||||
|
<select id="spots-to-fetch" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
|
||||||
|
{% for c in web_ui_options["spot-count"] %}
|
||||||
|
<option value="{{c}}" {% if web_ui_options["spot-count-default"] == c %}selected{% end %}>{{c}}</option>
|
||||||
|
{% end %}
|
||||||
|
</select>
|
||||||
|
spots
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/sigs.html
Normal file
6
templates/cards/sigs.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">SIGs</h5>
|
||||||
|
<p id="sig-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/sources.html
Normal file
6
templates/cards/sources.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Sources</h5>
|
||||||
|
<p id="source-options" class="card-text spothole-card-text"></p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
13
templates/cards/spot-age.html
Normal file
13
templates/cards/spot-age.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Spot Age</h5>
|
||||||
|
<p class="card-text spothole-card-text">Last
|
||||||
|
<select id="max-spot-age" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
|
||||||
|
{% for a in web_ui_options["max-spot-age"] %}
|
||||||
|
<option value="{{a*60}}" {% if web_ui_options["max-spot-age-default"] == a*60 %}selected{% end %}>{{a}}</option>
|
||||||
|
{% end %}
|
||||||
|
</select>
|
||||||
|
minutes
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
35
templates/cards/table-columns-alerts.html
Normal file
35
templates/cards/table-columns-alerts.html
Normal file
@@ -0,0 +1,35 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Table Columns</h5>
|
||||||
|
<div class="form-group">
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowStartTime" value="tableShowStartTime" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowStartTime">Start Time</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowEndTime" value="tableShowEndTime" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowEndTime">End Time</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDX" value="tableShowDX" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowDX">DX</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowFreqsModes" value="tableShowFreqsModes" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowFreqsModes">Frequencies & Modes</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowComment" value="tableShowComment" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowComment">Comment</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowSource" value="tableShowSource" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowSource">Source</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowRef" value="tableShowRef" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowRef">Ref.</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
47
templates/cards/table-columns-spots.html
Normal file
47
templates/cards/table-columns-spots.html
Normal file
@@ -0,0 +1,47 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Table Columns</h5>
|
||||||
|
<div class="form-group">
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowTime" value="tableShowTime" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowTime">Time</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDX" value="tableShowDX" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowDX">DX</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowFreq" value="tableShowFreq" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowFreq">Frequency</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowMode" value="tableShowMode" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowMode">Mode</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowComment" value="tableShowComment" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowComment">Comment</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowBearing" value="tableShowBearing" oninput="columnsUpdated();">
|
||||||
|
<label class="form-check-label" for="tableShowBearing">Bearing</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowType" value="tableShowType" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowType">Type</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowRef" value="tableShowRef" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowRef">Ref.</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDE" value="tableShowDE" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowDE">DE</label>
|
||||||
|
</div>
|
||||||
|
<div class="form-check form-check-inline">
|
||||||
|
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowWorkedCheckbox" value="tableShowWorkedCheckbox" oninput="columnsUpdated();" checked>
|
||||||
|
<label class="form-check-label" for="tableShowWorkedCheckbox">Worked?</label>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
11
templates/cards/time-zone.html
Normal file
11
templates/cards/time-zone.html
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Time Zone</h5>
|
||||||
|
<p class="card-text spothole-card-text"> Use
|
||||||
|
<select id="timeZone" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="timeZoneUpdated();" style="width: 8em; display: inline-block;">
|
||||||
|
<option value="UTC" selected>UTC</option>
|
||||||
|
<option value="local">Local time</option>
|
||||||
|
</select>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
6
templates/cards/worked-calls.html
Normal file
6
templates/cards/worked-calls.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<div class="card">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">Worked Calls</h5>
|
||||||
|
<button type="button" class="btn btn-secondary" onClick="clearWorked();">Clear worked calls</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
196
templates/conditions.html
Normal file
196
templates/conditions.html
Normal file
@@ -0,0 +1,196 @@
|
|||||||
|
{% extends "base.html" %}
|
||||||
|
{% block content %}
|
||||||
|
|
||||||
|
<div class="card mt-5">
|
||||||
|
<div class="card-header">
|
||||||
|
Propagation Conditions
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="row row-cols-1 row-cols-md-2 g-3">
|
||||||
|
<div class="col">
|
||||||
|
<div class="card h-100">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">HF</h5>
|
||||||
|
<table class="table table-sm mt-2">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Band</th>
|
||||||
|
<th>Day</th>
|
||||||
|
<th>Night</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<tr>
|
||||||
|
<td>80-40m</td>
|
||||||
|
<td id="hf-conditions-80m-40m-day"></td>
|
||||||
|
<td id="hf-conditions-80m-40m-night"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>30-20m</td>
|
||||||
|
<td id="hf-conditions-30m-20m-day"></td>
|
||||||
|
<td id="hf-conditions-30m-20m-night"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>17-15m</td>
|
||||||
|
<td id="hf-conditions-17m-15m-day"></td>
|
||||||
|
<td id="hf-conditions-17m-15m-night"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>12-10m</td>
|
||||||
|
<td id="hf-conditions-12m-10m-day"></td>
|
||||||
|
<td id="hf-conditions-12m-10m-night"></td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="col">
|
||||||
|
<div class="card h-100">
|
||||||
|
<div class="card-body">
|
||||||
|
<h5 class="card-title">VHF</h5>
|
||||||
|
<table class="table table-sm mt-2">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th>Propagation Mode</th>
|
||||||
|
<th>Condition</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
<tr>
|
||||||
|
<td>Sporadic-E 6m (Europe)</td>
|
||||||
|
<td id="vhf-conditions-es_6m_europe"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Sporadic-E 4m (Europe)</td>
|
||||||
|
<td id="vhf-conditions-es_4m_europe"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Sporadic-E 2m (Europe)</td>
|
||||||
|
<td id="vhf-conditions-es_2m_europe"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Sporadic-E 2m (North America)</td>
|
||||||
|
<td id="vhf-conditions-es_2m_na"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Aurora (Northern Hemisphere)</td>
|
||||||
|
<td id="vhf-conditions-vhf_aurora_northern_hemi"></td>
|
||||||
|
</tr>
|
||||||
|
<tr>
|
||||||
|
<td>Aurora Minimum Latitude</td>
|
||||||
|
<td id="vhf-conditions-aurora-lat"></td>
|
||||||
|
</tr>
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="form-text mt-3">Data from <a href="https://hamqsl.com">HamQSL.com</a>.</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card mt-5">
|
||||||
|
<div class="card-header">
|
||||||
|
Solar Weather
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="row border-bottom align-items-start me-0">
|
||||||
|
<div class="col-12 col-md-2 py-2 fw-bold">Solar Flux</div>
|
||||||
|
<div id="sw-solar-flux-vals" class="col-12 col-md-3 py-2">
|
||||||
|
<span class="me-3">SFI: <strong id="sw-sfi"></strong></span>
|
||||||
|
<span>Sunspots: <strong id="sw-sunspots"></strong></span>
|
||||||
|
</div>
|
||||||
|
<div id="sw-solar-flux-desc" class="col-12 col-md-7 py-2"></div>
|
||||||
|
</div>
|
||||||
|
<div class="row border-bottom align-items-start me-0">
|
||||||
|
<div class="col-12 col-md-2 py-2 fw-bold">Geomagnetic</div>
|
||||||
|
<div id="sw-geomag-vals" class="col-12 col-md-3 py-2">
|
||||||
|
<span class="me-3">K: <strong id="sw-k-index"></strong></span>
|
||||||
|
<span class="me-3">A: <strong id="sw-a-index"></strong></span>
|
||||||
|
<span class="me-3"><strong>G</strong><strong id="sw-geomag-storm-scale"></strong></span>
|
||||||
|
<span>Noise: <strong id="sw-geomag-noise"></strong></span>
|
||||||
|
</div>
|
||||||
|
<div id="sw-geomag-desc" class="col-12 col-md-7 py-2">
|
||||||
|
<span id="sw-geomag-field"></span>. <span id="sw-geomag-storm-desc"></span>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
<div class="row border-bottom align-items-start me-0">
|
||||||
|
<div class="col-12 col-md-2 py-2 fw-bold">X-ray Flux</div>
|
||||||
|
<div id="sw-xray-vals" class="col-12 col-md-3 py-2"><strong id="sw-x-ray"></strong></div>
|
||||||
|
<div id="sw-xray-desc" class="col-12 col-md-7 py-2"></div>
|
||||||
|
</div>
|
||||||
|
<div class="row border-bottom align-items-start me-0">
|
||||||
|
<div class="col-12 col-md-2 py-2 fw-bold">Proton Flux</div>
|
||||||
|
<div id="sw-proton-vals" class="col-12 col-md-3 py-2">
|
||||||
|
<span class="me-3"><strong id="sw-proton-flux"></strong> pfu</span>
|
||||||
|
<span class="me-3"><strong>S</strong><strong id="sw-solar-storm-scale"></strong></span>
|
||||||
|
</div>
|
||||||
|
<div id="sw-proton-desc" class="col-12 col-md-7 py-2"></div>
|
||||||
|
</div>
|
||||||
|
<div class="row border-bottom align-items-start me-0">
|
||||||
|
<div class="col-12 col-md-2 fw-bold py-2">Electron Flux</div>
|
||||||
|
<div id="sw-electron-vals" class="col-12 col-md-3 py-2"><strong id="sw-electron-flux"></strong> efu</div>
|
||||||
|
<div id="sw-electron-desc" class="col-12 col-md-7 py-2"></div>
|
||||||
|
</div>
|
||||||
|
<div class="form-text mt-3">Data from <a href="https://hamqsl.com">HamQSL.com</a>.</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card mt-5">
|
||||||
|
<div class="card-header">
|
||||||
|
DX Opportunities
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="mb-3">
|
||||||
|
<label for="dxstats-de-continent" class="form-label">Your continent:</label>
|
||||||
|
<select id="dxstats-de-continent" class="form-select storeable-select d-inline-block ms-2" style="width: auto;" oninput="dxStatsContientChanged();">
|
||||||
|
<option value="EU">Europe</option>
|
||||||
|
<option value="NA">North America</option>
|
||||||
|
<option value="SA">South America</option>
|
||||||
|
<option value="AS">Asia</option>
|
||||||
|
<option value="AF">Africa</option>
|
||||||
|
<option value="OC">Oceania</option>
|
||||||
|
<option value="AN">Antarctica</option>
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="table-responsive">
|
||||||
|
<table class="table table-sm table-bordered mb-0">
|
||||||
|
<thead>
|
||||||
|
<tr>
|
||||||
|
<th></th>
|
||||||
|
<th>160m</th>
|
||||||
|
<th>80m</th>
|
||||||
|
<th>60m</th>
|
||||||
|
<th>40m</th>
|
||||||
|
<th>30m</th>
|
||||||
|
<th>20m</th>
|
||||||
|
<th>17m</th>
|
||||||
|
<th>15m</th>
|
||||||
|
<th>12m</th>
|
||||||
|
<th>10m</th>
|
||||||
|
<th>6m</th>
|
||||||
|
</tr>
|
||||||
|
</thead>
|
||||||
|
<tbody>
|
||||||
|
{% for continent in ["EU", "NA", "SA", "AS", "AF", "OC", "AN"] %}
|
||||||
|
<tr>
|
||||||
|
<td class="fw-bold">{{ continent }}</td>
|
||||||
|
{% for band in ["160m", "80m", "60m", "40m", "30m", "20m", "17m", "15m", "12m", "10m", "6m"] %}
|
||||||
|
<td id="dxstats-{{ continent }}-{{ band }}"></td>
|
||||||
|
{% end %}
|
||||||
|
</tr>
|
||||||
|
{% end %}
|
||||||
|
</tbody>
|
||||||
|
</table>
|
||||||
|
</div>
|
||||||
|
<div class="form-text mt-2">This table shows the number of spots in the past hour received in your continent, where the DX continent and band are as shown in the table. Bands with high numbers of spots are likely to be the best ones for making contact with the continent you want right now. Bear in mind that some bands and some continents are inherently much rarer than others.</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
|
<script src="/js/conditions.js?v=1775203458"></script>
|
||||||
|
<script>$(document).ready(function() { $("#nav-link-conditions").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
|
{% end %}
|
||||||
@@ -3,142 +3,55 @@
|
|||||||
|
|
||||||
<div id="map">
|
<div id="map">
|
||||||
<div id="settingsButtonRowMap" class="mt-3 px-3" style="z-index: 1002; position: relative;">
|
<div id="settingsButtonRowMap" class="mt-3 px-3" style="z-index: 1002; position: relative;">
|
||||||
<div class="row">
|
<div class="row mb-3">
|
||||||
<div class="col-auto me-auto pt-3"></div>
|
<div class="col-auto me-auto pt-3"></div>
|
||||||
<div class="col-auto">
|
<div class="col-auto">
|
||||||
<p class="d-inline-flex gap-1">
|
<div class="d-inline-flex gap-1">
|
||||||
<button id="filters-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i> Filters</button>
|
{% module Template("widgets/filters-display-buttons.html", web_ui_options=web_ui_options) %}
|
||||||
<button id="display-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i> Display</button>
|
</div>
|
||||||
</p>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="filters-area" class="appearing-panel card mb-3">
|
<div id="filters-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/filters-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Filters
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-filters-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeFiltersPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/bands.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Bands</h5>
|
|
||||||
<p id="band-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sigs.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">SIGs</h5>
|
|
||||||
<p id="sig-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sources.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Sources</h5>
|
|
||||||
<p id="source-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="row row-cols-1 row-cols-md-3 g-4">
|
<div class="row row-cols-1 row-cols-md-3 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/dx-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DX Continent</h5>
|
|
||||||
<p id="dx-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/de-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DE Continent</h5>
|
|
||||||
<p id="de-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/modes.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Modes</h5>
|
|
||||||
<p id="mode-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="display-area" class="appearing-panel card mb-3">
|
<div id="display-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/display-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Display
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-display-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeDisplayPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/spot-age.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Spot Age</h5>
|
|
||||||
<p class="card-text spothole-card-text">Last
|
|
||||||
<select id="max-spot-age" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
|
|
||||||
</select>
|
|
||||||
minutes
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/map-features.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Map Features</h5>
|
|
||||||
<div class="form-group">
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="mapShowGeodesics" value="mapShowGeodesics" oninput="displayUpdated();">
|
|
||||||
<label class="form-check-label" for="mapShowGeodesics">Geodesic Lines</label>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/color-scheme-and-band-color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Theme</h5>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="color-scheme">UI color scheme</label>
|
|
||||||
<select id="color-scheme" class="storeable-select form-select d-inline-block" oninput="setColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
<option value="auto" selected>Automatic</option>
|
|
||||||
<option value="light">Light</option>
|
|
||||||
<option value="dark">Dark</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="band-color-scheme">Band color scheme</label><br/>
|
|
||||||
<select id="band-color-scheme" class="storeable-select form-select d-inline-block" oninput="setBandColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
@@ -154,9 +67,12 @@
|
|||||||
<script src="https://cdn.jsdelivr.net/npm/leaflet.geodesic"></script>
|
<script src="https://cdn.jsdelivr.net/npm/leaflet.geodesic"></script>
|
||||||
<script src="https://cdn.jsdelivr.net/npm/@joergdietrich/leaflet.terminator@1.1.0/L.Terminator.min.js"></script>
|
<script src="https://cdn.jsdelivr.net/npm/@joergdietrich/leaflet.terminator@1.1.0/L.Terminator.min.js"></script>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script>
|
||||||
<script src="/js/spotsbandsandmap.js?v=6"></script>
|
let spotProvidersEnabledByDefault = {% raw json_encode(web_ui_options["spot-providers-enabled-by-default"]) %};
|
||||||
<script src="/js/map.js?v=6"></script>
|
</script>
|
||||||
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
|
<script src="/js/spotsbandsandmap.js?v=1775203458"></script>
|
||||||
|
<script src="/js/map.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-map").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-map").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -9,226 +9,87 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="mt-3">
|
<div class="mt-3">
|
||||||
<div id="settingsButtonRow" class="row">
|
<div id="settingsButtonRow" class="row mb-3">
|
||||||
<div class="col-4">
|
<div class="col-md-4 mb-3 mb-md-0">
|
||||||
<p class="d-inline-flex gap-1">
|
<div class="d-inline-flex gap-3">
|
||||||
<span class="btn-group" role="group">
|
{% module Template("widgets/run-pause.html", web_ui_options=web_ui_options) %}
|
||||||
<input type="radio" class="btn-check" name="runPause" id="runButton" autocomplete="off" checked>
|
<div class="d-inline-flex">{% raw web_ui_options["support-button-html"] %}</div>
|
||||||
<label class="btn btn-outline-primary" for="runButton"><i class="fa-solid fa-play"></i><span class="hideonmobile"> Run</span></label>
|
</div>
|
||||||
|
|
||||||
<input type="radio" class="btn-check" name="runPause" id="pauseButton" autocomplete="off">
|
|
||||||
<label class="btn btn-outline-primary" for="pauseButton"><i class="fa-solid fa-pause"></i><span class="hideonmobile"> Pause</span></label>
|
|
||||||
</span>
|
|
||||||
</p>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col-8 text-end">
|
<div class="col-md-8 text-end">
|
||||||
<p class="d-inline-flex gap-1">
|
<div class="d-inline-flex gap-3">
|
||||||
<span style="position: relative;">
|
{% module Template("widgets/search.html", web_ui_options=web_ui_options) %}
|
||||||
<i id="searchicon" class="fa-solid fa-magnifying-glass"></i>
|
{% module Template("widgets/filters-display-buttons.html", web_ui_options=web_ui_options) %}
|
||||||
<input id="search" type="search" class="form-control" oninput="filtersUpdated();" placeholder="Search">
|
</div>
|
||||||
</span>
|
|
||||||
<button id="filters-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i><span class="hideonmobile"> Filters</span></button>
|
|
||||||
<button id="display-button" type="button" class="btn btn-outline-primary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i><span class="hideonmobile"> Display</span></button>
|
|
||||||
</p>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="filters-area" class="appearing-panel card mb-3">
|
<div id="filters-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/filters-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Filters
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-filters-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeFiltersPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
<div class="row row-cols-1 g-4 mb-4 row-cols-md-3">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/bands.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Bands</h5>
|
|
||||||
<p id="band-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sigs.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">SIGs</h5>
|
|
||||||
<p id="sig-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/sources.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Sources</h5>
|
|
||||||
<p id="source-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="row row-cols-1 row-cols-md-3 g-4">
|
<div class="row row-cols-1 row-cols-md-3 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/dx-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DX Continent</h5>
|
|
||||||
<p id="dx-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/de-continent.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">DE Continent</h5>
|
|
||||||
<p id="de-continent-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/modes.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Modes</h5>
|
|
||||||
<p id="mode-options" class="card-text spothole-card-text"></p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="display-area" class="appearing-panel card mb-3">
|
<div id="display-area" class="appearing-panel card mb-3">
|
||||||
<div class="card-header text-white bg-primary">
|
{% module Template("widgets/display-area-header.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="row">
|
|
||||||
<div class="col-auto me-auto">
|
|
||||||
Display
|
|
||||||
</div>
|
|
||||||
<div class="col-auto d-inline-flex">
|
|
||||||
<button id="close-display-button" type="button" class="btn-close btn-close-white" aria-label="Close" onclick="closeDisplayPanel();"></button>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
|
|
||||||
</div>
|
|
||||||
<div class="card-body">
|
<div class="card-body">
|
||||||
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
<div id="display-container" class="row row-cols-1 row-cols-md-4 g-4">
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/time-zone.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Time Zone</h5>
|
|
||||||
<p class="card-text spothole-card-text"> Use
|
|
||||||
<select id="timeZone" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="timeZoneUpdated();" style="width: 8em; display: inline-block;">
|
|
||||||
<option value="UTC" selected>UTC</option>
|
|
||||||
<option value="local">Local time</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/number-of-spots.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Number of Spots</h5>
|
|
||||||
<p class="card-text spothole-card-text">Show up to
|
|
||||||
<select id="spots-to-fetch" class="storeable-select form-select ms-2 me-2 d-inline-block" oninput="filtersUpdated();" style="width: 5em; display: inline-block;">
|
|
||||||
</select>
|
|
||||||
spots
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/location.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Location</h5>
|
|
||||||
<div class="form-group spothole-card-text">
|
|
||||||
<label for="userGrid">Your grid:</label>
|
|
||||||
<input type="text" class="storeable-text form-control" id="userGrid" placeholder="AA00aa" oninput="userGridUpdated();" style="width: 10em; display: inline-block;">
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/worked-calls.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
|
||||||
<h5 class="card-title">Theme</h5>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="color-scheme">UI color scheme</label>
|
|
||||||
<select id="color-scheme" class="storeable-select form-select d-inline-block" oninput="setColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
<option value="auto" selected>Automatic</option>
|
|
||||||
<option value="light">Light</option>
|
|
||||||
<option value="dark">Dark</option>
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
<p class="card-text spothole-card-text">
|
|
||||||
<label class="form-check-label" for="band-color-scheme">Band color scheme</label><br/>
|
|
||||||
<select id="band-color-scheme" class="storeable-select form-select d-inline-block" oninput="setBandColorSchemeFromUI();" style="display: inline-block;">
|
|
||||||
</select>
|
|
||||||
</p>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
<div class="col">
|
<div class="col">
|
||||||
<div class="card">
|
{% module Template("cards/color-scheme-and-band-color-scheme.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="card-body">
|
</div>
|
||||||
<h5 class="card-title">Table Columns</h5>
|
<div class="col">
|
||||||
<div class="form-group">
|
{% module Template("cards/table-columns-spots.html", web_ui_options=web_ui_options) %}
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowTime" value="tableShowTime" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowTime">Time</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDX" value="tableShowDX" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowDX">DX</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowFreq" value="tableShowFreq" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowFreq">Frequency</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowMode" value="tableShowMode" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowMode">Mode</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowComment" value="tableShowComment" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowComment">Comment</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowBearing" value="tableShowBearing" oninput="columnsUpdated();">
|
|
||||||
<label class="form-check-label" for="tableShowBearing">Bearing</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowType" value="tableShowType" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowType">Type</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowRef" value="tableShowRef" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowRef">Ref.</label>
|
|
||||||
</div>
|
|
||||||
<div class="form-check form-check-inline">
|
|
||||||
<input class="form-check-input storeable-checkbox" type="checkbox" id="tableShowDE" value="tableShowDE" oninput="columnsUpdated();" checked>
|
|
||||||
<label class="form-check-label" for="tableShowDE">DE</label>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div id="table-container">
|
<div id="table-container">
|
||||||
<table id="table" class="table"><thead><tr class="table-primary"></tr></thead><tbody></tbody></table>
|
<table id="table" class="table"><thead><tr></tr></thead><tbody></tbody></table>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<script>
|
||||||
<script src="/js/spotsbandsandmap.js?v=6"></script>
|
let spotProvidersEnabledByDefault = {% raw json_encode(web_ui_options["spot-providers-enabled-by-default"]) %};
|
||||||
<script src="/js/spots.js?v=6"></script>
|
</script>
|
||||||
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
|
<script src="/js/spotsbandsandmap.js?v=1775203458"></script>
|
||||||
|
<script src="/js/spots.js?v=1775203458"></script>
|
||||||
<script>$(document).ready(function() { $("#nav-link-spots").addClass("active"); }); <!-- highlight active page in nav --></script>
|
<script>$(document).ready(function() { $("#nav-link-spots").addClass("active"); }); <!-- highlight active page in nav --></script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
@@ -1,10 +1,68 @@
|
|||||||
{% extends "base.html" %}
|
{% extends "base.html" %}
|
||||||
{% block content %}
|
{% block content %}
|
||||||
|
|
||||||
<div id="status-container" class="row row-cols-1 row-cols-md-4 g-4 mt-4"></div>
|
<div class="card mt-5">
|
||||||
|
<div class="card-header">
|
||||||
|
Spothole
|
||||||
|
</div>
|
||||||
|
<div class="card-body">
|
||||||
|
<div class="row row-cols-1 row-cols-md-4 g-4 mb-2">
|
||||||
|
<div class="col"><strong>Metadata</strong></div>
|
||||||
|
<div class="col">Software Version: <span id="software-version"></span></div>
|
||||||
|
<div class="col">Owner Callsign: <span id="server-owner-callsign"></span></div>
|
||||||
|
<div class="col">Up since: <span id="up-since"></span></div>
|
||||||
|
</div>
|
||||||
|
<div class="row row-cols-1 row-cols-md-4 g-4 mb-2">
|
||||||
|
<div class="col"><strong>Performance</strong></div>
|
||||||
|
<div class="col">Memory Use: <span id="memory-use"></span></div>
|
||||||
|
<div class="col">Total Spots: <span id="total-spots"></span></div>
|
||||||
|
<div class="col">Total Alerts: <span id="total-alerts"></span></div>
|
||||||
|
</div>
|
||||||
|
<div class="row row-cols-1 row-cols-md-4 g-4 mb-2">
|
||||||
|
<div class="col"><strong>Web Server</strong></div>
|
||||||
|
<div class="col">Status: <span id="web-server-status"></span></div>
|
||||||
|
<div class="col">Last API call: <span id="web-server-last-api"></span></div>
|
||||||
|
<div class="col">Last page req: <span id="web-server-last-page"></span></div>
|
||||||
|
</div>
|
||||||
|
<div class="row row-cols-1 row-cols-md-4 g-4 mb-2">
|
||||||
|
<div class="col"><strong>Cleanup Service</strong></div>
|
||||||
|
<div class="col">Status: <span id="cleanup-status"></span></div>
|
||||||
|
<div class="col">Last ran: <span id="cleanup-last-ran"></span></div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<script src="/js/common.js?v=6"></script>
|
<div class="card mt-3">
|
||||||
<script src="/js/status.js?v=6"></script>
|
<div class="card-header">
|
||||||
<script>$(document).ready(function() { $("#nav-link-status").addClass("active"); }); <!-- highlight active page in nav --></script>
|
Spot Providers
|
||||||
|
</div>
|
||||||
|
<div class="card-body" id="spot-providers-status-container">
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card mt-3">
|
||||||
|
<div class="card-header">
|
||||||
|
Alert Providers
|
||||||
|
</div>
|
||||||
|
<div class="card-body" id="alert-providers-status-container">
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card mt-3">
|
||||||
|
<div class="card-header">
|
||||||
|
Solar/Band Conditions Providers
|
||||||
|
</div>
|
||||||
|
<div class="card-body" id="condition-providers-status-container">
|
||||||
|
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script src="/js/common.js?v=1775203458"></script>
|
||||||
|
<script src="/js/status.js?v=1775203458"></script>
|
||||||
|
<script>
|
||||||
|
$(document).ready(function() { $("#nav-link-status").addClass("active"); }); <!-- highlight active page in nav -->
|
||||||
|
</script>
|
||||||
|
|
||||||
{% end %}
|
{% end %}
|
||||||
13
templates/widgets/band-color-scheme.html
Normal file
13
templates/widgets/band-color-scheme.html
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
<label class="form-check-label" for="band-color-scheme">Band color scheme</label><br/>
|
||||||
|
<select id="band-color-scheme" class="storeable-select form-select d-inline-block" oninput="setBandColorSchemeFromUI();" style="display: inline-block;">
|
||||||
|
<option value="PSK Reporter" {% if web_ui_options["band-color-scheme-default"] == "PSK Reporter" %}selected{% end %}>PSK Reporter</option>
|
||||||
|
<option value="PSK Reporter (Adjusted)" {% if web_ui_options["band-color-scheme-default"] == "PSK Reporter (Adjusted)" %}selected{% end %}>PSK Reporter (Adjusted)</option>
|
||||||
|
<option value="RBN" {% if web_ui_options["band-color-scheme-default"] == "RBN" %}selected{% end %}>RBN</option>
|
||||||
|
<option value="Ham Rainbow" {% if web_ui_options["band-color-scheme-default"] == "Ham Rainbow" %}selected{% end %}>Ham Rainbow</option>
|
||||||
|
<option value="Ham Rainbow (Reverse)" {% if web_ui_options["band-color-scheme-default"] == "Ham Rainbow (Reverse)" %}selected{% end %}>Ham Rainbow (Reverse)</option>
|
||||||
|
<option value="Kate Morley" {% if web_ui_options["band-color-scheme-default"] == "Kate Morley" %}selected{% end %}>Kate Morley</option>
|
||||||
|
<option value="ColorBrewer" {% if web_ui_options["band-color-scheme-default"] == "ColorBrewer" %}selected{% end %}>ColorBrewer</option>
|
||||||
|
<option value="IWantHue" {% if web_ui_options["band-color-scheme-default"] == "IWantHue" %}selected{% end %}>IWantHue</option>
|
||||||
|
<option value="IWantHue (Color Blind)" {% if web_ui_options["band-color-scheme-default"] == "IWantHue (Color Blind)" %}selected{% end %}>IWantHue (Color Blind)</option>
|
||||||
|
<option value="Mokole" {% if web_ui_options["band-color-scheme-default"] == "Mokole" %}selected{% end %}>Mokole</option>
|
||||||
|
</select>
|
||||||
6
templates/widgets/color-scheme.html
Normal file
6
templates/widgets/color-scheme.html
Normal file
@@ -0,0 +1,6 @@
|
|||||||
|
<label class="form-check-label" for="color-scheme">UI color scheme</label>
|
||||||
|
<select id="color-scheme" class="storeable-select form-select d-inline-block" oninput="setColorSchemeFromUI();" style="display: inline-block;">
|
||||||
|
<option value="auto" {% if web_ui_options["color-scheme-default"] == "auto" %}selected{% end %}>Automatic</option>
|
||||||
|
<option value="light" {% if web_ui_options["color-scheme-default"] == "light" %}selected{% end %}>Light</option>
|
||||||
|
<option value="dark" {% if web_ui_options["color-scheme-default"] == "dark" %}selected{% end %}>Dark</option>
|
||||||
|
</select>
|
||||||
10
templates/widgets/display-area-header.html
Normal file
10
templates/widgets/display-area-header.html
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
<div class="card-header">
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-auto me-auto">
|
||||||
|
Display
|
||||||
|
</div>
|
||||||
|
<div class="col-auto d-inline-flex">
|
||||||
|
<button id="close-display-button" type="button" class="btn-close btn-close" aria-label="Close" onclick="closeDisplayPanel();"></button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
10
templates/widgets/filters-area-header.html
Normal file
10
templates/widgets/filters-area-header.html
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
<div class="card-header">
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-auto me-auto">
|
||||||
|
Filters
|
||||||
|
</div>
|
||||||
|
<div class="col-auto d-inline-flex">
|
||||||
|
<button id="close-filters-button" type="button" class="btn-close btn-close" aria-label="Close" onclick="closeFiltersPanel();"></button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
4
templates/widgets/filters-display-buttons.html
Normal file
4
templates/widgets/filters-display-buttons.html
Normal file
@@ -0,0 +1,4 @@
|
|||||||
|
<div class="d-inline-flex gap-1">
|
||||||
|
<button id="filters-button" type="button" class="btn btn-outline-secondary" data-bs-toggle="button" onclick="toggleFiltersPanel();"><i class="fa-solid fa-filter"></i> Filters</button>
|
||||||
|
<button id="display-button" type="button" class="btn btn-outline-secondary" data-bs-toggle="button" onclick="toggleDisplayPanel();"><i class="fa-solid fa-desktop"></i> Display</button>
|
||||||
|
</div>
|
||||||
1
templates/widgets/refresh-timer.html
Normal file
1
templates/widgets/refresh-timer.html
Normal file
@@ -0,0 +1 @@
|
|||||||
|
<div id="timing-container">Loading...</div>
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user