Skip to content

Commit

Permalink
Release 20240210 (#334)
Browse files Browse the repository at this point in the history
* Provide numeric value for dates for easier analysis later (#332)

* Feature/ratelimiter update (#330)

* updated request_is_limited algorithm

* Strip out numbers from paths

  This is so we can better aggregate by path in cloudwatch

* Provide numeric value for dates for easier analysis later

---------

Co-authored-by: Russ Biggs <[email protected]>

* Sensors by location (#297)

* Added ability to increase the working memory for a given endpoint (#282) resolves #281

* Added ability to increase the working memory for a given endpoint  resolves #281

* update staging deploy to deploy on release branches (#284)

* Added timer (#280)

* included order_by in v1 and v2 measurements (#263)

Co-authored-by: Gabriel Fosse <[email protected]>

* manufacturers resource (#273)

* manufacturers resource resolves #252
---------

Co-authored-by: Gabriel Fosse <[email protected]>

* Adding `v3/instruments` resource (#271)

* instruments resource resolves #270 

---------

Co-authored-by: Gabriel Fosse <[email protected]>

* Added timer

* Added owner router

---------

Co-authored-by: Gabriel Fosse <[email protected]>
Co-authored-by: Gabriel Fosse <[email protected]>
Co-authored-by: Russ Biggs <[email protected]>

* Add sort and order by query params to v3 endpoints  (#285)

* order by classes

* refactor query builder to handle sort and order by statements

* add sort and order by to v3 endpoints

resolves #163
---------

Co-authored-by: Gabriel Fosse <[email protected]>

* Feature/working memory query 281 (#287)



* Moved the tr.start out of if block

  Fixes bug #286

---------

Co-authored-by: Christian Parker <[email protected]>

* fix missing start variable and return empty string for order (#288)

* Remove count fields (#289)

* removed count fields
---------

Co-authored-by: Gabriel Fosse <[email protected]>

* fix failing unit test (#292)

* rate limiter logging fix (#293)

* location sensors query

* location sensors path query

* Cleaned up the sensors endpoints

  Rewrote the queries to speed them up a litte and added the
  measurements method

* Removed old method that I had renamed

---------

Co-authored-by: Christian Parker <[email protected]>
Co-authored-by: Russ Biggs <[email protected]>
Co-authored-by: Gabriel Fosse <[email protected]>

* add deprecation warning (#333)

* upgrdate to actions/checkout@v4

---------

Co-authored-by: Christian Parker <[email protected]>
Co-authored-by: Gabriel Fosse <[email protected]>
Co-authored-by: Gabriel Fosse <[email protected]>
  • Loading branch information
4 people authored Feb 10, 2024
1 parent abffc15 commit 15a7667
Show file tree
Hide file tree
Showing 9 changed files with 118 additions and 92 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/deploy-prod.yml
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Configure aws credentials
uses: aws-actions/configure-aws-credentials@master
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/deploy-staging.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout repo
uses: actions/checkout@v3
uses: actions/checkout@v4

- name: Configure aws credentials
uses: aws-actions/configure-aws-credentials@master
Expand Down
2 changes: 2 additions & 0 deletions openaq_api/openaq_api/db.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@
allowed_config_params = ["work_mem"]



DEFAULT_CONNECTION_TIMEOUT = 6
MAX_CONNECTION_TIMEOUT = 15

Expand All @@ -32,6 +33,7 @@ def default(obj):
# config is required as a placeholder here because of this
# function is used in the `cached` decorator and without it
# we will get a number of arguments error

def dbkey(m, f, query, args, timeout=None, config=None):
j = orjson.dumps(
args, option=orjson.OPT_OMIT_MICROSECONDS, default=default
Expand Down
1 change: 1 addition & 0 deletions openaq_api/openaq_api/main.py
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
from contextlib import asynccontextmanager
import datetime
import logging
import time
Expand Down
13 changes: 12 additions & 1 deletion openaq_api/openaq_api/models/logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@
from humps import camelize
from pydantic import BaseModel, ConfigDict, Field, computed_field
import re
from dateutil.parser import parse

class LogType(StrEnum):
SUCCESS = "SUCCESS"
Expand Down Expand Up @@ -120,7 +121,17 @@ def params(self) -> str:
@property
def params_obj(self) -> dict:
"""dict: returns URL query params as key values from request"""
return dict(x.split("=", 1) for x in self.params.split("&") if "=" in x)
params = dict(x.split("=", 1) for x in self.params.split("&") if "=" in x)
try:
# if bad strings make it past our validation than this will protect the log
if 'date_from' in params.keys():
params['date_from_epoch'] = parse(params['date_from']).timestamp()
if 'date_to' in params.keys():
params['date_to_epoch'] = parse(params['date_to']).timestamp()
except Exception:
pass

return params

@computed_field(return_type=list)
@property
Expand Down
2 changes: 1 addition & 1 deletion openaq_api/openaq_api/v3/models/responses.py
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ class Summary(JsonBase):
q75: float | None = None
q98: float | None = None
max: float | None = None
avg: float | None = None
sd: float | None = None


Expand Down Expand Up @@ -180,7 +181,6 @@ class Sensor(SensorBase):
datetime_first: DatetimeObject
datetime_last: DatetimeObject
coverage: Coverage
# period: Period
latest: Latest
summary: Summary

Expand Down
2 changes: 1 addition & 1 deletion openaq_api/openaq_api/v3/routers/locations.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,7 @@ async def fetch_locations(query, db):
, bbox(geom) as bounds
, datetime_first
, datetime_last
{query_builder.fields() or ''}
{query_builder.fields() or ''}
{query_builder.total()}
FROM locations_view_cached
{query_builder.where()}
Expand Down
5 changes: 2 additions & 3 deletions openaq_api/openaq_api/v3/routers/measurements.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,14 +45,13 @@ class LocationMeasurementsQueries(
DateToQuery,
MeasurementsParametersQuery,
PeriodNameQuery,
):
...
): ...


@router.get(
"/locations/{locations_id}/measurements",
response_model=MeasurementsResponse,
summary="Get measurements by location",
summary="Get measurements by location (DEPRECATING - will be removed in future releases)",
description="Provides a list of measurements by location ID",
)
async def measurements_get(
Expand Down
181 changes: 97 additions & 84 deletions openaq_api/openaq_api/v3/routers/sensors.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,19 @@
from fastapi import APIRouter, Depends, Path

from openaq_api.db import DB
from openaq_api.v3.models.queries import QueryBaseModel, QueryBuilder
from openaq_api.v3.models.responses import SensorsResponse
from openaq_api.v3.models.queries import (
DateFromQuery,
DateToQuery,
Paging,
PeriodNameQuery,
QueryBaseModel,
QueryBuilder,
)

from openaq_api.v3.models.responses import (
SensorsResponse,
MeasurementsResponse,
)
from openaq_api.v3.routers.measurements import fetch_measurements

logger = logging.getLogger("sensors")
Expand All @@ -23,25 +34,51 @@ class SensorQuery(QueryBaseModel):
)

def where(self):
return "m.sensors_id = :sensors_id"
return "s.sensors_id = :sensors_id"

class LocationSensorQuery(QueryBaseModel):
locations_id: int = Path(
..., description="Limit the results to a specific sensors id", ge=1
)

def where(self):
return "n.sensor_nodes_id = :locations_id"

class SensorMeasurementsQueries(
Paging,
SensorQuery,
DateFromQuery,
DateToQuery,
PeriodNameQuery,
):
...


# class SensorsQueries(Paging, CountryQuery):
# ...
@router.get(
"/sensors/{sensors_id}/measurements",
response_model=MeasurementsResponse,
summary="Get measurements by sensor ID",
description="Provides a list of measurements by sensor ID",
)
async def sensor_measurements_get(
sensors: Annotated[SensorMeasurementsQueries, Depends(SensorMeasurementsQueries.depends())],
db: DB = Depends(),
):
response = await fetch_measurements(sensors, db)
return response


# @router.get(
# "/sensors/{id}/measurements",
# response_model=MeasurementsResponse,
# summary="Get measurements by sensor ID",
# description="Provides a list of measurements by sensor ID",
# )
# async def sensor_measurements_get(
# sensor: SensorsQueries = Depends(SensorsQueries.depends()),
# db: DB = Depends(),
# ):
# response = await fetch_measurements(sensor, db)
# return response
@router.get(
"/locations/{locations_id}/sensors",
response_model=SensorsResponse,
summary="Get sensors by location ID",
description="Provides a list of sensors by location ID",
)
async def sensors_get(
location_sensors: Annotated[LocationSensorQuery, Depends(LocationSensorQuery.depends())],
db: DB = Depends(),
):
return await fetch_sensors(location_sensors, db)


@router.get(
Expand All @@ -54,76 +91,52 @@ async def sensor_get(
sensors: Annotated[SensorQuery, Depends(SensorQuery.depends())],
db: DB = Depends(),
):
response = await fetch_sensors(sensors, db)
return response
return await fetch_sensors(sensors, db)


async def fetch_sensors(q, db):
query = QueryBuilder(q)

logger.debug(query.params())
sql = f"""
WITH sensor AS (
SELECT
m.sensors_id
, MIN(datetime - '1sec'::interval) as datetime_first
, MAX(datetime - '1sec'::interval) as datetime_last
, COUNT(1) as value_count
, AVG(value_avg) as value_avg
, STDDEV(value_avg) as value_sd
, MIN(value_avg) as value_min
, MAX(value_avg) as value_max
, PERCENTILE_CONT(0.02) WITHIN GROUP(ORDER BY value_avg) as value_p02
, PERCENTILE_CONT(0.25) WITHIN GROUP(ORDER BY value_avg) as value_p25
, PERCENTILE_CONT(0.5) WITHIN GROUP(ORDER BY value_avg) as value_p50
, PERCENTILE_CONT(0.75) WITHIN GROUP(ORDER BY value_avg) as value_p75
, PERCENTILE_CONT(0.98) WITHIN GROUP(ORDER BY value_avg) as value_p98
, current_timestamp as calculated_on
FROM hourly_data m
{query.where()}
GROUP BY 1)
SELECT c.sensors_id as id
, 'sensor' as name
, c.value_avg as value
, get_datetime_object(c.datetime_first, ts.tzid) as datetime_first
, get_datetime_object(c.datetime_last, ts.tzid) as datetime_last
, json_build_object(
'datetime', get_datetime_object(r.datetime_last, ts.tzid)
, 'value', r.value_latest
, 'coordinates', jsonb_build_object(
'lat', st_y(r.geom_latest)
, 'lon', st_x(r.geom_latest)
)
) as latest
, json_build_object(
'id', s.measurands_id
, 'units', m.units
, 'name', m.measurand
, 'display_name', m.display
) as parameter
, json_build_object(
'sd', c.value_sd
, 'min', c.value_min
, 'q02', c.value_p02
, 'q25', c.value_p25
, 'median', c.value_p50
, 'q75', c.value_p75
, 'q98', c.value_p98
, 'max', c.value_max
) as summary
, calculate_coverage(
c.value_count::int
, s.data_averaging_period_seconds
, s.data_logging_period_seconds
, EXTRACT(EPOCH FROM c.datetime_last - c.datetime_first)
) as coverage
FROM sensors s
JOIN sensor_systems sy ON (s.sensor_systems_id = sy.sensor_systems_id)
JOIN sensor_nodes sn ON (sy.sensor_nodes_id = sn.sensor_nodes_id)
JOIN timezones ts ON (sn.timezones_id = ts.gid)
JOIN measurands m ON (s.measurands_id = m.measurands_id)
LEFT JOIN sensors_rollup r ON (s.sensors_id = r.sensors_id)
LEFT JOIN sensor c ON (c.sensors_id = s.sensors_id)
WHERE s.sensors_id = :sensors_id;
"""
response = await db.fetchPage(sql, query.params())
return response
SELECT s.sensors_id as id
, m.measurand||' '||m.units as name
, json_build_object(
'id', m.measurands_id
, 'name', m.measurand
, 'units', m.units
, 'display_name', m.display
) as parameter
, s.sensors_id
, json_build_object(
'min', r.value_min
, 'max', r.value_max
, 'avg', r.value_avg
, 'sd', r.value_sd
) as summary
, calculate_coverage(
r.value_count
, s.data_averaging_period_seconds
, s.data_logging_period_seconds
, r.datetime_first
, r.datetime_last
) as coverage
, get_datetime_object(r.datetime_first, t.tzid) as datetime_first
, get_datetime_object(r.datetime_last, t.tzid) as datetime_last
, json_build_object(
'datetime', get_datetime_object(r.datetime_last, t.tzid)
, 'value', r.value_latest
, 'coordinates', json_build_object(
'latitude', st_y(COALESCE(r.geom_latest, n.geom))
,'longitude', st_x(COALESCE(r.geom_latest, n.geom))
)) as latest
FROM sensors s
JOIN sensor_systems sy ON (s.sensor_systems_id = sy.sensor_systems_id)
JOIN sensor_nodes n ON (sy.sensor_nodes_id = n.sensor_nodes_id)
JOIN timezones t ON (n.timezones_id = t.gid)
JOIN measurands m ON (s.measurands_id = m.measurands_id)
LEFT JOIN sensors_rollup r ON (s.sensors_id = r.sensors_id)
{query.where()}
{query.pagination()}
"""
return await db.fetchPage(sql, query.params())

0 comments on commit 15a7667

Please sign in to comment.