initial commit

This commit is contained in:
mjames-upc 2018-09-05 15:52:38 -06:00
commit c132d638fe
500 changed files with 36095 additions and 0 deletions

3
.gitignore vendored Normal file
View file

@ -0,0 +1,3 @@
.ipynb_checkpoints
docs/build/
*.pyc

50
.travis.yml Normal file
View file

@ -0,0 +1,50 @@
# After changing this file, check it on:
# http://lint.travis-ci.org/
language: python
sudo: false
python:
- 3.6
- 2.7
env:
global:
- secure: "tHxj3+T83TBr3gQJINw8koSUbNBikKzIiV2AaPnLBE2F+xiweRhOSt0OhyP3HB1CluKfd9Ov0gKTzUsz0+ec08SCWpflT+013A7WG0jT5wtc2umCb7HGS2tpqOifjhXA91hm/88e4NRQ1wTLklNUTFC2JC3Lgjwd912gRlgmv7nU9cMO3w2qf2AtD83HSyIcUkYroQxtQijvSHlLZJUdhZzrvOgVMCtOq7fa5dggkmOtebAVE/GQcO10DfLMmx36rdzq79zRYhXc14Te+B7NR5aQb1CfyzO3FYohkBsaYXjXlgKMMcNC2chyzjja5ajDhbrMTDaeBZORs8SXue92jpFOJCyqTbfYhgiCfiX1NfZMqPyAwn+chMKy/jhK9lXV1xKASbhbnRXv9D5TETNNGgsEM/dj71RWgUKXItRKhznrSNl4gJhehPHzs8LvQdbk3sqepV3E6vl9FtWMsrMpm/8h/U1gxznIgpPYmdqMnjW26iHoDEw0Gq+yTcyCMD9x4s9JaqqYjdkIk1bJke2dr4l4uL36LFjUgx0Kr2Vg3Su2sNHj6sw9MujTeIAhnj+lg+rNZHKvS3Y3ulNY0v0v22+tZXo6zldOpvn+D06rPE/+IJMt94TyrLFksINFetNmRZpoCKVgeImES98vF2LAl3nmRct1G+GxgwvkYcrTyxE="
- WHEELHOUSE="https://unidata-python.s3.amazonaws.com/wheelhouse/index.html"
- WHEELDIR="wheelhouse/"
- EXTRA_INSTALLS="test,cdm"
- MPLLOCALFREETYPE="1"
matrix:
include:
- python: 2.7
env:
- VERSIONS="numpy==1.9.1"
- python: 3.4
env:
- python: 3.5
env:
- python: "3.6-dev"
env: PRE="--pre"
- python: nightly
env: PRE="--pre"
allow_failures:
- python: "3.6-dev"
- python: nightly
before_install:
# Shapely dependency needed to keep from using Shapely's manylinux wheels
# which use a different geos that what we build cartopy with on Travis
- pip install --upgrade pip;
- mkdir $WHEELDIR;
- pip download -d $WHEELDIR ".[$EXTRA_INSTALLS]" $EXTRA_PACKAGES -f $WHEELHOUSE $PRE $VERSIONS;
- touch $WHEELDIR/download_marker && ls -lrt $WHEELDIR;
- travis_wait pip wheel -w $WHEELDIR $EXTRA_PACKAGES -f $WHEELHOUSE $PRE $VERSIONS;
- pip install $EXTRA_PACKAGES --upgrade --no-index -f file://$PWD/$WHEELDIR $VERSIONS;
- travis_wait pip wheel -w $WHEELDIR ".[$EXTRA_INSTALLS]" $EXTRA_PACKAGES -f $WHEELHOUSE $PRE $VERSIONS;
- rm -f $WHEELDIR/python-awips*.whl;
install:
- pip install ".[$EXTRA_INSTALLS]" --upgrade --no-index $PRE -f file://$PWD/$WHEELDIR $VERSIONS;
script: true

30
LICENSE Normal file
View file

@ -0,0 +1,30 @@
Copyright (c) 2017, Unidata Python AWIPS Developers.
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are
met:
* Redistributions of source code must retain the above copyright
notice, this list of conditions and the following disclaimer.
* Redistributions in binary form must reproduce the above
copyright notice, this list of conditions and the following
disclaimer in the documentation and/or other materials provided
with the distribution.
* Neither the name of the MetPy Developers nor the names of any
contributors may be used to endorse or promote products derived
from this software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
"AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

71
README.rst Normal file
View file

@ -0,0 +1,71 @@
AWIPS Python Data Access Framework
==================================
|License| |PyPI| |LatestDocs|
|Travis| |Codacy|
.. |License| image:: https://img.shields.io/pypi/l/python-awips.svg
:target: https://pypi.python.org/pypi/python-awips/
:alt: License
.. |PyPI| image:: https://img.shields.io/pypi/v/python-awips.svg
:target: https://pypi.python.org/pypi/python-awips/
:alt: PyPI Package
.. |PyPIDownloads| image:: https://img.shields.io/pypi/dm/python-awips.svg
:target: https://pypi.python.org/pypi/python-awips/
:alt: PyPI Downloads
.. |LatestDocs| image:: https://readthedocs.org/projects/pip/badge/?version=latest
:target: http://python-awips.readthedocs.org/en/latest/
:alt: Latest Doc Build Status
.. |Travis| image:: https://travis-ci.org/Unidata/python-awips.svg?branch=master
:target: https://travis-ci.org/Unidata/python-awips
:alt: Travis Build Status
.. |Codacy| image:: https://api.codacy.com/project/badge/Grade/560b27db294449ed9484da1aadeaee91
:target: https://www.codacy.com/app/mjames/python-awips
:alt: Codacy issues
Install
-------
- pip install python-awips
Conda Environment
-----------------
- git clone https://github.com/Unidata/python-awips.git
- cd python-awips
- conda env create -f environment.yml
- source activate python-awips
- jupyter notebook examples
Requirements
------------
- Python 2.7+
- pip install numpy shapely six
- pip install metpy enum34 - to run Jupyter Notebook examples
Documentation
-------------
* http://python-awips.readthedocs.org/en/latest/
* http://nbviewer.jupyter.org/github/Unidata/python-awips/tree/master/examples/notebooks
Install from Github
-------------------
- git clone https://github.com/Unidata/python-awips.git
- cd python-awips
- python setup.py install
License
-------
Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). The Python AWIPS package contains no proprietery content and is therefore not subject to export controls as stated in the Master Rights licensing file and source code headers.

52
awips/AlertVizHandler.py Normal file
View file

@ -0,0 +1,52 @@
##
##
#
# Pure python logging mechanism for logging to AlertViz from
# pure python (ie not JEP). DO NOT USE IN PYTHON CALLED
# FROM JAVA.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 08/18/10 njensen Initial Creation.
#
#
#
import logging
import NotificationMessage
class AlertVizHandler(logging.Handler):
def __init__(self, host='localhost', port=61999, category='LOCAL', source='ANNOUNCER', level=logging.NOTSET):
logging.Handler.__init__(self, level)
self._category = category
self._host = host
self._port = port
self._source = source
def emit(self, record):
"Implements logging.Handler's interface. Record argument is a logging.LogRecord."
priority = None
if record.levelno >= 50:
priority = 'CRITICAL'
elif record.levelno >= 40:
priority = 'SIGNIFICANT'
elif record.levelno >= 30:
priority = 'PROBLEM'
elif record.levelno >= 20:
priority = 'EVENTA'
elif record.levelno >= 10:
priority = 'EVENTB'
else:
priority = 'VERBOSE'
msg = self.format(record)
notify = NotificationMessage.NotificationMessage(self._host, self._port, msg, priority, self._category, self._source)
notify.send()

39
awips/ConfigFileUtil.py Normal file
View file

@ -0,0 +1,39 @@
##
##
#
# A set of utility functions for dealing with configuration files.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 09/27/10 dgilling Initial Creation.
#
#
#
def parseKeyValueFile(fileName):
propDict= dict()
try:
propFile= open(fileName, "rU")
for propLine in propFile:
propDef= propLine.strip()
if len(propDef) == 0:
continue
if propDef[0] in ( '#' ):
continue
punctuation= [ propDef.find(c) for c in ':= ' ] + [ len(propDef) ]
found= min( [ pos for pos in punctuation if pos != -1 ] )
name= propDef[:found].rstrip()
value= propDef[found:].lstrip(":= ").rstrip()
propDict[name]= value
propFile.close()
except:
pass
return propDict

View file

@ -0,0 +1,90 @@
# #
# #
#
# Functions for converting between the various "Java" dynamic serialize types
# used by EDEX to the native python time datetime.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/24/15 #4480 dgilling Initial Creation.
#
import datetime
import time
from dynamicserialize.dstypes.java.util import Date
from dynamicserialize.dstypes.java.sql import Timestamp
from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange
MAX_TIME = pow(2, 31) - 1
MICROS_IN_SECOND = 1000000
def convertToDateTime(timeArg):
"""
Converts the given object to a python datetime object. Supports native
python representations like datetime and struct_time, but also
the dynamicserialize types like Date and Timestamp. Raises TypeError
if no conversion can be performed.
Args:
timeArg: a python object representing a date and time. Supported
types include datetime, struct_time, float, int, long and the
dynamicserialize types Date and Timestamp.
Returns:
A datetime that represents the same date/time as the passed in object.
"""
if isinstance(timeArg, datetime.datetime):
return timeArg
elif isinstance(timeArg, time.struct_time):
return datetime.datetime(*timeArg[:6])
elif isinstance(timeArg, float):
# seconds as float, should be avoided due to floating point errors
totalSecs = long(timeArg)
micros = int((timeArg - totalSecs) * MICROS_IN_SECOND)
return _convertSecsAndMicros(totalSecs, micros)
elif isinstance(timeArg, (int, long)):
# seconds as integer
totalSecs = timeArg
return _convertSecsAndMicros(totalSecs, 0)
elif isinstance(timeArg, (Date, Timestamp)):
totalSecs = timeArg.getTime()
return _convertSecsAndMicros(totalSecs, 0)
else:
objType = str(type(timeArg))
raise TypeError("Cannot convert object of type " + objType + " to datetime.")
def _convertSecsAndMicros(seconds, micros):
if seconds < MAX_TIME:
rval = datetime.datetime.utcfromtimestamp(seconds)
else:
extraTime = datetime.timedelta(seconds=(seconds - MAX_TIME))
rval = datetime.datetime.utcfromtimestamp(MAX_TIME) + extraTime
return rval.replace(microsecond=micros)
def constructTimeRange(*args):
"""
Builds a python dynamicserialize TimeRange object from the given
arguments.
Args:
args*: must be a TimeRange or a pair of objects that can be
converted to a datetime via convertToDateTime().
Returns:
A TimeRange.
"""
if len(args) == 1 and isinstance(args[0], TimeRange):
return args[0]
if len(args) != 2:
raise TypeError("constructTimeRange takes exactly 2 arguments, " + str(len(args)) + " provided.")
startTime = convertToDateTime(args[0])
endTime = convertToDateTime(args[1])
return TimeRange(startTime, endTime)

166
awips/NotificationMessage.py Executable file
View file

@ -0,0 +1,166 @@
##
##
from string import Template
import ctypes
import stomp
import socket
import sys
import time
import threading
import xml.etree.ElementTree as ET
import ThriftClient
from dynamicserialize.dstypes.com.raytheon.uf.common.alertviz import AlertVizRequest
from dynamicserialize import DynamicSerializationManager
#
# Provides a capability of constructing notification messages and sending
# them to a STOMP data source.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 09/30/08 chammack Initial Creation.
# 11/03/10 5849 cjeanbap Moved to awips package from
# com.raytheon.uf.tools.cli
# 01/07/11 5645 cjeanbap Added audio file to Status Message.
# 05/27/11 3050 cjeanbap Added if-statement to check Priority
# value
# 07/27/15 4654 skorolev Added filters
# 11/11/15 5120 rferrel Cannot serialize empty filters.
#
class NotificationMessage:
priorityMap = {
0: 'CRITICAL',
1: 'SIGNIFICANT',
2: 'PROBLEM',
3: 'EVENTA',
4: 'EVENTB',
5: 'VERBOSE'}
def __init__(self, host='localhost', port=61999, message='', priority='PROBLEM', category="LOCAL", source="ANNOUNCER", audioFile="NONE", filters=None):
self.host = host
self.port = port
self.message = message
self.audioFile = audioFile
self.source = source
self.category = category
self.filters = filters
priorityInt = None
try:
priorityInt = int(priority)
except:
pass
if priorityInt is None:
#UFStatus.java contains mapping of Priority to Logging level mapping
if priority == 'CRITICAL' or priority == 'FATAL':
priorityInt = int(0)
elif priority == 'SIGNIFICANT' or priority == 'ERROR':
priorityInt = int(1)
elif priority == 'PROBLEM' or priority == 'WARN':
priorityInt = int(2)
elif priority == 'EVENTA' or priority == 'INFO':
priorityInt = int(3)
elif priority == 'EVENTB':
priorityInt = int(4)
elif priority == 'VERBOSE' or priority == 'DEBUG':
priorityInt = int(5)
if (priorityInt < 0 or priorityInt > 5):
print "Error occurred, supplied an invalid Priority value: " + str(priorityInt)
print "Priority values are 0, 1, 2, 3, 4 and 5."
sys.exit(1)
if priorityInt is not None:
self.priority = self.priorityMap[priorityInt]
else:
self.priority = priority
def connection_timeout(self, connection):
if (connection is not None and not connection.is_connected()):
print "Connection Retry Timeout"
for tid, tobj in threading._active.items():
if tobj.name is "MainThread":
res = ctypes.pythonapi.PyThreadState_SetAsyncExc(tid, ctypes.py_object(SystemExit))
if res != 0 and res != 1:
# problem, reset state
ctypes.pythonapi.PyThreadState_SetAsyncExc(tid, 0)
def send(self):
# depending on the value of the port number indicates the distribution
# of the message to AlertViz
# 9581 is global distribution thru ThriftClient to Edex
# 61999 is local distribution
if (int(self.port) == 61999):
# use stomp.py
conn = stomp.Connection(host_and_ports=[(self.host, 61999)])
timeout = threading.Timer(5.0, self.connection_timeout, [conn])
try:
timeout.start();
conn.start()
finally:
timeout.cancel()
conn.connect()
sm = ET.Element("statusMessage")
sm.set("machine", socket.gethostname())
sm.set("priority", self.priority)
sm.set("category", self.category)
sm.set("sourceKey", self.source)
sm.set("audioFile", self.audioFile)
if self.filters is not None and len(self.filters) > 0:
sm.set("filters", self.filters)
msg = ET.SubElement(sm, "message")
msg.text = self.message
details = ET.SubElement(sm, "details")
msg = ET.tostring(sm, "UTF-8")
try :
conn.send(msg, destination='/queue/messages')
time.sleep(2)
finally:
conn.stop()
else:
# use ThriftClient
alertVizRequest = createRequest(self.message, self.priority, self.source, self.category, self.audioFile, self.filters)
thriftClient = ThriftClient.ThriftClient(self.host, self.port, "/services")
serverResponse = None
try:
serverResponse = thriftClient.sendRequest(alertVizRequest)
except Exception, ex:
print "Caught exception submitting AlertVizRequest: ", str(ex)
if (serverResponse != "None"):
print "Error occurred submitting Notification Message to AlertViz receiver: ", serverResponse
sys.exit(1)
else:
print "Response: " + str(serverResponse)
def createRequest(message, priority, source, category, audioFile, filters):
obj = AlertVizRequest()
obj.setMachine(socket.gethostname())
obj.setPriority(priority)
obj.setCategory(category)
obj.setSourceKey(source)
obj.setMessage(message)
if (audioFile is not None):
obj.setAudioFile(audioFile)
else:
obj.setAudioFile('\0')
obj.setFilters(filters)
return obj
if __name__ == '__main__':
main()

105
awips/QpidSubscriber.py Normal file
View file

@ -0,0 +1,105 @@
##
##
#
# Provides a Python-based interface for subscribing to qpid queues and topics.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 11/17/10 njensen Initial Creation.
# 08/15/13 2169 bkowal Optionally gzip decompress any data that is read.
# 08/04/16 2416 tgurney Add queueStarted property
# 02/16/17 6084 bsteffen Support ssl connections
# 09/07/17 6175 tgurney Remove "decompressing" log message
#
#
import os
import os.path
import qpid
import zlib
from Queue import Empty
from qpid.exceptions import Closed
class QpidSubscriber:
def __init__(self, host='127.0.0.1', port=5672, decompress=False, ssl=None):
self.host = host
self.port = port
self.decompress = decompress;
socket = qpid.util.connect(host, port)
if "QPID_SSL_CERT_DB" in os.environ:
certdb = os.environ["QPID_SSL_CERT_DB"]
else:
certdb = os.path.expanduser("~/.qpid/")
if "QPID_SSL_CERT_NAME" in os.environ:
certname = os.environ["QPID_SSL_CERT_NAME"]
else:
certname = "guest"
certfile = os.path.join(certdb, certname + ".crt")
if ssl or (ssl is None and os.path.exists(certfile)):
keyfile = os.path.join(certdb, certname + ".key")
trustfile = os.path.join(certdb, "root.crt")
socket = qpid.util.ssl(socket, keyfile=keyfile, certfile=certfile, ca_certs=trustfile)
self.__connection = qpid.connection.Connection(sock=socket, username='guest', password='guest')
self.__connection.start()
self.__session = self.__connection.session(str(qpid.datatypes.uuid4()))
self.subscribed = True
self.__queueStarted = False
def topicSubscribe(self, topicName, callback):
# if the queue is edex.alerts, set decompress to true always for now to
# maintain compatibility with existing python scripts.
if (topicName == 'edex.alerts'):
self.decompress = True
print "Establishing connection to broker on", self.host
queueName = topicName + self.__session.name
self.__session.queue_declare(queue=queueName, exclusive=True, auto_delete=True, arguments={'qpid.max_count':100, 'qpid.policy_type':'ring'})
self.__session.exchange_bind(exchange='amq.topic', queue=queueName, binding_key=topicName)
self.__innerSubscribe(queueName, callback)
def __innerSubscribe(self, serverQueueName, callback):
local_queue_name = 'local_queue_' + serverQueueName
queue = self.__session.incoming(local_queue_name)
self.__session.message_subscribe(serverQueueName, destination=local_queue_name)
queue.start()
print "Connection complete to broker on", self.host
self.__queueStarted = True
while self.subscribed:
try:
message = queue.get(timeout=10)
content = message.body
self.__session.message_accept(qpid.datatypes.RangedSet(message.id))
if (self.decompress):
try:
# http://stackoverflow.com/questions/2423866/python-decompressing-gzip-chunk-by-chunk
d = zlib.decompressobj(16+zlib.MAX_WBITS)
content = d.decompress(content)
except Exception:
# decompression failed, return the original content
pass
callback(content)
except Empty:
pass
except Closed:
self.close()
def close(self):
self.__queueStarted = False
self.subscribed = False
try:
self.__session.close(timeout=10)
except Exception:
pass
@property
def queueStarted(self):
return self.__queueStarted

84
awips/ThriftClient.py Normal file
View file

@ -0,0 +1,84 @@
##
##
import httplib
from dynamicserialize import DynamicSerializationManager
from dynamicserialize.dstypes.com.raytheon.uf.common.serialization.comm.response import ServerErrorResponse
from dynamicserialize.dstypes.com.raytheon.uf.common.serialization import SerializableExceptionWrapper
#
# Provides a Python-based interface for executing Thrift requests.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 09/20/10 dgilling Initial Creation.
#
#
#
class ThriftClient:
# How to call this constructor:
# 1. Pass in all arguments separately (e.g.,
# ThriftClient.ThriftClient("localhost", 9581, "/services"))
# will return a Thrift client pointed at http://localhost:9581/services.
# 2. Pass in all arguments through the host string (e.g.,
# ThriftClient.ThriftClient("localhost:9581/services"))
# will return a Thrift client pointed at http://localhost:9581/services.
# 3. Pass in host/port arguments through the host string (e.g.,
# ThriftClient.ThriftClient("localhost:9581", "/services"))
# will return a Thrift client pointed at http://localhost:9581/services.
def __init__(self, host, port=9581, uri="/services"):
hostParts = host.split("/", 1)
if (len(hostParts) > 1):
hostString = hostParts[0]
self.__uri = "/" + hostParts[1]
self.__httpConn = httplib.HTTPConnection(hostString)
else:
if (port is None):
self.__httpConn = httplib.HTTPConnection(host)
else:
self.__httpConn = httplib.HTTPConnection(host, port)
self.__uri = uri
self.__dsm = DynamicSerializationManager.DynamicSerializationManager()
def sendRequest(self, request, uri="/thrift"):
message = self.__dsm.serializeObject(request)
self.__httpConn.connect()
self.__httpConn.request("POST", self.__uri + uri, message)
response = self.__httpConn.getresponse()
if (response.status != 200):
raise ThriftRequestException("Unable to post request to server")
rval = self.__dsm.deserializeBytes(response.read())
self.__httpConn.close()
# let's verify we have an instance of ServerErrorResponse
# IF we do, through an exception up to the caller along
# with the original Java stack trace
# ELSE: we have a valid response and pass it back
try:
forceError = rval.getException()
raise ThriftRequestException(forceError)
except AttributeError:
pass
return rval
class ThriftRequestException(Exception):
def __init__(self, value):
self.parameter = value
def __str__(self):
return repr(self.parameter)

91
awips/TimeUtil.py Normal file
View file

@ -0,0 +1,91 @@
##
##
# ----------------------------------------------------------------------------
# This software is in the public domain, furnished "as is", without technical
# support, and with no warranty, express or implied, as to its usefulness for
# any purpose.
#
# offsetTime.py
# Handles Displaced Real Time for various applications
#
# Author: hansen/romberg
# ----------------------------------------------------------------------------
import string
import time
# Given the timeStr, return the offset (in seconds)
# from the current time.
# Also return the launchStr i.e. Programs launched from this
# offset application will use the launchStr as the -z argument.
# The offset will be positive for time in the future,
# negative for time in the past.
#
# May still want it to be normalized to the most recent midnight.
#
# NOTES about synchronizing:
# --With synchronizing on, the "current time" for all processes started
# within a given hour will be the same.
# This guarantees that GFE's have the same current time and ISC grid
# time stamps are syncrhonized and can be exchanged.
# Formatters launched from the GFE in this mode will be synchronized as
# well by setting the launchStr to use the time difference format
# (YYYYMMDD_HHMM,YYYYMMDD_HHMM).
# --This does not solve the problem in the general case.
# For example, if someone starts the GFE at 12:59 and someone
# else starts it at 1:01, they will have different offsets and
# current times.
# --With synchronizing off, when the process starts, the current time
# matches the drtTime in the command line. However, with synchronizing
# on, the current time will be offset by the fraction of the hour at
# which the process was started. Examples:
# Actual Starting time: 20040617_1230
# drtTime 20040616_0000
# Synchronizing off:
# GFE Spatial Editor at StartUp: 20040616_0000
# Synchronizing on:
# GFE Spatial Editor at StartUp: 20040616_0030
#
def determineDrtOffset(timeStr):
launchStr = timeStr
# Check for time difference
if timeStr.find(",") >=0:
times = timeStr.split(",")
t1 = makeTime(times[0])
t2 = makeTime(times[1])
#print "time offset", t1-t2, (t1-t2)/3600
return t1-t2, launchStr
# Check for synchronized mode
synch = 0
if timeStr[0] == "S":
timeStr = timeStr[1:]
synch = 1
drt_t = makeTime(timeStr)
#print "input", year, month, day, hour, minute
gm = time.gmtime()
cur_t = time.mktime(gm)
# Synchronize to most recent hour
# i.e. "truncate" cur_t to most recent hour.
#print "gmtime", gm
if synch:
cur_t = time.mktime((gm[0], gm[1], gm[2], gm[3], 0, 0, 0, 0, 0))
curStr = '%4s%2s%2s_%2s00\n' % (`gm[0]`,`gm[1]`,`gm[2]`,`gm[3]`)
curStr = curStr.replace(' ','0')
launchStr = timeStr + "," + curStr
#print "drt, cur", drt_t, cur_t
offset = drt_t - cur_t
#print "offset", offset, offset/3600, launchStr
return int(offset), launchStr
def makeTime(timeStr):
year = string.atoi(timeStr[0:4])
month = string.atoi(timeStr[4:6])
day = string.atoi(timeStr[6:8])
hour = string.atoi(timeStr[9:11])
minute = string.atoi(timeStr[11:13])
# Do not use daylight savings because gmtime is not in daylight
# savings time.
return time.mktime((year, month, day, hour, minute, 0, 0, 0, 0))

View file

@ -0,0 +1,64 @@
##
##
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------- -------- --------- ---------------------------------------------
# Feb 13, 2017 6092 randerso Added StoreTimeAction
#
##
import argparse
import sys
import time
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.db.objects import DatabaseID
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.db.objects import ParmID
TIME_FORMAT = "%Y%m%d_%H%M"
class UsageArgumentParser(argparse.ArgumentParser):
"""
A subclass of ArgumentParser that overrides error() to print the
whole help text, rather than just the usage string.
"""
def error(self, message):
sys.stderr.write('%s: error: %s\n' % (self.prog, message))
self.print_help()
sys.exit(2)
## Custom actions for ArgumentParser objects ##
class StoreDatabaseIDAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
did = DatabaseID(values)
if did.isValid():
setattr(namespace, self.dest, did)
else:
parser.error("DatabaseID [" + values + "] not a valid identifier")
class AppendParmNameAndLevelAction(argparse.Action):
def __call__(self, parser, namespace, values, option_string=None):
tx = ParmID.parmNameAndLevel(values)
comp = tx[0] + '_' + tx[1]
if (hasattr(namespace, self.dest)) and \
(getattr(namespace, self.dest) is not None):
currentValues = getattr(namespace, self.dest)
currentValues.append(comp)
setattr(namespace, self.dest, currentValues)
else:
setattr(namespace, self.dest, [comp])
class StoreTimeAction(argparse.Action):
"""
argparse.Action subclass to validate GFE formatted time strings
and parse them to time.struct_time
"""
def __call__(self, parser, namespace, values, option_string=None):
try:
timeStruct = time.strptime(values, TIME_FORMAT)
except:
parser.error(str(values) + " is not a valid time string of the format YYYYMMDD_hhmm")
setattr(namespace, self.dest, timeStruct)

View file

@ -0,0 +1,21 @@
##
##
import sys
from optparse import OptionParser
class UsageOptionParser(OptionParser):
"""
A subclass of OptionParser that prints that overrides error() to print the
whole help text, rather than just the usage string.
"""
def error(self, msg):
"""
Print the help text and exit.
"""
self.print_help(sys.stderr)
sys.stderr.write("\n")
sys.stderr.write(msg)
sys.stderr.write("\n")
sys.exit(2)

20
awips/__init__.py Normal file
View file

@ -0,0 +1,20 @@
##
##
#
# __init__.py for awips package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 09/21/10 dgilling Initial Creation.
#
#
#
__all__ = [
]

View file

@ -0,0 +1,83 @@
# #
# #
#
# Method for performing a DAF time query where all parameter/level/location
# combinations must be available at the same time.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/22/16 #5591 bsteffen Initial Creation.
#
from awips.dataaccess import DataAccessLayer
def getAvailableTimes(request, refTimeOnly=False):
return __getAvailableTimesForEachParameter(request, refTimeOnly)
def __getAvailableTimesForEachParameter(request, refTimeOnly=False):
parameters = request.getParameters()
if parameters:
times = None
for parameter in parameters:
specificRequest = __cloneRequest(request)
specificRequest.setParameters(parameter)
specificTimes = __getAvailableTimesForEachLevel(specificRequest, refTimeOnly)
if times is None:
times = specificTimes
else:
times.intersection_update(specificTimes)
if not times:
break
return times
else:
return __getAvailableTimesForEachLevel(request, refTimeOnly)
def __getAvailableTimesForEachLevel(request, refTimeOnly=False):
levels = request.getLevels()
if levels:
times = None
for level in levels:
specificRequest = __cloneRequest(request)
specificRequest.setLevels(level)
specificTimes = __getAvailableTimesForEachLocation(specificRequest, refTimeOnly)
if times is None:
times = specificTimes
else:
times.intersection_update(specificTimes)
if not times:
break
return times
else:
return __getAvailableTimesForEachLocation(request, refTimeOnly)
def __getAvailableTimesForEachLocation(request, refTimeOnly=False):
locations = request.getLocationNames()
if locations:
times = None
for location in locations:
specificRequest = __cloneRequest(request)
specificRequest.setLocationNames(location)
specificTimes = DataAccessLayer.getAvailableTimes(specificRequest, refTimeOnly)
if times is None:
times = set(specificTimes)
else:
times.intersection_update(specificTimes)
if not times:
break
return times
else:
return DataAccessLayer.getAvailableTimes(request, refTimeOnly)
def __cloneRequest(request):
return DataAccessLayer.newDataRequest(datatype = request.getDatatype(),
parameters = request.getParameters(),
levels = request.getLevels(),
locationNames = request.getLocationNames(),
envelope = request.getEnvelope(),
**request.getIdentifiers())

View file

@ -0,0 +1,266 @@
# #
# #
#
# Published interface for awips.dataaccess package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 12/10/12 njensen Initial Creation.
# Feb 14, 2013 1614 bsteffen refactor data access framework
# to use single request.
# 04/10/13 1871 mnash move getLatLonCoords to JGridData and add default args
# 05/29/13 2023 dgilling Hook up ThriftClientRouter.
# 03/03/14 2673 bsteffen Add ability to query only ref times.
# 07/22/14 3185 njensen Added optional/default args to newDataRequest
# 07/30/14 3185 njensen Renamed valid identifiers to optional
# Apr 26, 2015 4259 njensen Updated for new JEP API
# Apr 13, 2016 5379 tgurney Add getIdentifierValues()
# Jun 01, 2016 5587 tgurney Add new signatures for
# getRequiredIdentifiers() and
# getOptionalIdentifiers()
# Oct 18, 2016 5916 bsteffen Add setLazyLoadGridLatLon
#
#
import sys
import subprocess
import warnings
THRIFT_HOST = "edex"
USING_NATIVE_THRIFT = False
if 'jep' in sys.modules:
# intentionally do not catch if this fails to import, we want it to
# be obvious that something is configured wrong when running from within
# Java instead of allowing false confidence and fallback behavior
import JepRouter
router = JepRouter
else:
from awips.dataaccess import ThriftClientRouter
router = ThriftClientRouter.ThriftClientRouter(THRIFT_HOST)
USING_NATIVE_THRIFT = True
def getForecastRun(cycle, times):
"""
:param cycle: Forecast cycle reference time
:param times: All available times/cycles
:return: DataTime array for a single forecast run
"""
fcstRun = []
for t in times:
if str(t)[:19] == str(cycle):
fcstRun.append(t)
return fcstRun
def getAvailableTimes(request, refTimeOnly=False):
"""
Get the times of available data to the request.
Args:
request: the IDataRequest to get data for
refTimeOnly: optional, use True if only unique refTimes should be
returned (without a forecastHr)
Returns:
a list of DataTimes
"""
return router.getAvailableTimes(request, refTimeOnly)
def getGridData(request, times=[]):
"""
Gets the grid data that matches the request at the specified times. Each
combination of parameter, level, and dataTime will be returned as a
separate IGridData.
Args:
request: the IDataRequest to get data for
times: a list of DataTimes, a TimeRange, or None if the data is time
agnostic
Returns:
a list of IGridData
"""
return router.getGridData(request, times)
def getGeometryData(request, times=[]):
"""
Gets the geometry data that matches the request at the specified times.
Each combination of geometry, level, and dataTime will be returned as a
separate IGeometryData.
Args:
request: the IDataRequest to get data for
times: a list of DataTimes, a TimeRange, or None if the data is time
agnostic
Returns:
a list of IGeometryData
"""
return router.getGeometryData(request, times)
def getAvailableLocationNames(request):
"""
Gets the available location names that match the request without actually
requesting the data.
Args:
request: the request to find matching location names for
Returns:
a list of strings of available location names.
"""
return router.getAvailableLocationNames(request)
def getAvailableParameters(request):
"""
Gets the available parameters names that match the request without actually
requesting the data.
Args:
request: the request to find matching parameter names for
Returns:
a list of strings of available parameter names.
"""
return router.getAvailableParameters(request)
def getAvailableLevels(request):
"""
Gets the available levels that match the request without actually
requesting the data.
Args:
request: the request to find matching levels for
Returns:
a list of strings of available levels.
"""
return router.getAvailableLevels(request)
def getRequiredIdentifiers(request):
"""
Gets the required identifiers for this request. These identifiers
must be set on a request for the request of this datatype to succeed.
Args:
request: the request to find required identifiers for
Returns:
a list of strings of required identifiers
"""
if str(request) == request:
warnings.warn("Use getRequiredIdentifiers(IDataRequest) instead",
DeprecationWarning)
return router.getRequiredIdentifiers(request)
def getOptionalIdentifiers(request):
"""
Gets the optional identifiers for this request.
Args:
request: the request to find optional identifiers for
Returns:
a list of strings of optional identifiers
"""
if str(request) == request:
warnings.warn("Use getOptionalIdentifiers(IDataRequest) instead",
DeprecationWarning)
return router.getOptionalIdentifiers(request)
def getIdentifierValues(request, identifierKey):
"""
Gets the allowed values for a particular identifier on this datatype.
Args:
request: the request to find identifier values for
identifierKey: the identifier to find values for
Returns:
a list of strings of allowed values for the specified identifier
"""
return router.getIdentifierValues(request, identifierKey)
def newDataRequest(datatype=None, **kwargs):
""""
Creates a new instance of IDataRequest suitable for the runtime environment.
All args are optional and exist solely for convenience.
Args:
datatype: the datatype to create a request for
parameters: a list of parameters to set on the request
levels: a list of levels to set on the request
locationNames: a list of locationNames to set on the request
envelope: an envelope to limit the request
**kwargs: any leftover kwargs will be set as identifiers
Returns:
a new IDataRequest
"""
return router.newDataRequest(datatype, **kwargs)
def getSupportedDatatypes():
"""
Gets the datatypes that are supported by the framework
Returns:
a list of strings of supported datatypes
"""
return router.getSupportedDatatypes()
def changeEDEXHost(newHostName):
"""
Changes the EDEX host the Data Access Framework is communicating with. Only
works if using the native Python client implementation, otherwise, this
method will throw a TypeError.
Args:
newHostHame: the EDEX host to connect to
"""
if USING_NATIVE_THRIFT:
global THRIFT_HOST
THRIFT_HOST = newHostName
global router
router = ThriftClientRouter.ThriftClientRouter(THRIFT_HOST)
else:
raise TypeError("Cannot call changeEDEXHost when using JepRouter.")
def setLazyLoadGridLatLon(lazyLoadGridLatLon):
"""
Provide a hint to the Data Access Framework indicating whether to load the
lat/lon data for a grid immediately or wait until it is needed. This is
provided as a performance tuning hint and should not affect the way the
Data Access Framework is used. Depending on the internal implementation of
the Data Access Framework this hint might be ignored. Examples of when this
should be set to True are when the lat/lon information is not used or when
it is used only if certain conditions within the data are met. It could be
set to False if it is guaranteed that all lat/lon information is needed and
it would be better to get any performance overhead for generating the
lat/lon data out of the way during the initial request.
Args:
lazyLoadGridLatLon: Boolean value indicating whether to lazy load.
"""
try:
router.setLazyLoadGridLatLon(lazyLoadGridLatLon)
except AttributeError:
# The router is not required to support this capability.
pass

View file

@ -0,0 +1,140 @@
# #
# #
#
# Published interface for retrieving data updates via awips.dataaccess package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# May 26, 2016 2416 rjpeter Initial Creation.
# Aug 1, 2016 2416 tgurney Finish implementation
#
#
"""
Interface for the DAF's data notification feature, which allows continuous
retrieval of new data as it is coming into the system.
There are two ways to access this feature:
1. The DataQueue module (awips.dataaccess.DataQueue) offers a collection that
automatically fills up with new data as it receives notifications. See that
module for more information.
2. Depending on the type of data you want, use either getGridDataUpdates() or
getGeometryDataUpdates() in this module. Either one will give you back an
object that will retrieve new data for you and will call a function you specify
each time new data is received.
Example code follows. This example prints temperature as observed from KOMA
each time a METAR is received from there.
from awips.dataaccess import DataAccessLayer as DAL
from awips.dataaccess import DataNotificationLayer as DNL
def process_obs(list_of_data):
for item in list_of_data:
print(item.getNumber('temperature'))
request = DAL.newDataRequest('obs')
request.setParameters('temperature')
request.setLocationNames('KOMA')
notifier = DNL.getGeometryDataUpdates(request)
notifier.subscribe(process_obs)
# process_obs will called with a list of data each time new data comes in
"""
import re
import sys
import subprocess
from awips.dataaccess.PyGeometryNotification import PyGeometryNotification
from awips.dataaccess.PyGridNotification import PyGridNotification
THRIFT_HOST = subprocess.check_output(
"source /awips2/fxa/bin/setup.env; echo $DEFAULT_HOST",
shell=True).strip()
USING_NATIVE_THRIFT = False
JMS_HOST_PATTERN=re.compile('tcp://([^:]+):([0-9]+)')
if sys.modules.has_key('jep'):
# intentionally do not catch if this fails to import, we want it to
# be obvious that something is configured wrong when running from within
# Java instead of allowing false confidence and fallback behavior
import JepRouter
router = JepRouter
else:
from awips.dataaccess import ThriftClientRouter
router = ThriftClientRouter.ThriftClientRouter(THRIFT_HOST)
USING_NATIVE_THRIFT = True
def _getJmsConnectionInfo(notifFilterResponse):
serverString = notifFilterResponse.getJmsConnectionInfo()
try:
host, port = JMS_HOST_PATTERN.match(serverString).groups()
except AttributeError as e:
raise RuntimeError('Got bad JMS connection info from server: ' + serverString)
return {'host': host, 'port': port}
def getGridDataUpdates(request):
"""
Get a notification object that receives updates to grid data.
Args:
request: the IDataRequest specifying the data you want to receive
Returns:
an update request object that you can listen for updates to by
calling its subscribe() method
"""
response = router.getNotificationFilter(request)
filter = response.getNotificationFilter()
jmsInfo = _getJmsConnectionInfo(response)
notifier = PyGridNotification(request, filter, requestHost=THRIFT_HOST, **jmsInfo)
return notifier
def getGeometryDataUpdates(request):
"""
Get a notification object that receives updates to geometry data.
Args:
request: the IDataRequest specifying the data you want to receive
Returns:
an update request object that you can listen for updates to by
calling its subscribe() method
"""
response = router.getNotificationFilter(request)
filter = response.getNotificationFilter()
jmsInfo = _getJmsConnectionInfo(response)
notifier = PyGeometryNotification(request, filter, requestHost=THRIFT_HOST, **jmsInfo)
return notifier
def changeEDEXHost(newHostName):
"""
Changes the EDEX host the Data Access Framework is communicating with. Only
works if using the native Python client implementation, otherwise, this
method will throw a TypeError.
Args:
newHostHame: the EDEX host to connect to
"""
if USING_NATIVE_THRIFT:
global THRIFT_HOST
THRIFT_HOST = newHostName
global router
router = ThriftClientRouter.ThriftClientRouter(THRIFT_HOST)
else:
raise TypeError("Cannot call changeEDEXHost when using JepRouter.")

View file

@ -0,0 +1,196 @@
# #
# #
#
# Convenience class for using the DAF's notifications feature. This is a
# collection that, once connected to EDEX by calling start(), fills with
# data as notifications come in. Runs on a separate thread to allow
# non-blocking data retrieval.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/29/16 2416 tgurney Initial creation
#
from awips.dataaccess import DataNotificationLayer as DNL
import time
from threading import Thread
import sys
if sys.version_info.major == 2:
from Queue import Queue, Empty
else: # Python 3 module renamed to 'queue'
from queue import Queue, Empty
"""Used to indicate a DataQueue that will produce geometry data."""
GEOMETRY = object()
"""Used to indicate a DataQueue that will produce grid data."""
GRID = object()
"""Default maximum queue size."""
_DEFAULT_MAXSIZE = 100
class Closed(Exception):
"""Raised when attempting to get data from a closed queue."""
pass
class DataQueue(object):
"""
Convenience class for using the DAF's notifications feature. This is a
collection that, once connected to EDEX by calling start(), fills with
data as notifications come in.
Example for getting obs data:
from DataQueue import DataQueue, GEOMETRY
request = DataAccessLayer.newDataRequest('obs')
request.setParameters('temperature')
request.setLocationNames('KOMA')
q = DataQueue(GEOMETRY, request)
q.start()
for item in q:
print(item.getNumber('temperature'))
"""
def __init__(self, dtype, request, maxsize=_DEFAULT_MAXSIZE):
"""
Create a new DataQueue.
Args:
dtype: Either GRID or GEOMETRY; must match the type of data
requested.
request: IDataRequest describing the data you want. It must at
least have datatype set. All data produced will satisfy the
constraints you specify.
maxsize: Maximum number of data objects the queue can hold at
one time. If the limit is reached, any data coming in after
that will not appear until one or more items are removed using
DataQueue.get().
"""
assert maxsize > 0
assert dtype in (GEOMETRY, GRID)
self._maxsize = maxsize
self._queue = Queue(maxsize=maxsize)
self._thread = None
if dtype is GEOMETRY:
self._notifier = DNL.getGeometryDataUpdates(request)
elif dtype is GRID:
self._notifier = DNL.getGridDataUpdates(request)
def start(self):
"""Start listening for notifications and requesting data."""
if self._thread is not None:
# Already started
return
kwargs = {'callback': self._data_received}
self._thread = Thread(target=self._notifier.subscribe, kwargs=kwargs)
self._thread.daemon = True
self._thread.start()
timer = 0
while not self._notifier.subscribed:
time.sleep(0.1)
timer += 1
if timer >= 100: # ten seconds
raise RuntimeError('timed out when attempting to subscribe')
def _data_received(self, data):
for d in data:
if not isinstance(d, list):
d = [d]
for item in d:
self._queue.put(item)
def get(self, block=True, timeout=None):
"""
Get and return the next available data object. By default, if there is
no data yet available, this method will not return until data becomes
available.
Args:
block: Specifies behavior when the queue is empty. If True, wait
until an item is available before returning (the default). If
False, return None immediately if the queue is empty.
timeout: If block is True, wait this many seconds, and return None
if data is not received in that time.
Returns:
IData
"""
if self.closed:
raise Closed
try:
return self._queue.get(block, timeout)
except Empty:
return None
def get_all(self):
"""
Get all data waiting for processing, in a single list. Always returns
immediately. Returns an empty list if no data has arrived yet.
Returns:
List of IData
"""
data = []
for _ in range(self._maxsize):
next_item = self.get(False)
if next_item is None:
break
data.append(next_item)
return data
def close(self):
"""Close the queue. May not be re-opened after closing."""
if not self.closed:
self._notifier.close()
self._thread.join()
def qsize(self):
"""Return number of items in the queue."""
return self._queue.qsize()
def empty(self):
"""Return True if the queue is empty."""
return self._queue.empty()
def full(self):
"""Return True if the queue is full."""
return self._queue.full()
@property
def closed(self):
"""True if the queue has been closed."""
return not self._notifier.subscribed
@property
def maxsize(self):
"""
Maximum number of data objects the queue can hold at one time.
If this limit is reached, any data coming in after that will not appear
until one or more items are removed using get().
"""
return self._maxsize
def __iter__(self):
if self._thread is not None:
while not self.closed:
yield self.get()
def __enter__(self):
self.start()
return self
def __exit__(self, *unused):
self.close()

View file

@ -0,0 +1,40 @@
##
##
#
# Implements IData for use by native Python clients to the Data Access
# Framework.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/03/13 dgilling Initial Creation.
#
#
from awips.dataaccess import IData
class PyData(IData):
def __init__(self, dataRecord):
self.__time = dataRecord.getTime()
self.__level = dataRecord.getLevel()
self.__locationName = dataRecord.getLocationName()
self.__attributes = dataRecord.getAttributes()
def getAttribute(self, key):
return self.__attributes[key]
def getAttributes(self):
return self.__attributes.keys()
def getDataTime(self):
return self.__time
def getLevel(self):
return self.__level
def getLocationName(self):
return self.__locationName

View file

@ -0,0 +1,63 @@
##
##
#
# Implements IGeometryData for use by native Python clients to the Data Access
# Framework.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/03/13 dgilling Initial Creation.
# 01/06/14 2537 bsteffen Share geometry WKT.
# 03/19/14 2882 dgilling Raise an exception when getNumber()
# is called for data that is not a
# numeric Type.
# 06/09/16 5574 mapeters Handle 'SHORT' type in getNumber().
#
#
from awips.dataaccess import IGeometryData
from awips.dataaccess import PyData
class PyGeometryData(IGeometryData, PyData.PyData):
def __init__(self, geoDataRecord, geometry):
PyData.PyData.__init__(self, geoDataRecord)
self.__geometry = geometry
self.__dataMap = {}
tempDataMap = geoDataRecord.getDataMap()
for key, value in tempDataMap.items():
self.__dataMap[key] = (value[0], value[1], value[2])
def getGeometry(self):
return self.__geometry
def getParameters(self):
return self.__dataMap.keys()
def getString(self, param):
value = self.__dataMap[param][0]
return str(value)
def getNumber(self, param):
value = self.__dataMap[param][0]
t = self.getType(param)
if t == 'INT' or t == 'SHORT':
return int(value)
elif t == 'LONG':
return long(value)
elif t == 'FLOAT':
return float(value)
elif t == 'DOUBLE':
return float(value)
else:
raise TypeError("Data for parameter " + param + " is not a numeric type.")
def getUnit(self, param):
return self.__dataMap[param][2]
def getType(self, param):
return self.__dataMap[param][1]

View file

@ -0,0 +1,37 @@
# #
# #
#
# Notification object that produces geometry data
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/22/16 2416 tgurney Initial creation
# 09/07/17 6175 tgurney Override messageReceived
#
import dynamicserialize
from awips.dataaccess.PyNotification import PyNotification
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
class PyGeometryNotification(PyNotification):
def messageReceived(self, msg):
dataUriMsg = dynamicserialize.deserialize(msg)
dataUris = dataUriMsg.getDataURIs()
dataTimes = set()
for dataUri in dataUris:
if self.notificationFilter.accept(dataUri):
dataTimes.add(self.getDataTime(dataUri))
if dataTimes:
try:
data = self.getData(self.request, list(dataTimes))
self.callback(data)
except Exception as e:
traceback.print_exc()
def getData(self, request, dataTimes):
return self.DAL.getGeometryData(request, dataTimes)

View file

@ -0,0 +1,64 @@
# #
# #
#
# Implements IGridData for use by native Python clients to the Data Access
# Framework.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/03/13 #2023 dgilling Initial Creation.
# 10/13/16 #5916 bsteffen Correct grid shape, allow lat/lon
# 11/10/16 #5900 bsteffen Correct grid shape
# to be requested by a delegate
#
#
import numpy
import warnings
from awips.dataaccess import IGridData
from awips.dataaccess import PyData
NO_UNIT_CONVERT_WARNING = """
The ability to unit convert grid data is not currently available in this version of the Data Access Framework.
"""
class PyGridData(IGridData, PyData.PyData):
def __init__(self, gridDataRecord, nx, ny, latLonGrid = None, latLonDelegate = None):
PyData.PyData.__init__(self, gridDataRecord)
nx = nx
ny = ny
self.__parameter = gridDataRecord.getParameter()
self.__unit = gridDataRecord.getUnit()
self.__gridData = numpy.reshape(numpy.array(gridDataRecord.getGridData()), (ny, nx))
self.__latLonGrid = latLonGrid
self.__latLonDelegate = latLonDelegate
def getParameter(self):
return self.__parameter
def getUnit(self):
return self.__unit
def getRawData(self, unit=None):
# TODO: Find a proper python library that deals will with numpy and
# javax.measure style unit strings and hook it in to this method to
# allow end-users to perform unit conversion for grid data.
if unit is not None:
warnings.warn(NO_UNIT_CONVERT_WARNING, stacklevel=2)
return self.__gridData
def getLatLonCoords(self):
if self.__latLonGrid is not None:
return self.__latLonGrid
elif self.__latLonDelegate is not None:
return self.__latLonDelegate()
return self.__latLonGrid

View file

@ -0,0 +1,42 @@
# #
# #
#
# Notification object that produces grid data
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/03/16 2416 rjpeter Initial Creation.
# 09/06/17 6175 tgurney Override messageReceived
#
import dynamicserialize
from awips.dataaccess.PyNotification import PyNotification
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
class PyGridNotification(PyNotification):
def messageReceived(self, msg):
dataUriMsg = dynamicserialize.deserialize(msg)
dataUris = dataUriMsg.getDataURIs()
for dataUri in dataUris:
if not self.notificationFilter.accept(dataUri):
continue
try:
# This improves performance over requesting by datatime since it requests only the
# parameter that the notification was received for (instead of this and all previous
# parameters for the same forecast hour)
# TODO: This utterly fails for derived requests
newReq = self.DAL.newDataRequest(self.request.getDatatype())
newReq.addIdentifier("dataURI", dataUri)
newReq.setParameters(self.request.getParameters())
data = self.getData(newReq, [])
self.callback(data)
except Exception as e:
traceback.print_exc()
def getData(self, request, dataTimes):
return self.DAL.getGridData(request, dataTimes)

View file

@ -0,0 +1,93 @@
##
##
#
# Implements IData for use by native Python clients to the Data Access
# Framework.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# Jun 22, 2016 2416 rjpeter Initial creation
# Jul 22, 2016 2416 tgurney Finish implementation
# Sep 07, 2017 6175 tgurney Override messageReceived in subclasses
#
import abc
import time
import traceback
import dynamicserialize
from awips.dataaccess import DataAccessLayer
from awips.dataaccess import INotificationSubscriber
from awips.QpidSubscriber import QpidSubscriber
from awips.ThriftClient import ThriftRequestException
from dynamicserialize.dstypes.com.raytheon.uf.common.time import DataTime
class PyNotification(INotificationSubscriber):
"""
Receives notifications for new data and retrieves the data that meets
specified filtering criteria.
"""
__metaclass__ = abc.ABCMeta
def __init__(self, request, filter, host='localhost', port=5672, requestHost='localhost'):
self.DAL = DataAccessLayer
self.DAL.changeEDEXHost(requestHost)
self.request = request
self.notificationFilter = filter
self.__topicSubscriber = QpidSubscriber(host, port, decompress=True)
self.__topicName = "edex.alerts"
self.callback = None
def subscribe(self, callback):
"""
Start listening for notifications.
Args:
callback: Function to call with a list of received data objects.
Will be called once for each request made for data.
"""
assert hasattr(callback, '__call__'), 'callback arg must be callable'
self.callback = callback
self.__topicSubscriber.topicSubscribe(self.__topicName, self.messageReceived)
# Blocks here
def close(self):
if self.__topicSubscriber.subscribed:
self.__topicSubscriber.close()
def getDataTime(self, dataURI):
dataTimeStr = dataURI.split('/')[2]
return DataTime(dataTimeStr)
@abc.abstractmethod
def messageReceived(self, msg):
"""Called when a message is received from QpidSubscriber.
This method must call self.callback once for each request made for data
"""
pass
@abc.abstractmethod
def getData(self, request, dataTimes):
"""
Retrieve and return data
Args:
request: IDataRequest to send to the server
dataTimes: list of data times
Returns:
list of IData
"""
pass
@property
def subscribed(self):
"""True if currently subscribed to notifications."""
return self.__topicSubscriber.queueStarted

View file

@ -0,0 +1,266 @@
# #
# #
#
# Classes for retrieving soundings based on gridded data from the Data Access
# Framework
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/24/15 #4480 dgilling Initial Creation.
#
from collections import defaultdict
from shapely.geometry import Point
from awips import DateTimeConverter
from awips.dataaccess import DataAccessLayer
from dynamicserialize.dstypes.com.raytheon.uf.common.time import DataTime
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.level import Level
def getSounding(modelName, weatherElements, levels, samplePoint, refTime=None, timeRange=None):
""""
Performs a series of Data Access Framework requests to retrieve a sounding object
based on the specified request parameters.
Args:
modelName: the grid model datasetid to use as the basis of the sounding.
weatherElements: a list of parameters to return in the sounding.
levels: a list of levels to sample the given weather elements at
samplePoint: a lat/lon pair to perform the sampling of data at.
refTime: (optional) the grid model reference time to use for the sounding.
If not specified, the latest ref time in the system will be used.
timeRange: (optional) a TimeRange to specify which forecast hours to use.
If not specified, will default to all forecast hours.
Returns:
A _SoundingCube instance, which acts a 3-tiered dictionary, keyed
by DataTime, then by level and finally by weather element. If no
data is available for the given request parameters, None is returned.
"""
(locationNames, parameters, levels, envelope, refTime, timeRange) = \
__sanitizeInputs(modelName, weatherElements, levels, samplePoint, refTime, timeRange)
requestArgs = { 'datatype' : 'grid',
'locationNames' : locationNames,
'parameters' : parameters,
'levels' : levels,
'envelope' : envelope,
}
req = DataAccessLayer.newDataRequest(**requestArgs)
forecastHours = __determineForecastHours(req, refTime, timeRange)
if not forecastHours:
return None
response = DataAccessLayer.getGeometryData(req, forecastHours)
soundingObject = _SoundingCube(response)
return soundingObject
def setEDEXHost(host):
"""
Changes the EDEX host the Data Access Framework is communicating with.
Args:
host: the EDEX host to connect to
"""
if host:
DataAccessLayer.changeEDEXHost(str(host))
def __sanitizeInputs(modelName, weatherElements, levels, samplePoint, refTime, timeRange):
locationNames = [str(modelName)]
parameters = __buildStringList(weatherElements)
levels = __buildStringList(levels)
envelope = Point(samplePoint)
if refTime is not None:
refTime = DataTime(refTime=DateTimeConverter.convertToDateTime(refTime))
if timeRange is not None:
timeRange = DateTimeConverter.constructTimeRange(*timeRange)
return (locationNames, parameters, levels, envelope, refTime, timeRange)
def __determineForecastHours(request, refTime, timeRange):
dataTimes = DataAccessLayer.getAvailableTimes(request, False)
timesGen = [(DataTime(refTime=dataTime.getRefTime()), dataTime) for dataTime in dataTimes]
dataTimesMap = defaultdict(list)
for baseTime, dataTime in timesGen:
dataTimesMap[baseTime].append(dataTime)
if refTime is None:
refTime = max(dataTimesMap.keys())
forecastHours = dataTimesMap[refTime]
if timeRange is None:
return forecastHours
else:
return [forecastHour for forecastHour in forecastHours if timeRange.contains(forecastHour.getValidPeriod())]
def __buildStringList(param):
if __notStringIter(param):
return [str(item) for item in param]
else:
return [str(param)]
def __notStringIter(iterable):
if not isinstance(iterable, basestring):
try:
iter(iterable)
return True
except TypeError:
return False
class _SoundingCube(object):
"""
The top-level sounding object returned when calling SoundingsSupport.getSounding.
This object acts as a 3-tiered dict which is keyed by time then level
then parameter name. Calling times() will return all valid keys into this
object.
"""
def __init__(self, geometryDataObjects):
self._dataDict = {}
self._sortedTimes = []
if geometryDataObjects:
for geometryData in geometryDataObjects:
dataTime = geometryData.getDataTime()
level = geometryData.getLevel()
for parameter in geometryData.getParameters():
self.__addItem(parameter, dataTime, level, geometryData.getNumber(parameter))
def __addItem(self, parameter, dataTime, level, value):
timeLayer = self._dataDict.get(dataTime, _SoundingTimeLayer(dataTime))
self._dataDict[dataTime] = timeLayer
timeLayer._addItem(parameter, level, value)
if dataTime not in self._sortedTimes:
self._sortedTimes.append(dataTime)
self._sortedTimes.sort()
def __getitem__(self, key):
return self._dataDict[key]
def __len__(self):
return len(self._dataDict)
def times(self):
"""
Returns the valid times for this sounding.
Returns:
A list containing the valid DataTimes for this sounding in order.
"""
return self._sortedTimes
class _SoundingTimeLayer(object):
"""
The second-level sounding object returned when calling SoundingsSupport.getSounding.
This object acts as a 2-tiered dict which is keyed by level then parameter
name. Calling levels() will return all valid keys into this
object. Calling time() will return the DataTime for this particular layer.
"""
def __init__(self, dataTime):
self._dataTime = dataTime
self._dataDict = {}
def _addItem(self, parameter, level, value):
asString = str(level)
levelLayer = self._dataDict.get(asString, _SoundingTimeAndLevelLayer(self._dataTime, asString))
levelLayer._addItem(parameter, value)
self._dataDict[asString] = levelLayer
def __getitem__(self, key):
asString = str(key)
if asString in self._dataDict:
return self._dataDict[asString]
else:
raise KeyError("Level " + str(key) + " is not a valid level for this sounding.")
def __len__(self):
return len(self._dataDict)
def time(self):
"""
Returns the DataTime for this sounding cube layer.
Returns:
The DataTime for this sounding layer.
"""
return self._dataTime
def levels(self):
"""
Returns the valid levels for this sounding.
Returns:
A list containing the valid levels for this sounding in order of
closest to surface to highest from surface.
"""
sortedLevels = [Level(level) for level in self._dataDict.keys()]
sortedLevels.sort()
return [str(level) for level in sortedLevels]
class _SoundingTimeAndLevelLayer(object):
"""
The bottom-level sounding object returned when calling SoundingsSupport.getSounding.
This object acts as a dict which is keyed by parameter name. Calling
parameters() will return all valid keys into this object. Calling time()
will return the DataTime for this particular layer. Calling level() will
return the level for this layer.
"""
def __init__(self, time, level):
self._time = time
self._level = level
self._parameters = {}
def _addItem(self, parameter, value):
self._parameters[parameter] = value
def __getitem__(self, key):
return self._parameters[key]
def __len__(self):
return len(self._parameters)
def level(self):
"""
Returns the level for this sounding cube layer.
Returns:
The level for this sounding layer.
"""
return self._level
def parameters(self):
"""
Returns the valid parameters for this sounding.
Returns:
A list containing the valid parameter names.
"""
return list(self._parameters.keys())
def time(self):
"""
Returns the DataTime for this sounding cube layer.
Returns:
The DataTime for this sounding layer.
"""
return self._time

View file

@ -0,0 +1,230 @@
# #
# #
#
# Routes requests to the Data Access Framework through Python Thrift.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 05/21/13 2023 dgilling Initial Creation.
# 01/06/14 2537 bsteffen Share geometry WKT.
# 03/03/14 2673 bsteffen Add ability to query only ref times.
# 07/22/14 3185 njensen Added optional/default args to newDataRequest
# 07/23/14 3185 njensen Added new methods
# 07/30/14 3185 njensen Renamed valid identifiers to optional
# 06/30/15 4569 nabowle Use hex WKB for geometries.
# 04/13/15 5379 tgurney Add getIdentifierValues()
# 06/01/16 5587 tgurney Add new signatures for
# getRequiredIdentifiers() and
# getOptionalIdentifiers()
# 08/01/16 2416 tgurney Add getNotificationFilter()
# 10/13/16 5916 bsteffen Correct grid shape, allow lazy grid lat/lon
# 10/26/16 5919 njensen Speed up geometry creation in getGeometryData()
#
import numpy
import shapely.wkb
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.impl import DefaultDataRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetAvailableLocationNamesRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetAvailableTimesRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetGeometryDataRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetGridDataRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetGridLatLonRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetAvailableParametersRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetAvailableLevelsRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetRequiredIdentifiersRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetOptionalIdentifiersRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetIdentifierValuesRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetSupportedDatatypesRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataaccess.request import GetNotificationFilterRequest
from awips import ThriftClient
from awips.dataaccess import PyGeometryData
from awips.dataaccess import PyGridData
class LazyGridLatLon(object):
def __init__(self, client, nx, ny, envelope, crsWkt):
self._latLonGrid = None
self._client = client
self._request = GetGridLatLonRequest()
self._request.setNx(nx)
self._request.setNy(ny)
self._request.setEnvelope(envelope)
self._request.setCrsWkt(crsWkt)
def __call__(self):
# Its important that the data is cached internally so that if multiple
# GridData are sharing the same delegate then they can also share a
# single request for the LatLon information.
if self._latLonGrid is None:
response = self._client.sendRequest(self._request)
nx = response.getNx()
ny = response.getNy()
latData = numpy.reshape(numpy.array(response.getLats()), (ny, nx))
lonData = numpy.reshape(numpy.array(response.getLons()), (ny, nx))
self._latLonGrid = (lonData, latData)
return self._latLonGrid
class ThriftClientRouter(object):
def __init__(self, host='localhost'):
self._client = ThriftClient.ThriftClient(host)
self._lazyLoadGridLatLon = False
def setLazyLoadGridLatLon(self, lazyLoadGridLatLon):
self._lazyLoadGridLatLon = lazyLoadGridLatLon
def getAvailableTimes(self, request, refTimeOnly):
timesRequest = GetAvailableTimesRequest()
timesRequest.setRequestParameters(request)
timesRequest.setRefTimeOnly(refTimeOnly)
response = self._client.sendRequest(timesRequest)
return response
def getGridData(self, request, times):
gridDataRequest = GetGridDataRequest()
gridDataRequest.setIncludeLatLonData(not self._lazyLoadGridLatLon)
gridDataRequest.setRequestParameters(request)
# if we have an iterable times instance, then the user must have asked
# for grid data with the List of DataTime objects
# else, we assume it was a single TimeRange that was meant for the
# request
try:
iter(times)
gridDataRequest.setRequestedTimes(times)
except TypeError:
gridDataRequest.setRequestedPeriod(times)
response = self._client.sendRequest(gridDataRequest)
locSpecificData = {}
locNames = response.getSiteNxValues().keys()
for location in locNames:
nx = response.getSiteNxValues()[location]
ny = response.getSiteNyValues()[location]
if self._lazyLoadGridLatLon:
envelope = response.getSiteEnvelopes()[location]
crsWkt = response.getSiteCrsWkt()[location]
delegate = LazyGridLatLon(
self._client, nx, ny, envelope, crsWkt)
locSpecificData[location] = (nx, ny, delegate)
else:
latData = numpy.reshape(numpy.array(
response.getSiteLatGrids()[location]), (ny, nx))
lonData = numpy.reshape(numpy.array(
response.getSiteLonGrids()[location]), (ny, nx))
locSpecificData[location] = (nx, ny, (lonData, latData))
retVal = []
for gridDataRecord in response.getGridData():
locationName = gridDataRecord.getLocationName()
locData = locSpecificData[locationName]
if self._lazyLoadGridLatLon:
retVal.append(PyGridData.PyGridData(gridDataRecord, locData[
0], locData[1], latLonDelegate=locData[2]))
else:
retVal.append(PyGridData.PyGridData(
gridDataRecord, locData[0], locData[1], locData[2]))
return retVal
def getGeometryData(self, request, times):
geoDataRequest = GetGeometryDataRequest()
geoDataRequest.setRequestParameters(request)
# if we have an iterable times instance, then the user must have asked
# for geometry data with the List of DataTime objects
# else, we assume it was a single TimeRange that was meant for the
# request
try:
iter(times)
geoDataRequest.setRequestedTimes(times)
except TypeError:
geoDataRequest.setRequestedPeriod(times)
response = self._client.sendRequest(geoDataRequest)
geometries = []
for wkb in response.getGeometryWKBs():
# the wkb is a numpy.ndarray of dtype int8
# convert the bytearray to a byte string and load it
geometries.append(shapely.wkb.loads(wkb.tostring()))
retVal = []
for geoDataRecord in response.getGeoData():
geom = geometries[geoDataRecord.getGeometryWKBindex()]
retVal.append(PyGeometryData.PyGeometryData(geoDataRecord, geom))
return retVal
def getAvailableLocationNames(self, request):
locNamesRequest = GetAvailableLocationNamesRequest()
locNamesRequest.setRequestParameters(request)
response = self._client.sendRequest(locNamesRequest)
return response
def getAvailableParameters(self, request):
paramReq = GetAvailableParametersRequest()
paramReq.setRequestParameters(request)
response = self._client.sendRequest(paramReq)
return response
def getAvailableLevels(self, request):
levelReq = GetAvailableLevelsRequest()
levelReq.setRequestParameters(request)
response = self._client.sendRequest(levelReq)
return response
def getRequiredIdentifiers(self, request):
if str(request) == request:
# Handle old version getRequiredIdentifiers(str)
request = self.newDataRequest(request)
idReq = GetRequiredIdentifiersRequest()
idReq.setRequest(request)
response = self._client.sendRequest(idReq)
return response
def getOptionalIdentifiers(self, request):
if str(request) == request:
# Handle old version getOptionalIdentifiers(str)
request = self.newDataRequest(request)
idReq = GetOptionalIdentifiersRequest()
idReq.setRequest(request)
response = self._client.sendRequest(idReq)
return response
def getIdentifierValues(self, request, identifierKey):
idValReq = GetIdentifierValuesRequest()
idValReq.setIdentifierKey(identifierKey)
idValReq.setRequestParameters(request)
response = self._client.sendRequest(idValReq)
return response
def newDataRequest(self, datatype, parameters=[], levels=[], locationNames=[], envelope=None, **kwargs):
req = DefaultDataRequest()
if datatype:
req.setDatatype(datatype)
if parameters:
req.setParameters(*parameters)
if levels:
req.setLevels(*levels)
if locationNames:
req.setLocationNames(*locationNames)
if envelope:
req.setEnvelope(envelope)
if kwargs:
# any args leftover are assumed to be identifiers
req.identifiers = kwargs
return req
def getSupportedDatatypes(self):
response = self._client.sendRequest(GetSupportedDatatypesRequest())
return response
def getNotificationFilter(self, request):
notifReq = GetNotificationFilterRequest()
notifReq.setRequestParameters(request)
response = self._client.sendRequest(notifReq)
return response

View file

@ -0,0 +1,372 @@
##
##
#
# __init__.py for awips.dataaccess package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 12/10/12 njensen Initial Creation.
# Feb 14, 2013 1614 bsteffen refactor data access framework
# to use single request.
# Apr 09, 2013 1871 njensen Add doc strings
# Jun 03, 2013 2023 dgilling Add getAttributes to IData, add
# getLatLonGrids() to IGridData.
# Aug 01, 2016 2416 tgurney Add INotificationSubscriber
# and INotificationFilter
#
#
__all__ = [
]
import abc
class IDataRequest(object):
"""
An IDataRequest to be submitted to the DataAccessLayer to retrieve data.
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def setDatatype(self, datatype):
"""
Sets the datatype of the request.
Args:
datatype: A string of the datatype, such as "grid", "radar", "gfe", "obs"
"""
return
@abc.abstractmethod
def addIdentifier(self, key, value):
"""
Adds an identifier to the request. Identifiers are specific to the
datatype being requested.
Args:
key: the string key of the identifier
value: the value of the identifier
"""
return
@abc.abstractmethod
def setParameters(self, params):
"""
Sets the parameters of data to request.
Args:
params: a list of strings of parameters to request
"""
return
@abc.abstractmethod
def setLevels(self, levels):
"""
Sets the levels of data to request. Not all datatypes support levels.
Args:
levels: a list of strings of level abbreviations to request
"""
return
@abc.abstractmethod
def setEnvelope(self, env):
"""
Sets the envelope of the request. If supported by the datatype factory,
the data returned for the request will be constrained to only the data
within the envelope.
Args:
env: a shapely geometry
"""
return
@abc.abstractmethod
def setLocationNames(self, locationNames):
"""
Sets the location names of the request.
Args:
locationNames: a list of strings of location names to request
"""
return
@abc.abstractmethod
def getDatatype(self):
"""
Gets the datatype of the request
Returns:
the datatype set on the request
"""
return
@abc.abstractmethod
def getIdentifiers(self):
"""
Gets the identifiers on the request
Returns:
a dictionary of the identifiers
"""
return
@abc.abstractmethod
def getLevels(self):
"""
Gets the levels on the request
Returns:
a list of strings of the levels
"""
return
@abc.abstractmethod
def getLocationNames(self):
"""
Gets the location names on the request
Returns:
a list of strings of the location names
"""
return
@abc.abstractmethod
def getEnvelope(self):
"""
Gets the envelope on the request
Returns:
a rectangular shapely geometry
"""
return
class IData(object):
"""
An IData representing data returned from the DataAccessLayer.
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def getAttribute(self, key):
"""
Gets an attribute of the data.
Args:
key: the key of the attribute
Returns:
the value of the attribute
"""
return
@abc.abstractmethod
def getAttributes(self):
"""
Gets the valid attributes for the data.
Returns:
a list of strings of the attribute names
"""
return
@abc.abstractmethod
def getDataTime(self):
"""
Gets the data time of the data.
Returns:
the data time of the data, or None if no time is associated
"""
return
@abc.abstractmethod
def getLevel(self):
"""
Gets the level of the data.
Returns:
the level of the data, or None if no level is associated
"""
return
@abc.abstractmethod
def getLocationName(self, param):
"""
Gets the location name of the data.
Returns:
the location name of the data, or None if no location name is
associated
"""
return
class IGridData(IData):
"""
An IData representing grid data that is returned by the DataAccessLayer.
"""
@abc.abstractmethod
def getParameter(self):
"""
Gets the parameter of the data.
Returns:
the parameter of the data
"""
return
@abc.abstractmethod
def getUnit(self):
"""
Gets the unit of the data.
Returns:
the string abbreviation of the unit, or None if no unit is associated
"""
return
@abc.abstractmethod
def getRawData(self):
"""
Gets the grid data as a numpy array.
Returns:
a numpy array of the data
"""
return
@abc.abstractmethod
def getLatLonCoords(self):
"""
Gets the lat/lon coordinates of the grid data.
Returns:
a tuple where the first element is a numpy array of lons, and the
second element is a numpy array of lats
"""
return
class IGeometryData(IData):
"""
An IData representing geometry data that is returned by the DataAccessLayer.
"""
@abc.abstractmethod
def getGeometry(self):
"""
Gets the geometry of the data.
Returns:
a shapely geometry
"""
return
@abc.abstractmethod
def getParameters(self):
"""Gets the parameters of the data.
Returns:
a list of strings of the parameter names
"""
return
@abc.abstractmethod
def getString(self, param):
"""
Gets the string value of the specified param.
Args:
param: the string name of the param
Returns:
the string value of the param
"""
return
@abc.abstractmethod
def getNumber(self, param):
"""
Gets the number value of the specified param.
Args:
param: the string name of the param
Returns:
the number value of the param
"""
return
@abc.abstractmethod
def getUnit(self, param):
"""
Gets the unit of the specified param.
Args:
param: the string name of the param
Returns:
the string abbreviation of the unit of the param
"""
return
@abc.abstractmethod
def getType(self, param):
"""
Gets the type of the param.
Args:
param: the string name of the param
Returns:
a string of the type of the parameter, such as
"STRING", "INT", "LONG", "FLOAT", or "DOUBLE"
"""
return
class INotificationSubscriber(object):
"""
An INotificationSubscriber representing a notification filter returned from
the DataNotificationLayer.
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def subscribe(self, callback):
"""
Subscribes to the requested data. Method will not return until close is
called in a separate thread.
Args:
callback: the method to call with the IGridData/IGeometryData
"""
pass
@abc.abstractmethod
def close(self):
"""Closes the notification subscriber"""
pass
class INotificationFilter(object):
"""
Represents data required to filter a set of URIs and
return a corresponding list of IDataRequest to retrieve data for.
"""
__metaclass__ = abc.ABCMeta
@abc.abstractmethod
def accept(dataUri):
pass

156
awips/gfe/IFPClient.py Normal file
View file

@ -0,0 +1,156 @@
##
##
from awips import ThriftClient
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.db.objects import DatabaseID
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.db.objects import ParmID
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.request import CommitGridsRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.request import GetGridInventoryRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.request import GetParmListRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.request import GetSelectTimeRangeRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.server.request import CommitGridRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.message import WsId
from dynamicserialize.dstypes.com.raytheon.uf.common.site.requests import GetActiveSitesRequest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataplugin.gfe.server.message import ServerResponse
#
# Provides a Python-based interface for executing GFE requests.
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/26/12 dgilling Initial Creation.
#
#
#
class IFPClient(object):
def __init__(self, host, port, user, site=None, progName=None):
self.__thrift = ThriftClient.ThriftClient(host, port)
self.__wsId = WsId(userName=user, progName=progName)
# retrieve default site
if site is None:
sr = self.getSiteID()
if len(sr.getPayload()) > 0:
site = sr.getPayload()[0]
self.__siteId = site
def commitGrid(self, request):
if type(request) is CommitGridRequest:
return self.__commitGrid([request])
elif self.__isHomogenousIterable(request, CommitGridRequest):
return self.__commitGrid([cgr for cgr in request])
raise TypeError("Invalid type: " + str(type(request)) + " specified to commitGrid(). Only accepts CommitGridRequest or lists of CommitGridRequest.")
def __commitGrid(self, requests):
ssr = ServerResponse()
request = CommitGridsRequest()
request.setCommits(requests)
sr = self.__makeRequest(request)
ssr.setMessages(sr.getMessages())
return ssr
def getParmList(self, id):
argType = type(id)
if argType is DatabaseID:
return self.__getParmList([id])
elif self.__isHomogenousIterable(id, DatabaseID):
return self.__getParmList([dbid for dbid in id])
raise TypeError("Invalid type: " + str(argType) + " specified to getParmList(). Only accepts DatabaseID or lists of DatabaseID.")
def __getParmList(self, ids):
ssr = ServerResponse()
request = GetParmListRequest()
request.setDbIds(ids)
sr = self.__makeRequest(request)
ssr.setMessages(sr.getMessages())
list = sr.getPayload() if sr.getPayload() is not None else []
ssr.setPayload(list)
return ssr
def __isHomogenousIterable(self, iterable, classType):
try:
iterator = iter(iterable)
for item in iterator:
if not isinstance(item, classType):
return False
except TypeError:
return False
return True
def getGridInventory(self, parmID):
if type(parmID) is ParmID:
sr = self.__getGridInventory([parmID])
list = []
try:
list = sr.getPayload()[parmID]
except KeyError:
# no-op, we've already default the TimeRange list to empty
pass
sr.setPayload(list)
return sr
elif self.__isHomogenousIterable(parmID, ParmID):
return self.__getGridInventory([id for id in parmID])
raise TypeError("Invalid type: " + str(type(parmID)) + " specified to getGridInventory(). Only accepts ParmID or lists of ParmID.")
def __getGridInventory(self, parmIDs):
ssr = ServerResponse()
request = GetGridInventoryRequest()
request.setParmIds(parmIDs)
sr = self.__makeRequest(request)
ssr.setMessages(sr.getMessages())
trs = sr.getPayload() if sr.getPayload() is not None else {}
ssr.setPayload(trs)
return ssr
def getSelectTR(self, name):
request = GetSelectTimeRangeRequest()
request.setName(name)
sr = self.__makeRequest(request)
ssr = ServerResponse()
ssr.setMessages(sr.getMessages())
ssr.setPayload(sr.getPayload())
return ssr
def getSiteID(self):
ssr = ServerResponse()
request = GetActiveSitesRequest()
sr = self.__makeRequest(request)
ssr.setMessages(sr.getMessages())
ids = sr.getPayload() if sr.getPayload() is not None else []
sr.setPayload(ids)
return sr
def __makeRequest(self, request):
try:
request.setSiteID(self.__siteId)
except AttributeError:
pass
try:
request.setWorkstationID(self.__wsId)
except AttributeError:
pass
sr = ServerResponse()
response = None
try:
response = self.__thrift.sendRequest(request)
except ThriftClient.ThriftRequestException as e:
sr.setMessages([str(e)])
try:
sr.setPayload(response.getPayload())
except AttributeError:
sr.setPayload(response)
try:
sr.setMessages(response.getMessages())
except AttributeError:
# not a server response, nothing else to do
pass
return sr

20
awips/gfe/__init__.py Normal file
View file

@ -0,0 +1,20 @@
##
##
#
# __init__.py for awips.gfe package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/26/12 dgilling Initial Creation.
#
#
#
__all__ = [
]

View file

@ -0,0 +1,453 @@
##
##
#
# Library for accessing localization files from python.
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# --------- -------- --------- --------------------------
# 08/09/17 5731 bsteffen Initial Creation.
import urllib2
from json import load as loadjson
from xml.etree.ElementTree import parse as parseXml
from base64 import b64encode
from StringIO import StringIO
from getpass import getuser
import dateutil.parser
import contextlib
import os
from urlparse import urlunparse, urljoin
NON_EXISTENT_CHECKSUM = 'NON_EXISTENT_CHECKSUM'
DIRECTORY_CHECKSUM = 'DIRECTORY_CHECKSUM'
class LocalizationFileVersionConflictException(Exception):
pass
class LocalizationFileDoesNotExistException(Exception):
pass
class LocalizationFileIsNotDirectoryException(Exception):
pass
class LocalizationContext(object):
"""A localization context defines the scope of a localization file.
For example the base localization context includes all the default files
installed with EDEX, while a particular user context has custom files for
that user.
A localization context consists of a level and name. The level defines what
kind of entity this context is valid for, such as 'base', 'site', or 'user'.
The name identifies the specific entity, for example the name of a 'user'
level context is usually the username. The 'base' level does not have a name
because there cannot be only one 'base' context.
Attributes:
level: the localization level
name: the context name
"""
def __init__(self, level="base", name=None, type="common_static"):
if level != "base":
assert name is not None
self.level = level
self.name = name
self.type = type
def isBase(self):
return self.level == "base"
def _getUrlComponent(self):
if self.isBase():
return self.type + '/' + "base/"
else:
return self.type + '/' + self.level + '/' + self.name + '/'
def __str__(self):
if self.isBase():
return self.type + ".base"
else:
return self.type + "." + self.level + "." + self.name
def __eq__(self, other):
return self.level == other.level and \
self.name == other.name and \
self.type == other.type
def __hash__(self):
return hash((self.level, self.name, self.type))
class _LocalizationOutput(StringIO):
"""A file-like object for writing a localization file.
The contents being written are stored in memory and written to a
localization server only when the writing is finished.
This object should be used as a context manager, a save operation will be
executed if the context exits with no errors. If errors occur the partial
contents are abandoned and the server is unchanged.
It is also possible to save the contents to the server with the save()
method.
"""
def __init__(self, manager, file):
StringIO.__init__(self)
self._manager = manager
self._file = file
def save(self):
"""Send the currently written contents to the server."""
request = self._manager._buildRequest(self._file.context, self._file.path, method="PUT")
request.add_data(self.getvalue())
request.add_header("If-Match", self._file.checksum)
try:
urllib2.urlopen(request)
except urllib2.HTTPError as e:
if e.code == 409:
raise LocalizationFileVersionConflictException, e.read()
else:
raise e
def __enter__(self):
return self
def __exit__(self, exc_type, exc_value, traceback):
if exc_type is None:
self.save()
def __str__(self):
return '<' + self.__class__.__name__ + " for " + str(self._file) + '>'
class LocalizationFile(object):
"""A specific file stored in localization.
A localization file is uniquely defined by the context and path. There can
only be one valid file for that path and localization at a time. To access
the contents of the file use the open method.
Attributes:
context: A LocalizationContext
path: A path to this file
checksum: A string representation of a checksum generated from the file contents.
timnestamp: A datetime.datetime object indicating when the file was last modified.
"""
def __init__(self, manager, context, path, checksum, timestamp):
"""Initialize a LocalizationFile with the given manager and attributes.
Args:
manager: A LocalizationFileManager to assist with server communication
context: A LocalizationContext
path: A path to this file
checksum: A string representation of a checksum generated from the file contents.
timnestamp: A datetime.datetime object indicating when the file was last modified.
"""
self._manager = manager
self.context = context
self.path = path
self.checksum = checksum
self.timestamp = timestamp
def open(self, mode='r'):
"""Open the file.
This should always be called as as part of a with statement. When
writing the content is not saved on the server until leaving the with
statement normally, if an error occurs the server is left unchanged.
Example:
with locFile.open('w') as output:
output.write('some content')
Args:
mode: 'r' for reading the file, 'w' for writing
Returns:
A file like object that can be used for reads or writes.
"""
if mode == 'r':
request = self._manager._buildRequest(self.context, self.path)
response = urllib2.urlopen(request)
# Not the recommended way of reading directories.
if not(self.isDirectory()):
checksum = response.headers["Content-MD5"]
if self.checksum != checksum:
raise RuntimeError, "Localization checksum mismatch " + self.checksum + " " + checksum
return contextlib.closing(response)
elif mode == 'w':
return _LocalizationOutput(self._manager, self)
else:
raise ValueError, "mode string must be 'r' or 'w' not " + str(r)
def delete(self):
"""Delete this file from the server"""
request = self._manager._buildRequest(self.context, self.path, method='DELETE')
request.add_header("If-Match", self.checksum)
try:
urllib2.urlopen(request)
except urllib2.HTTPError as e:
if e.code == 409:
raise LocalizationFileVersionConflictException, e.read()
else:
raise e
def exists(self):
"""Check if this file actually exists.
Returns:
boolean indicating existence of this file
"""
return self.checksum != NON_EXISTENT_CHECKSUM
def isDirectory(self):
"""Check if this file is a directory.
A file must exist to be considered a directory.
Returns:
boolean indicating directorocity of this file
"""
return self.checksum == DIRECTORY_CHECKSUM
def getCheckSum(self):
return self.checksum
def getContext(self):
return self.context
def getPath(self):
return self.path
def getTimeStamp(self):
return self.timestamp
def __str__(self):
return str(self.context) + "/" + self.path
def __eq__(self, other):
return self.context == other.context and \
self.path == other.path and \
self.checksum == other.checksum \
and self.timestamp == other.timestamp
def __hash__(self):
return hash((self.context, self.path, self.checksum, self.timestamp))
def _getHost():
import subprocess
host = subprocess.check_output(
"source /awips2/fxa/bin/setup.env; echo $DEFAULT_HOST",
shell=True).strip()
if host:
return host
return 'localhost'
def _getSiteFromServer(host):
try:
from awips import ThriftClient
from dynamicserialize.dstypes.com.raytheon.uf.common.site.requests import GetPrimarySiteRequest
client = ThriftClient.ThriftClient(host)
return client.sendRequest(GetPrimarySiteRequest())
except:
# Servers that don't have GFE installed will not return a site
pass
def _getSiteFromEnv():
site = os.environ.get('FXA_LOCAL_SITE')
if site is None:
site = os.environ.get('SITE_IDENTIFIER');
return site
def _getSite(host):
site = _getSiteFromEnv()
if not(site):
site = _getSiteFromServer(host)
return site
def _parseJsonList(manager, response, context, path):
fileList = []
jsonResponse = loadjson(response)
for name, jsonData in jsonResponse.items():
checksum = jsonData["checksum"]
timestampString = jsonData["timestamp"]
timestamp = dateutil.parser.parse(timestampString)
newpath = urljoin(path, name)
fileList.append(LocalizationFile(manager, context, newpath, checksum, timestamp))
return fileList
def _parseXmlList(manager, response, context, path):
fileList = []
for xmlData in parseXml(response).getroot().findall('file'):
name = xmlData.get("name")
checksum = xmlData.get("checksum")
timestampString = xmlData.get("timestamp")
timestamp = dateutil.parser.parse(timestampString)
newpath = urljoin(path, name)
fileList.append(LocalizationFile(manager, context, newpath, checksum, timestamp))
return fileList
class LocalizationFileManager(object):
"""Connects to a server and retrieves LocalizationFiles."""
def __init__(self, host=None, port=9581, path="/services/localization/", contexts=None, site=None, type="common_static"):
"""Initializes a LocalizationFileManager with connection parameters and context information
All arguments are optional and will use defaults or attempt to figure out appropriate values form the environment.
Args:
host: A hostname of the localization server, such as 'ec'.
port: A port to use to connect to the localization server, usually 9581.
path: A path to reach the localization file service on the server.
contexts: A list of contexts to check for files, the order of the contexts will be used
for the order of incremental results and the priority of absolute results.
site: A site identifier to use for site specific contexts. This is only used if the contexts arg is None.
type: A localization type for contexts. This is only used if the contexts arg is None.
"""
if host is None:
host = _getHost()
if contexts is None:
if site is None :
site = _getSite(host)
contexts = [LocalizationContext("base", None, type)]
if site:
contexts.append(LocalizationContext("configured", site, type))
contexts.append(LocalizationContext("site", site, type))
contexts.append(LocalizationContext("user", getuser(), type))
netloc = host + ':' + str(port)
self._baseUrl = urlunparse(('http', netloc, path, None, None, None))
self._contexts = contexts
def _buildRequest(self, context, path, method='GET'):
url = urljoin(self._baseUrl, context._getUrlComponent())
url = urljoin(url, path)
request = urllib2.Request(url)
username = getuser()
# Currently password is ignored in the server
# this is the defacto standard for not providing one to this service.
password = username
base64string = b64encode('%s:%s' % (username, password))
request.add_header("Authorization", "Basic %s" % base64string)
if method != 'GET':
request.get_method = lambda: method
return request
def _normalizePath(self, path):
if path == '' or path == '/':
path = '.'
if path[0] == '/':
path = path[1:]
return path
def _list(self, path):
path = self._normalizePath(path)
if path[-1] != '/':
path += '/'
fileList = []
exists = False
for context in self._contexts:
try:
request = self._buildRequest(context, path)
request.add_header("Accept", "application/json, application/xml")
response = urllib2.urlopen(request)
exists = True
if not(response.geturl().endswith("/")):
# For ordinary files the server sends a redirect to remove the slash.
raise LocalizationFileIsNotDirectoryException, "Not a directory: " + path
elif response.headers["Content-Type"] == "application/xml":
fileList += _parseXmlList(self, response, context, path)
else:
fileList += _parseJsonList(self, response, context, path)
except urllib2.HTTPError as e:
if e.code != 404:
raise e
if not(exists):
raise LocalizationFileDoesNotExistException, "No such file or directory: " + path
return fileList
def _get(self, context, path):
path = self._normalizePath(path)
try:
request = self._buildRequest(context, path, method='HEAD')
resp = urllib2.urlopen(request)
if (resp.geturl().endswith("/")):
checksum = DIRECTORY_CHECKSUM;
else:
if "Content-MD5" not in resp.headers:
raise RuntimeError, "Missing Content-MD5 header in response from " + resp.geturl()
checksum = resp.headers["Content-MD5"]
if "Last-Modified" not in resp.headers:
raise RuntimeError, "Missing Last-Modified header in response from " + resp.geturl()
timestamp = dateutil.parser.parse(resp.headers["Last-Modified"])
return LocalizationFile(self, context, path, checksum, timestamp)
except urllib2.HTTPError as e:
if e.code != 404:
raise e
else:
return LocalizationFile(self, context, path, NON_EXISTENT_CHECKSUM, None)
def listAbsolute(self, path):
"""List the files in a localization directory, only a single file is returned for each unique path.
If a file exists in more than one context then the highest level(furthest from base) is used.
Args:
path: A path to a directory that should be the root of the listing
Returns:
A list of LocalizationFiles
"""
merged = dict()
for file in self._list(path):
merged[file.path] = file
return sorted(merged.values(), key=lambda file: file.path)
def listIncremental(self, path):
"""List the files in a localization directory, this includes all files for all contexts.
Args:
path: A path to a directory that should be the root of the listing
Returns:
A list of tuples, each tuple will contain one or more files for the
same paths but different contexts. Each tuple will be ordered the
same as the contexts in this manager, generally with 'base' first
and 'user' last.
"""
merged = dict()
for file in self._list(path):
if file.path in merged:
merged[file.path] += (file,)
else:
merged[file.path] = (file, )
return sorted(merged.values(), key=lambda t: t[0].path)
def getAbsolute(self, path):
"""Get a single localization file from the highest level context where it exists.
Args:
path: A path to a localization file
Returns:
A Localization File with the specified path or None if the file does not exist in any context.
"""
for context in reversed(self._contexts):
f = self._get(context, path)
if f.exists():
return f
def getIncremental(self, path):
"""Get all the localization files that exist in any context for the provided path.
Args:
path: A path to a localization file
Returns:
A tuple containing all the files that exist for this path in any context. The tuple
will be ordered the same as the contexts in this manager, generally with 'base' first
and 'user' last.
"""
result = ()
for context in self._contexts:
f = self._get(context, path)
if f.exists():
result += (f,)
return result
def getSpecific(self, level, path):
"""Get a specific localization file at a given level, the file may not exist.
The file is returned for whichever context is valid for the provided level in this manager.
For writing new files this is the only way to get access to a file that
does not exist in order to create it.
Args:
level: the name of a localization level, such as "base", "site", "user"
path: A path to a localization file
Returns:
A Localization File with the specified path and a context for the specified level.
"""
for context in self._contexts:
if context.level == level:
return self._get(context, path)
raise ValueError, "No context defined for level " + level
def __str__(self):
contextsStr = '[' + ' '.join((str(c) for c in self._contexts)) + ']'
return '<' + self.__class__.__name__ + " for " + self._baseUrl + ' ' + contextsStr + '>'

View file

@ -0,0 +1,15 @@
##
##
#
# __init__.py for awips.localization package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# --------- -------- --------- --------------------------
# 08/10/17 5731 bsteffen Initial Creation.
__all__ = [
]

129
awips/qpidingest.py Normal file
View file

@ -0,0 +1,129 @@
#===============================================================================
# qpidingest.py
#
# @author: Aaron Anderson
# @organization: NOAA/WDTB OU/CIMMS
# @version: 1.0 02/19/2010
# @requires: QPID Python Client available from http://qpid.apache.org/download.html
# The Python Client is located under Single Component Package/Client
#
# From the README.txt Installation Instructions
# = INSTALLATION =
# Extract the release archive into a directory of your choice and set
# your PYTHONPATH accordingly:
#
# tar -xzf qpid-python-<version>.tar.gz -C <install-prefix>
# export PYTHONPATH=<install-prefix>/qpid-<version>/python
#
# ***EDEX and QPID must be running for this module to work***
#
# DESCRIPTION:
# This module is used to connect to QPID and send messages to the external.dropbox queue
# which tells EDEX to ingest a data file from a specified path. This avoids having to copy
# a data file into an endpoint. Each message also contains a header which is used to determine
# which plugin should be used to decode the file. Each plugin has an xml file located in
# $EDEX_HOME/data/utility/edex_static/base/distribution that contains regular expressions
# that the header is compared to. When the header matches one of these regular expressions
# the file is decoded with that plugin. If you make changes to one of these xml files you
# must restart EDEX for the changes to take effect.
#
# NOTE: If the message is being sent but you do not see it being ingested in the EDEX log
# check the xml files to make sure the header you are passing matches one of the regular
# expressions. Beware of spaces, some regular expressions require spaces while others use
# a wildcard character so a space is optional. It seems you are better off having the space
# as this will be matched to both patterns. For the file in the example below,
# 20100218_185755_SAUS46KLOX.metar, I use SAUS46 KLOX as the header to make sure it matches.
#
#
# EXAMPLE:
# Simple example program:
#
#------------------------------------------------------------------------------
# import qpidingest
# #Tell EDEX to ingest a metar file from data_store. The filepath is
# #/data_store/20100218/metar/00/standard/20100218_005920_SAUS46KSEW.metar
#
# conn=qpidingest.IngestViaQPID() #defaults to localhost port 5672
#
# #If EDEX is not on the local machine you can make the connection as follows
# #conn=qpidingest.IngestViaQPID(host='<MACHINE NAME>',port=<PORT NUMBER>)
#
# conn.sendmessage('/data_store/20100218/metar/18/standard/20100218_185755_SAUS46KLOX.metar','SAUS46 KLOX')
# conn.close()
#-------------------------------------------------------------------------------
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# ....
# 06/13/2013 DR 16242 D. Friedman Add Qpid authentication info
# 03/06/2014 DR 17907 D. Friedman Workaround for issue QPID-5569
# 02/16/2017 DR 6084 bsteffen Support ssl connections
#
#===============================================================================
import os
import os.path
import qpid
from qpid.util import connect
from qpid.connection import Connection
from qpid.datatypes import Message, uuid4
QPID_USERNAME = 'guest'
QPID_PASSWORD = 'guest'
class IngestViaQPID:
def __init__(self, host='localhost', port=5672, ssl=None):
'''
Connect to QPID and make bindings to route message to external.dropbox queue
@param host: string hostname of computer running EDEX and QPID (default localhost)
@param port: integer port used to connect to QPID (default 5672)
@param ssl: boolean to determine whether ssl is used, default value of None will use ssl only if a client certificate is found.
'''
try:
#
socket = connect(host, port)
if "QPID_SSL_CERT_DB" in os.environ:
certdb = os.environ["QPID_SSL_CERT_DB"]
else:
certdb = os.path.expanduser("~/.qpid/")
if "QPID_SSL_CERT_NAME" in os.environ:
certname = os.environ["QPID_SSL_CERT_NAME"]
else:
certname = QPID_USERNAME
certfile = os.path.join(certdb, certname + ".crt")
if ssl or (ssl is None and os.path.exists(certfile)):
keyfile = os.path.join(certdb, certname + ".key")
trustfile = os.path.join(certdb, "root.crt")
socket = qpid.util.ssl(socket, keyfile=keyfile, certfile=certfile, ca_certs=trustfile)
self.connection = Connection (sock=socket, username=QPID_USERNAME, password=QPID_PASSWORD)
self.connection.start()
self.session = self.connection.session(str(uuid4()))
self.session.exchange_bind(exchange='amq.direct', queue='external.dropbox', binding_key='external.dropbox')
print 'Connected to Qpid'
except:
print 'Unable to connect to Qpid'
def sendmessage(self, filepath, header):
'''
This function sends a message to the external.dropbox queue providing the path
to the file to be ingested and a header to determine the plugin to be used to
decode the file.
@param filepath: string full path to file to be ingested
@param header: string header used to determine plugin decoder to use
'''
props = self.session.delivery_properties(routing_key='external.dropbox')
head = self.session.message_properties(application_headers={'header':header},
user_id=QPID_USERNAME) # For issue QPID-5569. Fixed in Qpid 0.27
self.session.message_transfer(destination='amq.direct', message=Message(props, head, filepath))
def close(self):
'''
After all messages are sent call this function to close connection and make sure
there are no threads left open
'''
self.session.close(timeout=10)
print 'Connection to Qpid closed'

917
awips/stomp.py Normal file
View file

@ -0,0 +1,917 @@
#!/usr/bin/env python
##
##
"""Stomp Protocol Connectivity
This provides basic connectivity to a message broker supporting the 'stomp' protocol.
At the moment ACK, SEND, SUBSCRIBE, UNSUBSCRIBE, BEGIN, ABORT, COMMIT, CONNECT and DISCONNECT operations
are supported.
This changes the previous version which required a listener per subscription -- now a listener object
just calls the 'addlistener' method and will receive all messages sent in response to all/any subscriptions.
(The reason for the change is that the handling of an 'ack' becomes problematic unless the listener mechanism
is decoupled from subscriptions).
Note that you must 'start' an instance of Connection to begin receiving messages. For example:
conn = stomp.Connection([('localhost', 62003)], 'myuser', 'mypass')
conn.start()
Meta-Data
---------
Author: Jason R Briggs
License: http://www.apache.org/licenses/LICENSE-2.0
Start Date: 2005/12/01
Last Revision Date: $Date: 2008/09/11 00:16 $
Notes/Attribution
-----------------
* uuid method courtesy of Carl Free Jr:
http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/213761
* patch from Andreas Schobel
* patches from Julian Scheid of Rising Sun Pictures (http://open.rsp.com.au)
* patch from Fernando
* patches from Eugene Strulyov
Updates
-------
* 2007/03/31 : (Andreas Schobel) patch to fix newlines problem in ActiveMQ 4.1
* 2007/09 : (JRB) updated to get stomp.py working in Jython as well as Python
* 2007/09/05 : (Julian Scheid) patch to allow sending custom headers
* 2007/09/18 : (JRB) changed code to use logging instead of just print. added logger for jython to work
* 2007/09/18 : (Julian Scheid) various updates, including:
- change incoming message handling so that callbacks are invoked on the listener not only for MESSAGE, but also for
CONNECTED, RECEIPT and ERROR frames.
- callbacks now get not only the payload but any headers specified by the server
- all outgoing messages now sent via a single method
- only one connection used
- change to use thread instead of threading
- sends performed on the calling thread
- receiver loop now deals with multiple messages in one received chunk of data
- added reconnection attempts and connection fail-over
- changed defaults for "user" and "passcode" to None instead of empty string (fixed transmission of those values)
- added readline support
* 2008/03/26 : (Fernando) added cStringIO for faster performance on large messages
* 2008/09/10 : (Eugene) remove lower() on headers to support case-sensitive header names
* 2008/09/11 : (JRB) fix incompatibilities with RabbitMQ, add wait for socket-connect
* 2008/10/28 : (Eugene) add jms map (from stomp1.1 ideas)
* 2008/11/25 : (Eugene) remove superfluous (incorrect) locking code
* 2009/02/05 : (JRB) remove code to replace underscores with dashes in header names (causes a problem in rabbit-mq)
* 2009/03/29 : (JRB) minor change to add logging config file
(JRB) minor change to add socket timeout, suggested by Israel
* 2009/04/01 : (Gavin) patch to change md5 to hashlib (for 2.6 compatibility)
* 2009/04/02 : (Fernando Ciciliati) fix overflow bug when waiting too long to connect to the broker
"""
import hashlib
import math
import random
import re
import socket
import sys
import thread
import threading
import time
import types
import xml.dom.minidom
from cStringIO import StringIO
#
# stomp.py version number
#
_version = 1.8
def _uuid( *args ):
"""
uuid courtesy of Carl Free Jr:
(http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/213761)
"""
t = long( time.time() * 1000 )
r = long( random.random() * 100000000000000000L )
try:
a = socket.gethostbyname( socket.gethostname() )
except:
# if we can't get a network address, just imagine one
a = random.random() * 100000000000000000L
data = str(t) + ' ' + str(r) + ' ' + str(a) + ' ' + str(args)
md5 = hashlib.md5()
md5.update(data)
data = md5.hexdigest()
return data
class DevNullLogger(object):
"""
dummy logging class for environments without the logging module
"""
def log(self, msg):
print msg
def devnull(self, msg):
pass
debug = devnull
info = devnull
warning = log
error = log
critical = log
exception = log
def isEnabledFor(self, lvl):
return False
#
# add logging if available
#
try:
import logging
import logging.config
logging.config.fileConfig("stomp.log.conf")
log = logging.getLogger('root')
except:
log = DevNullLogger()
class ConnectionClosedException(Exception):
"""
Raised in the receiver thread when the connection has been closed
by the server.
"""
pass
class NotConnectedException(Exception):
"""
Raised by Connection.__send_frame when there is currently no server
connection.
"""
pass
class ConnectionListener(object):
"""
This class should be used as a base class for objects registered
using Connection.add_listener().
"""
def on_connecting(self, host_and_port):
"""
Called by the STOMP connection once a TCP/IP connection to the
STOMP server has been established or re-established. Note that
at this point, no connection has been established on the STOMP
protocol level. For this, you need to invoke the "connect"
method on the connection.
\param host_and_port a tuple containing the host name and port
number to which the connection has been established.
"""
pass
def on_connected(self, headers, body):
"""
Called by the STOMP connection when a CONNECTED frame is
received, that is after a connection has been established or
re-established.
\param headers a dictionary containing all headers sent by the
server as key/value pairs.
\param body the frame's payload. This is usually empty for
CONNECTED frames.
"""
pass
def on_disconnected(self):
"""
Called by the STOMP connection when a TCP/IP connection to the
STOMP server has been lost. No messages should be sent via
the connection until it has been reestablished.
"""
pass
def on_message(self, headers, body):
"""
Called by the STOMP connection when a MESSAGE frame is
received.
\param headers a dictionary containing all headers sent by the
server as key/value pairs.
\param body the frame's payload - the message body.
"""
pass
def on_receipt(self, headers, body):
"""
Called by the STOMP connection when a RECEIPT frame is
received, sent by the server if requested by the client using
the 'receipt' header.
\param headers a dictionary containing all headers sent by the
server as key/value pairs.
\param body the frame's payload. This is usually empty for
RECEIPT frames.
"""
pass
def on_error(self, headers, body):
"""
Called by the STOMP connection when an ERROR frame is
received.
\param headers a dictionary containing all headers sent by the
server as key/value pairs.
\param body the frame's payload - usually a detailed error
description.
"""
pass
class Connection(object):
"""
Represents a STOMP client connection.
"""
def __init__(self,
host_and_ports = [ ('localhost', 61613) ],
user = None,
passcode = None,
prefer_localhost = True,
try_loopback_connect = True,
reconnect_sleep_initial = 0.1,
reconnect_sleep_increase = 0.5,
reconnect_sleep_jitter = 0.1,
reconnect_sleep_max = 60.0):
"""
Initialize and start this connection.
\param host_and_ports
a list of (host, port) tuples.
\param prefer_localhost
if True and the local host is mentioned in the (host,
port) tuples, try to connect to this first
\param try_loopback_connect
if True and the local host is found in the host
tuples, try connecting to it using loopback interface
(127.0.0.1)
\param reconnect_sleep_initial
initial delay in seconds to wait before reattempting
to establish a connection if connection to any of the
hosts fails.
\param reconnect_sleep_increase
factor by which the sleep delay is increased after
each connection attempt. For example, 0.5 means
to wait 50% longer than before the previous attempt,
1.0 means wait twice as long, and 0.0 means keep
the delay constant.
\param reconnect_sleep_max
maximum delay between connection attempts, regardless
of the reconnect_sleep_increase.
\param reconnect_sleep_jitter
random additional time to wait (as a percentage of
the time determined using the previous parameters)
between connection attempts in order to avoid
stampeding. For example, a value of 0.1 means to wait
an extra 0%-10% (randomly determined) of the delay
calculated using the previous three parameters.
"""
sorted_host_and_ports = []
sorted_host_and_ports.extend(host_and_ports)
# If localhost is preferred, make sure all (host, port) tuples
# that refer to the local host come first in the list
if prefer_localhost:
def is_local_host(host):
return host in Connection.__localhost_names
sorted_host_and_ports.sort(lambda x, y: (int(is_local_host(y[0]))
- int(is_local_host(x[0]))))
# If the user wishes to attempt connecting to local ports
# using the loopback interface, for each (host, port) tuple
# referring to a local host, add an entry with the host name
# replaced by 127.0.0.1 if it doesn't exist already
loopback_host_and_ports = []
if try_loopback_connect:
for host_and_port in sorted_host_and_ports:
if is_local_host(host_and_port[0]):
port = host_and_port[1]
if (not ("127.0.0.1", port) in sorted_host_and_ports
and not ("localhost", port) in sorted_host_and_ports):
loopback_host_and_ports.append(("127.0.0.1", port))
# Assemble the final, possibly sorted list of (host, port) tuples
self.__host_and_ports = []
self.__host_and_ports.extend(loopback_host_and_ports)
self.__host_and_ports.extend(sorted_host_and_ports)
self.__recvbuf = ''
self.__listeners = [ ]
self.__reconnect_sleep_initial = reconnect_sleep_initial
self.__reconnect_sleep_increase = reconnect_sleep_increase
self.__reconnect_sleep_jitter = reconnect_sleep_jitter
self.__reconnect_sleep_max = reconnect_sleep_max
self.__connect_headers = {}
if user is not None and passcode is not None:
self.__connect_headers['login'] = user
self.__connect_headers['passcode'] = passcode
self.__socket = None
self.__current_host_and_port = None
self.__receiver_thread_exit_condition = threading.Condition()
self.__receiver_thread_exited = False
#
# Manage the connection
#
def start(self):
"""
Start the connection. This should be called after all
listeners have been registered. If this method is not called,
no frames will be received by the connection.
"""
self.__running = True
self.__attempt_connection()
thread.start_new_thread(self.__receiver_loop, ())
def stop(self):
"""
Stop the connection. This is equivalent to calling
disconnect() but will do a clean shutdown by waiting for the
receiver thread to exit.
"""
self.disconnect()
self.__receiver_thread_exit_condition.acquire()
if not self.__receiver_thread_exited:
self.__receiver_thread_exit_condition.wait()
self.__receiver_thread_exit_condition.release()
def get_host_and_port(self):
"""
Return a (host, port) tuple indicating which STOMP host and
port is currently connected, or None if there is currently no
connection.
"""
return self.__current_host_and_port
def is_connected(self):
try:
return self.__socket is not None and self.__socket.getsockname()[1] != 0
except socket.error:
return False
#
# Manage objects listening to incoming frames
#
def add_listener(self, listener):
self.__listeners.append(listener)
def remove_listener(self, listener):
self.__listeners.remove(listener)
#
# STOMP transmissions
#
def subscribe(self, headers={}, **keyword_headers):
self.__send_frame_helper('SUBSCRIBE', '', self.__merge_headers([headers, keyword_headers]), [ 'destination' ])
def unsubscribe(self, headers={}, **keyword_headers):
self.__send_frame_helper('UNSUBSCRIBE', '', self.__merge_headers([headers, keyword_headers]), [ ('destination', 'id') ])
def send(self, message='', headers={}, **keyword_headers):
if '\x00' in message:
content_length_headers = {'content-length': len(message)}
else:
content_length_headers = {}
self.__send_frame_helper('SEND', message, self.__merge_headers([headers,
keyword_headers,
content_length_headers]), [ 'destination' ])
def ack(self, headers={}, **keyword_headers):
self.__send_frame_helper('ACK', '', self.__merge_headers([headers, keyword_headers]), [ 'message-id' ])
def begin(self, headers={}, **keyword_headers):
use_headers = self.__merge_headers([headers, keyword_headers])
if not 'transaction' in use_headers.keys():
use_headers['transaction'] = _uuid()
self.__send_frame_helper('BEGIN', '', use_headers, [ 'transaction' ])
return use_headers['transaction']
def abort(self, headers={}, **keyword_headers):
self.__send_frame_helper('ABORT', '', self.__merge_headers([headers, keyword_headers]), [ 'transaction' ])
def commit(self, headers={}, **keyword_headers):
self.__send_frame_helper('COMMIT', '', self.__merge_headers([headers, keyword_headers]), [ 'transaction' ])
def connect(self, headers={}, **keyword_headers):
if keyword_headers.has_key('wait') and keyword_headers['wait']:
while not self.is_connected(): time.sleep(0.1)
del keyword_headers['wait']
self.__send_frame_helper('CONNECT', '', self.__merge_headers([self.__connect_headers, headers, keyword_headers]), [ ])
def disconnect(self, headers={}, **keyword_headers):
self.__send_frame_helper('DISCONNECT', '', self.__merge_headers([self.__connect_headers, headers, keyword_headers]), [ ])
self.__running = False
if hasattr(socket, 'SHUT_RDWR'):
self.__socket.shutdown(socket.SHUT_RDWR)
if self.__socket:
self.__socket.close()
self.__current_host_and_port = None
# ========= PRIVATE MEMBERS =========
# List of all host names (unqualified, fully-qualified, and IP
# addresses) that refer to the local host (both loopback interface
# and external interfaces). This is used for determining
# preferred targets.
__localhost_names = [ "localhost",
"127.0.0.1",
socket.gethostbyname(socket.gethostname()),
socket.gethostname(),
socket.getfqdn(socket.gethostname()) ]
#
# Used to parse STOMP header lines in the format "key:value",
#
__header_line_re = re.compile('(?P<key>[^:]+)[:](?P<value>.*)')
#
# Used to parse the STOMP "content-length" header lines,
#
__content_length_re = re.compile('^content-length[:]\\s*(?P<value>[0-9]+)', re.MULTILINE)
def __merge_headers(self, header_map_list):
"""
Helper function for combining multiple header maps into one.
Any underscores ('_') in header names (keys) will be replaced by dashes ('-').
"""
headers = {}
for header_map in header_map_list:
for header_key in header_map.keys():
headers[header_key] = header_map[header_key]
return headers
def __convert_dict(self, payload):
"""
Encode python dictionary as <map>...</map> structure.
"""
xmlStr = "<map>\n"
for key in payload:
xmlStr += "<entry>\n"
xmlStr += "<string>%s</string>" % key
xmlStr += "<string>%s</string>" % payload[key]
xmlStr += "</entry>\n"
xmlStr += "</map>"
return xmlStr
def __send_frame_helper(self, command, payload, headers, required_header_keys):
"""
Helper function for sending a frame after verifying that a
given set of headers are present.
\param command the command to send
\param payload the frame's payload
\param headers a dictionary containing the frame's headers
\param required_header_keys a sequence enumerating all
required header keys. If an element in this sequence is itself
a tuple, that tuple is taken as a list of alternatives, one of
which must be present.
\throws ArgumentError if one of the required header keys is
not present in the header map.
"""
for required_header_key in required_header_keys:
if type(required_header_key) == tuple:
found_alternative = False
for alternative in required_header_key:
if alternative in headers.keys():
found_alternative = True
if not found_alternative:
raise KeyError("Command %s requires one of the following headers: %s" % (command, str(required_header_key)))
elif not required_header_key in headers.keys():
raise KeyError("Command %s requires header %r" % (command, required_header_key))
self.__send_frame(command, headers, payload)
def __send_frame(self, command, headers={}, payload=''):
"""
Send a STOMP frame.
"""
if type(payload) == dict:
headers["transformation"] = "jms-map-xml"
payload = self.__convert_dict(payload)
if self.__socket is not None:
frame = '%s\n%s\n%s\x00' % (command,
reduce(lambda accu, key: accu + ('%s:%s\n' % (key, headers[key])), headers.keys(), ''),
payload)
self.__socket.sendall(frame)
log.debug("Sent frame: type=%s, headers=%r, body=%r" % (command, headers, payload))
else:
raise NotConnectedException()
def __receiver_loop(self):
"""
Main loop listening for incoming data.
"""
try:
try:
threading.currentThread().setName("StompReceiver")
while self.__running:
log.debug('starting receiver loop')
if self.__socket is None:
break
try:
try:
for listener in self.__listeners:
if hasattr(listener, 'on_connecting'):
listener.on_connecting(self.__current_host_and_port)
while self.__running:
frames = self.__read()
for frame in frames:
(frame_type, headers, body) = self.__parse_frame(frame)
log.debug("Received frame: result=%r, headers=%r, body=%r" % (frame_type, headers, body))
frame_type = frame_type.lower()
if frame_type in [ 'connected',
'message',
'receipt',
'error' ]:
for listener in self.__listeners:
if hasattr(listener, 'on_%s' % frame_type):
eval('listener.on_%s(headers, body)' % frame_type)
else:
log.debug('listener %s has no such method on_%s' % (listener, frame_type))
else:
log.warning('Unknown response frame type: "%s" (frame length was %d)' % (frame_type, len(frame)))
finally:
try:
self.__socket.close()
except:
pass # ignore errors when attempting to close socket
self.__socket = None
self.__current_host_and_port = None
except ConnectionClosedException:
if self.__running:
log.error("Lost connection")
# Notify listeners
for listener in self.__listeners:
if hasattr(listener, 'on_disconnected'):
listener.on_disconnected()
# Clear out any half-received messages after losing connection
self.__recvbuf = ''
continue
else:
break
except:
log.exception("An unhandled exception was encountered in the stomp receiver loop")
finally:
self.__receiver_thread_exit_condition.acquire()
self.__receiver_thread_exited = True
self.__receiver_thread_exit_condition.notifyAll()
self.__receiver_thread_exit_condition.release()
def __read(self):
"""
Read the next frame(s) from the socket.
"""
fastbuf = StringIO()
while self.__running:
try:
c = self.__socket.recv(1024)
except:
c = ''
if len(c) == 0:
raise ConnectionClosedException
fastbuf.write(c)
if '\x00' in c:
break
self.__recvbuf += fastbuf.getvalue()
fastbuf.close()
result = []
if len(self.__recvbuf) > 0 and self.__running:
while True:
pos = self.__recvbuf.find('\x00')
if pos >= 0:
frame = self.__recvbuf[0:pos]
preamble_end = frame.find('\n\n')
if preamble_end >= 0:
content_length_match = Connection.__content_length_re.search(frame[0:preamble_end])
if content_length_match:
content_length = int(content_length_match.group('value'))
content_offset = preamble_end + 2
frame_size = content_offset + content_length
if frame_size > len(frame):
# Frame contains NUL bytes, need to
# read more
if frame_size < len(self.__recvbuf):
pos = frame_size
frame = self.__recvbuf[0:pos]
else:
# Haven't read enough data yet,
# exit loop and wait for more to
# arrive
break
result.append(frame)
self.__recvbuf = self.__recvbuf[pos+1:]
else:
break
return result
def __transform(self, body, transType):
"""
Perform body transformation. Currently, the only supported transformation is
'jms-map-xml', which converts a map into python dictionary. This can be extended
to support other transformation types.
The body has the following format:
<map>
<entry>
<string>name</string>
<string>Dejan</string>
</entry>
<entry>
<string>city</string>
<string>Belgrade</string>
</entry>
</map>
(see http://docs.codehaus.org/display/STOMP/Stomp+v1.1+Ideas)
"""
if transType != 'jms-map-xml':
return body
try:
entries = {}
doc = xml.dom.minidom.parseString(body)
rootElem = doc.documentElement
for entryElem in rootElem.getElementsByTagName("entry"):
pair = []
for node in entryElem.childNodes:
if not isinstance(node, xml.dom.minidom.Element): continue
pair.append(node.firstChild.nodeValue)
assert len(pair) == 2
entries[pair[0]] = pair[1]
return entries
except Exception, ex:
# unable to parse message. return original
return body
def __parse_frame(self, frame):
"""
Parse a STOMP frame into a (frame_type, headers, body) tuple,
where frame_type is the frame type as a string (e.g. MESSAGE),
headers is a map containing all header key/value pairs, and
body is a string containing the frame's payload.
"""
preamble_end = frame.find('\n\n')
preamble = frame[0:preamble_end]
preamble_lines = preamble.split('\n')
body = frame[preamble_end+2:]
# Skip any leading newlines
first_line = 0
while first_line < len(preamble_lines) and len(preamble_lines[first_line]) == 0:
first_line += 1
# Extract frame type
frame_type = preamble_lines[first_line]
# Put headers into a key/value map
headers = {}
for header_line in preamble_lines[first_line+1:]:
header_match = Connection.__header_line_re.match(header_line)
if header_match:
headers[header_match.group('key')] = header_match.group('value')
if 'transformation' in headers:
body = self.__transform(body, headers['transformation'])
return (frame_type, headers, body)
def __attempt_connection(self):
"""
Try connecting to the (host, port) tuples specified at construction time.
"""
sleep_exp = 1
while self.__running and self.__socket is None:
for host_and_port in self.__host_and_ports:
try:
log.debug("Attempting connection to host %s, port %s" % host_and_port)
self.__socket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
self.__socket.settimeout(None)
self.__socket.connect(host_and_port)
self.__current_host_and_port = host_and_port
log.info("Established connection to host %s, port %s" % host_and_port)
break
except socket.error:
self.__socket = None
if type(sys.exc_info()[1]) == types.TupleType:
exc = sys.exc_info()[1][1]
else:
exc = sys.exc_info()[1]
log.warning("Could not connect to host %s, port %s: %s" % (host_and_port[0], host_and_port[1], exc))
if self.__socket is None:
sleep_duration = (min(self.__reconnect_sleep_max,
((self.__reconnect_sleep_initial / (1.0 + self.__reconnect_sleep_increase))
* math.pow(1.0 + self.__reconnect_sleep_increase, sleep_exp)))
* (1.0 + random.random() * self.__reconnect_sleep_jitter))
sleep_end = time.time() + sleep_duration
log.debug("Sleeping for %.1f seconds before attempting reconnect" % sleep_duration)
while self.__running and time.time() < sleep_end:
time.sleep(0.2)
if sleep_duration < self.__reconnect_sleep_max:
sleep_exp += 1
#
# command line testing
#
if __name__ == '__main__':
# If the readline module is available, make command input easier
try:
import readline
def stomp_completer(text, state):
commands = [ 'subscribe', 'unsubscribe',
'send', 'ack',
'begin', 'abort', 'commit',
'connect', 'disconnect'
]
for command in commands[state:]:
if command.startswith(text):
return "%s " % command
return None
readline.parse_and_bind("tab: complete")
readline.set_completer(stomp_completer)
readline.set_completer_delims("")
except ImportError:
pass # ignore unavailable readline module
class StompTester(object):
def __init__(self, host='localhost', port=61613, user='', passcode=''):
self.c = Connection([(host, port)], user, passcode)
self.c.add_listener(self)
self.c.start()
def __print_async(self, frame_type, headers, body):
print "\r \r",
print frame_type
for header_key in headers.keys():
print '%s: %s' % (header_key, headers[header_key])
print
print body
print '> ',
sys.stdout.flush()
def on_connecting(self, host_and_port):
self.c.connect(wait=True)
def on_disconnected(self):
print "lost connection"
def on_message(self, headers, body):
self.__print_async("MESSAGE", headers, body)
def on_error(self, headers, body):
self.__print_async("ERROR", headers, body)
def on_receipt(self, headers, body):
self.__print_async("RECEIPT", headers, body)
def on_connected(self, headers, body):
self.__print_async("CONNECTED", headers, body)
def ack(self, args):
if len(args) < 3:
self.c.ack(message_id=args[1])
else:
self.c.ack(message_id=args[1], transaction=args[2])
def abort(self, args):
self.c.abort(transaction=args[1])
def begin(self, args):
print 'transaction id: %s' % self.c.begin()
def commit(self, args):
if len(args) < 2:
print 'expecting: commit <transid>'
else:
print 'committing %s' % args[1]
self.c.commit(transaction=args[1])
def disconnect(self, args):
try:
self.c.disconnect()
except NotConnectedException:
pass # ignore if no longer connected
def send(self, args):
if len(args) < 3:
print 'expecting: send <destination> <message>'
else:
self.c.send(destination=args[1], message=' '.join(args[2:]))
def sendtrans(self, args):
if len(args) < 3:
print 'expecting: sendtrans <destination> <transid> <message>'
else:
self.c.send(destination=args[1], message="%s\n" % ' '.join(args[3:]), transaction=args[2])
def subscribe(self, args):
if len(args) < 2:
print 'expecting: subscribe <destination> [ack]'
elif len(args) > 2:
print 'subscribing to "%s" with acknowledge set to "%s"' % (args[1], args[2])
self.c.subscribe(destination=args[1], ack=args[2])
else:
print 'subscribing to "%s" with auto acknowledge' % args[1]
self.c.subscribe(destination=args[1], ack='auto')
def unsubscribe(self, args):
if len(args) < 2:
print 'expecting: unsubscribe <destination>'
else:
print 'unsubscribing from "%s"' % args[1]
self.c.unsubscribe(destination=args[1])
if len(sys.argv) > 5:
print 'USAGE: stomp.py [host] [port] [user] [passcode]'
sys.exit(1)
if len(sys.argv) >= 2:
host = sys.argv[1]
else:
host = "localhost"
if len(sys.argv) >= 3:
port = int(sys.argv[2])
else:
port = 61613
if len(sys.argv) >= 5:
user = sys.argv[3]
passcode = sys.argv[4]
else:
user = None
passcode = None
st = StompTester(host, port, user, passcode)
try:
while True:
line = raw_input("\r> ")
if not line or line.lstrip().rstrip() == '':
continue
elif 'quit' in line or 'disconnect' in line:
break
split = line.split()
command = split[0]
if not command.startswith("on_") and hasattr(st, command):
getattr(st, command)(split)
else:
print 'unrecognized command'
finally:
st.disconnect(None)

31
awips/test/Record.py Normal file
View file

@ -0,0 +1,31 @@
##
##
#
# Pure python logging mechanism for logging to AlertViz from
# pure python (ie not JEP). DO NOT USE IN PYTHON CALLED
# FROM JAVA.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 11/03/10 5849 cjeanbap Initial Creation.
#
#
#
import os
import sys
class Record():
def __init__(self, level=0, msg='Test Message'):
self.levelno=level
self.message=msg
self.exc_info=sys.exc_info()
self.exc_text="TEST"
def getMessage(self):
return self.message

30
awips/test/Test Normal file
View file

@ -0,0 +1,30 @@
##
##
#
# Pure python logging mechanism for logging to AlertViz from
# pure python (ie not JEP). DO NOT USE IN PYTHON CALLED
# FROM JAVA.
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 11/03/10 5849 cjeanbap Initial Creation.
#
#
#
## to execute type python Test
import os
import logging
from awips import AlertVizHandler
import Record
avh = AlertVizHandler.AlertVizHandler(host=os.getenv("BROKER_ADDR","localhost"), port=9581, category='LOCAL', source='ANNOUNCER', level=logging.NOTSET)
record = Record.Record(10)
avh.emit(record)

17
awips/test/__init__.py Normal file
View file

@ -0,0 +1,17 @@
##
##
#
# __init__.py for awips package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 11/03/10 5489 cjeanbap Initial Creation.
#
#
#

View file

@ -0,0 +1,19 @@
##
##
#
# __init__.py for awips.test.dafTests package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 02/09/2016 4795 mapeters Initial creation.
# 04/12/2016 5548 tgurney Cleanup
#
#
#
__all__ = []

View file

@ -0,0 +1,56 @@
##
##
from awips.dataaccess import DataAccessLayer as DAL
from shapely.geometry import box
import baseDafTestCase
import params
import unittest
#
# Base TestCase for BufrMos* tests.
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 12/07/16 5981 tgurney Parameterize
# 12/15/16 5981 tgurney Add envelope test
#
#
class BufrMosTestCase(baseDafTestCase.DafTestCase):
"""Base class for testing DAF support of bufrmos data"""
data_params = "temperature", "dewpoint"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.OBS_STATION)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.OBS_STATION)
req.setParameters(*self.data_params)
self.runGeometryDataTest(req)
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters(*self.data_params)
req.setEnvelope(params.ENVELOPE)
data = self.runGeometryDataTest(req)
for item in data:
self.assertTrue(params.ENVELOPE.contains(item.getGeometry()))

View file

@ -0,0 +1,214 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import os
import unittest
#
# Base TestCase for DAF tests. This class provides helper methods and
# tests common to all DAF test cases.
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/13/16 5379 tgurney Add identifier values tests
# 04/18/16 5548 tgurney More cleanup, plus new tests
# 04/26/16 5587 tgurney Move identifier values tests
# to subclasses
# 06/01/16 5587 tgurney Add testGet*Identifiers
# 06/07/16 5574 tgurney Make geometry/grid data tests
# return the retrieved data
# 06/10/16 5548 tgurney Make testDatatypeIsSupported
# case-insensitive
# 08/10/16 2416 tgurney Don't test identifier values
# for dataURI
# 10/05/16 5926 dgilling Better checks in runGeometryDataTest.
# 11/08/16 5985 tgurney Do not check data times on
# time-agnostic data
# 03/13/17 5981 tgurney Do not check valid period on
# data time
#
#
class DafTestCase(unittest.TestCase):
sampleDataLimit = 5
"""
Maximum number of levels, locations, times, and geometry/grid data to
display
"""
numTimesToLimit = 3
"""
When limiting geometry/grid data requests with times, only retrieve data
for this many times
"""
datatype = None
"""Name of the datatype"""
@classmethod
def setUpClass(cls):
host = os.environ.get('DAF_TEST_HOST')
if host is None:
host = 'localhost'
DAL.changeEDEXHost(host)
@staticmethod
def getTimesIfSupported(req):
"""Return available times for req. If req refers to a time-agnostic
datatype, return an empty list instead.
"""
times = []
try:
times = DAL.getAvailableTimes(req)
except ThriftRequestException as e:
if not 'TimeAgnosticDataException' in str(e):
raise
return times
def testDatatypeIsSupported(self):
allSupported = (item.lower() for item in DAL.getSupportedDatatypes())
self.assertIn(self.datatype.lower(), allSupported)
def testGetRequiredIdentifiers(self):
req = DAL.newDataRequest(self.datatype)
required = DAL.getRequiredIdentifiers(req)
self.assertIsNotNone(required)
print("Required identifiers:", required)
def testGetOptionalIdentifiers(self):
req = DAL.newDataRequest(self.datatype)
optional = DAL.getOptionalIdentifiers(req)
self.assertIsNotNone(optional)
print("Optional identifiers:", optional)
def runGetIdValuesTest(self, identifiers):
for id in identifiers:
if id.lower() == 'datauri':
continue
req = DAL.newDataRequest(self.datatype)
idValues = DAL.getIdentifierValues(req, id)
self.assertTrue(hasattr(idValues, '__iter__'))
def runInvalidIdValuesTest(self):
badString = 'id from ' + self.datatype + '; select 1;'
with self.assertRaises(ThriftRequestException) as cm:
req = DAL.newDataRequest(self.datatype)
idValues = DAL.getIdentifierValues(req, badString)
def runNonexistentIdValuesTest(self):
with self.assertRaises(ThriftRequestException) as cm:
req = DAL.newDataRequest(self.datatype)
idValues = DAL.getIdentifierValues(req, 'idthatdoesnotexist')
def runParametersTest(self, req):
params = DAL.getAvailableParameters(req)
self.assertIsNotNone(params)
print(params)
def runLevelsTest(self, req):
levels = DAL.getAvailableLevels(req)
self.assertIsNotNone(levels)
print("Number of levels: " + str(len(levels)))
strLevels = [str(t) for t in levels[:self.sampleDataLimit]]
print("Sample levels:\n" + str(strLevels))
def runLocationsTest(self, req):
locs = DAL.getAvailableLocationNames(req)
self.assertIsNotNone(locs)
print("Number of location names: " + str(len(locs)))
print("Sample location names:\n" + str(locs[:self.sampleDataLimit]))
def runTimesTest(self, req):
times = DAL.getAvailableTimes(req)
self.assertIsNotNone(times)
print("Number of times: " + str(len(times)))
strTimes = [str(t) for t in times[:self.sampleDataLimit]]
print("Sample times:\n" + str(strTimes))
def runTimeAgnosticTest(self, req):
with self.assertRaises(ThriftRequestException) as cm:
times = DAL.getAvailableTimes(req)
self.assertIn('TimeAgnosticDataException', str(cm.exception))
def runGeometryDataTest(self, req, checkDataTimes=True):
"""
Test that we are able to successfully retrieve geometry data for the
given request.
"""
times = DafTestCase.getTimesIfSupported(req)
geomData = DAL.getGeometryData(req, times[:self.numTimesToLimit])
self.assertIsNotNone(geomData)
if times:
self.assertNotEqual(len(geomData), 0)
if not geomData:
raise unittest.SkipTest("No data available")
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
if (checkDataTimes and times and
"PERIOD_USED" not in record.getDataTime().getUtilityFlags()):
self.assertIn(record.getDataTime(), times[:self.numTimesToLimit])
print("geometry=" + str(record.getGeometry()), end="")
for p in req.getParameters():
print(" " + p + "=" + record.getString(p), end="")
print()
return geomData
def runGeometryDataTestWithTimeRange(self, req, timeRange):
"""
Test that we are able to successfully retrieve geometry data for the
given request.
"""
geomData = DAL.getGeometryData(req, timeRange)
self.assertIsNotNone(geomData)
if not geomData:
raise unittest.SkipTest("No data available")
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
self.assertGreaterEqual(record.getDataTime().getRefTime().getTime(), timeRange.getStartInMillis())
self.assertLessEqual(record.getDataTime().getRefTime().getTime(), timeRange.getEndInMillis())
print("geometry=" + str(record.getGeometry()), end="")
for p in req.getParameters():
print(" " + p + "=" + record.getString(p), end="")
print()
return geomData
def runGridDataTest(self, req, testSameShape=True):
"""
Test that we are able to successfully retrieve grid data for the given
request.
Args:
testSameShape: whether or not to verify that all the retrieved data
have the same shape (most data don't change shape)
"""
times = DafTestCase.getTimesIfSupported(req)
gridData = DAL.getGridData(req, times[:self.numTimesToLimit])
self.assertIsNotNone(gridData)
if not gridData:
raise unittest.SkipTest("No data available")
print("Number of grid records: " + str(len(gridData)))
if len(gridData) > 0:
print("Sample grid data shape:\n" + str(gridData[0].getRawData().shape) + "\n")
print("Sample grid data:\n" + str(gridData[0].getRawData()) + "\n")
print("Sample lat-lon data:\n" + str(gridData[0].getLatLonCoords()) + "\n")
if testSameShape:
correctGridShape = gridData[0].getLatLonCoords()[0].shape
for record in gridData:
rawData = record.getRawData()
self.assertIsNotNone(rawData)
self.assertEqual(rawData.shape, correctGridShape)
return gridData

View file

@ -0,0 +1,177 @@
##
##
from __future__ import print_function
from shapely.geometry import box
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import params
import unittest
#
# Tests common to all radar factories
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/26/16 5587 tgurney Move identifier values tests
# out of base class
# 06/01/16 5587 tgurney Update testGetIdentifierValues
# 06/08/16 5574 mapeters Add advanced query tests
# 06/13/16 5574 tgurney Fix checks for None
# 06/14/16 5548 tgurney Undo previous change (broke
# test)
# 06/30/16 5725 tgurney Add test for NOT IN
# 08/25/16 2671 tgurney Rename to baseRadarTestCase
# and move factory-specific
# tests
# 12/07/16 5981 tgurney Parameterize
#
#
class BaseRadarTestCase(baseDafTestCase.DafTestCase):
"""Tests common to all radar factories"""
# datatype is specified by subclass
datatype = None
radarLoc = params.RADAR.lower()
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableLevels(self):
req = DAL.newDataRequest(self.datatype)
self.runLevelsTest(req)
def testGetAvailableLevelsWithInvalidLevelIdentifierThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('level.one.field', 'invalidLevelField')
with self.assertRaises(ThriftRequestException) as cm:
self.runLevelsTest(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
self.runTimesTest(req)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def runConstraintTest(self, key, operator, value):
raise NotImplementedError
def testGetDataWithEqualsString(self):
gridData = self.runConstraintTest('icao', '=', self.radarLoc)
for record in gridData:
self.assertEqual(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithEqualsUnicode(self):
gridData = self.runConstraintTest('icao', '=', unicode(self.radarLoc))
for record in gridData:
self.assertEqual(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithEqualsInt(self):
gridData = self.runConstraintTest('icao', '=', 1000)
for record in gridData:
self.assertEqual(record.getAttribute('icao'), 1000)
def testGetDataWithEqualsLong(self):
gridData = self.runConstraintTest('icao', '=', 1000L)
for record in gridData:
self.assertEqual(record.getAttribute('icao'), 1000)
def testGetDataWithEqualsFloat(self):
gridData = self.runConstraintTest('icao', '=', 1.0)
for record in gridData:
self.assertEqual(round(record.getAttribute('icao'), 1), 1.0)
def testGetDataWithEqualsNone(self):
gridData = self.runConstraintTest('icao', '=', None)
for record in gridData:
self.assertIsNone(record.getAttribute('icao'))
def testGetDataWithNotEquals(self):
gridData = self.runConstraintTest('icao', '!=', self.radarLoc)
for record in gridData:
self.assertNotEqual(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithNotEqualsNone(self):
gridData = self.runConstraintTest('icao', '!=', None)
for record in gridData:
self.assertIsNotNone(record.getAttribute('icao'))
def testGetDataWithGreaterThan(self):
gridData = self.runConstraintTest('icao', '>', self.radarLoc)
for record in gridData:
self.assertGreater(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithLessThan(self):
gridData = self.runConstraintTest('icao', '<', self.radarLoc)
for record in gridData:
self.assertLess(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithGreaterThanEquals(self):
gridData = self.runConstraintTest('icao', '>=', self.radarLoc)
for record in gridData:
self.assertGreaterEqual(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithLessThanEquals(self):
gridData = self.runConstraintTest('icao', '<=', self.radarLoc)
for record in gridData:
self.assertLessEqual(record.getAttribute('icao'), self.radarLoc)
def testGetDataWithInTuple(self):
gridData = self.runConstraintTest('icao', 'in', (self.radarLoc, 'tpbi'))
for record in gridData:
self.assertIn(record.getAttribute('icao'), (self.radarLoc, 'tpbi'))
def testGetDataWithInList(self):
gridData = self.runConstraintTest('icao', 'in', [self.radarLoc, 'tpbi'])
for record in gridData:
self.assertIn(record.getAttribute('icao'), (self.radarLoc, 'tpbi'))
def testGetDataWithInGenerator(self):
generator = (item for item in (self.radarLoc, 'tpbi'))
gridData = self.runConstraintTest('icao', 'in', generator)
for record in gridData:
self.assertIn(record.getAttribute('icao'), (self.radarLoc, 'tpbi'))
def testGetDataWithNotInList(self):
gridData = self.runConstraintTest('icao', 'not in', ['zzzz', self.radarLoc])
for record in gridData:
self.assertNotIn(record.getAttribute('icao'), ('zzzz', self.radarLoc))
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self.runConstraintTest('icao', 'junk', self.radarLoc)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self.runConstraintTest('icao', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self.runConstraintTest('icao', 'in', [])

View file

@ -0,0 +1,26 @@
##
##
#
# Site-specific parameters for DAF tests
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 12/07/16 5981 tgurney Initial creation
# 12/15/16 5981 tgurney Add ENVELOPE
#
#
from shapely.geometry import box
AIRPORT = 'OMA'
OBS_STATION = 'KOMA'
SITE_ID = 'OAX'
STATION_ID = '72558'
RADAR = 'KOAX'
SAMPLE_AREA = (-97.0, 41.0, -96.0, 42.0)
ENVELOPE = box(*SAMPLE_AREA)

View file

@ -0,0 +1,44 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import unittest
#
# Test DAF support for ACARS data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class AcarsTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for ACARS data"""
datatype = "acars"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("flightLevel", "tailNumber")
self.runGeometryDataTest(req)

View file

@ -0,0 +1,155 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import unittest
#
# Test DAF support for airep data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 bsteffen Add getIdentifierValues tests
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
#
#
class AirepTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for airep data"""
datatype = "airep"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("flightLevel", "reportType")
self.runGeometryDataTest(req)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
self.runGetIdValuesTest(optionalIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.setParameters("flightLevel", "reportType")
req.addIdentifier(key, constraint)
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('reportType', '=', 'AIREP')
for record in geometryData:
self.assertEqual(record.getString('reportType'), 'AIREP')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('reportType', '=', u'AIREP')
for record in geometryData:
self.assertEqual(record.getString('reportType'), 'AIREP')
# No numeric tests since no numeric identifiers are available.
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '=', None)
for record in geometryData:
self.assertEqual(record.getType('reportType'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('reportType', '!=', 'AIREP')
for record in geometryData:
self.assertNotEqual(record.getString('reportType'), 'AIREP')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('reportType'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('reportType', '>', 'AIREP')
for record in geometryData:
self.assertGreater(record.getString('reportType'), 'AIREP')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('reportType', '<', 'AIREP')
for record in geometryData:
self.assertLess(record.getString('reportType'), 'AIREP')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('reportType', '>=', 'AIREP')
for record in geometryData:
self.assertGreaterEqual(record.getString('reportType'), 'AIREP')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('reportType', '<=', 'AIREP')
for record in geometryData:
self.assertLessEqual(record.getString('reportType'), 'AIREP')
def testGetDataWithInTuple(self):
collection = ('AIREP', 'AMDAR')
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInList(self):
collection = ['AIREP', 'AMDAR']
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInGenerator(self):
collection = ('AIREP', 'AMDAR')
generator = (item for item in collection)
geometryData = self._runConstraintTest('reportType', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithNotInList(self):
collection = ['AMDAR']
geometryData = self._runConstraintTest('reportType', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('reportType'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'junk', 'AIREP')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('AIREP', 'AMDAR', ())
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', 'in', collection)

View file

@ -0,0 +1,181 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import unittest
#
# Test DAF support for binlightning data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/21/16 5551 tgurney Add tests to verify #5551
# 04/25/16 5587 tgurney Enable skipped test added in
# #5551
# 04/26/16 5587 tgurney Move identifier values tests
# out of base class
# 06/01/16 5587 tgurney Update testGetIdentifierValues
# 06/03/16 5574 tgurney Add advanced query tests
# 06/13/16 5574 tgurney Typo
# 06/30/16 5725 tgurney Add test for NOT IN
# 11/08/16 5985 tgurney Do not check data times
#
#
class BinLightningTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for binlightning data"""
datatype = "binlightning"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("source", "NLDN")
self.runTimesTest(req)
def testGetGeometryDataSingleSourceSingleParameter(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("source", "NLDN")
req.setParameters('intensity')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataInvalidParamRaisesIncompatibleRequestException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("source", "NLDN")
req.setParameters('blahblahblah')
with self.assertRaises(ThriftRequestException) as cm:
self.runGeometryDataTest(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
def testGetGeometryDataSingleSourceAllParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("source", "NLDN")
req.setParameters(*DAL.getAvailableParameters(req))
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('intensity')
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetDataWithEqualsString(self):
geomData = self._runConstraintTest('source', '=', 'NLDN')
for record in geomData:
self.assertEqual(record.getAttribute('source'), 'NLDN')
def testGetDataWithEqualsUnicode(self):
geomData = self._runConstraintTest('source', '=', u'NLDN')
for record in geomData:
self.assertEqual(record.getAttribute('source'), 'NLDN')
def testGetDataWithEqualsInt(self):
geomData = self._runConstraintTest('source', '=', 1000)
for record in geomData:
self.assertEqual(record.getAttribute('source'), 1000)
def testGetDataWithEqualsLong(self):
geomData = self._runConstraintTest('source', '=', 1000L)
for record in geomData:
self.assertEqual(record.getAttribute('source'), 1000)
def testGetDataWithEqualsFloat(self):
geomData = self._runConstraintTest('source', '=', 1.0)
for record in geomData:
self.assertEqual(round(record.getAttribute('source'), 1), 1.0)
def testGetDataWithEqualsNone(self):
geomData = self._runConstraintTest('source', '=', None)
for record in geomData:
self.assertIsNone(record.getAttribute('source'))
def testGetDataWithNotEquals(self):
geomData = self._runConstraintTest('source', '!=', 'NLDN')
for record in geomData:
self.assertNotEqual(record.getAttribute('source'), 'NLDN')
def testGetDataWithNotEqualsNone(self):
geomData = self._runConstraintTest('source', '!=', None)
for record in geomData:
self.assertIsNotNone(record.getAttribute('source'))
def testGetDataWithGreaterThan(self):
geomData = self._runConstraintTest('source', '>', 'NLDN')
for record in geomData:
self.assertGreater(record.getAttribute('source'), 'NLDN')
def testGetDataWithLessThan(self):
geomData = self._runConstraintTest('source', '<', 'NLDN')
for record in geomData:
self.assertLess(record.getAttribute('source'), 'NLDN')
def testGetDataWithGreaterThanEquals(self):
geomData = self._runConstraintTest('source', '>=', 'NLDN')
for record in geomData:
self.assertGreaterEqual(record.getAttribute('source'), 'NLDN')
def testGetDataWithLessThanEquals(self):
geomData = self._runConstraintTest('source', '<=', 'NLDN')
for record in geomData:
self.assertLessEqual(record.getAttribute('source'), 'NLDN')
def testGetDataWithInTuple(self):
geomData = self._runConstraintTest('source', 'in', ('NLDN', 'ENTLN'))
for record in geomData:
self.assertIn(record.getAttribute('source'), ('NLDN', 'ENTLN'))
def testGetDataWithInList(self):
geomData = self._runConstraintTest('source', 'in', ['NLDN', 'ENTLN'])
for record in geomData:
self.assertIn(record.getAttribute('source'), ('NLDN', 'ENTLN'))
def testGetDataWithInGenerator(self):
generator = (item for item in ('NLDN', 'ENTLN'))
geomData = self._runConstraintTest('source', 'in', generator)
for record in geomData:
self.assertIn(record.getAttribute('source'), ('NLDN', 'ENTLN'))
def testGetDataWithNotInList(self):
geomData = self._runConstraintTest('source', 'not in', ['NLDN', 'blah'])
for record in geomData:
self.assertNotIn(record.getAttribute('source'), ('NLDN', 'blah'))
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('source', 'junk', 'NLDN')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('source', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('source', 'in', [])

View file

@ -0,0 +1,28 @@
##
##
from __future__ import print_function
import baseBufrMosTestCase
import unittest
#
# Test DAF support for bufrmosAVN data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class BufrMosAvnTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosAVN data"""
datatype = "bufrmosAVN"
# All tests inherited from superclass

View file

@ -0,0 +1,28 @@
##
##
from __future__ import print_function
import baseBufrMosTestCase
import unittest
#
# Test DAF support for bufrmosETA data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class BufrMosEtaTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosETA data"""
datatype = "bufrmosETA"
# All tests inherited from superclass

View file

@ -0,0 +1,28 @@
##
##
from __future__ import print_function
import baseBufrMosTestCase
import unittest
#
# Test DAF support for bufrmosGFS data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class BufrMosGfsTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosGFS data"""
datatype = "bufrmosGFS"
# All tests inherited from superclass

View file

@ -0,0 +1,33 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseBufrMosTestCase
import params
import unittest
#
# Test DAF support for bufrmosHPC data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Inherit all tests
#
#
class BufrMosHpcTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosHPC data"""
datatype = "bufrmosHPC"
data_params = "forecastHr", "maxTemp24Hour"
# All tests inherited from superclass

View file

@ -0,0 +1,28 @@
##
##
from __future__ import print_function
import baseBufrMosTestCase
import unittest
#
# Test DAF support for bufrmosLAMP data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class BufrMosLampTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosLAMP data"""
datatype = "bufrmosLAMP"
# All tests inherited from superclass

View file

@ -0,0 +1,33 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseBufrMosTestCase
import params
import unittest
#
# Test DAF support for bufrmosMRF data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Inherit all tests
#
#
class BufrMosMrfTestCase(baseBufrMosTestCase.BufrMosTestCase):
"""Test DAF support for bufrmosMRF data"""
datatype = "bufrmosMRF"
data_params = "forecastHr", "maxTempDay"
# All tests inherited from superclass

View file

@ -0,0 +1,204 @@
# #
# #
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import params
import unittest
#
# Test DAF support for bufrua data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 bsteffen Add getIdentifierValues tests
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 12/07/16 5981 tgurney Parameterize
# 12/15/16 5981 tgurney Add envelope test
#
#
class BufrUaTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for bufrua data"""
datatype = "bufrua"
location = params.STATION_ID
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("reportType", "2020")
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(self.location)
req.addIdentifier("reportType", "2020")
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(self.location)
req.addIdentifier("reportType", "2020")
req.setParameters("sfcPressure", "staName", "rptType", "tdMan")
print("Testing getGeometryData()")
geomData = DAL.getGeometryData(req)
self.assertIsNotNone(geomData)
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
print("level=", record.getLevel(), end="")
# One dimensional parameters are reported on the 0.0UNKNOWN level.
# 2D parameters are reported on MB levels from pressure.
if record.getLevel() == "0.0UNKNOWN":
print(" sfcPressure=" + record.getString("sfcPressure") + record.getUnit("sfcPressure"), end="")
print(" staName=" + record.getString("staName"), end="")
print(" rptType=" + record.getString("rptType") + record.getUnit("rptType"), end="")
else:
print(" tdMan=" + str(record.getNumber("tdMan")) + record.getUnit("tdMan"), end="")
print(" geometry=", record.getGeometry())
print("getGeometryData() complete\n\n")
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("staName", "rptType")
req.setEnvelope(params.ENVELOPE)
data = self.runGeometryDataTest(req)
for item in data:
self.assertTrue(params.ENVELOPE.contains(item.getGeometry()))
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
self.runGetIdValuesTest(optionalIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
# As an identifier it is "reportType" but as a parameter it is
# "rptType"... this is weird...
req.setParameters("staName", "rptType")
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('reportType', '=', '2022')
for record in geometryData:
self.assertEqual(record.getString('rptType'), '2022')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('reportType', '=', u'2022')
for record in geometryData:
self.assertEqual(record.getString('rptType'), '2022')
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('reportType', '=', 2022)
for record in geometryData:
self.assertEqual(record.getString('rptType'), '2022')
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('reportType', '=', 2022L)
for record in geometryData:
self.assertEqual(record.getString('rptType'), '2022')
# No float test because no float identifiers are available
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '=', None)
for record in geometryData:
self.assertEqual(record.getType('rptType'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('reportType', '!=', 2022)
for record in geometryData:
self.assertNotEqual(record.getString('rptType'), '2022')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('rptType'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('reportType', '>', 2022)
for record in geometryData:
self.assertGreater(record.getString('rptType'), '2022')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('reportType', '<', 2022)
for record in geometryData:
self.assertLess(record.getString('rptType'), '2022')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('reportType', '>=', 2022)
for record in geometryData:
self.assertGreaterEqual(record.getString('rptType'), '2022')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('reportType', '<=', 2022)
for record in geometryData:
self.assertLessEqual(record.getString('rptType'), '2022')
def testGetDataWithInTuple(self):
collection = ('2022', '2032')
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('rptType'), collection)
def testGetDataWithInList(self):
collection = ['2022', '2032']
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('rptType'), collection)
def testGetDataWithInGenerator(self):
collection = ('2022', '2032')
generator = (item for item in collection)
geometryData = self._runConstraintTest('reportType', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('rptType'), collection)
def testGetDataWithNotInList(self):
collection = ('2022', '2032')
geometryData = self._runConstraintTest('reportType', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('rptType'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'junk', '2022')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('rptType', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('2022', '2032', ())
with self.assertRaises(TypeError):
self._runConstraintTest('rptType', 'in', collection)

View file

@ -0,0 +1,427 @@
##
##
from __future__ import print_function
import datetime
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import params
import unittest
#
# Test DAF support for climate data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/26/16 5587 tgurney Add identifier values tests
# 06/09/16 5574 mapeters Add advanced query tests, Short parameter test
# 06/13/16 5574 tgurney Fix checks for None
# 06/21/16 5548 tgurney Skip tests that cause errors
# 06/30/16 5725 tgurney Add test for NOT IN
# 10/06/16 5926 dgilling Add additional time and location tests.
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Add envelope test
# 08/16/17 6388 tgurney Test for duplicate data
#
#
class ClimateTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for climate data"""
datatype = 'climate'
obsStation = params.OBS_STATION
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
self.runLocationsTest(req)
def testGetAvailableLocationsForRptTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.rpt')
self.runLocationsTest(req)
def testGetAvailableLocationsForStationId(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.day_climate_norm')
self.runLocationsTest(req)
def testGetAvailableLocationsForInformId(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_mon_season_yr')
self.runLocationsTest(req)
def testGetAvailableLocationsWithConstraints(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.addIdentifier('maxtemp_mon', RequestConstraint.new('>', 95))
self.runLocationsTest(req)
def testGetAvailableLocationsWithInvalidTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.boolean_values')
with self.assertRaises(ThriftRequestException) as cm:
DAL.getAvailableLocationNames(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setParameters('maxtemp_mon', 'min_sea_press')
self.runTimesTest(req)
def testGetAvailableTimesWithLocationNamesForYearMonth(self):
"""
Test retrieval of times for a climo table that uses year and
month columns to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames(self.obsStation, 'KABR', 'KDMO')
req.setParameters('maxtemp_mon', 'min_sea_press')
self.runTimesTest(req)
def testGetAvailableTimesWithLocationNamesForYearDayOfYear(self):
"""
Test retrieval of times for a climo table that uses year and
day_of_year columns to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_daily')
req.setLocationNames(self.obsStation, 'KABR', 'KDMO')
req.setParameters('maxtemp_cal', 'min_press')
self.runTimesTest(req)
def testGetAvailableTimesWithLocationNamesForPeriod(self):
"""
Test retrieval of times for a climo table that uses
period_start and period_end columns to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_mon_season_yr')
req.setLocationNames(self.obsStation, 'KABR', 'KDMO')
req.setParameters('max_temp', 'precip_total')
self.runTimesTest(req)
def testGetAvailableTimesWithLocationNamesForDate(self):
"""
Test retrieval of times for a climo table that uses a date
column to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.daily_climate')
req.setLocationNames(self.obsStation, 'KABR', 'KDMO')
req.setParameters('max_temp', 'precip', 'avg_wind_speed')
self.runTimesTest(req)
def testGetAvailableTimesWithConstraint(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.addIdentifier('maxtemp_mon', RequestConstraint.new('<', 75))
req.setParameters('maxtemp_mon', 'min_sea_press')
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_mon', 'min_sea_press')
self.runGeometryDataTest(req)
def testGetGeometryDataWithEnvelopeThrowsException(self):
# Envelope is not used
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setParameters('maxtemp_mon', 'min_sea_press')
req.setEnvelope(params.ENVELOPE)
with self.assertRaises(Exception):
data = self.runGeometryDataTest(req)
def testGetGeometryDataForYearAndDayOfYearTable(self):
"""
Test retrieval of data for a climo table that uses year and
day_of_year columns to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_daily')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_cal', 'min_press')
self.runGeometryDataTest(req)
def testGetGeometryDataForPeriodTable(self):
"""
Test retrieval of data for a climo table that uses a period_start and
period_end columns to build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_mon_season_yr')
req.setLocationNames('KFNB')
req.setParameters('max_temp', 'precip_total')
self.runGeometryDataTest(req)
def testGetGeometryDataForDateTable(self):
"""
Test retrieval of data for a climo table that uses a date column to
build DataTimes.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.daily_climate')
req.setLocationNames('KFNB')
req.setParameters('max_temp', 'precip', 'avg_wind_speed')
self.runGeometryDataTest(req)
def testGetGeometryDataWithShortParameter(self):
"""
Test that a parameter that is stored in Java as a Short is correctly
retrieved as a number.
"""
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'cli_asos_monthly')
req.setParameters('month')
geometryData = self.runGeometryDataTest(req)
for record in geometryData:
self.assertIsNotNone(record.getNumber('month'))
def testGetTableIdentifierValues(self):
self.runGetIdValuesTest(['table'])
def testGetColumnIdValuesWithTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
idValues = DAL.getIdentifierValues(req, 'year')
self.assertTrue(hasattr(idValues, '__iter__'))
def testGetColumnIdValuesWithoutTableThrowsException(self):
req = DAL.newDataRequest(self.datatype)
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'year')
@unittest.skip('avoid EDEX error')
def testGetColumnIdValuesWithNonexistentTableThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'nonexistentjunk')
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'year')
@unittest.skip('avoid EDEX error')
def testGetNonexistentColumnIdValuesThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'nonexistentjunk')
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'cli_asos_monthly')
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('station_code', 'avg_daily_max')
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('station_code', '=', self.obsStation)
for record in geometryData:
self.assertEqual(record.getString('station_code'), self.obsStation)
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('station_code', '=', unicode(self.obsStation))
for record in geometryData:
self.assertEqual(record.getString('station_code'), self.obsStation)
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('avg_daily_max', '=', 70)
for record in geometryData:
self.assertEqual(record.getNumber('avg_daily_max'), 70)
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('avg_daily_max', '=', 70L)
for record in geometryData:
self.assertEqual(record.getNumber('avg_daily_max'), 70)
def testGetDataWithEqualsFloat(self):
geometryData = self._runConstraintTest('avg_daily_max', '=', 69.2)
for record in geometryData:
self.assertEqual(round(record.getNumber('avg_daily_max'), 1), 69.2)
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('station_code', '=', None)
self.assertEqual(len(geometryData), 0)
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('station_code', '!=', self.obsStation)
for record in geometryData:
self.assertNotEqual(record.getString('station_code'), self.obsStation)
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('station_code', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('station_code'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('avg_daily_max', '>', 70)
for record in geometryData:
self.assertGreater(record.getNumber('avg_daily_max'), 70)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('avg_daily_max', '<', 70)
for record in geometryData:
self.assertLess(record.getNumber('avg_daily_max'), 70)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('avg_daily_max', '>=', 70)
for record in geometryData:
self.assertGreaterEqual(record.getNumber('avg_daily_max'), 70)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('avg_daily_max', '<=', 70)
for record in geometryData:
self.assertLessEqual(record.getNumber('avg_daily_max'), 70)
def testGetDataWithInTuple(self):
collection = (self.obsStation, 'KABR')
geometryData = self._runConstraintTest('station_code', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('station_code'), collection)
def testGetDataWithInList(self):
collection = [self.obsStation, 'KABR']
geometryData = self._runConstraintTest('station_code', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('station_code'), collection)
def testGetDataWithInGenerator(self):
collection = (self.obsStation, 'KABR')
generator = (item for item in collection)
geometryData = self._runConstraintTest('station_code', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('station_code'), collection)
def testGetDataWithNotInList(self):
collection = ['KORD', 'KABR']
geometryData = self._runConstraintTest('station_code', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('station_code'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('station_code', 'junk', self.obsStation)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('station_code', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('station_code', 'in', [])
def testGetDataWithTimeRangeWithYearAndMonth1(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_mon', 'min_sea_press')
startTime = datetime.datetime(2009, 1, 1)
endTime = datetime.datetime(2009, 12, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithYearAndMonth2(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_mon', 'min_sea_press')
startTime = datetime.datetime(2008, 1, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithYearAndMonth3(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_mon', 'min_sea_press')
startTime = datetime.datetime(2007, 7, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithYearAndDayOfYear1(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_daily')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_cal', 'min_press')
startTime = datetime.datetime(2009, 1, 1)
endTime = datetime.datetime(2009, 7, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithYearAndDayOfYear2(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_daily')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_cal', 'min_press')
startTime = datetime.datetime(2008, 7, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithYearAndDayOfYear3(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_daily')
req.setLocationNames('KFNB')
req.setParameters('maxtemp_cal', 'min_press')
startTime = datetime.datetime(2007, 7, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithPeriodTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_mon_season_yr')
req.setLocationNames('KFNB')
req.setParameters('max_temp', 'precip_total')
startTime = datetime.datetime(2007, 7, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithTimeRangeWithForDateTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.daily_climate')
req.setLocationNames('KFNB')
req.setParameters('max_temp', 'precip', 'avg_wind_speed')
startTime = datetime.datetime(2007, 7, 1)
endTime = datetime.datetime(2009, 3, 31)
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testNoDuplicateData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.cli_asos_monthly')
req.setLocationNames('KOMA')
req.setParameters('maxtemp_day1')
rows = DAL.getGeometryData(req, DAL.getAvailableTimes(req)[0:5])
for i in range(len(rows)):
for j in range(len(rows)):
if i != j:
self.assertNotEqual(rows[i].__dict__, rows[j].__dict__)

View file

@ -0,0 +1,50 @@
##
##
from awips.dataaccess import DataAccessLayer as DAL
from awips.dataaccess import CombinedTimeQuery as CTQ
import unittest
import os
#
# Test the CombinedTimedQuery module
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/24/16 5591 bsteffen Initial Creation.
# 11/08/16 5895 tgurney Change grid model
#
#
#
class CombinedTimeQueryTestCase(unittest.TestCase):
@classmethod
def setUp(cls):
host = os.environ.get('DAF_TEST_HOST')
if host is None:
host = 'localhost'
DAL.changeEDEXHost(host)
def testSuccessfulQuery(self):
req = DAL.newDataRequest('grid')
req.setLocationNames('RUC130')
req.setParameters('T','GH')
req.setLevels('300MB', '500MB','700MB')
times = CTQ.getAvailableTimes(req);
self.assertNotEqual(len(times), 0)
def testNonIntersectingQuery(self):
"""
Test that when a parameter is only available on one of the levels that no times are returned.
"""
req = DAL.newDataRequest('grid')
req.setLocationNames('RUC130')
req.setParameters('T','GH', 'LgSP1hr')
req.setLevels('300MB', '500MB','700MB','0.0SFC')
times = CTQ.getAvailableTimes(req);
self.assertEqual(len(times), 0)

View file

@ -0,0 +1,161 @@
##
##
from __future__ import print_function
from shapely.geometry import box
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import params
import unittest
#
# Test DAF support for common_obs_spatial data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 05/26/16 5587 njensen Added testGetIdentifierValues()
# 06/01/16 5587 tgurney Move testIdentifiers() to
# superclass
# 06/13/16 5574 tgurney Add advanced query tests
# 06/21/16 5548 tgurney Skip tests that cause errors
# 06/30/16 5725 tgurney Add test for NOT IN
# 12/07/16 5981 tgurney Parameterize
# 01/06/17 5981 tgurney Do not check data times
#
class CommonObsSpatialTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for common_obs_spatial data"""
datatype = "common_obs_spatial"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("country", ["US", "CN"])
self.runLocationsTest(req)
def testGetIdentifierValues(self):
self.runGetIdValuesTest(['country'])
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setParameters("name", "stationid")
self.runGeometryDataTest(req, checkDataTimes=False)
def testRequestingTimesThrowsTimeAgnosticDataException(self):
req = DAL.newDataRequest(self.datatype)
self.runTimeAgnosticTest(req)
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('catalogtype', 'elevation', 'state')
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('state', '=', 'NE')
for record in geometryData:
self.assertEqual(record.getString('state'), 'NE')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('state', '=', u'NE')
for record in geometryData:
self.assertEqual(record.getString('state'), 'NE')
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('catalogtype', '=', 32)
for record in geometryData:
self.assertEqual(record.getNumber('catalogtype'), 32)
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('elevation', '=', 0L)
for record in geometryData:
self.assertEqual(record.getNumber('elevation'), 0)
# No float test since there are no float identifiers available. Attempting
# to filter a non-float identifier on a float value raises an exception.
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('state', '=', None)
for record in geometryData:
self.assertEqual(record.getType('state'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('state', '!=', 'NE')
for record in geometryData:
self.assertNotEqual(record.getString('state'), 'NE')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('state', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('state'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('elevation', '>', 500)
for record in geometryData:
self.assertGreater(record.getNumber('elevation'), 500)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('elevation', '<', 100)
for record in geometryData:
self.assertLess(record.getNumber('elevation'), 100)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('elevation', '>=', 500)
for record in geometryData:
self.assertGreaterEqual(record.getNumber('elevation'), 500)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('elevation', '<=', 100)
for record in geometryData:
self.assertLessEqual(record.getNumber('elevation'), 100)
def testGetDataWithInTuple(self):
collection = ('NE', 'TX')
geometryData = self._runConstraintTest('state', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithInList(self):
collection = ['NE', 'TX']
geometryData = self._runConstraintTest('state', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithInGenerator(self):
collection = ('NE', 'TX')
generator = (item for item in collection)
geometryData = self._runConstraintTest('state', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithNotInList(self):
collection = ('NE', 'TX')
geometryData = self._runConstraintTest('state', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('state'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('state', 'junk', 'NE')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('state', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('state', 'in', [])

View file

@ -0,0 +1,117 @@
##
##
from dynamicserialize.dstypes.com.raytheon.uf.common.time import DataTime
import unittest
#
# Unit tests for Python implementation of RequestConstraint
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 08/02/16 2416 tgurney Initial creation
#
#
class DataTimeTestCase(unittest.TestCase):
def testFromStrRefTimeOnly(self):
s = '2016-08-02 01:23:45'
expected = s
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrRefTimeOnlyZeroMillis(self):
s = '2016-08-02 01:23:45.0'
# result of str() will always drop trailing .0 milliseconds
expected = '2016-08-02 01:23:45'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrRefTimeOnlyWithMillis(self):
s = '2016-08-02 01:23:45.1'
expected = '2016-08-02 01:23:45.001000'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithFcstTimeHr(self):
s = '2016-08-02 01:23:45 (17)'
expected = s
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithFcstTimeHrZeroMillis(self):
s = '2016-08-02 01:23:45.0 (17)'
expected = '2016-08-02 01:23:45 (17)'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithFcstTimeHrAndMillis(self):
s = '2016-08-02 01:23:45.1 (17)'
expected = '2016-08-02 01:23:45.001000 (17)'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithFcstTimeHrMin(self):
s = '2016-08-02 01:23:45 (17:34)'
expected = s
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithFcstTimeHrMinZeroMillis(self):
s = '2016-08-02 01:23:45.0 (17:34)'
expected = '2016-08-02 01:23:45 (17:34)'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithPeriod(self):
s = '2016-08-02 01:23:45[2016-08-02 02:34:45--2016-08-02 03:45:56]'
expected = s
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithPeriodZeroMillis(self):
s = '2016-08-02 01:23:45.0[2016-08-02 02:34:45.0--2016-08-02 03:45:56.0]'
expected = '2016-08-02 01:23:45[2016-08-02 02:34:45--2016-08-02 03:45:56]'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testFromStrWithEverything(self):
s = '2016-08-02 01:23:45.0_(17:34)[2016-08-02 02:34:45.0--2016-08-02 03:45:56.0]'
expected = '2016-08-02 01:23:45 (17:34)[2016-08-02 02:34:45--2016-08-02 03:45:56]'
self.assertEqual(expected, str(DataTime(s)))
s = s.replace(' ', '_')
self.assertEqual(expected, str(DataTime(s)))
def testDataTimeReconstructItselfFromString(self):
times = [
'2016-08-02 01:23:45',
'2016-08-02 01:23:45.0',
'2016-08-02 01:23:45.1',
'2016-08-02 01:23:45.123000',
'2016-08-02 01:23:45 (17)',
'2016-08-02 01:23:45.0 (17)',
'2016-08-02 01:23:45.1 (17)',
'2016-08-02 01:23:45 (17:34)',
'2016-08-02 01:23:45.0 (17:34)',
'2016-08-02 01:23:45.1 (17:34)',
'2016-08-02 01:23:45.0[2016-08-02_02:34:45.0--2016-08-02_03:45:56.0]',
'2016-08-02 01:23:45.0[2016-08-02_02:34:45.123--2016-08-02_03:45:56.456]',
'2016-08-02 01:23:45.456_(17:34)[2016-08-02_02:34:45.0--2016-08-02_03:45:56.0]'
]
for time in times:
self.assertEqual(DataTime(time), DataTime(str(DataTime(time))), time)

View file

@ -0,0 +1,211 @@
##
##
from __future__ import print_function
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import params
import unittest
#
# Test DAF support for ffmp data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/18/16 5587 tgurney Add test for sane handling of
# zero records returned
# 06/20/16 5587 tgurney Add identifier values tests
# 07/01/16 5728 mapeters Add advanced query tests,
# include huc and accumHrs in
# id values tests, test that
# accumHrs id is never required
# 08/03/16 5728 mapeters Fixed minor bugs, replaced
# PRTM parameter since it isn't
# configured for ec-oma
# 11/08/16 5985 tgurney Do not check data times
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Do not check data times
#
#
class FfmpTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for ffmp data"""
datatype = 'ffmp'
location = params.RADAR.lower()
@staticmethod
def addIdentifiers(req):
req.addIdentifier('wfo', params.SITE_ID)
req.addIdentifier('siteKey', 'hpe')
req.addIdentifier('dataKey', 'hpe')
req.addIdentifier('huc', 'ALL')
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.addIdentifiers(req)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
self.addIdentifiers(req)
req.setParameters('DHRMOSAIC')
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
self.addIdentifiers(req)
req.setParameters('DHRMOSAIC')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataEmptyResult(self):
req = DAL.newDataRequest(self.datatype)
self.addIdentifiers(req)
req.setParameters('blah blah blah') # force 0 records returned
result = self.runGeometryDataTest(req, checkDataTimes=False)
self.assertEqual(len(result), 0)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
ids = requiredIds | optionalIds
for id in ids:
req = DAL.newDataRequest(self.datatype)
if id == 'accumHrs':
req.setParameters('ARI6H2YR')
req.addIdentifier('wfo', params.SITE_ID)
req.addIdentifier('siteKey', self.location)
req.addIdentifier('huc', 'ALL')
idValues = DAL.getIdentifierValues(req, id)
self.assertTrue(hasattr(idValues, '__iter__'))
print(id + " values: " + str(idValues))
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.addIdentifier('wfo', params.SITE_ID)
req.addIdentifier('huc', 'ALL')
req.setParameters('QPFSCAN')
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('siteKey', '=', self.location)
for record in geometryData:
self.assertEqual(record.getAttribute('siteKey'), self.location)
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('siteKey', '=', unicode(self.location))
for record in geometryData:
self.assertEqual(record.getAttribute('siteKey'), self.location)
# No numeric tests since no numeric identifiers are available that support
# RequestConstraints.
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('siteKey', '=', None)
for record in geometryData:
self.assertIsNone(record.getAttribute('siteKey'))
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('siteKey', '!=', self.location)
for record in geometryData:
self.assertNotEqual(record.getAttribute('siteKey'), self.location)
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('siteKey', '!=', None)
for record in geometryData:
self.assertIsNotNone(record.getAttribute('siteKey'))
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('siteKey', '>', self.location)
for record in geometryData:
self.assertGreater(record.getAttribute('siteKey'), self.location)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('siteKey', '<', self.location)
for record in geometryData:
self.assertLess(record.getAttribute('siteKey'), self.location)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('siteKey', '>=', self.location)
for record in geometryData:
self.assertGreaterEqual(record.getAttribute('siteKey'), self.location)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('siteKey', '<=', self.location)
for record in geometryData:
self.assertLessEqual(record.getAttribute('siteKey'), self.location)
def testGetDataWithInList(self):
collection = [self.location, 'kuex']
geometryData = self._runConstraintTest('siteKey', 'in', collection)
for record in geometryData:
self.assertIn(record.getAttribute('siteKey'), collection)
def testGetDataWithNotInList(self):
collection = [self.location, 'kuex']
geometryData = self._runConstraintTest('siteKey', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getAttribute('siteKey'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('siteKey', 'junk', self.location)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('siteKey', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('siteKey', 'in', [])
def testGetDataWithSiteKeyAndDataKeyConstraints(self):
siteKeys = [self.location, 'hpe']
dataKeys = ['kuex', 'kdmx']
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('wfo', params.SITE_ID)
req.addIdentifier('huc', 'ALL')
siteKeysConstraint = RequestConstraint.new('in', siteKeys)
req.addIdentifier('siteKey', siteKeysConstraint)
dataKeysConstraint = RequestConstraint.new('in', dataKeys)
req.addIdentifier('dataKey', dataKeysConstraint)
req.setParameters('QPFSCAN')
geometryData = self.runGeometryDataTest(req, checkDataTimes=False)
for record in geometryData:
self.assertIn(record.getAttribute('siteKey'), siteKeys)
# dataKey attr. is comma-separated list of dataKeys that had data
for dataKey in record.getAttribute('dataKey').split(','):
self.assertIn(dataKey, dataKeys)
def testGetGuidanceDataWithoutAccumHrsIdentifierSet(self):
# Test that accumHrs identifier is not required for guidance data
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('wfo', params.SITE_ID)
req.addIdentifier('siteKey', self.location)
req.addIdentifier('huc', 'ALL')
req.setParameters('FFG0124hr')
self.runGeometryDataTest(req, checkDataTimes=False)

View file

@ -0,0 +1,203 @@
##
##
from __future__ import print_function
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from awips.dataaccess import DataAccessLayer as DAL
from shapely.geometry import box, Point
import baseDafTestCase
import params
import unittest
#
# Test DAF support for GFE data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 05/23/16 5637 bsteffen Test vectors
# 05/31/16 5587 tgurney Add getIdentifierValues tests
# 06/01/16 5587 tgurney Update testGetIdentifierValues
# 06/17/16 5574 mapeters Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 11/07/16 5991 bsteffen Improve vector tests
# 12/07/16 5981 tgurney Parameterize
# 12/15/16 6040 tgurney Add testGetGridDataWithDbType
# 12/20/16 5981 tgurney Add envelope test
# 10/19/17 6491 tgurney Add test for dbtype identifier
# 11/10/17 6491 tgurney Replace modelName with
# parmId.dbId.modelName
#
#
class GfeTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for GFE data"""
datatype = 'gfe'
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('parmId.dbId.modelName', 'Fcst')
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('parmId.dbId.modelName', 'Fcst')
req.addIdentifier('parmId.dbId.siteId', params.SITE_ID)
self.runTimesTest(req)
def testGetGridData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('parmId.dbId.modelName', 'Fcst')
req.addIdentifier('parmId.dbId.siteId', params.SITE_ID)
req.setParameters('T')
gridDatas = self.runGridDataTest(req)
for gridData in gridDatas:
self.assertEqual(gridData.getAttribute('parmId.dbId.dbType'), '')
def testGetGridDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('parmId.dbId.modelName', 'Fcst')
req.addIdentifier('parmId.dbId.siteId', params.SITE_ID)
req.setParameters('T')
req.setEnvelope(params.ENVELOPE)
gridData = self.runGridDataTest(req)
if not gridData:
raise unittest.SkipTest('no data available')
lons, lats = gridData[0].getLatLonCoords()
lons = lons.reshape(-1)
lats = lats.reshape(-1)
# Ensure all points are within one degree of the original box
# to allow slight margin of error for reprojection distortion.
testEnv = box(params.ENVELOPE.bounds[0] - 1, params.ENVELOPE.bounds[1] - 1,
params.ENVELOPE.bounds[2] + 1, params.ENVELOPE.bounds[3] + 1 )
for i in range(len(lons)):
self.assertTrue(testEnv.contains(Point(lons[i], lats[i])))
def testGetVectorGridData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('parmId.dbId.modelName', 'Fcst')
req.addIdentifier('parmId.dbId.siteId', params.SITE_ID)
req.setParameters('Wind')
times = DAL.getAvailableTimes(req)
if not(times):
raise unittest.SkipTest('No Wind Data available for testing')
gridData = DAL.getGridData(req, [times[0]])
rawWind = None
rawDir = None
for grid in gridData:
if grid.getParameter() == 'Wind':
self.assertEqual(grid.getUnit(),'kts')
rawWind = grid.getRawData()
elif grid.getParameter() == 'WindDirection':
self.assertEqual(grid.getUnit(),'deg')
rawDir = grid.getRawData()
self.assertIsNotNone(rawWind, 'Wind Magnitude grid is not present')
self.assertIsNotNone(rawDir, 'Wind Direction grid is not present')
# rawWind and rawDir are numpy.ndarrays so comparison will result in boolean ndarrays.
self.assertTrue((rawWind >= 0).all(), 'Wind Speed should not contain negative values')
self.assertTrue((rawDir >= 0).all(), 'Wind Direction should not contain negative values')
self.assertTrue((rawDir <= 360).all(), 'Wind Direction should be less than or equal to 360')
self.assertFalse((rawDir == rawWind).all(), 'Wind Direction should be different from Wind Speed')
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setLocationNames(params.SITE_ID)
req.setParameters('T')
return self.runGridDataTest(req)
def testGetDataWithModelNameEqualsString(self):
gridData = self._runConstraintTest('parmId.dbId.modelName', '=', 'Fcst')
for record in gridData:
self.assertEqual(record.getAttribute('parmId.dbId.modelName'), 'Fcst')
def testGetDataWithDbTypeEqualsString(self):
gridData = self._runConstraintTest('parmId.dbId.dbType', '=', 'Prac')
for record in gridData:
self.assertEqual(record.getAttribute('parmId.dbId.dbType'), 'Prac')
def testGetDataWithEqualsUnicode(self):
gridData = self._runConstraintTest('parmId.dbId.modelName', '=', u'Fcst')
for record in gridData:
self.assertEqual(record.getAttribute('parmId.dbId.modelName'), 'Fcst')
# No numeric tests since no numeric identifiers are available.
def testGetDataWithEqualsNone(self):
gridData = self._runConstraintTest('parmId.dbId.modelName', '=', None)
for record in gridData:
self.assertIsNone(record.getAttribute('parmId.dbId.modelName'))
def testGetDataWithNotEquals(self):
gridData = self._runConstraintTest('parmId.dbId.modelName', '!=', 'Fcst')
for record in gridData:
self.assertNotEqual(record.getAttribute('parmId.dbId.modelName'), 'Fcst')
def testGetDataWithNotEqualsNone(self):
gridData = self._runConstraintTest('parmId.dbId.modelName', '!=', None)
for record in gridData:
self.assertIsNotNone(record.getAttribute('parmId.dbId.modelName'))
def testGetDataWithInTuple(self):
collection = ('Fcst', 'SAT')
gridData = self._runConstraintTest('parmId.dbId.modelName', 'in', collection)
for record in gridData:
self.assertIn(record.getAttribute('parmId.dbId.modelName'), collection)
def testGetDataWithInList(self):
collection = ['Fcst', 'SAT']
gridData = self._runConstraintTest('parmId.dbId.modelName', 'in', collection)
for record in gridData:
self.assertIn(record.getAttribute('parmId.dbId.modelName'), collection)
def testGetDataWithInGenerator(self):
collection = ('Fcst', 'SAT')
generator = (item for item in collection)
gridData = self._runConstraintTest('parmId.dbId.modelName', 'in', generator)
for record in gridData:
self.assertIn(record.getAttribute('parmId.dbId.modelName'), collection)
def testGetDataWithNotInList(self):
collection = ('Fcst', 'SAT')
gridData = self._runConstraintTest('parmId.dbId.modelName', 'not in', collection)
for record in gridData:
self.assertNotIn(record.getAttribute('parmId.dbId.modelName'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('parmId.dbId.modelName', 'junk', 'Fcst')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('parmId.dbId.modelName', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('parmId.dbId.modelName', 'in', [])

View file

@ -0,0 +1,203 @@
##
##
from __future__ import print_function
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import params
#
# Test DAF support for GFE edit area data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 06/08/17 6298 mapeters Initial Creation.
# 09/27/17 6463 tgurney Remove GID site identifier
#
#
class GfeEditAreaTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for GFE edit area data"""
datatype = 'gfeEditArea'
siteIdKey = 'siteId'
editAreaNames = ['ISC_NHA', 'SDZ066', 'StormSurgeWW_EditArea']
groupKey = 'group'
groups = ['ISC', 'WFOs', 'FIPS_' + params.SITE_ID]
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
with self.assertRaises(ThriftRequestException):
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
with self.assertRaises(ThriftRequestException):
self.runTimesTest(req)
def testGetGeometryDataWithoutSiteIdThrowsException(self):
req = DAL.newDataRequest(self.datatype)
with self.assertRaises(ThriftRequestException):
self.runGeometryDataTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
data = self.runGeometryDataTest(req)
for item in data:
self.assertEqual(params.SITE_ID, item.getAttribute(self.siteIdKey))
def testGetGeometryDataWithLocNames(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
req.setLocationNames(*self.editAreaNames)
data = self.runGeometryDataTest(req)
for item in data:
self.assertEqual(params.SITE_ID, item.getAttribute(self.siteIdKey))
self.assertIn(item.getLocationName(), self.editAreaNames)
def testGetGeometryDataWithGroups(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
req.addIdentifier(self.groupKey, RequestConstraint.new('in', self.groups))
data = self.runGeometryDataTest(req)
for item in data:
self.assertEqual(params.SITE_ID, item.getAttribute(self.siteIdKey))
self.assertIn(item.getAttribute(self.groupKey), self.groups)
def testGetGeometryDataWithLocNamesAndGroupsThrowException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
req.setLocationNames(*self.editAreaNames)
req.addIdentifier(self.groupKey, RequestConstraint.new('in', self.groups))
with self.assertRaises(ThriftRequestException):
self.runGeometryDataTest(req)
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier(self.siteIdKey, params.SITE_ID)
req.setEnvelope(params.ENVELOPE)
data = self.runGeometryDataTest(req)
for item in data:
self.assertEqual(params.SITE_ID, item.getAttribute(self.siteIdKey))
self.assertTrue(params.ENVELOPE.intersects(item.getGeometry()))
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setLocationNames(*self.editAreaNames)
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geomData = self._runConstraintTest(self.siteIdKey, '=', params.SITE_ID)
for record in geomData:
self.assertEqual(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithEqualsUnicode(self):
geomData = self._runConstraintTest(self.siteIdKey, '=', params.SITE_ID.decode('unicode-escape'))
for record in geomData:
self.assertEqual(record.getAttribute(self.siteIdKey), params.SITE_ID)
# No numeric tests since no numeric identifiers are available.
def testGetDataWithEqualsNone(self):
geomData = self._runConstraintTest(self.siteIdKey, '=', None)
for record in geomData:
self.assertIsNone(record.getAttribute(self.siteIdKey))
def testGetDataWithNotEquals(self):
geomData = self._runConstraintTest(self.siteIdKey, '!=', params.SITE_ID)
for record in geomData:
self.assertNotEqual(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithNotEqualsNone(self):
geomData = self._runConstraintTest(self.siteIdKey, '!=', None)
for record in geomData:
self.assertIsNotNone(record.getAttribute(self.siteIdKey))
def testGetDataWithGreaterThan(self):
geomData = self._runConstraintTest(self.siteIdKey, '>', params.SITE_ID)
for record in geomData:
self.assertGreater(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithLessThan(self):
geomData = self._runConstraintTest(self.siteIdKey, '<', params.SITE_ID)
for record in geomData:
self.assertLess(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithGreaterThanEquals(self):
geomData = self._runConstraintTest(self.siteIdKey, '>=', params.SITE_ID)
for record in geomData:
self.assertGreaterEqual(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithLessThanEquals(self):
geomData = self._runConstraintTest(self.siteIdKey, '<=', params.SITE_ID)
for record in geomData:
self.assertLessEqual(record.getAttribute(self.siteIdKey), params.SITE_ID)
def testGetDataWithInTuple(self):
collection = (params.SITE_ID,)
geomData = self._runConstraintTest(self.siteIdKey, 'in', collection)
for record in geomData:
self.assertIn(record.getAttribute(self.siteIdKey), collection)
def testGetDataWithInList(self):
collection = [params.SITE_ID,]
geomData = self._runConstraintTest(self.siteIdKey, 'in', collection)
for record in geomData:
self.assertIn(record.getAttribute(self.siteIdKey), collection)
def testGetDataWithInGenerator(self):
collection = (params.SITE_ID,)
generator = (item for item in collection)
geomData = self._runConstraintTest(self.siteIdKey, 'in', generator)
for record in geomData:
self.assertIn(record.getAttribute(self.siteIdKey), collection)
def testGetDataWithNotInList(self):
collection = [params.SITE_ID,]
geomData = self._runConstraintTest(self.siteIdKey, 'not in', collection)
for record in geomData:
self.assertNotIn(record.getAttribute(self.siteIdKey), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest(self.siteIdKey, 'junk', params.SITE_ID)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest(self.siteIdKey, '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest(self.siteIdKey, 'in', [])

View file

@ -0,0 +1,271 @@
##
##
from __future__ import print_function
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from shapely.geometry import box, Point
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import params
import unittest
#
# Test DAF support for grid data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 tgurney Typo in id values test
# 07/06/16 5728 mapeters Add advanced query tests
# 08/03/16 5728 mapeters Add additional identifiers to testGetDataWith*
# tests to shorten run time and prevent EOFError
# 10/13/16 5942 bsteffen Test envelopes
# 11/08/16 5985 tgurney Skip certain tests when no
# data is available
# 12/07/16 5981 tgurney Parameterize
# 01/06/17 5981 tgurney Skip envelope test when no
# data is available
#
class GridTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for grid data"""
datatype = 'grid'
model = 'GFS160'
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
self.runLocationsTest(req)
def testGetAvailableLevels(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
self.runLevelsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
req.setLevels('2FHAG')
self.runTimesTest(req)
def testGetGridData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
req.setLevels('2FHAG')
req.setParameters('T')
self.runGridDataTest(req)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', 'ENSEMBLE')
req.setLevels('2FHAG')
req.setParameters('T')
idValues = DAL.getIdentifierValues(req, 'info.ensembleId')
self.assertTrue(hasattr(idValues, '__iter__'))
if idValues:
self.assertIn('ctl1', idValues)
self.assertIn('p1', idValues)
self.assertIn('n1', idValues)
else:
raise unittest.SkipTest("no data available")
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def testGetDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('info.datasetId', self.model)
req.setLevels('2FHAG')
req.setParameters('T')
req.setEnvelope(params.ENVELOPE)
gridData = self.runGridDataTest(req)
if len(gridData) == 0:
raise unittest.SkipTest("No data available")
lons, lats = gridData[0].getLatLonCoords()
lons = lons.reshape(-1)
lats = lats.reshape(-1)
# Ensure all points are within one degree of the original box
# to allow slight margin of error for reprojection distortion.
testEnv = box(params.ENVELOPE.bounds[0] - 1, params.ENVELOPE.bounds[1] - 1,
params.ENVELOPE.bounds[2] + 1, params.ENVELOPE.bounds[3] + 1 )
for i in range(len(lons)):
self.assertTrue(testEnv.contains(Point(lons[i], lats[i])))
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.addIdentifier('info.datasetId', self.model)
req.addIdentifier('info.level.masterLevel.name', 'FHAG')
req.addIdentifier('info.level.leveltwovalue', 3000.0)
req.setParameters('T')
return self.runGridDataTest(req)
def testGetDataWithEqualsString(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', '2000.0')
for record in gridData:
self.assertEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithEqualsUnicode(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', u'2000.0')
for record in gridData:
self.assertEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithEqualsInt(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', 2000)
for record in gridData:
self.assertEqual(record.getAttribute('info.level.levelonevalue'), 2000)
def testGetDataWithEqualsLong(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', 2000L)
for record in gridData:
self.assertEqual(record.getAttribute('info.level.levelonevalue'), 2000)
def testGetDataWithEqualsFloat(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', 2000.0)
for record in gridData:
self.assertEqual(round(record.getAttribute('info.level.levelonevalue'), 1), 2000.0)
def testGetDataWithEqualsNone(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '=', None)
for record in gridData:
self.assertIsNone(record.getAttribute('info.level.levelonevalue'))
def testGetDataWithNotEquals(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '!=', 2000.0)
for record in gridData:
self.assertNotEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithNotEqualsNone(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '!=', None)
for record in gridData:
self.assertIsNotNone(record.getAttribute('info.level.levelonevalue'))
def testGetDataWithGreaterThan(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '>', 2000.0)
for record in gridData:
self.assertGreater(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithLessThan(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '<', 2000.0)
for record in gridData:
self.assertLess(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithGreaterThanEquals(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '>=', 2000.0)
for record in gridData:
self.assertGreaterEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithLessThanEquals(self):
gridData = self._runConstraintTest('info.level.levelonevalue', '<=', 2000.0)
for record in gridData:
self.assertLessEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
def testGetDataWithInList(self):
collection = [2000.0, 1000.0]
gridData = self._runConstraintTest('info.level.levelonevalue', 'in', collection)
for record in gridData:
self.assertIn(record.getAttribute('info.level.levelonevalue'), collection)
def testGetDataWithNotInList(self):
collection = [2000.0, 1000.0]
gridData = self._runConstraintTest('info.level.levelonevalue', 'not in', collection)
for record in gridData:
self.assertNotIn(record.getAttribute('info.level.levelonevalue'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('info.level.levelonevalue', 'junk', '2000.0')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('info.level.levelonevalue', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('info.level.levelonevalue', 'in', [])
def testGetDataWithLevelOneAndLevelTwoConstraints(self):
req = DAL.newDataRequest(self.datatype)
levelOneConstraint = RequestConstraint.new('>=', 2000.0)
req.addIdentifier('info.level.levelonevalue', levelOneConstraint)
levelTwoConstraint = RequestConstraint.new('in', (4000.0, 5000.0))
req.addIdentifier('info.level.leveltwovalue', levelTwoConstraint)
req.addIdentifier('info.datasetId', self.model)
req.addIdentifier('info.level.masterLevel.name', 'FHAG')
req.setParameters('T')
gridData = self.runGridDataTest(req)
for record in gridData:
self.assertGreaterEqual(record.getAttribute('info.level.levelonevalue'), 2000.0)
self.assertIn(record.getAttribute('info.level.leveltwovalue'), (4000.0, 5000.0))
def testGetDataWithMasterLevelNameInConstraint(self):
req = DAL.newDataRequest(self.datatype)
masterLevelConstraint = RequestConstraint.new('in', ('FHAG', 'K'))
req.addIdentifier('info.level.masterLevel.name', masterLevelConstraint)
req.addIdentifier('info.level.levelonevalue', 2000.0)
req.addIdentifier('info.level.leveltwovalue', 3000.0)
req.addIdentifier('info.datasetId', 'GFS160')
req.setParameters('T')
gridData = self.runGridDataTest(req)
for record in gridData:
self.assertIn(record.getAttribute('info.level.masterLevel.name'), ('FHAG', 'K'))
def testGetDataWithDatasetIdInConstraint(self):
req = DAL.newDataRequest(self.datatype)
# gfs160 is alias for GFS160 in this namespace
req.addIdentifier('namespace', 'gfeParamInfo')
datasetIdConstraint = RequestConstraint.new('in', ('gfs160', 'HRRR'))
req.addIdentifier('info.datasetId', datasetIdConstraint)
req.addIdentifier('info.level.masterLevel.name', 'FHAG')
req.addIdentifier('info.level.levelonevalue', 2000.0)
req.addIdentifier('info.level.leveltwovalue', 3000.0)
req.setParameters('T')
gridData = self.runGridDataTest(req, testSameShape=False)
for record in gridData:
self.assertIn(record.getAttribute('info.datasetId'), ('gfs160', 'HRRR'))
def testGetDataWithMasterLevelNameLessThanEqualsConstraint(self):
req = DAL.newDataRequest(self.datatype)
masterLevelConstraint = RequestConstraint.new('<=', 'K')
req.addIdentifier('info.level.masterLevel.name', masterLevelConstraint)
req.addIdentifier('info.level.levelonevalue', 2000.0)
req.addIdentifier('info.level.leveltwovalue', 3000.0)
req.addIdentifier('info.datasetId', 'GFS160')
req.setParameters('T')
gridData = self.runGridDataTest(req)
for record in gridData:
self.assertLessEqual(record.getAttribute('info.level.masterLevel.name'), 'K')
def testGetDataWithComplexConstraintAndNamespaceThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('namespace', 'grib')
masterLevelConstraint = RequestConstraint.new('<=', 'K')
req.addIdentifier('info.level.masterLevel.name', masterLevelConstraint)
req.addIdentifier('info.datasetId', 'GFS160')
req.setParameters('T')
with self.assertRaises(ThriftRequestException) as cm:
self.runGridDataTest(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
self.assertIn('info.level.masterLevel.name', str(cm.exception))

View file

@ -0,0 +1,251 @@
##
##
from __future__ import print_function
import datetime
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from dynamicserialize.dstypes.com.raytheon.uf.common.time import TimeRange
import baseDafTestCase
import unittest
#
# Test DAF support for hydro data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/21/16 5596 tgurney Add tests to verify #5596
# 04/26/16 5587 tgurney Add identifier values tests
# 06/09/16 5574 tgurney Add advanced query tests
# 06/13/16 5574 tgurney Fix checks for None
# 06/21/16 5548 tgurney Skip tests that cause errors
# 06/30/16 5725 tgurney Add test for NOT IN
# 10/06/16 5926 dgilling Add additional location tests.
#
#
class HydroTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for hydro data"""
datatype = 'hydro'
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
self.runParametersTest(req)
def testGetAvailableParametersFullyQualifiedTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'public.height')
self.runParametersTest(req)
def testGetAvailableParamsNoTableThrowsInvalidIdentifiersException(self):
req = DAL.newDataRequest(self.datatype)
with self.assertRaises(ThriftRequestException) as cm:
self.runParametersTest(req)
self.assertIn('InvalidIdentifiersException', str(cm.exception))
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
self.runLocationsTest(req)
def testGetAvailableLocationsWithConstraint(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
req.addIdentifier('value', RequestConstraint.new('>', 5.0))
self.runLocationsTest(req)
def testGetAvailableLocationsWithInvalidTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'city')
with self.assertRaises(ThriftRequestException) as cm:
DAL.getAvailableLocationNames(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
req.setParameters('lid', 'quality_code')
self.runTimesTest(req)
def testGetGeometryDataWithoutLocationSpecified(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
req.setParameters('lid', 'quality_code')
self.runGeometryDataTest(req)
def testGetGeometryDataWithLocationSpecified(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'fcstheight')
locs = DAL.getAvailableLocationNames(req)
if locs:
req.setLocationNames(locs[0])
req.setParameters('probability', 'value')
data = self.runGeometryDataTest(req)
self.assertNotEqual(len(data), 0)
def testGetTableIdentifierValues(self):
self.runGetIdValuesTest(['table'])
def testGetColumnIdValuesWithTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
idValues = DAL.getIdentifierValues(req, 'lid')
self.assertTrue(hasattr(idValues, '__iter__'))
@unittest.skip('avoid EDEX error')
def testGetColumnIdValuesWithNonexistentTableThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'nonexistentjunk')
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'lid')
def testGetColumnIdValuesWithoutTableThrowsException(self):
req = DAL.newDataRequest(self.datatype)
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'lid')
@unittest.skip('avoid EDEX error')
def testGetNonexistentColumnIdValuesThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'nonexistentjunk')
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.addIdentifier('table', 'height')
req.addIdentifier('ts', 'RG')
req.setParameters('value', 'lid', 'quality_code')
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('value', '=', '3')
for record in geometryData:
self.assertEqual(record.getNumber('value'), 3)
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('value', '=', u'3')
for record in geometryData:
self.assertEqual(record.getNumber('value'), 3)
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('value', '=', 3)
for record in geometryData:
self.assertEqual(record.getNumber('value'), 3)
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('value', '=', 3L)
for record in geometryData:
self.assertEqual(record.getNumber('value'), 3L)
def testGetDataWithEqualsFloat(self):
geometryData = self._runConstraintTest('value', '=', 3.0)
for record in geometryData:
self.assertEqual(round(record.getNumber('value'), 1), 3.0)
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('value', '=', None)
self.assertEqual(len(geometryData), 0)
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('value', '!=', 3)
for record in geometryData:
self.assertNotEqual(record.getNumber('value'), '3')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('value', '!=', None)
self.assertNotEqual(len(geometryData), 0)
for record in geometryData:
self.assertNotEqual(record.getType('value'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('value', '>', 3)
for record in geometryData:
self.assertGreater(record.getNumber('value'), 3)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('value', '<', 3)
for record in geometryData:
self.assertLess(record.getNumber('value'), 3)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('value', '>=', 3)
for record in geometryData:
self.assertGreaterEqual(record.getNumber('value'), 3)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('value', '<=', 3)
for record in geometryData:
self.assertLessEqual(record.getNumber('value'), 3)
def testGetDataWithInTuple(self):
collection = (3, 4)
geometryData = self._runConstraintTest('value', 'in', collection)
for record in geometryData:
self.assertIn(record.getNumber('value'), collection)
def testGetDataWithInList(self):
collection = [3, 4]
geometryData = self._runConstraintTest('value', 'in', collection)
for record in geometryData:
self.assertIn(record.getNumber('value'), collection)
def testGetDataWithInGenerator(self):
collection = (3, 4)
generator = (item for item in collection)
geometryData = self._runConstraintTest('value', 'in', generator)
for record in geometryData:
self.assertIn(record.getNumber('value'), collection)
def testGetDataWithNotInList(self):
collection = [3, 4]
geometryData = self._runConstraintTest('value', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getNumber('value'), collection)
def testGetDataWithTimeRange(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'height')
req.addIdentifier('ts', 'RG')
req.setParameters('value', 'lid', 'quality_code')
times = DAL.getAvailableTimes(req)
limitTimes = times[-self.numTimesToLimit:]
startTime = datetime.datetime.utcfromtimestamp(limitTimes[0].getRefTime().getTime()/1000)
endTime = datetime.datetime.utcnow()
tr = TimeRange(startTime, endTime)
self.runGeometryDataTestWithTimeRange(req, tr)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('value', 'junk', 3)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('value', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('value', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('3', '4', ())
with self.assertRaises(TypeError):
self._runConstraintTest('value', 'in', collection)

View file

@ -0,0 +1,68 @@
##
##
from __future__ import print_function
from shapely.geometry import Polygon
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import unittest
#
# Test DAF support for ldadmesonet data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 01/20/17 6095 tgurney Add null identifiers test
#
#
class LdadMesonetTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for ldadmesonet data"""
datatype = "ldadmesonet"
envelope = None
@classmethod
def getReqEnvelope(cls):
# Restrict the output to only records with latitude and
# longitude between -30 and 30.
if not cls.envelope:
vertices = [(-30, -30), (-30, 30), (30, 30), (30, -30)]
polygon = Polygon(vertices)
cls.envelope = polygon.envelope
return cls.envelope
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(self.getReqEnvelope())
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(self.getReqEnvelope())
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("highLevelCloud", "pressure")
req.setEnvelope(self.getReqEnvelope())
self.runGeometryDataTest(req)
def testGetGeometryDataNullIdentifiers(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("highLevelCloud", "pressure")
req.setEnvelope(self.getReqEnvelope())
req.identifiers = None
self.runGeometryDataTest(req)

View file

@ -0,0 +1,202 @@
##
##
from __future__ import print_function
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import unittest
#
# Test DAF support for maps data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/26/16 5587 tgurney Add identifier values tests
# 06/13/16 5574 mapeters Add advanced query tests
# 06/21/16 5548 tgurney Skip tests that cause errors
# 06/30/16 5725 tgurney Add test for NOT IN
# 01/06/17 5981 tgurney Do not check data times
#
#
class MapsTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for maps data"""
datatype = 'maps'
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.county')
req.addIdentifier('geomField', 'the_geom')
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.county')
req.addIdentifier('geomField', 'the_geom')
req.addIdentifier('locationField', 'cwa')
self.runLocationsTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.county')
req.addIdentifier('geomField', 'the_geom')
req.addIdentifier('inLocation', 'true')
req.addIdentifier('locationField', 'cwa')
req.setLocationNames('OAX')
req.addIdentifier('cwa', 'OAX')
req.setParameters('countyname', 'state', 'fips')
self.runGeometryDataTest(req, checkDataTimes=False)
def testRequestingTimesThrowsTimeAgnosticDataException(self):
req = DAL.newDataRequest(self.datatype)
self.runTimeAgnosticTest(req)
def testGetTableIdentifierValues(self):
self.runGetIdValuesTest(['table'])
def testGetGeomFieldIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.county')
idValues = DAL.getIdentifierValues(req, 'geomField')
for idValue in idValues:
self.assertTrue(idValue.startswith('the_geom'))
def testGetGeomFieldIdValuesWithoutTableThrowsException(self):
with self.assertRaises(ThriftRequestException):
self.runGetIdValuesTest(['geomField'])
def testGetColumnIdValuesWithTable(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.county')
req.addIdentifier('geomField', 'the_geom')
idValues = DAL.getIdentifierValues(req, 'state')
self.assertIn('NE', idValues)
def testGetColumnIdValuesWithoutTableThrowsException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('geomField', 'the_geom')
with self.assertRaises(ThriftRequestException):
idValues = DAL.getIdentifierValues(req, 'state')
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier('table', 'mapdata.ffmp_basins')
req.addIdentifier('geomField', 'the_geom')
req.addIdentifier('cwa', 'OAX')
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('state', 'reservoir', 'area_sq_mi')
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('state', '=', 'NE')
for record in geometryData:
self.assertEqual(record.getString('state'), 'NE')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('state', '=', u'NE')
for record in geometryData:
self.assertEqual(record.getString('state'), 'NE')
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('reservoir', '=', 1)
for record in geometryData:
self.assertEqual(record.getNumber('reservoir'), 1)
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('reservoir', '=', 1L)
for record in geometryData:
self.assertEqual(record.getNumber('reservoir'), 1)
def testGetDataWithEqualsFloat(self):
geometryData = self._runConstraintTest('area_sq_mi', '=', 5.00)
for record in geometryData:
self.assertEqual(round(record.getNumber('area_sq_mi'), 2), 5.00)
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('state', '=', None)
for record in geometryData:
self.assertEqual(record.getType('state'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('state', '!=', 'NE')
for record in geometryData:
self.assertNotEqual(record.getString('state'), 'NE')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('state', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('state'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('area_sq_mi', '>', 5)
for record in geometryData:
self.assertGreater(record.getNumber('area_sq_mi'), 5)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('area_sq_mi', '<', 5)
for record in geometryData:
self.assertLess(record.getNumber('area_sq_mi'), 5)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('area_sq_mi', '>=', 5)
for record in geometryData:
self.assertGreaterEqual(record.getNumber('area_sq_mi'), 5)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('area_sq_mi', '<=', 5)
for record in geometryData:
self.assertLessEqual(record.getNumber('area_sq_mi'), 5)
def testGetDataWithInTuple(self):
collection = ('NE', 'TX')
geometryData = self._runConstraintTest('state', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithInList(self):
collection = ['NE', 'TX']
geometryData = self._runConstraintTest('state', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithInGenerator(self):
collection = ('NE', 'TX')
generator = (item for item in collection)
geometryData = self._runConstraintTest('state', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('state'), collection)
def testGetDataWithNotInList(self):
collection = ['IA', 'TX']
geometryData = self._runConstraintTest('state', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('state'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('state', 'junk', 'NE')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('state', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('state', 'in', [])

View file

@ -0,0 +1,200 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import params
import unittest
#
# Test DAF support for modelsounding data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 bsteffen Add getIdentifierValues tests
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 11/10/16 5985 tgurney Mark expected failures prior
# to 17.3.1
# 12/07/16 5981 tgurney Parameterize
# 12/19/16 5981 tgurney Remove pre-17.3 expected fails
# 12/20/16 5981 tgurney Add envelope test
#
#
class ModelSoundingTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for modelsounding data"""
datatype = "modelsounding"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("reportType", "ETA")
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("reportType", "ETA")
req.setLocationNames(params.OBS_STATION)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("reportType", "ETA")
req.setLocationNames(params.OBS_STATION)
req.setParameters("temperature", "pressure", "specHum", "sfcPress", "temp2", "q2")
print("Testing getGeometryData()")
geomData = DAL.getGeometryData(req)
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
print("level=" + record.getLevel(), end="")
# One dimensional parameters are reported on the 0.0UNKNOWN level.
# 2D parameters are reported on MB levels from pressure.
if record.getLevel() == "0.0UNKNOWN":
print(" sfcPress=" + record.getString("sfcPress") +
record.getUnit("sfcPress"), end="")
print(" temp2=" + record.getString("temp2") +
record.getUnit("temp2"), end="")
print(" q2=" + record.getString("q2") +
record.getUnit("q2"), end="")
else:
print(" pressure=" + record.getString("pressure") +
record.getUnit("pressure"), end="")
print(" temperature=" + record.getString("temperature") +
record.getUnit("temperature"), end="")
print(" specHum=" + record.getString("specHum") +
record.getUnit("specHum"), end="")
print(" geometry=" + str(record.getGeometry()))
print("getGeometryData() complete\n\n")
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("reportType", "ETA")
req.setEnvelope(params.ENVELOPE)
req.setParameters("temperature", "pressure", "specHum", "sfcPress", "temp2", "q2")
print("Testing getGeometryData()")
data = DAL.getGeometryData(req)
for item in data:
self.assertTrue(params.ENVELOPE.contains(item.getGeometry()))
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
self.runGetIdValuesTest(optionalIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.setParameters('dataURI')
req.setLocationNames(params.OBS_STATION, 'KORD', 'KOFK', 'KLNK')
req.addIdentifier(key, constraint)
return self.runGeometryDataTest(req)
# We can filter on reportType but it is not possible to retrieve the value
# of reportType directly. We can look inside the dataURI instead.
#
# For cases like '<=' and '>' the best we can do is send the request and
# see if it throws back an exception.
#
# Can also eyeball the number of returned records.
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('reportType', '=', 'ETA')
for record in geometryData:
self.assertIn('/ETA/', record.getString('dataURI'))
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('reportType', '=', u'ETA')
for record in geometryData:
self.assertIn('/ETA/', record.getString('dataURI'))
# No numeric tests since no numeric identifiers are available.
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '=', None)
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('reportType', '!=', 'ETA')
for record in geometryData:
self.assertNotIn('/ETA/', record.getString('dataURI'))
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '!=', None)
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('reportType', '>', 'ETA')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('reportType', '<', 'ETA')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('reportType', '>=', 'ETA')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('reportType', '<=', 'ETA')
def testGetDataWithInTuple(self):
collection = ('ETA', 'GFS')
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
dataURI = record.getString('dataURI')
self.assertTrue('/ETA/' in dataURI or '/GFS/' in dataURI)
def testGetDataWithInList(self):
collection = ['ETA', 'GFS']
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
dataURI = record.getString('dataURI')
self.assertTrue('/ETA/' in dataURI or '/GFS/' in dataURI)
def testGetDataWithInGenerator(self):
collection = ('ETA', 'GFS')
generator = (item for item in collection)
geometryData = self._runConstraintTest('reportType', 'in', generator)
for record in geometryData:
dataURI = record.getString('dataURI')
self.assertTrue('/ETA/' in dataURI or '/GFS/' in dataURI)
def testGetDataWithNotInList(self):
collection = ['ETA', 'GFS']
geometryData = self._runConstraintTest('reportType', 'not in', collection)
for record in geometryData:
dataURI = record.getString('dataURI')
self.assertTrue('/ETA/' not in dataURI and '/GFS/' not in dataURI)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'junk', 'ETA')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('ETA', 'GFS', ())
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', 'in', collection)

View file

@ -0,0 +1,169 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import params
import unittest
#
# Test DAF support for obs data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 bsteffen Add getIdentifierValues tests
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Add envelope test
#
#
class ObsTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for obs data"""
datatype = "obs"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.OBS_STATION)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.OBS_STATION)
req.setParameters("temperature", "seaLevelPress", "dewpoint")
data = self.runGeometryDataTest(req)
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setParameters("temperature", "seaLevelPress", "dewpoint")
data = self.runGeometryDataTest(req)
for item in data:
self.assertTrue(params.ENVELOPE.contains(item.getGeometry()))
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
self.runGetIdValuesTest(optionalIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.setParameters("temperature", "reportType")
req.setLocationNames(params.OBS_STATION)
req.addIdentifier(key, constraint)
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('reportType', '=', 'METAR')
for record in geometryData:
self.assertEqual(record.getString('reportType'), 'METAR')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('reportType', '=', u'METAR')
for record in geometryData:
self.assertEqual(record.getString('reportType'), 'METAR')
# No numeric tests since no numeric identifiers are available.
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '=', None)
for record in geometryData:
self.assertEqual(record.getType('reportType'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('reportType', '!=', 'METAR')
for record in geometryData:
self.assertNotEqual(record.getString('reportType'), 'METAR')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('reportType'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('reportType', '>', 'METAR')
for record in geometryData:
self.assertGreater(record.getString('reportType'), 'METAR')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('reportType', '<', 'METAR')
for record in geometryData:
self.assertLess(record.getString('reportType'), 'METAR')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('reportType', '>=', 'METAR')
for record in geometryData:
self.assertGreaterEqual(record.getString('reportType'), 'METAR')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('reportType', '<=', 'METAR')
for record in geometryData:
self.assertLessEqual(record.getString('reportType'), 'METAR')
def testGetDataWithInTuple(self):
collection = ('METAR', 'SPECI')
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInList(self):
collection = ['METAR', 'SPECI']
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInGenerator(self):
collection = ('METAR', 'SPECI')
generator = (item for item in collection)
geometryData = self._runConstraintTest('reportType', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithNotInList(self):
collection = ['METAR', 'SPECI']
geometryData = self._runConstraintTest('reportType', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('reportType'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'junk', 'METAR')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('METAR', 'SPECI', ())
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', 'in', collection)

View file

@ -0,0 +1,74 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import params
import unittest
#
# Test DAF support for pirep data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 12/07/16 5981 tgurney Parameterize
# 12/20/16 5981 tgurney Add envelope test
#
#
class PirepTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for pirep data"""
datatype = "pirep"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.AIRPORT)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames(params.AIRPORT)
req.setParameters("temperature", "windSpeed", "hazardType", "turbType")
print("Testing getGeometryData()")
geomData = DAL.getGeometryData(req)
self.assertIsNotNone(geomData)
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
print("level=", record.getLevel(), end="")
# One dimensional parameters are reported on the 0.0UNKNOWN level.
# 2D parameters are reported on MB levels from pressure.
if record.getLevel() == "0.0UNKNOWN":
print(" temperature=" + record.getString("temperature") + record.getUnit("temperature"), end="")
print(" windSpeed=" + record.getString("windSpeed") + record.getUnit("windSpeed"), end="")
else:
print(" hazardType=" + record.getString("hazardType"), end="")
print(" turbType=" + record.getString("turbType"), end="")
print(" geometry=", record.getGeometry())
print("getGeometryData() complete\n")
def testGetGeometryDataWithEnvelope(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("temperature", "windSpeed", "hazardType", "turbType")
req.setEnvelope(params.ENVELOPE)
print("Testing getGeometryData()")
data = DAL.getGeometryData(req)
for item in data:
self.assertTrue(params.ENVELOPE.contains(item.getGeometry()))

View file

@ -0,0 +1,32 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import testWarning
import unittest
#
# Test DAF support for practicewarning data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/10/16 5548 tgurney Inherit all tests from
# warning
#
class PracticeWarningTestCase(testWarning.WarningTestCase):
"""Test DAF support for practicewarning data"""
datatype = "practicewarning"
# All tests taken from testWarning

View file

@ -0,0 +1,63 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
import baseDafTestCase
import unittest
#
# Test DAF support for profiler data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
#
#
class ProfilerTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for profiler data"""
datatype = "profiler"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("temperature", "pressure", "uComponent", "vComponent")
print("Testing getGeometryData()")
geomData = DAL.getGeometryData(req)
self.assertIsNotNone(geomData)
print("Number of geometry records: " + str(len(geomData)))
print("Sample geometry data:")
for record in geomData[:self.sampleDataLimit]:
print("level:", record.getLevel(), end="")
# One dimensional parameters are reported on the 0.0UNKNOWN level.
# 2D parameters are reported on MB levels from pressure.
if record.getLevel() == "0.0UNKNOWN":
print(" temperature=" + record.getString("temperature") + record.getUnit("temperature"), end="")
print(" pressure=" + record.getString("pressure") + record.getUnit("pressure"), end="")
else:
print(" uComponent=" + record.getString("uComponent") + record.getUnit("uComponent"), end="")
print(" vComponent=" + record.getString("vComponent") + record.getUnit("vComponent"), end="")
print(" geometry:", record.getGeometry())
print("getGeometryData() complete\n\n")

View file

@ -0,0 +1,78 @@
##
##
import unittest
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
from awips.dataaccess import DataAccessLayer as DAL
import baseRadarTestCase
import params
#
# Test DAF support for radar graphics data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 08/25/16 2671 tgurney Initial creation.
# 08/31/16 2671 tgurney Add mesocyclone
# 09/08/16 2671 tgurney Add storm track
# 09/27/16 2671 tgurney Add hail index
# 09/30/16 2671 tgurney Add TVS
# 12/07/16 5981 tgurney Parameterize
# 12/19/16 5981 tgurney Do not check data times on
# returned data
#
#
class RadarGraphicsTestCase(baseRadarTestCase.BaseRadarTestCase):
"""Test DAF support for radar data"""
datatype = 'radar'
def runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('166')
# TODO: Cannot check datatimes on the result because the times returned
# by getAvailableTimes have level = -1.0, while the time on the actual
# data has the correct level set (>= 0.0).
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataMeltingLayer(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters('166')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataMesocyclone(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters('141')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataStormTrack(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters('58')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataHailIndex(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters('59')
self.runGeometryDataTest(req, checkDataTimes=False)
def testGetGeometryDataTVS(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters('61')
self.runGeometryDataTest(req, checkDataTimes=False)

View file

@ -0,0 +1,44 @@
##
##
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseRadarTestCase
import params
import unittest
#
# Test DAF support for radar grid data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 08/25/16 2671 tgurney Initial creation
#
#
class RadarTestCase(baseRadarTestCase.BaseRadarTestCase):
"""Test DAF support for radar data"""
datatype = 'radar'
parameterList = ['94']
def runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters(*self.parameterList)
# Don't test shapes since they may differ.
return self.runGridDataTest(req, testSameShape=False)
def testGetGridData(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
req.setLocationNames(self.radarLoc)
req.setParameters(*self.parameterList)
# Don't test shapes since they may differ.
self.runGridDataTest(req, testSameShape=False)

View file

@ -0,0 +1,163 @@
##
##
from __future__ import print_function
from shapely.geometry import box
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import params
import unittest
#
# Test DAF support for radar_spatial data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 05/26/16 5587 njensen Added testGetIdentifierValues()
# 06/01/16 5587 tgurney Move testIdentifiers() to
# superclass
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 12/07/16 5981 tgurney Parameterize
# 01/06/17 5981 tgurney Do not check data times
#
#
class RadarSpatialTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for radar_spatial data"""
datatype = "radar_spatial"
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
req.setEnvelope(params.ENVELOPE)
self.runLocationsTest(req)
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetIdentifierValues(self):
self.runGetIdValuesTest(['wfo_id'])
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames("TORD", "TMDW")
req.setParameters("wfo_id", "name", "elevmeter")
self.runGeometryDataTest(req, checkDataTimes=False)
def testRequestingTimesThrowsTimeAgnosticDataException(self):
req = DAL.newDataRequest(self.datatype)
self.runTimeAgnosticTest(req)
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters('elevmeter', 'eqp_elv', 'wfo_id', 'immutablex')
return self.runGeometryDataTest(req, checkDataTimes=False)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('wfo_id', '=', params.SITE_ID)
for record in geometryData:
self.assertEqual(record.getString('wfo_id'), params.SITE_ID)
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('wfo_id', '=', unicode(params.SITE_ID))
for record in geometryData:
self.assertEqual(record.getString('wfo_id'), params.SITE_ID)
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('immutablex', '=', 57)
for record in geometryData:
self.assertEqual(record.getNumber('immutablex'), 57)
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('immutablex', '=', 57L)
for record in geometryData:
self.assertEqual(record.getNumber('immutablex'), 57)
def testGetDataWithEqualsFloat(self):
geometryData = self._runConstraintTest('immutablex', '=', 57.0)
for record in geometryData:
self.assertEqual(round(record.getNumber('immutablex'), 1), 57.0)
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('wfo_id', '=', None)
for record in geometryData:
self.assertEqual(record.getType('wfo_id'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('wfo_id', '!=', params.SITE_ID)
for record in geometryData:
self.assertNotEquals(record.getString('wfo_id'), params.SITE_ID)
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('wfo_id', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('wfo_id'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('elevmeter', '>', 1000)
for record in geometryData:
self.assertGreater(record.getNumber('elevmeter'), 1000)
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('elevmeter', '<', 1000)
for record in geometryData:
self.assertLess(record.getNumber('elevmeter'), 1000)
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('eqp_elv', '>=', 1295)
for record in geometryData:
self.assertGreaterEqual(record.getNumber('eqp_elv'), 1295)
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('eqp_elv', '<=', 138)
for record in geometryData:
self.assertLessEqual(record.getNumber('eqp_elv'), 138)
def testGetDataWithInTuple(self):
collection = (params.SITE_ID, 'GID')
geometryData = self._runConstraintTest('wfo_id', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('wfo_id'), collection)
def testGetDataWithInList(self):
collection = [params.SITE_ID, 'GID']
geometryData = self._runConstraintTest('wfo_id', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('wfo_id'), collection)
def testGetDataWithInGenerator(self):
collection = (params.SITE_ID, 'GID')
generator = (item for item in collection)
geometryData = self._runConstraintTest('wfo_id', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('wfo_id'), collection)
def testGetDataWithNotInList(self):
collection = [params.SITE_ID, 'GID']
geometryData = self._runConstraintTest('wfo_id', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('wfo_id'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('wfo_id', 'junk', params.SITE_ID)
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('wfo_id', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('wfo_id', 'in', [])

View file

@ -0,0 +1,228 @@
##
##
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import unittest
#
# Unit tests for Python implementation of RequestConstraint
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 07/22/16 2416 tgurney Initial creation
#
#
class RequestConstraintTestCase(unittest.TestCase):
def _newRequestConstraint(self, constraintType, constraintValue):
constraint = RequestConstraint()
constraint.constraintType = constraintType
constraint.constraintValue = constraintValue
return constraint
def testEvaluateEquals(self):
new = RequestConstraint.new
self.assertTrue(new('=', 3).evaluate(3))
self.assertTrue(new('=', 3).evaluate('3'))
self.assertTrue(new('=', '3').evaluate(3))
self.assertTrue(new('=', 12345).evaluate(12345L))
self.assertTrue(new('=', 'a').evaluate('a'))
self.assertTrue(new('=', 'a').evaluate(u'a'))
self.assertTrue(new('=', 1.0001).evaluate(2.0 - 0.999999))
self.assertTrue(new('=', 1.00001).evaluate(1))
self.assertFalse(new('=', 'a').evaluate(['a']))
self.assertFalse(new('=', 'a').evaluate(['b']))
self.assertFalse(new('=', 3).evaluate(4))
self.assertFalse(new('=', 4).evaluate(3))
self.assertFalse(new('=', 'a').evaluate('z'))
def testEvaluateNotEquals(self):
new = RequestConstraint.new
self.assertTrue(new('!=', 'a').evaluate(['a']))
self.assertTrue(new('!=', 'a').evaluate(['b']))
self.assertTrue(new('!=', 3).evaluate(4))
self.assertTrue(new('!=', 4).evaluate(3))
self.assertTrue(new('!=', 'a').evaluate('z'))
self.assertFalse(new('!=', 3).evaluate('3'))
self.assertFalse(new('!=', '3').evaluate(3))
self.assertFalse(new('!=', 3).evaluate(3))
self.assertFalse(new('!=', 12345).evaluate(12345L))
self.assertFalse(new('!=', 'a').evaluate('a'))
self.assertFalse(new('!=', 'a').evaluate(u'a'))
self.assertFalse(new('!=', 1.0001).evaluate(2.0 - 0.9999))
def testEvaluateGreaterThan(self):
new = RequestConstraint.new
self.assertTrue(new('>', 1.0001).evaluate(1.0002))
self.assertTrue(new('>', 'a').evaluate('b'))
self.assertTrue(new('>', 3).evaluate(4))
self.assertFalse(new('>', 20).evaluate(3))
self.assertFalse(new('>', 12345).evaluate(12345L))
self.assertFalse(new('>', 'a').evaluate('a'))
self.assertFalse(new('>', 'z').evaluate('a'))
self.assertFalse(new('>', 4).evaluate(3))
def testEvaluateGreaterThanEquals(self):
new = RequestConstraint.new
self.assertTrue(new('>=', 3).evaluate(3))
self.assertTrue(new('>=', 12345).evaluate(12345L))
self.assertTrue(new('>=', 'a').evaluate('a'))
self.assertTrue(new('>=', 1.0001).evaluate(1.0002))
self.assertTrue(new('>=', 'a').evaluate('b'))
self.assertTrue(new('>=', 3).evaluate(20))
self.assertFalse(new('>=', 1.0001).evaluate(1.0))
self.assertFalse(new('>=', 'z').evaluate('a'))
self.assertFalse(new('>=', 40).evaluate(3))
def testEvaluateLessThan(self):
new = RequestConstraint.new
self.assertTrue(new('<', 'z').evaluate('a'))
self.assertTrue(new('<', 30).evaluate(4))
self.assertFalse(new('<', 3).evaluate(3))
self.assertFalse(new('<', 12345).evaluate(12345L))
self.assertFalse(new('<', 'a').evaluate('a'))
self.assertFalse(new('<', 1.0001).evaluate(1.0002))
self.assertFalse(new('<', 'a').evaluate('b'))
self.assertFalse(new('<', 3).evaluate(40))
def testEvaluateLessThanEquals(self):
new = RequestConstraint.new
self.assertTrue(new('<=', 'z').evaluate('a'))
self.assertTrue(new('<=', 20).evaluate(3))
self.assertTrue(new('<=', 3).evaluate(3))
self.assertTrue(new('<=', 12345).evaluate(12345L))
self.assertTrue(new('<=', 'a').evaluate('a'))
self.assertFalse(new('<=', 1.0001).evaluate(1.0002))
self.assertFalse(new('<=', 'a').evaluate('b'))
self.assertFalse(new('<=', 4).evaluate(30))
def testEvaluateIsNull(self):
new = RequestConstraint.new
self.assertTrue(new('=', None).evaluate(None))
self.assertTrue(new('=', None).evaluate('null'))
self.assertFalse(new('=', None).evaluate(()))
self.assertFalse(new('=', None).evaluate(0))
self.assertFalse(new('=', None).evaluate(False))
def testEvaluateIsNotNull(self):
new = RequestConstraint.new
self.assertTrue(new('!=', None).evaluate(()))
self.assertTrue(new('!=', None).evaluate(0))
self.assertTrue(new('!=', None).evaluate(False))
self.assertFalse(new('!=', None).evaluate(None))
self.assertFalse(new('!=', None).evaluate('null'))
def testEvaluateIn(self):
new = RequestConstraint.new
self.assertTrue(new('in', [3]).evaluate(3))
self.assertTrue(new('in', ['a', 'b', 3]).evaluate(3))
self.assertTrue(new('in', 'a').evaluate('a'))
self.assertTrue(new('in', [3, 4, 5]).evaluate('5'))
self.assertTrue(new('in', [1.0001, 2, 3]).evaluate(2.0 - 0.9999))
self.assertFalse(new('in', ['a', 'b', 'c']).evaluate('d'))
self.assertFalse(new('in', 'a').evaluate('b'))
def testEvaluateNotIn(self):
new = RequestConstraint.new
self.assertTrue(new('not in', ['a', 'b', 'c']).evaluate('d'))
self.assertTrue(new('not in', [3, 4, 5]).evaluate(6))
self.assertTrue(new('not in', 'a').evaluate('b'))
self.assertFalse(new('not in', [3]).evaluate(3))
self.assertFalse(new('not in', ['a', 'b', 3]).evaluate(3))
self.assertFalse(new('not in', 'a').evaluate('a'))
self.assertFalse(new('not in', [1.0001, 2, 3]).evaluate(2.0 - 0.9999))
def testEvaluateLike(self):
# cannot make "like" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('LIKE', 'a').evaluate('a'))
self.assertTrue(new('LIKE', 'a%').evaluate('a'))
self.assertTrue(new('LIKE', 'a%').evaluate('abcd'))
self.assertTrue(new('LIKE', '%a').evaluate('a'))
self.assertTrue(new('LIKE', '%a').evaluate('bcda'))
self.assertTrue(new('LIKE', '%').evaluate(''))
self.assertTrue(new('LIKE', '%').evaluate('anything'))
self.assertTrue(new('LIKE', 'a%d').evaluate('ad'))
self.assertTrue(new('LIKE', 'a%d').evaluate('abcd'))
self.assertTrue(new('LIKE', 'aa.()!{[]^%$').evaluate('aa.()!{[]^zzz$'))
self.assertTrue(new('LIKE', 'a__d%').evaluate('abcdefg'))
self.assertFalse(new('LIKE', 'a%').evaluate('b'))
self.assertFalse(new('LIKE', 'a%').evaluate('ba'))
self.assertFalse(new('LIKE', '%a').evaluate('b'))
self.assertFalse(new('LIKE', '%a').evaluate('ab'))
self.assertFalse(new('LIKE', 'a%').evaluate('A'))
self.assertFalse(new('LIKE', 'A%').evaluate('a'))
self.assertFalse(new('LIKE', 'a%d').evaluate('da'))
self.assertFalse(new('LIKE', 'a__d%').evaluate('abccdefg'))
self.assertFalse(new('LIKE', '....').evaluate('aaaa'))
self.assertFalse(new('LIKE', '.*').evaluate('anything'))
def testEvaluateILike(self):
# cannot make "ilike" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('ILIKE', 'a').evaluate('a'))
self.assertTrue(new('ILIKE', 'a%').evaluate('a'))
self.assertTrue(new('ILIKE', 'a%').evaluate('abcd'))
self.assertTrue(new('ILIKE', '%a').evaluate('a'))
self.assertTrue(new('ILIKE', '%a').evaluate('bcda'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('ad'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('abcd'))
self.assertTrue(new('ILIKE', 'a').evaluate('A'))
self.assertTrue(new('ILIKE', 'a%').evaluate('A'))
self.assertTrue(new('ILIKE', 'a%').evaluate('ABCD'))
self.assertTrue(new('ILIKE', '%a').evaluate('A'))
self.assertTrue(new('ILIKE', '%a').evaluate('BCDA'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('AD'))
self.assertTrue(new('ILIKE', 'a%d').evaluate('ABCD'))
self.assertTrue(new('ILIKE', 'A').evaluate('a'))
self.assertTrue(new('ILIKE', 'A%').evaluate('a'))
self.assertTrue(new('ILIKE', 'A%').evaluate('abcd'))
self.assertTrue(new('ILIKE', '%A').evaluate('a'))
self.assertTrue(new('ILIKE', '%A').evaluate('bcda'))
self.assertTrue(new('ILIKE', '%').evaluate(''))
self.assertTrue(new('ILIKE', '%').evaluate('anything'))
self.assertTrue(new('ILIKE', 'A%D').evaluate('ad'))
self.assertTrue(new('ILIKE', 'A%D').evaluate('abcd'))
self.assertTrue(new('ILIKE', 'aa.()!{[]^%$').evaluate('AA.()!{[]^zzz$'))
self.assertTrue(new('ILIKE', 'a__d%').evaluate('abcdefg'))
self.assertTrue(new('ILIKE', 'a__d%').evaluate('ABCDEFG'))
self.assertFalse(new('ILIKE', 'a%').evaluate('b'))
self.assertFalse(new('ILIKE', 'a%').evaluate('ba'))
self.assertFalse(new('ILIKE', '%a').evaluate('b'))
self.assertFalse(new('ILIKE', '%a').evaluate('ab'))
self.assertFalse(new('ILIKE', 'a%d').evaluate('da'))
self.assertFalse(new('ILIKE', 'a__d%').evaluate('abccdefg'))
self.assertFalse(new('ILIKE', '....').evaluate('aaaa'))
self.assertFalse(new('ILIKE', '.*').evaluate('anything'))
def testEvaluateBetween(self):
# cannot make "between" with RequestConstraint.new()
new = self._newRequestConstraint
self.assertTrue(new('BETWEEN', '1--1').evaluate(1))
self.assertTrue(new('BETWEEN', '1--10').evaluate(1))
self.assertTrue(new('BETWEEN', '1--10').evaluate(5))
self.assertTrue(new('BETWEEN', '1--10').evaluate(10))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.0))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.05))
self.assertTrue(new('BETWEEN', '1.0--1.1').evaluate(1.1))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('a'))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('j'))
self.assertTrue(new('BETWEEN', 'a--x').evaluate('x'))
self.assertFalse(new('BETWEEN', '1--1').evaluate(2))
self.assertFalse(new('BETWEEN', '1--2').evaluate(10))
self.assertFalse(new('BETWEEN', '1--10').evaluate(0))
self.assertFalse(new('BETWEEN', '1--10').evaluate(11))
self.assertFalse(new('BETWEEN', '1.0--1.1').evaluate(0.99))
self.assertFalse(new('BETWEEN', '1.0--1.1').evaluate(1.11))
self.assertFalse(new('BETWEEN', 'a--x').evaluate(' '))
self.assertFalse(new('BETWEEN', 'a--x').evaluate('z'))

View file

@ -0,0 +1,176 @@
#!/usr/bin/env python
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import unittest
#
# Test DAF support for satellite data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/26/16 5587 tgurney Move identifier values tests
# out of base class
# 06/01/16 5587 tgurney Update testGetIdentifierValues
# 06/07/16 5574 tgurney Add advanced query tests
# 06/13/16 5574 tgurney Typo
# 06/30/16 5725 tgurney Add test for NOT IN
#
#
class SatelliteTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for satellite data"""
datatype = "satellite"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames("West CONUS")
self.runTimesTest(req)
def testGetGridData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("Imager 11 micron IR")
req.setLocationNames("West CONUS")
self.runGridDataTest(req)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters("Imager 11 micron IR")
req.setLocationNames("West CONUS")
return self.runGridDataTest(req)
def testGetDataWithEqualsString(self):
gridData = self._runConstraintTest('creatingEntity', '=', 'Composite')
for record in gridData:
self.assertEqual(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithEqualsUnicode(self):
gridData = self._runConstraintTest('creatingEntity', '=', u'Composite')
for record in gridData:
self.assertEqual(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithEqualsInt(self):
gridData = self._runConstraintTest('creatingEntity', '=', 1000)
for record in gridData:
self.assertEqual(record.getAttribute('creatingEntity'), 1000)
def testGetDataWithEqualsLong(self):
gridData = self._runConstraintTest('creatingEntity', '=', 1000L)
for record in gridData:
self.assertEqual(record.getAttribute('creatingEntity'), 1000)
def testGetDataWithEqualsFloat(self):
gridData = self._runConstraintTest('creatingEntity', '=', 1.0)
for record in gridData:
self.assertEqual(round(record.getAttribute('creatingEntity'), 1), 1.0)
def testGetDataWithEqualsNone(self):
gridData = self._runConstraintTest('creatingEntity', '=', None)
for record in gridData:
self.assertIsNone(record.getAttribute('creatingEntity'))
def testGetDataWithNotEquals(self):
gridData = self._runConstraintTest('creatingEntity', '!=', 'Composite')
for record in gridData:
self.assertNotEqual(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithNotEqualsNone(self):
gridData = self._runConstraintTest('creatingEntity', '!=', None)
for record in gridData:
self.assertIsNotNone(record.getAttribute('creatingEntity'))
def testGetDataWithGreaterThan(self):
gridData = self._runConstraintTest('creatingEntity', '>', 'Composite')
for record in gridData:
self.assertGreater(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithLessThan(self):
gridData = self._runConstraintTest('creatingEntity', '<', 'Composite')
for record in gridData:
self.assertLess(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithGreaterThanEquals(self):
gridData = self._runConstraintTest('creatingEntity', '>=', 'Composite')
for record in gridData:
self.assertGreaterEqual(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithLessThanEquals(self):
gridData = self._runConstraintTest('creatingEntity', '<=', 'Composite')
for record in gridData:
self.assertLessEqual(record.getAttribute('creatingEntity'), 'Composite')
def testGetDataWithInTuple(self):
collection = ('Composite', 'Miscellaneous')
gridData = self._runConstraintTest('creatingEntity', 'in', collection)
for record in gridData:
self.assertIn(record.getAttribute('creatingEntity'), collection)
def testGetDataWithInList(self):
collection = ('Composite', 'Miscellaneous')
gridData = self._runConstraintTest('creatingEntity', 'in', collection)
for record in gridData:
self.assertIn(record.getAttribute('creatingEntity'), collection)
def testGetDataWithInGenerator(self):
collection = ('Composite', 'Miscellaneous')
generator = (item for item in collection)
gridData = self._runConstraintTest('creatingEntity', 'in', generator)
for record in gridData:
self.assertIn(record.getAttribute('creatingEntity'), collection)
def testGetDataWithNotInList(self):
collection = ('Composite', 'Miscellaneous')
gridData = self._runConstraintTest('creatingEntity', 'not in', collection)
for record in gridData:
self.assertNotIn(record.getAttribute('creatingEntity'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('creatingEntity', 'junk', 'Composite')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('creatingEntity', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('creatingEntity', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('Composite', 'Miscellaneous', ())
with self.assertRaises(TypeError):
self._runConstraintTest('creatingEntity', 'in', collection)

View file

@ -0,0 +1,175 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import unittest
#
# Test DAF support for sfcobs data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 06/09/16 5587 bsteffen Add getIdentifierValues tests
# 06/13/16 5574 tgurney Add advanced query tests
# 06/30/16 5725 tgurney Add test for NOT IN
# 01/20/17 6095 tgurney Add null identifiers test
#
#
class SfcObsTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for sfcobs data"""
datatype = "sfcobs"
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames("14547")
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames("14547")
req.setParameters("temperature", "seaLevelPress", "dewpoint")
self.runGeometryDataTest(req)
def testGetGeometryDataNullIdentifiers(self):
req = DAL.newDataRequest(self.datatype)
req.setLocationNames("14547")
req.setParameters("temperature", "seaLevelPress", "dewpoint")
req.identifiers = None
self.runGeometryDataTest(req)
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
self.runGetIdValuesTest(optionalIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters("temperature", "reportType")
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('reportType', '=', '1004')
for record in geometryData:
self.assertEqual(record.getString('reportType'), '1004')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('reportType', '=', u'1004')
for record in geometryData:
self.assertEqual(record.getString('reportType'), '1004')
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('reportType', '=', 1004)
for record in geometryData:
self.assertEqual(record.getString('reportType'), '1004')
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('reportType', '=', 1004L)
for record in geometryData:
self.assertEqual(record.getString('reportType'), '1004')
# No float test because no float identifiers are available
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '=', None)
for record in geometryData:
self.assertEqual(record.getType('reportType'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('reportType', '!=', 1004)
for record in geometryData:
self.assertNotEqual(record.getString('reportType'), '1004')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('reportType', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('reportType'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('reportType', '>', 1004)
for record in geometryData:
self.assertGreater(record.getString('reportType'), '1004')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('reportType', '<', 1004)
for record in geometryData:
self.assertLess(record.getString('reportType'), '1004')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('reportType', '>=', 1004)
for record in geometryData:
self.assertGreaterEqual(record.getString('reportType'), '1004')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('reportType', '<=', 1004)
for record in geometryData:
self.assertLessEqual(record.getString('reportType'), '1004')
def testGetDataWithInTuple(self):
collection = ('1004', '1005')
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInList(self):
collection = ['1004', '1005']
geometryData = self._runConstraintTest('reportType', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithInGenerator(self):
collection = ('1004', '1005')
generator = (item for item in collection)
geometryData = self._runConstraintTest('reportType', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('reportType'), collection)
def testGetDataWithNotInList(self):
collection = ['1004', '1005']
geometryData = self._runConstraintTest('reportType', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('reportType'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'junk', '1004')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('reportType', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('1004', '1005', ())
with self.assertRaises(TypeError):
self._runConstraintTest('reportType', 'in', collection)

View file

@ -0,0 +1,79 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from awips.ThriftClient import ThriftRequestException
import baseDafTestCase
import shapely.geometry
import unittest
#
# Test DAF support for topo data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 05/26/16 5587 tgurney Add test for
# getIdentifierValues()
# 06/01/16 5587 tgurney Update testGetIdentifierValues
# 07/18/17 6253 randerso Removed referenced to GMTED
#
class TopoTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for topo data"""
datatype = "topo"
def testGetGridData(self):
print("defaultTopo")
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("group", "/")
req.addIdentifier("dataset", "full")
poly = shapely.geometry.LinearRing(((-70, 40), (-71, 40), (-71, 42), (-70, 42)))
req.setEnvelope(poly)
gridData = DAL.getGridData(req)
self.assertIsNotNone(gridData)
print("Number of grid records: " + str(len(gridData)))
print("Sample grid data shape:\n" + str(gridData[0].getRawData().shape) + "\n")
print("Sample grid data:\n" + str(gridData[0].getRawData()) + "\n")
for topoFile in ["gtopo30"]:
print("\n" + topoFile)
req.addIdentifier("topoFile", topoFile)
gridData = DAL.getGridData(req)
self.assertIsNotNone(gridData)
print("Number of grid records: " + str(len(gridData)))
print("Sample grid data shape:\n" + str(gridData[0].getRawData().shape) + "\n")
print("Sample grid data:\n" + str(gridData[0].getRawData()) + "\n")
def testRequestingTooMuchDataThrowsResponseTooLargeException(self):
req = DAL.newDataRequest(self.datatype)
req.addIdentifier("group", "/")
req.addIdentifier("dataset", "full")
points = ((-180, 90), (180, 90), (180, -90), (-180, -90))
poly = shapely.geometry.LinearRing(points)
req.setEnvelope(poly)
with self.assertRaises(ThriftRequestException) as cm:
DAL.getGridData(req)
self.assertIn('ResponseTooLargeException', str(cm.exception))
def testGetIdentifierValues(self):
req = DAL.newDataRequest(self.datatype)
optionalIds = set(DAL.getOptionalIdentifiers(req))
requiredIds = set(DAL.getRequiredIdentifiers(req))
self.runGetIdValuesTest(optionalIds | requiredIds)
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()

View file

@ -0,0 +1,216 @@
##
##
from __future__ import print_function
from awips.dataaccess import DataAccessLayer as DAL
from dynamicserialize.dstypes.com.raytheon.uf.common.dataquery.requests import RequestConstraint
import baseDafTestCase
import unittest
#
# Test DAF support for warning data
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 01/19/16 4795 mapeters Initial Creation.
# 04/11/16 5548 tgurney Cleanup
# 04/18/16 5548 tgurney More cleanup
# 04/26/16 5587 tgurney Add identifier values tests
# 06/08/16 5574 tgurney Add advanced query tests
# 06/10/16 5548 tgurney Clean up references to name
# of data type
# 06/13/16 5574 tgurney Fix checks for None
# 06/21/16 5548 tgurney Skip tests that cause errors
# 06/30/16 5725 tgurney Add test for NOT IN
# 12/12/16 5981 tgurney Improve test performance
#
#
class WarningTestCase(baseDafTestCase.DafTestCase):
"""Test DAF support for warning data"""
datatype = "warning"
def _getLocationNames(self):
req = DAL.newDataRequest()
req.setDatatype(self.datatype)
return DAL.getAvailableLocationNames(req)
def _getAllRecords(self):
req = DAL.newDataRequest()
req.setDatatype(self.datatype)
req.setParameters('id')
return DAL.getGeometryData(req)
def testGetAvailableParameters(self):
req = DAL.newDataRequest(self.datatype)
self.runParametersTest(req)
def testGetAvailableLocations(self):
req = DAL.newDataRequest(self.datatype)
self.runLocationsTest(req)
def testGetAvailableTimes(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("etn", "wmoid")
self.runTimesTest(req)
def testGetGeometryData(self):
req = DAL.newDataRequest(self.datatype)
req.setParameters("etn", "wmoid")
self.runGeometryDataTest(req)
def testFilterOnLocationName(self):
allLocationNames = self._getLocationNames()
if len(allLocationNames) == 0:
errmsg = "No {0} data exists on {1}. Try again with {0} data."
raise unittest.SkipTest(errmsg.format(self.datatype, DAL.THRIFT_HOST))
testCount = 3 # number of different location names to test
for locationName in allLocationNames[:testCount]:
req = DAL.newDataRequest()
req.setDatatype(self.datatype)
req.setParameters('id')
req.setLocationNames(locationName)
geomData = DAL.getGeometryData(req)
for geom in geomData:
self.assertEqual(geom.getLocationName(), locationName)
def testFilterOnNonexistentLocationReturnsEmpty(self):
req = DAL.newDataRequest()
req.setDatatype(self.datatype)
req.setParameters('id')
req.setLocationNames('ZZZZ')
self.assertEqual(len(DAL.getGeometryData(req)), 0)
def testFilterOnInvalidLocationThrowsIncompatibleRequestException(self):
req = DAL.newDataRequest()
req.setDatatype(self.datatype)
req.setParameters('id')
req.setLocationNames(') and 0=1')
with self.assertRaises(Exception) as cm:
DAL.getGeometryData(req)
self.assertIn('IncompatibleRequestException', str(cm.exception))
def testGetColumnIdentifierValues(self):
self.runGetIdValuesTest(['act'])
@unittest.skip('avoid EDEX error')
def testGetInvalidIdentifierValuesThrowsException(self):
self.runInvalidIdValuesTest()
@unittest.skip('avoid EDEX error')
def testGetNonexistentIdentifierValuesThrowsException(self):
self.runNonexistentIdValuesTest()
def _runConstraintTest(self, key, operator, value):
req = DAL.newDataRequest(self.datatype)
constraint = RequestConstraint.new(operator, value)
req.addIdentifier(key, constraint)
req.setParameters("etn", "wmoid", "sig")
return self.runGeometryDataTest(req)
def testGetDataWithEqualsString(self):
geometryData = self._runConstraintTest('sig', '=', 'Y')
for record in geometryData:
self.assertEqual(record.getString('sig'), 'Y')
def testGetDataWithEqualsUnicode(self):
geometryData = self._runConstraintTest('sig', '=', u'Y')
for record in geometryData:
self.assertEqual(record.getString('sig'), 'Y')
def testGetDataWithEqualsInt(self):
geometryData = self._runConstraintTest('etn', '=', 1000)
for record in geometryData:
self.assertEqual(record.getString('etn'), '1000')
def testGetDataWithEqualsLong(self):
geometryData = self._runConstraintTest('etn', '=', 1000L)
for record in geometryData:
self.assertEqual(record.getString('etn'), '1000')
def testGetDataWithEqualsFloat(self):
geometryData = self._runConstraintTest('etn', '=', 1.0)
for record in geometryData:
self.assertEqual(round(float(record.getString('etn')), 1), 1.0)
def testGetDataWithEqualsNone(self):
geometryData = self._runConstraintTest('sig', '=', None)
for record in geometryData:
self.assertEqual(record.getType('sig'), 'NULL')
def testGetDataWithNotEquals(self):
geometryData = self._runConstraintTest('sig', '!=', 'Y')
for record in geometryData:
self.assertNotEqual(record.getString('sig'), 'Y')
def testGetDataWithNotEqualsNone(self):
geometryData = self._runConstraintTest('sig', '!=', None)
for record in geometryData:
self.assertNotEqual(record.getType('sig'), 'NULL')
def testGetDataWithGreaterThan(self):
geometryData = self._runConstraintTest('sig', '>', 'Y')
for record in geometryData:
self.assertGreater(record.getString('sig'), 'Y')
def testGetDataWithLessThan(self):
geometryData = self._runConstraintTest('sig', '<', 'Y')
for record in geometryData:
self.assertLess(record.getString('sig'), 'Y')
def testGetDataWithGreaterThanEquals(self):
geometryData = self._runConstraintTest('sig', '>=', 'Y')
for record in geometryData:
self.assertGreaterEqual(record.getString('sig'), 'Y')
def testGetDataWithLessThanEquals(self):
geometryData = self._runConstraintTest('sig', '<=', 'Y')
for record in geometryData:
self.assertLessEqual(record.getString('sig'), 'Y')
def testGetDataWithInTuple(self):
collection = ('Y', 'A')
geometryData = self._runConstraintTest('sig', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('sig'), collection)
def testGetDataWithInList(self):
collection = ['Y', 'A']
geometryData = self._runConstraintTest('sig', 'in', collection)
for record in geometryData:
self.assertIn(record.getString('sig'), collection)
def testGetDataWithInGenerator(self):
collection = ('Y', 'A')
generator = (item for item in collection)
geometryData = self._runConstraintTest('sig', 'in', generator)
for record in geometryData:
self.assertIn(record.getString('sig'), collection)
def testGetDataWithNotInList(self):
collection = ['Y', 'W']
geometryData = self._runConstraintTest('sig', 'not in', collection)
for record in geometryData:
self.assertNotIn(record.getString('sig'), collection)
def testGetDataWithInvalidConstraintTypeThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('sig', 'junk', 'Y')
def testGetDataWithInvalidConstraintValueThrowsException(self):
with self.assertRaises(TypeError):
self._runConstraintTest('sig', '=', {})
def testGetDataWithEmptyInConstraintThrowsException(self):
with self.assertRaises(ValueError):
self._runConstraintTest('sig', 'in', [])
def testGetDataWithNestedInConstraintThrowsException(self):
collection = ('Y', 'A', ())
with self.assertRaises(TypeError):
self._runConstraintTest('sig', 'in', collection)

View file

@ -0,0 +1,15 @@
##
##
#
# __init__.py for awips.test.localization package
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# --------- -------- --------- --------------------------
# 08/07/17 5731 bsteffen Initial Creation.
__all__ = []

View file

@ -0,0 +1,155 @@
##
##
#
# Tests for the LocalizationFileManager
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# --------- -------- --------- --------------------------
# 08/09/17 5731 bsteffen Initial Creation.
import unittest
from awips.localization.LocalizationFileManager import (LocalizationFileManager,
LocalizationFileVersionConflictException,
LocalizationContext,
LocalizationFileIsNotDirectoryException,
LocalizationFileDoesNotExistException)
testFile = "purge/defaultPurgeRules.xml"
testContent = "<purgeRuleSet><defaultRule><period>05-05:05:05</period></defaultRule></purgeRuleSet>"
testDir = "purge/"
testNewFile = "purge/testPurgeRules.xml"
class ContextTestCase(unittest.TestCase):
def test_eq(self):
c1 = LocalizationContext()
c2 = LocalizationContext()
self.assertEqual(c1,c2)
c3 = LocalizationContext("site", "test")
c4 = LocalizationContext("site", "test")
self.assertEqual(c3,c4)
self.assertNotEqual(c1,c3)
def test_hash(self):
c1 = LocalizationContext()
c2 = LocalizationContext()
self.assertEqual(hash(c1),hash(c2))
c3 = LocalizationContext("site", "test")
c4 = LocalizationContext("site", "test")
self.assertEqual(hash(c3),hash(c4))
class LFMTestCase(unittest.TestCase):
def setUp(self):
self.manager = LocalizationFileManager()
userFile = self.manager.getSpecific("user", testFile)
if userFile.exists():
userFile.delete()
newFile = self.manager.getSpecific("user", testNewFile)
if newFile.exists():
newFile.delete()
def test_gets(self):
startingIncremental = self.manager.getIncremental(testFile)
baseFile = self.manager.getSpecific("base", testFile)
self.assertEqual(baseFile, startingIncremental[0])
self.assertTrue(baseFile.exists())
self.assertFalse(baseFile.isDirectory())
userFile = self.manager.getSpecific("user", testFile)
self.assertFalse(userFile.exists())
with userFile.open("w") as stream:
stream.write(testContent)
userFile = self.manager.getSpecific("user", testFile)
self.assertTrue(userFile.exists())
with userFile.open('r') as stream:
self.assertEqual(stream.read(), testContent)
absFile = self.manager.getAbsolute(testFile)
self.assertEqual(absFile, userFile)
endingIncremental = self.manager.getIncremental(testFile)
self.assertEqual(len(startingIncremental) + 1, len(endingIncremental))
self.assertEqual(userFile, endingIncremental[-1])
self.assertEqual(baseFile, endingIncremental[0])
userFile.delete()
userFile = self.manager.getSpecific("user", testFile)
self.assertFalse(userFile.exists())
def test_concurrent_edit(self):
userFile1 = self.manager.getSpecific("user", testFile)
userFile2 = self.manager.getSpecific("user", testFile)
self.assertFalse(userFile1.exists())
self.assertFalse(userFile2.exists())
with self.assertRaises(LocalizationFileVersionConflictException):
with userFile1.open("w") as stream1:
stream1.write(testContent)
with userFile2.open("w") as stream2:
stream2.write(testContent)
userFile = self.manager.getSpecific("user", testFile)
userFile.delete()
def test_dir(self):
dir = self.manager.getAbsolute(testDir)
self.assertTrue(dir.isDirectory())
with self.assertRaises(Exception):
dir.delete()
def test_list(self):
abs1 = self.manager.listAbsolute(testDir)
inc1 = self.manager.listIncremental(testDir)
self.assertEqual(len(abs1), len(inc1))
for i in range(len(abs1)):
self.assertEquals(abs1[i], inc1[i][-1])
userFile = self.manager.getSpecific("user", testNewFile)
self.assertNotIn(userFile, abs1)
with userFile.open("w") as stream:
stream.write(testContent)
userFile = self.manager.getSpecific("user", testNewFile)
abs2 = self.manager.listAbsolute(testDir)
inc2 = self.manager.listIncremental(testDir)
self.assertEqual(len(abs2), len(inc2))
for i in range(len(abs2)):
self.assertEquals(abs2[i], inc2[i][-1])
self.assertEquals(len(abs1) + 1, len(abs2))
self.assertIn(userFile, abs2)
userFile.delete()
def test_list_file(self):
with self.assertRaises(LocalizationFileIsNotDirectoryException):
self.manager.listIncremental(testFile)
def test_list_nonexistant(self):
with self.assertRaises(LocalizationFileDoesNotExistException):
self.manager.listIncremental('dontNameYourDirectoryThis')
def test_root_variants(self):
list1 = self.manager.listAbsolute(".")
list2 = self.manager.listAbsolute("")
list3 = self.manager.listAbsolute("/")
self.assertEquals(list1,list2)
self.assertEquals(list2,list3)
def test_slashiness(self):
raw = testDir
if raw[0] == '/':
raw = raw[1:]
if raw[-1] == '/':
raw = raw[:-1]
list1 = self.manager.listAbsolute(raw)
list2 = self.manager.listAbsolute(raw + "/")
list3 = self.manager.listAbsolute("/" + raw)
self.assertEquals(list1,list2)
self.assertEquals(list2,list3)
if __name__ == '__main__':
unittest.main()

View file

@ -0,0 +1,342 @@
##
##
import unittest
import urllib2
from HTMLParser import HTMLParser
from xml.etree.ElementTree import parse as parseXml
from json import load as loadjson
from urlparse import urljoin
from base64 import b64encode
#
# Test the localizaiton REST service.
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# --------- -------- --------- --------------------------
# 08/07/17 5731 bsteffen Initial Creation.
baseURL = "http://localhost:9581/services/localization/"
testSite = "OAX"
testDir = "menus"
testFile = "test.xml"
username = "test"
password = username
base64string = b64encode('%s:%s' % (username, password))
authString = "Basic %s" % base64string
class ValidHTMLParser(HTMLParser):
"""Simple HTML parser that performs very minimal validation.
This ensures that all start and end tags match, and also that there are
some tags. It also accumulates the text of all links in the html file
in the link_texts attribute, which can be used for further validation.
"""
def __init__(self, testcase):
HTMLParser.__init__(self)
self._testcase = testcase
self._tags = []
self._any = False
self.link_texts = []
def handle_starttag(self, tag, attrs):
self._tags.append(tag)
self._any = True
def handle_endtag(self, tag):
self._testcase.assertNotEquals([], self._tags, "Unstarted end tag " + tag)
self._testcase.assertEquals(tag, self._tags.pop())
def handle_data(self, data):
if self._tags[-1] == "a":
self.link_texts.append(data)
def close(self):
HTMLParser.close(self)
self._testcase.assertTrue(self._any)
self._testcase.assertEquals([], self._tags)
class AbstractListingTestCase():
"""Base test case for testing listings, retrieves data as html, xml, and json.
Sub classes should implement assertValidHtml, assertValidXml, and
assertValidJson to ensure that the content returned matches what was
expected.
"""
def assertRequestGetsHtml(self, request):
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "text/html")
body = response.read()
parser = ValidHTMLParser(self)
parser.feed(body)
parser.close()
self.assertValidHtml(parser)
def assertValidHtml(self, parser):
"""Intended to be overriden by subclasses to validate HTML content.
The argument is a populated instance of ValidHTMLParser.
"""
pass
def test_default(self):
request = urllib2.Request(self.url)
self.assertRequestGetsHtml(request)
def test_last_slash(self):
if self.url.endswith("/"):
request = urllib2.Request(self.url[:-1])
else:
request = urllib2.Request(self.url + "/")
self.assertRequestGetsHtml(request)
def test_wild_mime(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "*/*")
self.assertRequestGetsHtml(request)
request.add_header("Accept", "text/*")
self.assertRequestGetsHtml(request)
def test_html(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "text/html")
self.assertRequestGetsHtml(request)
def test_json(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "application/json")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/json")
jsonData = loadjson(response)
self.assertValidJson(jsonData)
def assertValidJson(self, jsonData):
"""Intended to be overriden by subclasses to validate JSON content.
The argument is a python object as returned from json.load
"""
pass
def test_xml(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "application/xml")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/xml")
xmlData = parseXml(response)
self.assertValidXml(xmlData)
def assertValidXml(self, xmlData):
"""Intended to be overriden by subclasses to validate XML content.
The argument is an ElementTree
"""
pass
def test_delete(self):
request = urllib2.Request(self.url)
request.get_method = lambda: "DELETE"
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(405, cm.exception.code)
def test_put(self):
request = urllib2.Request(self.url)
request.get_method = lambda: "PUT"
request.add_data("Test Data")
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(405, cm.exception.code)
def test_unacceptable(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "application/fakemimetype")
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(406, cm.exception.code)
request.add_header("Accept", "fakemimetype/*")
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(406, cm.exception.code)
def test_accept_quality_factor(self):
request = urllib2.Request(self.url)
request.add_header("Accept", "application/xml; q=0.8, application/json; q=0.2")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/xml")
xmlData = parseXml(response)
self.assertValidXml(xmlData)
request.add_header("Accept", "application/xml; q=0.2, application/json; q=0.8")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/json")
jsonData = loadjson(response)
self.assertValidJson(jsonData)
request.add_header("Accept", "application/xml, application/json; q=0.8")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/xml")
xmlData = parseXml(response)
self.assertValidXml(xmlData)
request.add_header("Accept", "application/fakemimetype, application/json; q=0.8")
response = urllib2.urlopen(request)
self.assertEquals(response.headers["Content-Type"], "application/json")
jsonData = loadjson(response)
self.assertValidJson(jsonData)
class RootTestCase(AbstractListingTestCase, unittest.TestCase):
"""Test that the root of the localization service returns listing of localization types."""
def setUp(self):
self.url = baseURL
def assertValidHtml(self, parser):
self.assertIn("common_static/", parser.link_texts)
def assertValidJson(self, jsonData):
self.assertIn("common_static/", jsonData)
def assertValidXml(self, xmlData):
root = xmlData.getroot()
self.assertEquals(root.tag, "entries")
names = [e.text for e in root.findall("entry")]
self.assertIn("common_static/", names)
class TypeTestCase(AbstractListingTestCase, unittest.TestCase):
"""Test that common_static will list context levels."""
def setUp(self):
self.url = urljoin(baseURL, "common_static/")
def assertValidHtml(self, parser):
self.assertIn("base/", parser.link_texts)
self.assertIn("site/", parser.link_texts)
def assertValidJson(self, jsonData):
self.assertIn("base/", jsonData)
self.assertIn("site/", jsonData)
def assertValidXml(self, xmlData):
root = xmlData.getroot()
self.assertEquals(root.tag, "entries")
names = [e.text for e in root.findall("entry")]
self.assertIn("base/", names)
self.assertIn("site/", names)
class LevelTestCase(AbstractListingTestCase, unittest.TestCase):
"""Test that common_static/site will list sites."""
def setUp(self):
self.url = urljoin(baseURL, "common_static/site/")
def assertValidHtml(self, parser):
self.assertIn(testSite +"/", parser.link_texts)
def assertValidJson(self, jsonData):
self.assertIn(testSite +"/", jsonData)
def assertValidXml(self, xmlData):
root = xmlData.getroot()
self.assertEquals(root.tag, "entries")
names = [e.text for e in root.findall("entry")]
self.assertIn(testSite +"/", names)
class AbstractFileListingTestCase(AbstractListingTestCase):
"""Base test case for a file listing"""
def assertValidHtml(self, parser):
self.assertIn(testDir +"/", parser.link_texts)
self.assertEquals(parser.link_texts, sorted(parser.link_texts))
def assertValidJson(self, jsonData):
self.assertIn(testDir +"/", jsonData)
def assertValidXml(self, xmlData):
root = xmlData.getroot()
self.assertEquals(root.tag, "files")
names = [e.get("name") for e in root.findall("file")]
self.assertIn(testDir +"/", names)
self.assertEquals(names, sorted(names))
class BaseFileListingTestCase(AbstractFileListingTestCase, unittest.TestCase):
"""Test that common_static/base lists files"""
def setUp(self):
self.url = urljoin(baseURL, "common_static/base/")
class SiteFileListingTestCase(AbstractFileListingTestCase, unittest.TestCase):
"""Test that common_static/site/<testSite>/ lists files"""
def setUp(self):
self.url = urljoin(baseURL, "common_static/site/" + testSite + "/")
class FileTestCase(unittest.TestCase):
"""Test retrieval, modification and deletion of an individual."""
def setUp(self):
self.url = urljoin(baseURL, "common_static/user/" + username + "/" + testFile)
# The file should not exist before the test, but if it does then delete it
# This is some of the same functionality we are testing so if setup fails
# then the test would probably fail anyway
try:
request = urllib2.Request(self.url)
response = urllib2.urlopen(request)
request = urllib2.Request(self.url)
request.get_method = lambda: "DELETE"
request.add_header("Authorization", authString)
request.add_header("If-Match", response.headers["Content-MD5"])
response = urllib2.urlopen(request)
except urllib2.HTTPError as e:
if e.code != 404:
raise e
def test_file_operations(self):
"""Run through a typical set of file interactions and verify everything works correctly."""
request = urllib2.Request(self.url)
request.get_method = lambda: "PUT"
request.add_data("Test Data")
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(401, cm.exception.code)
request.add_header("Authorization", authString)
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(409, cm.exception.code)
request.add_header("If-Match", "NON_EXISTENT_CHECKSUM")
response = urllib2.urlopen(request)
request = urllib2.Request(self.url)
response = urllib2.urlopen(request)
self.assertEquals(response.read(), "Test Data")
request = urllib2.Request(self.url + "/")
response = urllib2.urlopen(request)
self.assertEquals(response.read(), "Test Data")
request = urllib2.Request(self.url)
request.get_method = lambda: "PUT"
request.add_data("Test Data2")
request.add_header("If-Match", response.headers["Content-MD5"])
request.add_header("Authorization", authString)
response = urllib2.urlopen(request)
checksum = response.headers["Content-MD5"]
request = urllib2.Request(self.url)
response = urllib2.urlopen(request)
self.assertEquals(response.read(), "Test Data2")
request = urllib2.Request(self.url)
request.get_method = lambda: "DELETE"
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(401, cm.exception.code)
request.add_header("Authorization", authString)
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(409, cm.exception.code)
request.add_header("If-Match", checksum)
response = urllib2.urlopen(request)
request = urllib2.Request(self.url)
with self.assertRaises(urllib2.HTTPError) as cm:
response = urllib2.urlopen(request)
self.assertEqual(404, cm.exception.code)
if __name__ == '__main__':
unittest.main()

View file

@ -0,0 +1,87 @@
##
##
#
#
#
#
#
# SOFTWARE HISTORY
#
# Date Ticket# Engineer Description
# ------------ ---------- ----------- --------------------------
# 03/09/11 njensen Initial Creation.
# 08/15/13 2169 bkowal Decompress data read from the queue
#
#
#
import time, sys
import threading
import dynamicserialize
TIME_TO_SLEEP = 300
class ListenThread(threading.Thread):
def __init__(self, hostname, portNumber, topicName):
self.hostname = hostname
self.portNumber = portNumber
self.topicName = topicName
self.nMessagesReceived = 0
self.waitSecond = 0
self.stopped = False
threading.Thread.__init__(self)
def run(self):
from awips import QpidSubscriber
self.qs = QpidSubscriber.QpidSubscriber(self.hostname, self.portNumber, True)
self.qs.topicSubscribe(self.topicName, self.receivedMessage)
def receivedMessage(self, msg):
print "Received message"
self.nMessagesReceived += 1
if self.waitSecond == 0:
fmsg = open('/tmp/rawMessage', 'w')
fmsg.write(msg)
fmsg.close()
while self.waitSecond < TIME_TO_SLEEP and not self.stopped:
if self.waitSecond % 60 == 0:
print time.strftime('%H:%M:%S'), "Sleeping and stuck in not so infinite while loop"
self.waitSecond += 1
time.sleep(1)
print time.strftime('%H:%M:%S'), "Received", self.nMessagesReceived, "messages"
def stop(self):
print "Stopping"
self.stopped = True
self.qs.close()
def main():
print "Starting up at", time.strftime('%H:%M:%S')
topic = 'edex.alerts'
host = 'localhost'
port = 5672
thread = ListenThread(host, port, topic)
try:
thread.start()
while True:
time.sleep(3)
except KeyboardInterrupt:
pass
finally:
thread.stop()
if __name__ == '__main__':
main()

View file

@ -0,0 +1,12 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="acarsDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="acars"/>
<constructor-arg ref="acarsDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,12 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="binLightningDataAccessFactory" class="com.raytheon.uf.common.dataplugin.binlightning.dataaccess.BinLightningAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="binlightning"/>
<constructor-arg ref="binLightningDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,41 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="bufrmosDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosAVN"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosETA"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosGFS"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosHPC"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosLAMP"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosMRF"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrmosNGM"/>
<constructor-arg ref="bufrmosDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,82 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="bufruaDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="bufruaDataAccessFactory" factory-method="register2D">
<constructor-arg value="numMand"/>
<constructor-arg value="prMan"/>
<constructor-arg value="MB"/>
<constructor-arg>
<list>
<value>prMan</value>
<value>htMan</value>
<value>tpMan</value>
<value>tdMan</value>
<value>wdMan</value>
<value>wsMan</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="bufruaDataAccessFactory" factory-method="register2D">
<constructor-arg value="numTrop"/>
<constructor-arg value="prTrop"/>
<constructor-arg value="MB"/>
<constructor-arg>
<list>
<value>prTrop</value>
<value>tpTrop</value>
<value>tdTrop</value>
<value>wdTrop</value>
<value>wsTrop</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="bufruaDataAccessFactory" factory-method="register2D">
<constructor-arg value="numMwnd"/>
<constructor-arg value="prMaxW"/>
<constructor-arg value="MB"/>
<constructor-arg>
<list>
<value>prMaxW</value>
<value>wdMaxW</value>
<value>wsMaxW</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="bufruaDataAccessFactory" factory-method="register2D">
<constructor-arg value="numSigT"/>
<constructor-arg value="prSigT"/>
<constructor-arg value="MB"/>
<constructor-arg>
<list>
<value>prSigT</value>
<value>tpSigT</value>
<value>tdSigT</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="bufruaDataAccessFactory" factory-method="register2D">
<constructor-arg value="numSigW"/>
<constructor-arg value="htSigW"/>
<constructor-arg value="FHAG"/>
<constructor-arg>
<list>
<value>htSigW</value>
<value>wdSigW</value>
<value>wsSigW</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="bufrua"/>
<constructor-arg ref="bufruaDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,11 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="climateGeometryFactory" class="com.raytheon.uf.common.dataplugin.climate.ClimateGeometryFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="climate"/>
<constructor-arg ref="climateGeometryFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,12 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="ldadmesonetDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="ldadmesonet"/>
<constructor-arg ref="ldadmesonetDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,29 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="mdlsndDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="modelsounding"/>
<constructor-arg ref="mdlsndDataAccessFactory"/>
</bean>
<bean factory-bean="mdlsndDataAccessFactory" factory-method="register2D">
<constructor-arg value="numProfLvls"/>
<constructor-arg value="pressure"/>
<constructor-arg value="MB"/>
<constructor-arg>
<list>
<value>pressure</value>
<value>temperature</value>
<value>specHum</value>
<value>omega</value>
<value>uComp</value>
<value>vComp</value>
<value>cldCvr</value>
</list>
</constructor-arg>
</bean>
</beans>

View file

@ -0,0 +1,39 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="obsDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="obsDataAccessFactory" factory-method="register2D">
<!-- There is no counter field. -->
<constructor-arg ><null /></constructor-arg>
<constructor-arg value="skyLayerBase"/>
<constructor-arg value="FHAG"/>
<constructor-arg>
<list>
<value>skyCover</value>
<value>skyLayerBase</value>
<value>skyCoverType</value>
<value>skyCoverGenus</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="obsDataAccessFactory" factory-method="register2D">
<!-- There are no counter or layer fields. -->
<constructor-arg><null /></constructor-arg>
<constructor-arg><null /></constructor-arg>
<constructor-arg value="UNKNOWN"/>
<constructor-arg>
<list>
<value>presWeather</value>
</list>
</constructor-arg>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="obs"/>
<constructor-arg ref="obsDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,32 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="profilerDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="profiler"/>
<constructor-arg ref="profilerDataAccessFactory"/>
</bean>
<bean factory-bean="profilerDataAccessFactory" factory-method="register2D">
<constructor-arg value="numProfLvls"/>
<constructor-arg value="height"/>
<constructor-arg value="FHAG"/>
<constructor-arg>
<list>
<value>height</value>
<value>uComponent</value>
<value>vComponent</value>
<value>HorizSpStdDev</value>
<value>wComponent</value>
<value>VertSpStdDev</value>
<value>peakPower</value>
<value>levelMode</value>
<value>uvQualityCode</value>
<value>consensusNum</value>
</list>
</constructor-arg>
</bean>
</beans>

View file

@ -0,0 +1,12 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="sfcobsDataAccessFactory" class="com.raytheon.uf.common.pointdata.dataaccess.PointDataAccessFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="sfcobs"/>
<constructor-arg ref="sfcobsDataAccessFactory"/>
</bean>
</beans>

View file

@ -0,0 +1,17 @@
<beans xmlns="http://www.springframework.org/schema/beans"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.1.xsd">
<bean id="warningGeometryFactory" class="com.raytheon.uf.common.dataplugin.warning.dataaccess.WarningGeometryFactory" />
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="warning"/>
<constructor-arg ref="warningGeometryFactory"/>
</bean>
<bean factory-bean="dataAccessRegistry" factory-method="register">
<constructor-arg value="practicewarning"/>
<constructor-arg ref="warningGeometryFactory"/>
</bean>
</beans>

218
docs/Makefile Normal file
View file

@ -0,0 +1,218 @@
# Makefile for Sphinx documentation
#
# You can set these variables from the command line.
SPHINXOPTS =
SPHINXBUILD = sphinx-build
PAPER =
BUILDDIR = build
NBCONVERT = ipython nbconvert
# User-friendly check for sphinx-build
ifeq ($(shell which $(SPHINXBUILD) >/dev/null 2>&1; echo $$?), 1)
$(error The '$(SPHINXBUILD)' command was not found. Make sure you have Sphinx installed, then set the SPHINXBUILD environment variable to point to the full path of the '$(SPHINXBUILD)' executable. Alternatively you can add the directory with the executable to your PATH. If you don't have Sphinx installed, grab it from http://sphinx-doc.org/)
endif
# Internal variables.
PAPEROPT_a4 = -D latex_paper_size=a4
PAPEROPT_letter = -D latex_paper_size=letter
ALLSPHINXOPTS = -d $(BUILDDIR)/doctrees $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
# the i18n builder cannot share the environment and doctrees with the others
I18NSPHINXOPTS = $(PAPEROPT_$(PAPER)) $(SPHINXOPTS) source
.PHONY: help
help:
@echo "Please use \`make <target>' where <target> is one of"
@echo " html to make standalone HTML files"
@echo " dirhtml to make HTML files named index.html in directories"
@echo " singlehtml to make a single large HTML file"
@echo " pickle to make pickle files"
@echo " json to make JSON files"
@echo " htmlhelp to make HTML files and a HTML help project"
@echo " qthelp to make HTML files and a qthelp project"
@echo " applehelp to make an Apple Help Book"
@echo " devhelp to make HTML files and a Devhelp project"
@echo " epub to make an epub"
@echo " latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter"
@echo " latexpdf to make LaTeX files and run them through pdflatex"
@echo " latexpdfja to make LaTeX files and run them through platex/dvipdfmx"
@echo " text to make text files"
@echo " man to make manual pages"
@echo " texinfo to make Texinfo files"
@echo " info to make Texinfo files and run them through makeinfo"
@echo " gettext to make PO message catalogs"
@echo " changes to make an overview of all changed/added/deprecated items"
@echo " xml to make Docutils-native XML files"
@echo " pseudoxml to make pseudoxml-XML files for display purposes"
@echo " linkcheck to check all external links for integrity"
@echo " doctest to run all doctests embedded in the documentation (if enabled)"
@echo " coverage to run coverage check of the documentation (if enabled)"
.PHONY: clean
clean:
rm -rf $(BUILDDIR)/* source/examples/generated/*
.PHONY: html
html:
make clean
$(SPHINXBUILD) -vb html $(ALLSPHINXOPTS) $(BUILDDIR)/html
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/html."
.PHONY: dirhtml
dirhtml:
$(SPHINXBUILD) -b dirhtml $(ALLSPHINXOPTS) $(BUILDDIR)/dirhtml
@echo
@echo "Build finished. The HTML pages are in $(BUILDDIR)/dirhtml."
.PHONY: singlehtml
singlehtml:
$(SPHINXBUILD) -b singlehtml $(ALLSPHINXOPTS) $(BUILDDIR)/singlehtml
@echo
@echo "Build finished. The HTML page is in $(BUILDDIR)/singlehtml."
.PHONY: pickle
pickle:
$(SPHINXBUILD) -b pickle $(ALLSPHINXOPTS) $(BUILDDIR)/pickle
@echo
@echo "Build finished; now you can process the pickle files."
.PHONY: json
json:
$(SPHINXBUILD) -b json $(ALLSPHINXOPTS) $(BUILDDIR)/json
@echo
@echo "Build finished; now you can process the JSON files."
.PHONY: htmlhelp
htmlhelp:
$(SPHINXBUILD) -b htmlhelp $(ALLSPHINXOPTS) $(BUILDDIR)/htmlhelp
@echo
@echo "Build finished; now you can run HTML Help Workshop with the" \
".hhp project file in $(BUILDDIR)/htmlhelp."
.PHONY: qthelp
qthelp:
$(SPHINXBUILD) -b qthelp $(ALLSPHINXOPTS) $(BUILDDIR)/qthelp
@echo
@echo "Build finished; now you can run "qcollectiongenerator" with the" \
".qhcp project file in $(BUILDDIR)/qthelp, like this:"
@echo "# qcollectiongenerator $(BUILDDIR)/qthelp/python-awips.qhcp"
@echo "To view the help file:"
@echo "# assistant -collectionFile $(BUILDDIR)/qthelp/python-awips.qhc"
.PHONY: applehelp
applehelp:
$(SPHINXBUILD) -b applehelp $(ALLSPHINXOPTS) $(BUILDDIR)/applehelp
@echo
@echo "Build finished. The help book is in $(BUILDDIR)/applehelp."
@echo "N.B. You won't be able to view it unless you put it in" \
"~/Library/Documentation/Help or install it in your application" \
"bundle."
.PHONY: devhelp
devhelp:
$(SPHINXBUILD) -b devhelp $(ALLSPHINXOPTS) $(BUILDDIR)/devhelp
@echo
@echo "Build finished."
@echo "To view the help file:"
@echo "# mkdir -p $$HOME/.local/share/devhelp/python-awips"
@echo "# ln -s $(BUILDDIR)/devhelp $$HOME/.local/share/devhelp/python-awips"
@echo "# devhelp"
.PHONY: epub
epub:
$(SPHINXBUILD) -b epub $(ALLSPHINXOPTS) $(BUILDDIR)/epub
@echo
@echo "Build finished. The epub file is in $(BUILDDIR)/epub."
.PHONY: latex
latex:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo
@echo "Build finished; the LaTeX files are in $(BUILDDIR)/latex."
@echo "Run \`make' in that directory to run these through (pdf)latex" \
"(use \`make latexpdf' here to do that automatically)."
.PHONY: latexpdf
latexpdf:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through pdflatex..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: latexpdfja
latexpdfja:
$(SPHINXBUILD) -b latex $(ALLSPHINXOPTS) $(BUILDDIR)/latex
@echo "Running LaTeX files through platex and dvipdfmx..."
$(MAKE) -C $(BUILDDIR)/latex all-pdf-ja
@echo "pdflatex finished; the PDF files are in $(BUILDDIR)/latex."
.PHONY: text
text:
$(SPHINXBUILD) -b text $(ALLSPHINXOPTS) $(BUILDDIR)/text
@echo
@echo "Build finished. The text files are in $(BUILDDIR)/text."
.PHONY: man
man:
$(SPHINXBUILD) -b man $(ALLSPHINXOPTS) $(BUILDDIR)/man
@echo
@echo "Build finished. The manual pages are in $(BUILDDIR)/man."
.PHONY: texinfo
texinfo:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo
@echo "Build finished. The Texinfo files are in $(BUILDDIR)/texinfo."
@echo "Run \`make' in that directory to run these through makeinfo" \
"(use \`make info' here to do that automatically)."
.PHONY: info
info:
$(SPHINXBUILD) -b texinfo $(ALLSPHINXOPTS) $(BUILDDIR)/texinfo
@echo "Running Texinfo files through makeinfo..."
make -C $(BUILDDIR)/texinfo info
@echo "makeinfo finished; the Info files are in $(BUILDDIR)/texinfo."
.PHONY: gettext
gettext:
$(SPHINXBUILD) -b gettext $(I18NSPHINXOPTS) $(BUILDDIR)/locale
@echo
@echo "Build finished. The message catalogs are in $(BUILDDIR)/locale."
.PHONY: changes
changes:
$(SPHINXBUILD) -b changes $(ALLSPHINXOPTS) $(BUILDDIR)/changes
@echo
@echo "The overview file is in $(BUILDDIR)/changes."
.PHONY: linkcheck
linkcheck:
$(SPHINXBUILD) -b linkcheck $(ALLSPHINXOPTS) $(BUILDDIR)/linkcheck
@echo
@echo "Link check complete; look for any errors in the above output " \
"or in $(BUILDDIR)/linkcheck/output.txt."
.PHONY: doctest
doctest:
$(SPHINXBUILD) -b doctest $(ALLSPHINXOPTS) $(BUILDDIR)/doctest
@echo "Testing of doctests in the sources finished, look at the " \
"results in $(BUILDDIR)/doctest/output.txt."
.PHONY: coverage
coverage:
$(SPHINXBUILD) -b coverage $(ALLSPHINXOPTS) $(BUILDDIR)/coverage
@echo "Testing of coverage in the sources finished, look at the " \
"results in $(BUILDDIR)/coverage/python.txt."
.PHONY: xml
xml:
$(SPHINXBUILD) -b xml $(ALLSPHINXOPTS) $(BUILDDIR)/xml
@echo
@echo "Build finished. The XML files are in $(BUILDDIR)/xml."
.PHONY: pseudoxml
pseudoxml:
$(SPHINXBUILD) -b pseudoxml $(ALLSPHINXOPTS) $(BUILDDIR)/pseudoxml
@echo
@echo "Build finished. The pseudo-XML files are in $(BUILDDIR)/pseudoxml."

263
docs/make.bat Normal file
View file

@ -0,0 +1,263 @@
@ECHO OFF
REM Command file for Sphinx documentation
if "%SPHINXBUILD%" == "" (
set SPHINXBUILD=sphinx-build
)
set BUILDDIR=build
set ALLSPHINXOPTS=-d %BUILDDIR%/doctrees %SPHINXOPTS% source
set I18NSPHINXOPTS=%SPHINXOPTS% source
if NOT "%PAPER%" == "" (
set ALLSPHINXOPTS=-D latex_paper_size=%PAPER% %ALLSPHINXOPTS%
set I18NSPHINXOPTS=-D latex_paper_size=%PAPER% %I18NSPHINXOPTS%
)
if "%1" == "" goto help
if "%1" == "help" (
:help
echo.Please use `make ^<target^>` where ^<target^> is one of
echo. html to make standalone HTML files
echo. dirhtml to make HTML files named index.html in directories
echo. singlehtml to make a single large HTML file
echo. pickle to make pickle files
echo. json to make JSON files
echo. htmlhelp to make HTML files and a HTML help project
echo. qthelp to make HTML files and a qthelp project
echo. devhelp to make HTML files and a Devhelp project
echo. epub to make an epub
echo. latex to make LaTeX files, you can set PAPER=a4 or PAPER=letter
echo. text to make text files
echo. man to make manual pages
echo. texinfo to make Texinfo files
echo. gettext to make PO message catalogs
echo. changes to make an overview over all changed/added/deprecated items
echo. xml to make Docutils-native XML files
echo. pseudoxml to make pseudoxml-XML files for display purposes
echo. linkcheck to check all external links for integrity
echo. doctest to run all doctests embedded in the documentation if enabled
echo. coverage to run coverage check of the documentation if enabled
goto end
)
if "%1" == "clean" (
for /d %%i in (%BUILDDIR%\*) do rmdir /q /s %%i
del /q /s %BUILDDIR%\*
goto end
)
REM Check if sphinx-build is available and fallback to Python version if any
%SPHINXBUILD% 1>NUL 2>NUL
if errorlevel 9009 goto sphinx_python
goto sphinx_ok
:sphinx_python
set SPHINXBUILD=python -m sphinx.__init__
%SPHINXBUILD% 2> nul
if errorlevel 9009 (
echo.
echo.The 'sphinx-build' command was not found. Make sure you have Sphinx
echo.installed, then set the SPHINXBUILD environment variable to point
echo.to the full path of the 'sphinx-build' executable. Alternatively you
echo.may add the Sphinx directory to PATH.
echo.
echo.If you don't have Sphinx installed, grab it from
echo.http://sphinx-doc.org/
exit /b 1
)
:sphinx_ok
if "%1" == "html" (
%SPHINXBUILD% -b html %ALLSPHINXOPTS% %BUILDDIR%/html
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/html.
goto end
)
if "%1" == "dirhtml" (
%SPHINXBUILD% -b dirhtml %ALLSPHINXOPTS% %BUILDDIR%/dirhtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/dirhtml.
goto end
)
if "%1" == "singlehtml" (
%SPHINXBUILD% -b singlehtml %ALLSPHINXOPTS% %BUILDDIR%/singlehtml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The HTML pages are in %BUILDDIR%/singlehtml.
goto end
)
if "%1" == "pickle" (
%SPHINXBUILD% -b pickle %ALLSPHINXOPTS% %BUILDDIR%/pickle
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the pickle files.
goto end
)
if "%1" == "json" (
%SPHINXBUILD% -b json %ALLSPHINXOPTS% %BUILDDIR%/json
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can process the JSON files.
goto end
)
if "%1" == "htmlhelp" (
%SPHINXBUILD% -b htmlhelp %ALLSPHINXOPTS% %BUILDDIR%/htmlhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run HTML Help Workshop with the ^
.hhp project file in %BUILDDIR%/htmlhelp.
goto end
)
if "%1" == "qthelp" (
%SPHINXBUILD% -b qthelp %ALLSPHINXOPTS% %BUILDDIR%/qthelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished; now you can run "qcollectiongenerator" with the ^
.qhcp project file in %BUILDDIR%/qthelp, like this:
echo.^> qcollectiongenerator %BUILDDIR%\qthelp\python-awips.qhcp
echo.To view the help file:
echo.^> assistant -collectionFile %BUILDDIR%\qthelp\python-awips.ghc
goto end
)
if "%1" == "devhelp" (
%SPHINXBUILD% -b devhelp %ALLSPHINXOPTS% %BUILDDIR%/devhelp
if errorlevel 1 exit /b 1
echo.
echo.Build finished.
goto end
)
if "%1" == "epub" (
%SPHINXBUILD% -b epub %ALLSPHINXOPTS% %BUILDDIR%/epub
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The epub file is in %BUILDDIR%/epub.
goto end
)
if "%1" == "latex" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
if errorlevel 1 exit /b 1
echo.
echo.Build finished; the LaTeX files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdf" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "latexpdfja" (
%SPHINXBUILD% -b latex %ALLSPHINXOPTS% %BUILDDIR%/latex
cd %BUILDDIR%/latex
make all-pdf-ja
cd %~dp0
echo.
echo.Build finished; the PDF files are in %BUILDDIR%/latex.
goto end
)
if "%1" == "text" (
%SPHINXBUILD% -b text %ALLSPHINXOPTS% %BUILDDIR%/text
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The text files are in %BUILDDIR%/text.
goto end
)
if "%1" == "man" (
%SPHINXBUILD% -b man %ALLSPHINXOPTS% %BUILDDIR%/man
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The manual pages are in %BUILDDIR%/man.
goto end
)
if "%1" == "texinfo" (
%SPHINXBUILD% -b texinfo %ALLSPHINXOPTS% %BUILDDIR%/texinfo
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The Texinfo files are in %BUILDDIR%/texinfo.
goto end
)
if "%1" == "gettext" (
%SPHINXBUILD% -b gettext %I18NSPHINXOPTS% %BUILDDIR%/locale
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The message catalogs are in %BUILDDIR%/locale.
goto end
)
if "%1" == "changes" (
%SPHINXBUILD% -b changes %ALLSPHINXOPTS% %BUILDDIR%/changes
if errorlevel 1 exit /b 1
echo.
echo.The overview file is in %BUILDDIR%/changes.
goto end
)
if "%1" == "linkcheck" (
%SPHINXBUILD% -b linkcheck %ALLSPHINXOPTS% %BUILDDIR%/linkcheck
if errorlevel 1 exit /b 1
echo.
echo.Link check complete; look for any errors in the above output ^
or in %BUILDDIR%/linkcheck/output.txt.
goto end
)
if "%1" == "doctest" (
%SPHINXBUILD% -b doctest %ALLSPHINXOPTS% %BUILDDIR%/doctest
if errorlevel 1 exit /b 1
echo.
echo.Testing of doctests in the sources finished, look at the ^
results in %BUILDDIR%/doctest/output.txt.
goto end
)
if "%1" == "coverage" (
%SPHINXBUILD% -b coverage %ALLSPHINXOPTS% %BUILDDIR%/coverage
if errorlevel 1 exit /b 1
echo.
echo.Testing of coverage in the sources finished, look at the ^
results in %BUILDDIR%/coverage/python.txt.
goto end
)
if "%1" == "xml" (
%SPHINXBUILD% -b xml %ALLSPHINXOPTS% %BUILDDIR%/xml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The XML files are in %BUILDDIR%/xml.
goto end
)
if "%1" == "pseudoxml" (
%SPHINXBUILD% -b pseudoxml %ALLSPHINXOPTS% %BUILDDIR%/pseudoxml
if errorlevel 1 exit /b 1
echo.
echo.Build finished. The pseudo-XML files are in %BUILDDIR%/pseudoxml.
goto end
)
:end

4
docs/requirements.txt Normal file
View file

@ -0,0 +1,4 @@
sphinx>=1.3
nbconvert>=4.1
enum34
jupyter

181
docs/source/about.rst Normal file
View file

@ -0,0 +1,181 @@
===================
About Unidata AWIPS
===================
AWIPS is a weather forecasting display and analysis package being
developed by the National Weather Service and Raytheon. AWIPS is a
Java application consisting of a data-rendering client (CAVE, which runs
on Red Hat/CentOS Linux and Mac OS X) and a backend data server (EDEX,
which runs only on Linux)
AWIPS takes a unified approach to data ingest, and most data types
follow a standard path through the system. At a high level, data flow
describes the path taken by a piece of data from its source to its
display by a client system. This path starts with data requested and
stored by an `LDM <#ldm>`_ client and includes the decoding of the data
and storing of decoded data in a form readable and displayable by the
end user.
The AWIPS ingest and request processes are a highly distributed
system, and the messaging broken `Qpid <#qpid>`_ is used for
inter-process communication.
.. figure:: http://www.unidata.ucar.edu/software/awips2/images/awips2_coms.png
:align: center
:alt: image
image
License
-------
The AWIPS software package released by the Unidata Program Center is considered to
be in the public domain since it is released without proprietary code. As such, export
controls do not apply.  Any person is free to download, modify, distribute, or share
Unidata AWIPS in any form. Entities who modify or re-distribute Unidata AWIPS
software are encouraged to conduct their own FOSS/COTS entitlement/license review
to ensure that they remain compatible with the associated terms (see
FOSS_COTS_License.pdf at `https://github.com/Unidata/awips2 <https://github.com/Unidata/awips2>`_).
About AWIPS
-----------
The primary AWIPS application for data ingest, processing, and
storage is the Environmental Data EXchange (**EDEX**) server; the
primary AWIPS application for visualization/data manipulation is the
Common AWIPS Visualization Environment (**CAVE**) client, which is
typically installed on a workstation separate from other AWIPS
components.
In addition to programs developed specifically for AWIPS, AWIPS uses
several commercial off-the-shelf (COTS) and Free or Open Source software
(FOSS) products to assist in its operation. The following components,
working together and communicating, compose the entire AWIPS system.
EDEX
----
The main server for AWIPS. Qpid sends alerts to EDEX when data stored
by the LDM is ready for processing. These Qpid messages include file
header information which allows EDEX to determine the appropriate data
decoder to use. The default ingest server (simply named ingest) handles
all data ingest other than grib messages, which are processed by a
separate ingestGrib server. After decoding, EDEX writes metadata to the
database via Postgres and saves the processed data in HDF5 via PyPIES. A
third EDEX server, request, feeds requested data to CAVE clients. EDEX
ingest and request servers are started and stopped with the commands
``edex start`` and ``edex stop``, which runs the system script
``/etc/rc.d/init.d/edex_camel``
CAVE
----
Common AWIPS Visualization Environment. The data rendering and
visualization tool for AWIPS. CAVE contains of a number of different
data display configurations called perspectives. Perspectives used in
operational forecasting environments include **D2D** (Display
Two-Dimensional), **GFE** (Graphical Forecast Editor), and **NCP**
(National Centers Perspective). CAVE is started with the command
``/awips2/cave/cave.sh`` or ``cave.sh``
.. figure:: http://www.unidata.ucar.edu/software/awips2/images/Unidata_AWIPS2_CAVE.png
:align: center
:alt: CAVE
CAVE
Alertviz
--------
**Alertviz** is a modernized version of an AWIPS I application, designed
to present various notifications, error messages, and alarms to the user
(forecaster). AlertViz can be executed either independently or from CAVE
itself. In the Unidata CAVE client, Alertviz is run within CAVE and is
not required to be run separately. The toolbar is also **hidden from
view** and is accessed by right-click on the desktop taskbar icon.
LDM
---
`http://www.unidata.ucar.edu/software/ldm/ <http://www.unidata.ucar.edu/software/ldm/>`_
The **LDM** (Local Data Manager), developed and supported by Unidata, is
a suite of client and server programs designed for data distribution,
and is the fundamental component comprising the Unidata Internet Data
Distribution (IDD) system. In AWIPS, the LDM provides data feeds for
grids, surface observations, upper-air profiles, satellite and radar
imagery and various other meteorological datasets. The LDM writes data
directly to file and alerts EDEX via Qpid when a file is available for
processing. The LDM is started and stopped with the commands
``edex start`` and ``edex stop``, which runs the commands
``service edex_ldm start`` and ``service edex_ldm stop``
edexBridge
----------
edexBridge, invoked in the LDM configuration file
``/awips2/ldm/etc/ldmd.conf``, is used by the LDM to post "data
available" messaged to Qpid, which alerts the EDEX Ingest server that a
file is ready for processing.
Qpid
----
`http://qpid.apache.org <http://qpid.apache.org>`_
**Apache Qpid**, the Queue Processor Interface Daemon, is the messaging
system used by AWIPS to facilitate communication between services.
When the LDM receives a data file to be processed, it employs
**edexBridge** to send EDEX ingest servers a message via Qpid. When EDEX
has finished decoding the file, it sends CAVE a message via Qpid that
data are available for display or further processing. Qpid is started
and stopped by ``edex start`` and ``edex stop``, and is controlled by
the system script ``/etc/rc.d/init.d/qpidd``
PostgreSQL
----------
`http://www.postgresql.org <http://www.postgresql.org>`_
**PostgreSQL**, known simply as Postgres, is a relational database
management system (DBMS) which handles the storage and retrieval of
metadata, database tables and some decoded data. The storage and reading
of EDEX metadata is handled by the Postgres DBMS. Users may query the
metadata tables by using the termainal-based front-end for Postgres
called **psql**. Postgres is started and stopped by ``edex start`` and
``edex stop``, and is controlled by the system script
``/etc/rc.d/init.d/edex_postgres``
HDF5
----
`http://www.hdfgroup.org/HDF5/ <http://www.hdfgroup.org/HDF5/>`_
**Hierarchical Data Format (v.5)** is
the primary data storage format used by AWIPS for processed grids,
satellite and radar imagery and other products. Similar to netCDF,
developed and supported by Unidata, HDF5 supports multiple types of data
within a single file. For example, a single HDF5 file of radar data may
contain multiple volume scans of base reflectivity and base velocity as
well as derived products such as composite reflectivity. The file may
also contain data from multiple radars. HDF5 is stored in
``/awips2/edex/data/hdf5/``
PyPIES (httpd-pypies)
---------------------
**PyPIES**, Python Process Isolated Enhanced Storage, was created for
AWIPS to isolate the management of HDF5 Processed Data Storage from
the EDEX processes. PyPIES manages access, i.e., reads and writes, of
data in the HDF5 files. In a sense, PyPIES provides functionality
similar to a DBMS (i.e PostgreSQL for metadata); all data being written
to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5
are processed by PyPIES.
PyPIES is implemented in two parts: 1. The PyPIES manager is a Python
application that runs as part of an Apache HTTP server, and handles
requests to store and retrieve data. 2. The PyPIES logger is a Python
process that coordinates logging. PyPIES is started and stopped by
``edex start`` and ``edex stop``, and is controlled by the system script
``/etc/rc.d/init.d/https-pypies``

View file

@ -0,0 +1,7 @@
===============
DataAccessLayer
===============
.. automodule:: awips.dataaccess.DataAccessLayer
:members:
:undoc-members:

View file

@ -0,0 +1,7 @@
=================
DateTimeConverter
=================
.. automodule:: awips.DateTimeConverter
:members:
:undoc-members:

View file

@ -0,0 +1,7 @@
===============================
IDataRequest (newDataRequest())
===============================
.. autoclass:: awips.dataaccess.IDataRequest
:members:
:special-members:

View file

@ -0,0 +1,7 @@
======================
PyData
======================
.. automodule:: awips.dataaccess.PyData
:members:
:undoc-members:

View file

@ -0,0 +1,7 @@
======================
PyGeometryData
======================
.. automodule:: awips.dataaccess.PyGeometryData
:members:
:undoc-members:

View file

@ -0,0 +1,7 @@
======================
PyGridData
======================
.. automodule:: awips.dataaccess.PyGridData
:members:
:undoc-members:

Some files were not shown because too many files have changed in this diff Show more