awips2/search/search_index.json

2014 lines
556 KiB
JSON
Raw Normal View History

{
"docs": [
{
"location": "/",
"text": "Unidata AWIPS User Manual\n\uf0c1\n\n\nhttps://www.unidata.ucar.edu/software/awips2\n \n\n\n\n\nThe Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UCP) which develops and supports a modified non-operational version of AWIPS for use in research and education by \nUCAR member institutions\n. This is released as open source software, free to download and use by anyone.\n\n\nAWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the \nLDM\n client pulling data feeds from the \nUnidata IDD\n. Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by \nEDEX\n, which serves products and data over http.\n\n\nUnidata supports two data visualization frameworks: \nCAVE\n (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and the \npython-awips\n.\n\n\n\n\nNote\n: Our version of CAVE is a \nnon-operational\n version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well.\n\n\n\n\n\n\n\n\nDownload and Install CAVE\n\uf0c1\n\n\n\n\nDownload and Install EDEX\n\uf0c1\n\n\n\n\nLicense\n\uf0c1\n\n\nUnidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS contains no proprietery content and is therefore not subject to export controls as stated in the Master Rights licensing file.\n\n\n\n\nAWIPS Data in the Cloud\n\uf0c1\n\n\nUnidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter \nedex-cloud.unidata.ucar.edu\n (without \nhttp://\n before, or \n:9581/services\n after).\n\n\n\n\n\n\nDistributed Computing\n\uf0c1\n\n\nAWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data.\n\n\n\n\nRead More: \nDistributed EDEX\n\n\n\n\n\n\n\n\nSoftware Components\n\uf0c1\n\n\n\n\nEDEX\n\n\nCAVE\n\n\nLDM\n\n\nedexBridge\n\n\nQpid\n\n\nPostgreSQL\n\n\nHDF5\n\n\nPyPIES\n\n\n\n\nEDEX\n\uf0c1\n\n\nThe main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands \nedex start\n and \nedex stop\n, which runs the system scr
"title": "Home"
},
{
"location": "/#unidata-awips-user-manual",
"text": "https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the Unidata Program Center (UCP) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. Unidata supports two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and the python-awips . Note : Our version of CAVE is a non-operational version. It does not support some features of NWS AWIPS. Warnings and alerts cannot be issued from Unidata's CAVE. Additional functionality may not be available as well.",
"title": "Unidata AWIPS User Manual"
},
{
"location": "/#download-and-install-cave",
"text": "",
"title": "Download and Install CAVE"
},
{
"location": "/#download-and-install-edex",
"text": "",
"title": "Download and Install EDEX"
},
{
"location": "/#license",
"text": "Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS contains no proprietery content and is therefore not subject to export controls as stated in the Master Rights licensing file.",
"title": "License"
},
{
"location": "/#awips-data-in-the-cloud",
"text": "Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).",
"title": "AWIPS Data in the Cloud"
},
{
"location": "/#distributed-computing",
"text": "AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, Unidata modified the package to be more applicable in the University setting. Because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX",
"title": "Distributed Computing"
},
{
"location": "/#software-components",
"text": "EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES",
"title": "Software Components"
},
{
"location": "/#edex",
"text": "The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX",
"title": "EDEX"
},
{
"location": "/#cave",
"text": "Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE",
"title": "CAVE"
},
{
"location": "/#ldm",
"text": "https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop",
"title": "LDM"
},
{
"location": "/#edexbridge",
"text": "edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing.",
"title": "edexBridge"
},
{
"location": "/#qpid",
"text": "http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd",
"title": "Qpid"
},
{
"location": "/#postgresql",
"text": "http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres",
"title": "PostgreSQL"
},
{
"location": "/#hdf5",
"text": "http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ .",
"title": "HDF5"
},
{
"location": "/#pypies",
"text": "PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .",
"title": "PyPIES"
},
{
"location": "/install/install-cave/",
"text": "Install CAVE\n\uf0c1\n\n\nCAVE is the \nC\nommon \nA\nWIPS \nV\nisualization \nE\nnvironment that is used for rendering and analyzing data for AWIPS. Unidata supports CAVE to work on three platforms: \nCentos (Redhat) Linux\n, \nWindows\n, and \nmacOS\n. The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well.\n\n\n\n\nGeneral Requirements\n\uf0c1\n\n\nRegardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally:\n\n\n\n\nJava 1.8\n\n\nOpenGL 2.0 Compatible Devices\n\n\nAt least 4GB RAM\n\n\nAt least 2GB Disk Space for Caching\n\n\nNVIDIA Graphics Card\n\n\nLatest NVIDIA Driver\n\n\nNote: While other graphics cards \nmay\n work, NVIDIA Quadro graphics card is recommended for full visualization capability\n\n\n\n\n\n\n\n\n\n\nLinux \n\uf0c1\n\n\nSystem Requirements\n\uf0c1\n\n\n\n\n64 bit CentOS/Red Hat 7\n\n\n\n\nDownload and Installation Instructions\n\uf0c1\n\n\n\n\nDownload the following installer: \nawips_install.sh\n \n\n\nIn a terminal, go to the download directory \n\n\nMake the installer an executable by running: \nchmod 755 awips_install.sh\n\n\nRun the installer: \nsudo ./awips_install.sh --cave\n\n\nThis will install the application in \n/awips2/cave/\n and set the local cache to \n~/caveData/\n\n\n\n\n\n\n\n\nTo run CAVE either:\n\n\n\n\nUse the terminal and type the command \ncave\n\n\nFind the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE\n\n\n\n\n\n\nWindows \n\uf0c1\n\n\nFor Windows, Unidata offers two installation options: a \nLinux Virtual Machine\n, or a \nDirect Windows Installation\n.\n\n\nCurrently, the \nvirtual machine (VM)\n is the recommended form of install for those who do not have administrative priviledges on the machine, or beginners who want a simpler installation process. \n\n\n\n\nNote: At the moment, the VM option may not render all products in CAVE (ex. RGB composites of satellite imagery)\n\n\n\n\nThe \ndirect installation method\n is recommended for those who have administrative priviledges and a little bit of experience installing more software.\n\n\nMethod 1: Linux Virtual Machine\n\uf0c1\n\n\nThis method is recommended for beginners, or those with less computer knowledge as it is a very simple installation, however at this time, some CAVE functionality may be missing (ex: rendering RGB satellite images).\n\n\nSystem Requirements\n\uf0c1\n\n\n\n\nVMWare Workstation Player\n must be installed (free software)\n\n\n\n\nDownload and Installation Instructions\n\uf0c1\n\n\n\n\nDownload the zipped file containing the virtual machine: \nunidata_cave.zip\n \n\n\nUnzip the folder by right-clicking and selecting \"Extract All\". All files will be extracted into a new folder.\n\n\nOpen VMWare Player and go to \nPlayer\n > \nFile...\n > \nOpen\n and locate the folder that was created from the downloaded zipped file. Select the file called \n\"CentOS 7 - Unidata CAVE\"\n.\n\n\nRun this new VM option. If it asks if it's been moved or copied, select \ncopied\n.\n\n\nThere will be a user in the Linux machine named \"awips\" and the password is \"awips\"\n\n\nThe root password is \"unidataAWIPS\" if ever needed\n\n\n\n\n\n\n\n\nOnce inside the VM, to run CAVE either:\n\n\n\n\nUse the desktop icon \n\n\nUse the terminal and type the command \ncave\n\n\nFind the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE\n\n\n\n\nMethod 2: Direct Windows Install\n\uf0c1\n\n\nThis method is recommended for personal use and requires Administrative priviledges. It should enable full CAVE capability, but it is a bit lengthy and might take about 20 minutes or so to complete. \n\n\nSystem Requirements\n\uf0c1\n\n\n\n\n64-bit Miniconda3 (4.8.2)\n\n\nPython3 (comes with Miniconda installation)\n\n\n64-bit Java JDK 1.8 (1.8_181)\n\n\n64-bit Visual C++ Build Tools 2015 Update 3 (14.1)\n\n\nNumpy (1.15.1)\n\n\nJep (3.8.2)\n\n\nUser Variable PATH must have miniconda3 location\n\n\nUs
"title": "Install CAVE"
},
{
"location": "/install/install-cave/#install-cave",
"text": "CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. Unidata supports CAVE to work on three platforms: Centos (Redhat) Linux , Windows , and macOS . The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well.",
"title": "Install CAVE"
},
{
"location": "/install/install-cave/#general-requirements",
"text": "Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Java 1.8 OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver Note: While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability",
"title": "General Requirements"
},
{
"location": "/install/install-cave/#linux",
"text": "",
"title": "Linux "
},
{
"location": "/install/install-cave/#system-requirements",
"text": "64 bit CentOS/Red Hat 7",
"title": "System Requirements"
},
{
"location": "/install/install-cave/#download-and-installation-instructions",
"text": "Download the following installer: awips_install.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install.sh Run the installer: sudo ./awips_install.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/ To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE",
"title": "Download and Installation Instructions"
},
{
"location": "/install/install-cave/#windows",
"text": "For Windows, Unidata offers two installation options: a Linux Virtual Machine , or a Direct Windows Installation . Currently, the virtual machine (VM) is the recommended form of install for those who do not have administrative priviledges on the machine, or beginners who want a simpler installation process. Note: At the moment, the VM option may not render all products in CAVE (ex. RGB composites of satellite imagery) The direct installation method is recommended for those who have administrative priviledges and a little bit of experience installing more software.",
"title": "Windows "
},
{
"location": "/install/install-cave/#method-1-linux-virtual-machine",
"text": "This method is recommended for beginners, or those with less computer knowledge as it is a very simple installation, however at this time, some CAVE functionality may be missing (ex: rendering RGB satellite images).",
"title": "Method 1: Linux Virtual Machine"
},
{
"location": "/install/install-cave/#system-requirements_1",
"text": "VMWare Workstation Player must be installed (free software)",
"title": "System Requirements"
},
{
"location": "/install/install-cave/#download-and-installation-instructions_1",
"text": "Download the zipped file containing the virtual machine: unidata_cave.zip Unzip the folder by right-clicking and selecting \"Extract All\". All files will be extracted into a new folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE\" . Run this new VM option. If it asks if it's been moved or copied, select copied . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE",
"title": "Download and Installation Instructions"
},
{
"location": "/install/install-cave/#method-2-direct-windows-install",
"text": "This method is recommended for personal use and requires Administrative priviledges. It should enable full CAVE capability, but it is a bit lengthy and might take about 20 minutes or so to complete.",
"title": "Method 2: Direct Windows Install"
},
{
"location": "/install/install-cave/#system-requirements_2",
"text": "64-bit Miniconda3 (4.8.2) Python3 (comes with Miniconda installation) 64-bit Java JDK 1.8 (1.8_181) 64-bit Visual C++ Build Tools 2015 Update 3 (14.1) Numpy (1.15.1) Jep (3.8.2) User Variable PATH must have miniconda3 location User Variables PYTHONHOME and PYTHONPATH must be defined System Variable JAVA_HOME must be defined",
"title": "System Requirements"
},
{
"location": "/install/install-cave/#download-and-installation-instructions_2",
"text": "Download and install 64-bit Miniconda Python 4.8.2 for Windows Allow Miniconda3 to set PATH and other environment variables Register miniconda as the default python Download and install the 64-bit Java JDK 1.8_181 (this is necessary so Jep can install properly). Select Development Tools as the installation options Make note of where it installs on your computer (the default is C:\\ProgramFiles\\Java) Set the environment variables: Access the Environment Variables window by typing \"env\" in the start bar, hitting enter, and clicking on the \"Environment Variables...\" button at the bottom of the \"System Properties\" window User Variables: PYTHONPATH and PYTHONHOME System Variable: JAVA_HOME Note: If PYTHONHOME is not set, the gridslice Python module will not be installed or available Download and install 64-bit Microsoft Visual Studio C++ Build Tools To access the page linked above you will need a Microsoft account Once at that webpage, search for \"build tools c+\" in order for the proper download to be returned Download 64-bit Visual C++ Build Tools 2015 Update 3 When running the installer, choose the Default Installation Install dependent Python packages Open a terminal by typing \"cmd\" into the start bar and hitting enter Run the following command: pip install numpy==1.15.1 jep=3.8.2 Download and install: awips-cave.msi In addition to the application directory, the MSI installer will attempt to copy the gridslice shared library to $PYTHONHOME/Dlls/ . If the $PYTHONHOME environmental variable is not defined gridslice will not be installed. You can check to see if it was installed in the Dlls directory after you have completed steps 1-3. Note: CAVE will still run without gridslice, but certain bundles which use derived parameters, such as isentropic analysis , will not load. To run CAVE, either: Type \"cave\" in the start bar and hit enter Find and run CAVE app in the file browser: C:\\Program Files\\Unidata\\AWIPS CAVE\\cave.exe",
"title": "Download and Installation Instructions"
},
{
"location": "/install/install-cave/#macos",
"text": "",
"title": "macOS "
},
{
"location": "/install/install-cave/#system-requirements_3",
"text": "NVIDIA Graphics card is recommended, some Intel Graphics cards will working Note: Most AMD graphics cards are not supported",
"title": "System Requirements"
},
{
"location": "/install/install-cave/#download-and-installation-instructions_3",
"text": "Download and install both: awips-cave.dmg and awips-python.pkg This will install CAVE as an application and set the local cache to ~/Library/caveData Note: The awips-python.pkg is not necessarily required, and CAVE will still run without it, but any derived data such as barbs, arrows, and various grid products will not render without having jep installed (it is assumed to be in /Library/Python/2.7/site-packages/jep/) To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it",
"title": "Download and Installation Instructions"
},
{
"location": "/install/install-cave/#edex-connection",
"text": "Unidata and XSEDE Jetstream have partnered to offer a EDEX data server in the cloud, open to the Unidata university community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).",
"title": "EDEX Connection"
},
{
"location": "/install/install-cave/#local-cache",
"text": "After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home/<user>/caveData/ macOS: /Users/<user>/Library/caveData/ Windows: C:\\Users\\<user>\\caveData\\",
"title": "Local Cache"
},
{
"location": "/install/install-edex/",
"text": "Install EDEX \n\uf0c1\n\n\nEDEX is the \nE\nnvironmental \nD\nata \nEx\nchange system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at \nDistributed EDEX, Installing Across Multiple Machines\n\n\n\n\nSystem requirements\n\uf0c1\n\n\n\n\n64-bit CentOS/RHEL 7\n\n\n16+ CPU cores (each CPU core can run a decorder in parallel)\n\n\n24GB RAM\n\n\n700GB+ Disk Space\n\n\ngcc-c++ package\n\n\nRun \nrpm -qa | grep gcc-c++\n to verify if the package is installed\n\n\nIf it is not installed, run \nyum install gcc-c++\n to install the package\n\n\n\n\n\n\nA \nSolid State Drive (SSD)\n is recommended\n\n\nA SSD should be mounted either to \n/awips2\n (to contain the entire EDEX system) or to \n/awips2/edex/data/hdf5\n (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type.\n\n\n\n\n\n\n\n\n\n\nNote: EDEX is only supported for 64-bit CentOS and RHEL 7 Operation Systems. You may have luck with Fedora Core 12 to 14 and Scientific Linux.\n\n\n\n\n\n\nEDEX is \nnot\n supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows.\n\n\n\n\n\n\nDownload and Installation Instructions\n\uf0c1\n\n\nAll of these command should be run as \nroot\n\n\n1. Create AWIPS User\n\uf0c1\n\n\nCreate user awips and group fxalpha\n\n\ngroupadd fxalpha && useradd -G fxalpha awips\n\n\n\nor if the awips account already exists:\n\n\ngroupadd fxalpha && usermod -G fxalpha awips\n\n\n\n2. Install EDEX\n\uf0c1\n\n\nDownload the and run the installer: \nawips_install.sh\n \n\n\nwget https://www.unidata.ucar.edu/software/awips2/awips_install.sh\nchmod 755 awips_install.sh\nsudo ./awips_install.sh --edex\n\n\n\n\n\nawips_install.sh --edex\n will perform the following steps (it's always a good idea to review downloaded shell scripts):\n\n\n\n\nSaves the appropriate Yum repo file to \n/etc/yum.repos.d/awips2.repo\n\n\nIncreases process and file limits for the the \nawips\n account in \n/etc/security/limits.conf\n\n\nCreates \n/awips2/data_store\n if it does not exist already\n\n\nRuns \nyum groupinstall awips2-server\n\n\nAttempts to configure the EDEX hostname defined in \n/awips2/edex/bin/setup.env\n\n\nAlerts the user if the \nawips\n account does not exist (the RPMs will still install)\n\n\n\n\n\n\n3. EDEX Setup\n\uf0c1\n\n\nChange user and run edex setup:\n\n\nsudo su - awips\nsudo edex setup\n\n\n\nThe command \nedex setup\n will try to determine your fully-qualified domain name and set it in \n/awips2/edex/bin/setup.env\n. EDEX Server Administrators should double-check that the addresses and names defined in setup.env are resolvable from both inside and outside the server, and make appropriate edits to \n/etc/hosts\n if necessary.\n\n\nSetup Example\n\uf0c1\n\n\nFor example, in the XSEDE Jetstream cloud, the fully-qualified domain name defined in \n/awips2/edex/bin/setup.env\n\n\nexport EXT_ADDR=js-196-132.jetstream-cloud.org\nexport DB_ADDR=localhost\nexport DB_PORT=5432\nexport BROKER_ADDR=localhost\nexport PYPIES_SERVER=http://${EXT_ADDR}:9582\n\n\n\nThe external address needs to direct to localhost in \n/etc/hosts\n\n\n127.0.0.1 localhost localhost.localdomain js-196-132.jetstream-cloud.org\n\n\n\n4. Configure iptables\n\uf0c1\n\n\nConfigure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data to CAVE clients and the Python API.\n\n\nOpen Port 9588\n\uf0c1\n\n\nIf you are running a Registry (Data Delivery) server, you will also want to open port \n9588\n.\n\n\nTo open ports to all connections\n\uf0c1\n\n\nvi /etc/sysconfig/iptables\n\n*filter\n:INPUT ACCEPT [0:0]\n:FORWARD ACCEPT [0:0]\n:OUTPUT ACCEPT [0:0]\n-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT\n-A INPUT -p icmp
"title": "Install EDEX"
},
{
"location": "/install/install-edex/#install-edex",
"text": "EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines",
"title": "Install EDEX "
},
{
"location": "/install/install-edex/#system-requirements",
"text": "64-bit CentOS/RHEL 7 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package Run rpm -qa | grep gcc-c++ to verify if the package is installed If it is not installed, run yum install gcc-c++ to install the package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. Note: EDEX is only supported for 64-bit CentOS and RHEL 7 Operation Systems. You may have luck with Fedora Core 12 to 14 and Scientific Linux. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows.",
"title": "System requirements"
},
{
"location": "/install/install-edex/#download-and-installation-instructions",
"text": "All of these command should be run as root",
"title": "Download and Installation Instructions"
},
{
"location": "/install/install-edex/#1-create-awips-user",
"text": "Create user awips and group fxalpha groupadd fxalpha && useradd -G fxalpha awips or if the awips account already exists: groupadd fxalpha && usermod -G fxalpha awips",
"title": "1. Create AWIPS User"
},
{
"location": "/install/install-edex/#2-install-edex",
"text": "Download the and run the installer: awips_install.sh wget https://www.unidata.ucar.edu/software/awips2/awips_install.sh\nchmod 755 awips_install.sh\nsudo ./awips_install.sh --edex awips_install.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Saves the appropriate Yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server Attempts to configure the EDEX hostname defined in /awips2/edex/bin/setup.env Alerts the user if the awips account does not exist (the RPMs will still install)",
"title": "2. Install EDEX"
},
{
"location": "/install/install-edex/#3-edex-setup",
"text": "Change user and run edex setup: sudo su - awips\nsudo edex setup The command edex setup will try to determine your fully-qualified domain name and set it in /awips2/edex/bin/setup.env . EDEX Server Administrators should double-check that the addresses and names defined in setup.env are resolvable from both inside and outside the server, and make appropriate edits to /etc/hosts if necessary.",
"title": "3. EDEX Setup"
},
{
"location": "/install/install-edex/#setup-example",
"text": "For example, in the XSEDE Jetstream cloud, the fully-qualified domain name defined in /awips2/edex/bin/setup.env export EXT_ADDR=js-196-132.jetstream-cloud.org\nexport DB_ADDR=localhost\nexport DB_PORT=5432\nexport BROKER_ADDR=localhost\nexport PYPIES_SERVER=http://${EXT_ADDR}:9582 The external address needs to direct to localhost in /etc/hosts 127.0.0.1 localhost localhost.localdomain js-196-132.jetstream-cloud.org",
"title": "Setup Example"
},
{
"location": "/install/install-edex/#4-configure-iptables",
"text": "Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data to CAVE clients and the Python API.",
"title": "4. Configure iptables"
},
{
"location": "/install/install-edex/#open-port-9588",
"text": "If you are running a Registry (Data Delivery) server, you will also want to open port 9588 .",
"title": "Open Port 9588"
},
{
"location": "/install/install-edex/#to-open-ports-to-all-connections",
"text": "vi /etc/sysconfig/iptables\n\n*filter\n:INPUT ACCEPT [0:0]\n:FORWARD ACCEPT [0:0]\n:OUTPUT ACCEPT [0:0]\n-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT\n-A INPUT -p icmp -j ACCEPT\n-A INPUT -i lo -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT\n#-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd\n-A INPUT -j REJECT --reject-with icmp-host-prohibited\n-A FORWARD -j REJECT --reject-with icmp-host-prohibited\nCOMMIT",
"title": "To open ports to all connections"
},
{
"location": "/install/install-edex/#to-open-ports-to-specific-ip-addresses",
"text": "In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables\n\n*filter\n:INPUT DROP [0:0]\n:FORWARD DROP [0:0]\n:OUTPUT ACCEPT [0:0]\n:EXTERNAL - [0:0]\n:EDEX - [0:0]\n-A INPUT -i lo -j ACCEPT\n-A INPUT -p icmp --icmp-type any -j ACCEPT\n-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT\n-A INPUT -s 128.117.140.0/24 -j EDEX\n-A INPUT -s 128.117.156.0/24 -j EDEX\n-A INPUT -j EXTERNAL\n-A EXTERNAL -j REJECT\n-A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT\n-A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT\n-A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT\n#-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd\n-A EDEX -j REJECT\nCOMMIT",
"title": "To open ports to specific IP addresses"
},
{
"location": "/install/install-edex/#restart-iptables",
"text": "service iptables restart",
"title": "Restart iptables"
},
{
"location": "/install/install-edex/#troubleshooting",
"text": "For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service \nFailed to restart iptables.service: Unit iptables.service failed to load: No such >file or directory. The solution is: yum install iptables-services\nsystemctl enable iptables\nservice iptables restart",
"title": "Troubleshooting"
},
{
"location": "/install/install-edex/#5-start-edex",
"text": "edex start To manually start, stop, and restart: service edex_postgres start\nservice httpd-pypies start\nservice qpidd start\nservice edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. ldmadmin start To start all services except the LDM (good for troubleshooting): edex start base To restart EDEX edex restart",
"title": "5. Start EDEX"
},
{
"location": "/install/install-edex/#additional-steps",
"text": "",
"title": "Additional Steps"
},
{
"location": "/install/install-edex/#increase-process-limit",
"text": "/etc/security/limits.conf defines the number of user processes and files (this step is automatically performed by ./awips_install.sh --edex ). Without these definitions, Qpid is known to crash during periods of high ingest. awips soft nproc 65536\nawips soft nofile 65536",
"title": "Increase Process Limit"
},
{
"location": "/install/install-edex/#ensure-selinux-is-disabled",
"text": "This step is no longer necessary with version LDM-6.13 or higher. The version shipped with Unidata's EDEX is higher than this cutoff. vi /etc/sysconfig/selinux\n\n# This file controls the state of SELinux on the system.\n# SELINUX= can take one of these three values:\n# enforcing - SELinux security policy is enforced.\n# permissive - SELinux prints warnings instead of enforcing.\n# disabled - No SELinux policy is loaded.\nSELINUX=disabled\n# SELINUXTYPE= can take one of these two values:\n# targeted - Targeted processes are protected,\n# mls - Multi Level Security protection.\nSELINUXTYPE=targeted Read more about selinux at redhat.com",
"title": "Ensure SELinux is Disabled"
},
{
"location": "/install/install-edex/#ssd-mount",
"text": "Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on\n/dev/sda1 30G 2.5G 26G 9% /\ntmpfs 28G 0 28G 0% /dev/shm\n/dev/sdc1 788G 81G 667G 11% /awips2\n/dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5",
"title": "SSD Mount"
},
{
"location": "/install/install-edex/#configure-ldm-feeds",
"text": "EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested:",
"title": "Configure LDM Feeds"
},
{
"location": "/install/install-edex/#configuration-file-awips2ldmetcldmdconf",
"text": "This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu\nREQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu\nREQUEST NGRID \".*\" idd.unidata.ucar.edu\nREQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual",
"title": "Configuration file: /awips2/ldm/etc/ldmd.conf"
},
{
"location": "/install/install-edex/#configuration-file-awips2ldmetcpqactconf",
"text": "This file specifies the WMO headers and file pattern actions to request: # Redbook graphics\nANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8})\n FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H\n# NOAAPORT GINI images\nNIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..)\n FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds",
"title": "Configuration File: /awips2/ldm/etc/pqact.conf"
},
{
"location": "/install/install-edex/#directories-to-know",
"text": "/awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint.",
"title": "Directories to Know"
},
{
"location": "/install/install-edex/#what-version-is-my-edex",
"text": "rpm -qa | grep awips2-edex",
"title": "What Version is my EDEX?"
},
{
"location": "/install/common-problems/",
"text": "Common Problems\n\uf0c1\n\n\nWindows CAVE Start Up Error\n\uf0c1\n\n\nOne common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up:\n\n\n\n\n\n\nThese errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time.\n\n\nTo fix the issue simply follow these steps:\n\n\n\n\nNote\n: These screenshots may vary from your system.\n\n\n\n\n1. Close all error windows and any open windows associated with CAVE.\n\n\n2. In the Windows 10 search field, search for \"control panel\".\n\n\n\n\n3. Once in the Control Panel, look for \"Network and Sharing Center\".\n\n\n\n\n\n\n4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\").\n\n\n\n\n5. Click on \"Properties\".\n\n\n\n\n6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK.\n\n\n\n\n7. Restart CAVE.",
"title": "Common Problems"
},
{
"location": "/install/common-problems/#common-problems",
"text": "",
"title": "Common Problems"
},
{
"location": "/install/common-problems/#windows-cave-start-up-error",
"text": "One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: Note : These screenshots may vary from your system. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows 10 search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. 7. Restart CAVE.",
"title": "Windows CAVE Start Up Error"
},
{
"location": "/cave/d2d-perspective/",
"text": "D2D Perspective\n\uf0c1\n\n\nD2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system.\n\n\n\n\nSystem menus include \nCAVE\n, \nFile\n, \nView\n, \nOptions\n, and \nTools\n.\n\n\nData menus include \nModels\n, \nSurface\n, \nNCEP/Hydro\n, \nUpper Air\n, \nSatellite\n, \nLocal Radar Stations\n, \nRadar\n, \nMRMS\n, and \nMaps\n.\n\n\nMap projection, image properties, frame control, and a few featured applications (\nWarngen\n, \nNsharp\n, and \nBrowser\n) make up the the primary D2D toolbar.\n\n\n\n\nNote\n: Depending on which Operating System version of CAVE there may be other application options (\nPGEN\n, \nGEMPAK\n).\n\n\n\n\n\n\nResource Stack\n\uf0c1\n\n\nAt bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a \nright-click menu\n.\n\n\nThere are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the \nRight-Click Functionality\n.\n\n\nIt's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the \nClear\n button will remove all Products but not remove any Map Products.\n\n\nLeft-Click Resource Name to Hide\n\uf0c1\n\n\nA left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.\n\n\n\n\nRight-Click Background to Cycle Resource Views\n\uf0c1\n\n\nThe default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources.\n\n\nHold-Right-Click Resource Name for Menu\n\uf0c1\n\n\nDrag the mouse over a loaded resource and \nhold\n the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility).\n\n\nThe hold-right-click menu allows you to control individual resource \nImage Properties\n, \nChange Colormaps\n, change resource color, width, density, and magnification, \nmove resources up and down\n in the stack, as well as configure custom options with other interactive resources.\n\n\n\n\n\n\nDisplay Menu\n\uf0c1\n\n\nThe display menu has many options which can alter the functionality in CAVE.\n\n\nHold-Right-Click Background for Display Menu\n\uf0c1\n\n\nHolding down the right mouse button anywhere in the map view will open a right-click menu\n\n\n\n\nShow Map Legends\n\uf0c1\n\n\nFrom the above menu select \nShow Map Legends\n and watch the Resource Stack show only map resources which are loaded to the view.\n\n\n\n\nToggle 2 or 4-Panel Layout\n\uf0c1\n\n\nRight-click hold in the view and select \nTwo Panel Layout\n or \nFour Panel Layout\n to create duplicates of the current view (note that any data loaded to the \nview\n will be loaded to \nboth displays within the view\n).\n\n\nFrom this multi-pane display, hold-right-click again and you will see the \nSingle Panel Layout\n option to switch back to a standard view (defaulting to the left of two, and top-left of four).\n\n\nSample Loaded Resources\n\uf0c1\n\n\nMost data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the \nSample\n option in the Display Menu.\n\n\n\n\n\n\nProduct Browser\n\uf0c1\n\n\nThe Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar (\n), or go to the menu: \nCAVE\n > \nData Broswers\n > \nProduc
"title": "D2D Perspective"
},
{
"location": "/cave/d2d-perspective/#d2d-perspective",
"text": "D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ).",
"title": "D2D Perspective"
},
{
"location": "/cave/d2d-perspective/#resource-stack",
"text": "At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products.",
"title": "Resource Stack"
},
{
"location": "/cave/d2d-perspective/#left-click-resource-name-to-hide",
"text": "A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.",
"title": "Left-Click Resource Name to Hide"
},
{
"location": "/cave/d2d-perspective/#right-click-background-to-cycle-resource-views",
"text": "The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources.",
"title": "Right-Click Background to Cycle Resource Views"
},
{
"location": "/cave/d2d-perspective/#hold-right-click-resource-name-for-menu",
"text": "Drag the mouse over a loaded resource and hold the right mouse button until a menu appears (simply clicking the resource with the right mouse button will toggle its visibility). The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources.",
"title": "Hold-Right-Click Resource Name for Menu"
},
{
"location": "/cave/d2d-perspective/#display-menu",
"text": "The display menu has many options which can alter the functionality in CAVE.",
"title": "Display Menu"
},
{
"location": "/cave/d2d-perspective/#hold-right-click-background-for-display-menu",
"text": "Holding down the right mouse button anywhere in the map view will open a right-click menu",
"title": "Hold-Right-Click Background for Display Menu"
},
{
"location": "/cave/d2d-perspective/#show-map-legends",
"text": "From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view.",
"title": "Show Map Legends"
},
{
"location": "/cave/d2d-perspective/#toggle-2-or-4-panel-layout",
"text": "Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view (note that any data loaded to the view will be loaded to both displays within the view ). From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four).",
"title": "Toggle 2 or 4-Panel Layout"
},
{
"location": "/cave/d2d-perspective/#sample-loaded-resources",
"text": "Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu.",
"title": "Sample Loaded Resources"
},
{
"location": "/cave/d2d-perspective/#product-browser",
"text": "The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Broswers > Product Broswers . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available.",
"title": "Product Browser"
},
{
"location": "/cave/d2d-perspective/#options-menu",
"text": "There are several toggle options and options dialogs that are available under the Options menu found at the top of the application.",
"title": "Options Menu"
},
{
"location": "/cave/d2d-perspective/#time-options-ctrl-t",
"text": "This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select\nOffset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will\nbe displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog\nbox appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how\nfar off it is.",
"title": "Time Options (Ctrl + T)"
},
{
"location": "/cave/d2d-perspective/#image-combination-insert",
"text": "This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable.",
"title": "Image Combination (Insert)"
},
{
"location": "/cave/d2d-perspective/#display-properties",
"text": "This menu option opens the Display Properties dialog box. All the options available in this dialog box are also available on the Toolbar.",
"title": "Display Properties"
},
{
"location": "/cave/d2d-perspective/#loop-properties-ctrl-l",
"text": "Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping.",
"title": "Loop Properties (Ctrl + L)"
},
{
"location": "/cave/d2d-perspective/#image-properties-ctrl-i",
"text": "The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data.",
"title": "Image Properties (Ctrl + I)"
},
{
"location": "/cave/d2d-perspective/#set-time",
"text": "This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data.",
"title": "Set Time"
},
{
"location": "/cave/d2d-perspective/#set-background-color",
"text": "You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3).",
"title": "Set Background Color"
},
{
"location": "/cave/d2d-perspective/#switching-perspectives",
"text": "Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. Note : The National Centers Perspective (which is available in the Other... submenu) is available on the Linux version of CAVE. And the GFE perspective is not available on the Windows version.",
"title": "Switching Perspectives"
},
{
"location": "/cave/d2d-perspective/#cave-preferences",
"text": "Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname.",
"title": "CAVE Preferences"
},
{
"location": "/cave/d2d-perspective/#load-mode",
"text": "Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of\nframes. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display\npane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already\nloaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you\nwant. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.",
"title": "Load Mode"
},
{
"location": "/cave/maps-views-projections/",
"text": "Maps, Views, Projections\n\uf0c1\n\n\nDefault Map Scales\n\uf0c1\n\n\nThe first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always \nCONUS\n, which is a North Polar Steregraphic projection centered on the Continental United States.\n\n\nDefault projections and areas available in the menu\n\n\n\n\nCONUS\n\n\nN. Hemisphere\n (North Polar Stereographic)\n\n\nRegional\n (for the selected localization site)\n\n\nWFO\n (for the selected localization site)\n\n\nWorld - Mercator\n\n\nWorld - CED\n\n\nWorld - Mollweide\n\n\nGOES East Full Disk\n (Geostationary)\n\n\nGOES West Full Disk\n (Geostationary)\n\n\nRegional\n Mercator projections for\n\n\nAfrica\n\n\nAlaska\n\n\nAntarctica\n\n\nArctic\n\n\nAustralia,New Zealand\n\n\nEurope\n\n\nHawaii\n\n\nJapan\n\n\nPacific Ocean\n\n\nPuerto Rico\n\n\nSouth America\n\n\n\n\n\n\nWFO\n (Has a submenu which contains a map scale for every NWS localization site)\n\n\n\n\n\n\n\n\nNew Map Editor / View\n\uf0c1\n\n\nFile > New Map\n\n\nOpens a new map editor tab with the default projection (CONUS Polar Stereographic).\n\n\n\n\nThis can also be done by \nright-click\n on any tab and selecting \nNew Editor\n\n\n\n\n\n\nNew Projection\n\uf0c1\n\n\nFile > New Projection\n\n\nCreate a new map projection.",
"title": "Maps, Views, Projections"
},
{
"location": "/cave/maps-views-projections/#maps-views-projections",
"text": "",
"title": "Maps, Views, Projections"
},
{
"location": "/cave/maps-views-projections/#default-map-scales",
"text": "The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site)",
"title": "Default Map Scales"
},
{
"location": "/cave/maps-views-projections/#new-map-editor-view",
"text": "File > New Map Opens a new map editor tab with the default projection (CONUS Polar Stereographic). This can also be done by right-click on any tab and selecting New Editor",
"title": "New Map Editor / View"
},
{
"location": "/cave/maps-views-projections/#new-projection",
"text": "File > New Projection Create a new map projection.",
"title": "New Projection"
},
{
"location": "/cave/bundles-and-procedures/",
"text": "Bundles and Procedures\n\uf0c1\n\n\nAWIPS contains two methods for saving and loading data resources: \nBundles\n are a simple way to save loaded resources to access in future CAVE sessions. \nProcedures\n are similar to Bundles, but can be thought of as \ngroups of bundles\n and allows the user to manage saved resources with more control.\n\n\nBundles\n\uf0c1\n\n\nFile > Load Display\n\uf0c1\n\n\nLoad a previously-saved bundle from within the AWIPS system. The Open Bundle dialog allows you to select your own saved bundles as well as those saved by other users.\n\n\nEach selected bundle will load its contents to new tabs which are named after the bundle file name (e.g. NAM_ThetaE).\n\n\n\n\nMost saved bundles will consist of a single Map Editor (tab), but with multiple tabs saved each will open again in its own Map Editor.\n\n\n\n\nFile > Save Display\n\uf0c1\n\n\nSave a product display within the AWIPS system, synching the bundle between CAVE and the EDEX server.\n\n\n\n\nFile > Manage Bundles\n\uf0c1\n\n\nSelect and remove a saved bundle under File > Manage Bundles, this will open the Delete Bundle dialog. Select the file name and click \nOK\n and then confirm deletion to remove the saved file permanently.\n\n\n\n\n\n\n\n\nLoad Bundle from Local Disk\n\uf0c1\n\n\nTo load a previously-saved display from a path within the file directory of the workstation, select \nFile > Load Dislay\n and then select the \nFile\n button on the right to browse your local directories.\n\n\n\n\n\n\nSave Bundle to Local Disk\n\uf0c1\n\n\nTo save a product display to a path within the file directory of the workstation, select \nFile > Save Display\n and then select the \nFile\n button on the right.\n\n\n\n\nProcedures\n\uf0c1\n\n\nNew Procedure\n\uf0c1\n\n\n\n\nSelect the menu \nFile > Procedures > New\n\n\nSelect \nCopy Into\n to add all loaded resources to the Procedure Stack\n\n\nSelect \nSave\n (or \nSave As\n) and then enter a name for the Procedure before clicking \nOK\n to save.\n\n\n\n\n\n\nOpen Procedure\n\uf0c1\n\n\nSimilar to creating a new Procedure, select \nFile > Procedures > Open\n, select the saved resources and click \nLoad\n to load them to CAVE.\n\n\nDelete Procedure\n\uf0c1\n\n\nFrom the menu \nFile > Procedures > Delete\n you can delete existing Procedure files in a way similar to deleting saved Bundle files.",
"title": "Bundles and Procedures"
},
{
"location": "/cave/bundles-and-procedures/#bundles-and-procedures",
"text": "AWIPS contains two methods for saving and loading data resources: Bundles are a simple way to save loaded resources to access in future CAVE sessions. Procedures are similar to Bundles, but can be thought of as groups of bundles and allows the user to manage saved resources with more control.",
"title": "Bundles and Procedures"
},
{
"location": "/cave/bundles-and-procedures/#bundles",
"text": "",
"title": "Bundles"
},
{
"location": "/cave/bundles-and-procedures/#file-load-display",
"text": "Load a previously-saved bundle from within the AWIPS system. The Open Bundle dialog allows you to select your own saved bundles as well as those saved by other users. Each selected bundle will load its contents to new tabs which are named after the bundle file name (e.g. NAM_ThetaE). Most saved bundles will consist of a single Map Editor (tab), but with multiple tabs saved each will open again in its own Map Editor.",
"title": "File &gt; Load Display"
},
{
"location": "/cave/bundles-and-procedures/#file-save-display",
"text": "Save a product display within the AWIPS system, synching the bundle between CAVE and the EDEX server.",
"title": "File &gt; Save Display"
},
{
"location": "/cave/bundles-and-procedures/#file-manage-bundles",
"text": "Select and remove a saved bundle under File > Manage Bundles, this will open the Delete Bundle dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently.",
"title": "File &gt; Manage Bundles"
},
{
"location": "/cave/bundles-and-procedures/#load-bundle-from-local-disk",
"text": "To load a previously-saved display from a path within the file directory of the workstation, select File > Load Dislay and then select the File button on the right to browse your local directories.",
"title": "Load Bundle from Local Disk"
},
{
"location": "/cave/bundles-and-procedures/#save-bundle-to-local-disk",
"text": "To save a product display to a path within the file directory of the workstation, select File > Save Display and then select the File button on the right.",
"title": "Save Bundle to Local Disk"
},
{
"location": "/cave/bundles-and-procedures/#procedures",
"text": "",
"title": "Procedures"
},
{
"location": "/cave/bundles-and-procedures/#new-procedure",
"text": "Select the menu File > Procedures > New Select Copy Into to add all loaded resources to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save.",
"title": "New Procedure"
},
{
"location": "/cave/bundles-and-procedures/#open-procedure",
"text": "Similar to creating a new Procedure, select File > Procedures > Open , select the saved resources and click Load to load them to CAVE.",
"title": "Open Procedure"
},
{
"location": "/cave/bundles-and-procedures/#delete-procedure",
"text": "From the menu File > Procedures > Delete you can delete existing Procedure files in a way similar to deleting saved Bundle files.",
"title": "Delete Procedure"
},
{
"location": "/cave/localization-perspective/",
"text": "Localization perspective\n\uf0c1\n\n\nLocalization Levels\n\uf0c1\n\n\nAWIPS uses a hierarchical system known as \nLocalization\n to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a \nUser\n-level\n localization file will supercede any similar file in a higher level (such as \nSite\n).\n\n\n\n\nThere are three \nlevels of localization\n, starting with the default \nBASE\n\n\n\n\nBASE\n - default\n\n\nSITE\n - 3-letter WFO ID (required) overrides base\n\n\nUSER\n - user-level localization overrides site and base\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nLocalization Editor\n\uf0c1\n\n\nThe Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu \nCAVE > Perspective > Localization\n.\n\n\nUsers may copy and add files to available directories at their own \nUser\n localization version.\n\n\nExamples of things that can be accessed through the perspective include (this list is not all-inclusive):\n\n\n\n\n\n\nNCP Predefined Areas, Color Maps and Style Rules\n\n\n\n\n\n\nD2D Volume Browser Controls\n\n\n\n\n\n\nD2D Bundles - Scales (WFO, State(s), etc.)\n\n\n\n\n\n\nCAVE Map Overlays, Color Maps and Style Rules\n\n\n\n\n\n\nGFE Tools and Utilities\n\n\n\n\n\n\n\n\nThe left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as \nuser\n localization files.\n\n\n\n\nThere may be several versions of each file including \nBASE\n, \nCONFIGURED\n (GFE only), \nSITE\n, and \nUSER\n. Each file version is listed separately under the actual file name.\n\n\nThe \nFile Editor\n view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor.\n\n\n\n\nCustomizing CAVE Menus\n\uf0c1\n\n\nNavigate to \nD2D > Menus\n and select a submenu (e.g. \nsatellite\n). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an \nindex.xml\n file from which you can investigate the menu structure and make needed changes.\n\n\nSelecting a file such as \nindex.xml\n (by double clicking, or expanding) will show a sub-menu with a default localization level (typically \nBASE\n or \nCONFIGURED\n). Double-click this file to open in the file editor (you may need to click \nSource\n at the bottom of the view to see the raw XML). Right-click this file and select \nCopy To\n > \nUser (\nusername\n)\n and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.",
"title": "Localization Perspective"
},
{
"location": "/cave/localization-perspective/#localization-perspective",
"text": "",
"title": "Localization perspective"
},
{
"location": "/cave/localization-perspective/#localization-levels",
"text": "AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base",
"title": "Localization Levels"
},
{
"location": "/cave/localization-perspective/#localization-editor",
"text": "The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor.",
"title": "Localization Editor"
},
{
"location": "/cave/localization-perspective/#customizing-cave-menus",
"text": "Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.",
"title": "Customizing CAVE Menus"
},
{
"location": "/cave/nsharp/",
"text": "NSHARP\n\uf0c1\n\n\nNSHARP, which stands for the \nN\national Center \nS\nounding and \nH\nodograph \nA\nnalysis and \nR\nesearch \nP\nrogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs \nBigSHARP\n sounding display tool, and the Python package \nSHARpy\n.\n\n\nNSHARP is available a number of ways in CAVE:\n\n\n\n\nFrom the \nD2D toolbar\n select the NSHARP icon: \n\n\nFrom the \nUpper Air\n menu select \nNSHARP Soundings\n\n\nFrom the \nUpper Air\n menu select a station from the \nRAOB\n menus\n\n\nFrom the \nUpper Air\n menu select \nNUCAPS Soundings\n\n\n\n\nFrom the \nModels\n or \nTools\n menu select \nVolume Browser\n\n\n\n\n\n\nMake sure \nSounding\n is selected from the menu at the top\n\n\n\n\n\n\n\n\nSelect a source that has data (signified by a green box to the right)\n\n\n\n\n\n\n\n\nSelect \nSoundings\n from the Fields menu\n\n\n\n\n\n\n\n\nSelect any point from the Planes menu and an option will load in the table\n\n\n\n\nTo create a new point go to select \nTools\n > \nPoints\n and use the right-click-hold menu to create a new point anywhere on the map\n\n\n\n\n\n\n\n\nUse the \nLoad\n button to load data and open the NSharp display\n\n\n\n\n\n\n\n\n\n\n\n\n\n\nNSHARP Configurations\n\uf0c1\n\n\nNSHARP has four configurations for use in different operational settings:\n\n\n\n\nSPC Wide\n - more insets and graphs at the expense of timeline/station inventory.\n\n\nD2D Skewt Standard\n - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs.\n\n\nD2D Lite\n - Skew-T, table, and inventory only.\n\n\nOPC\n - Ocean Prediction Center display.\n\n\n\n\nTo change the NSHARP confiuguration:\n\n\n\n\nOpen the \nNSHARP(D2D)\n controls tab by clicking on the Nsharp toolbar (\n) icon again\n\n\nClick the \nConfigure\n button\n\n\nClick \nDisplay Pane Configuration\n (third from the bottom)\n\n\nUse the dropdown to choose a configuration, apply, save, close\n\n\n\n\nIf you would like to interactively explore the different graphical areas in NSHARP \non the Web\n, see the \nNSHARP Interactive Overview\n.\n\n\n\n\nSkew-T Display\n\uf0c1\n\n\n\n\nThe Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace.\n\n\nSkew-T is the default upper air chart in AWIPS, and can be changed to \nturbulence display\n (\nT\n) or an \nicing display\n (\nI\n). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in \nNSHARP Configurations\n). Use the \nAWIPS-2 NSHARP Interactive Overview page\n for more information about the Skew-T display.\n\n\n\n\nWindspeed vs Height and Inferred Temperature Advection\n\uf0c1\n\n\n\n\nThe windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the \nAWIPS-2 NSHARP Interactive Overview page\n for more information.\n\n\n\n\nHodograph Display\n\uf0c1\n\n\n\n\nThis panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the \nAWIPS NSHARP Interactive Overview page\n for more information about the hodograph display.\n\n\n\n\nInsets\n\uf0c1\n\n\n\n\nIn the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind
"title": "NSHARP"
},
{
"location": "/cave/nsharp/#nsharp",
"text": "NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source that has data (signified by a green box to the right) Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display",
"title": "NSHARP"
},
{
"location": "/cave/nsharp/#nsharp-configurations",
"text": "NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview .",
"title": "NSHARP Configurations"
},
{
"location": "/cave/nsharp/#skew-t-display",
"text": "The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display.",
"title": "Skew-T Display"
},
{
"location": "/cave/nsharp/#windspeed-vs-height-and-inferred-temperature-advection",
"text": "The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information.",
"title": "Windspeed vs Height and Inferred Temperature Advection"
},
{
"location": "/cave/nsharp/#hodograph-display",
"text": "This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display.",
"title": "Hodograph Display"
},
{
"location": "/cave/nsharp/#insets",
"text": "In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information.",
"title": "Insets"
},
{
"location": "/cave/nsharp/#table-output-displays",
"text": "The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available.",
"title": "Table Output Displays"
},
{
"location": "/cave/nsharp/#graphsstatistics",
"text": "In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information.",
"title": "Graphs/Statistics"
},
{
"location": "/cave/nsharp/#sounding-inventory",
"text": "This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.",
"title": "Sounding Inventory"
},
{
"location": "/cave/warngen/",
"text": "WarnGen Walkthrough\n\uf0c1\n\n\nWarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a \nnon-operational\n forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but \nprevents you from transmitting a generated warning upstream\n.\n\n\n\n\nIn order to select a feature it must be within your \nCAVE localization\n coverage (load \nMaps\n > \nCounty Warning Areas\n to see coverages)\n\n\n\n\nQuick Steps - Using WarnGen in Unidata AWIPS CAVE\n\uf0c1\n\n\n\n\nLoad NEXRAD Display\n from the Radar menu\n\n\nChoose a CWA with active severe weather (PAH is used in the video below)\n\n\nRe-localize\n to this site in the \nCAVE\n > \nPreferences\n > \nLocalization\n menu\n\n\nExit out of CAVE and reload (you should notice the new CWA at the top of CAVE)\n\n\nLoad radar data\n from the local radar menu \nkpah\n > \nZ + SRM8\n\n\nUse the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM\n\n\nClick \nWarnGen\n toolbar button or load from \nTools\n > \nWarnGen\n\n\nDrag the storm marker\n to the center of a storm feature\n\n\nStep through frames back and forth and adjust the marker to match the trajectory of the storm feature\n\n\nClick \nTrack\n in the Warngen GUI to update the polygon shape and trajectory\n\n\nFrom the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc)\n\n\n\n\nClick \nCreate Text\n at the bottom of the WarnGen dialog to generate a text warning product in a new window\n\n\n\n\nNote: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open \n\n\n\n\n\n\n\n\n\n\nClick \nReset\n at the top of the WarnGen dialog to reset the storm marker at any time\n\n\n\n\nSelect \nLine of Storms\n to enable a two-pointed vector which is to be positioned parallel to a storm line\n\n\nTo \nadd another vertex\n, middle button click along the polygon\n\n\n\n\nVideo - Using WarnGen in AWIPS\n\uf0c1\n\n\nThe video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video.\n\n\n\n\n\n\n\nLoad NEXRAD level 3 display\n\uf0c1\n\n\nSelect the menu \nRadar\n > \nNEXRAD Display\n and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example).\n\n\nSelect SITE Localization\n\uf0c1\n\n\nOpen \nCAVE\n > \nPreferences\n > \nLocalization\n, select the CWA site ID (PAH) for the coverage area you want to use, followed by \nApply\n and \nOkay\n and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window.\n\n\n\n\nLoad single radar data from the local radars\n\uf0c1\n\n\nClick on the local radar \nkpah\n > \nZ + SRM8\n. Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM.\n\n\n\n\nLaunch WarnGen\n\uf0c1\n\n\nSelect \nWarnGen\n from the D2D Toolbar or from the \nTools\n > \nWarnGen\n menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window.\n\n\n\n\nGenerate a Storm Motion Vector\n\uf0c1\n\n\n\n\nClick and drag \nDrag Me to Storm\n to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms).\n\n\nStep back 3 to 4 frames.\n\n\nDrag the dot to the previous position of the feature you first marked to create the storm motion vector.\n\n\nClick the \nTrack\n button in the WarnGen GUI to update the polygon based off the storm motion.\n\n\nReview the product loop and make adjustments to ensure the vector is accurate.\n\n\n\n\n\n\nThe initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to previe
"title": "WarnGen Walkthrough"
},
{
"location": "/cave/warngen/#warngen-walkthrough",
"text": "WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages)",
"title": "WarnGen Walkthrough"
},
{
"location": "/cave/warngen/#quick-steps-using-warngen-in-unidata-awips-cave",
"text": "Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (PAH is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kpah > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Note: Since you are not \"issuing\" the warning, leave the top to rows blank (\"TTAAii\" and \"CCCC\") and Click \"Enter\" and a separate text window should open Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon",
"title": "Quick Steps - Using WarnGen in Unidata AWIPS CAVE"
},
{
"location": "/cave/warngen/#video-using-warngen-in-awips",
"text": "The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video.",
"title": "Video - Using WarnGen in AWIPS"
},
{
"location": "/cave/warngen/#load-nexrad-level-3-display",
"text": "Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (PAH Paducah, Kentucky, in this example).",
"title": "Load NEXRAD level 3 display"
},
{
"location": "/cave/warngen/#select-site-localization",
"text": "Open CAVE > Preferences > Localization , select the CWA site ID (PAH) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window.",
"title": "Select SITE Localization"
},
{
"location": "/cave/warngen/#load-single-radar-data-from-the-local-radars",
"text": "Click on the local radar kpah > Z + SRM8 . Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM.",
"title": "Load single radar data from the local radars"
},
{
"location": "/cave/warngen/#launch-warngen",
"text": "Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window.",
"title": "Launch WarnGen"
},
{
"location": "/cave/warngen/#generate-a-storm-motion-vector",
"text": "Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits.",
"title": "Generate a Storm Motion Vector"
},
{
"location": "/cave/warngen/#moving-vertex-points",
"text": "Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream.",
"title": "Moving Vertex Points"
},
{
"location": "/cave/warngen/#add-and-remove-vertex-points",
"text": "There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex .",
"title": "Add and Remove Vertex Points"
},
{
"location": "/cave/warngen/#redrawing-a-polygon",
"text": "Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon.",
"title": "Redrawing a Polygon"
},
{
"location": "/cave/warngen/#text-window",
"text": "Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Leave the top two rows bank and click Enter for the text window to open. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.",
"title": "Text Window"
},
{
"location": "/cave/goes-16-satellite/",
"text": "GOES 16/17\n\uf0c1\n\n\nThe GOES-R decoder supports the ingest and display of NOAAport products (currently on the \nNIMAGE\n feed), Derived products (Level 2b netCDF files), and the Geostationary Lightning Mapper (GLM) products.\n\n\nGOES-R products are accessible in the \nSatellite\n menu. The menu is broken into sections and has submenus for each of the separate geospatial products: CONUS, Full Disk, Mesoscale Sectors, Hawaii, Alaska, and Puerto Rico. Each submenu has products for \nindividual channels\n and \nRGB Composites\n, as well as submenus for \nchannel differences\n and \nderived products\n. \nGLM data\n can also be found with its own submenu option a little lower down the menu and under the \nSurface\n menu.\n\n\n\n\nNOTE\n: The RGB products are not available on MacOS.\n\n\n\n\n\n\n\n\nLDM Pattern Actions\n\uf0c1\n\n\nThis is found in the LDM's pqact.conf file which is located in /awips2/ldm/etc. The entries for GOES data are shown below. There are three different feeds that supply the individual channel data, derived products, and GLM data: NIMAGE, HDS and DIFAX, respectively.\n\n\n#\n# GOES 16/17 ABI\n#\nNIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*).nc\n FILE -close -edex\n /awips2/data_store/GOES/CMI/\\5.nc4\n#\n# GOES derived products\n#\nHDS ^(IXT.99) KNES (......)\n FILE -close -edex\n /awips2/data_store/GOES/derived/KNES_\\1_\\2-(seq)\n#\n# GLM\n#\nDIFAX ^/data/cspp-geo/(EAST|WEST|GRB-R)/OR_GLM-L2-([^/]*).nc\n FILE -close -edex\n /awips2/data_store/GOES/GLM/\\1_OR_GLM-L2-\\2.nc\n\n\n\n\n\nIndividual Channels\n\uf0c1\n\n\nAll geospatial sectors have individual channel products that can be viewed. Below are samples of Channel 02 (0.64\u03bcm) for each of the sectors. These products come through the \nNIMAGE\n feed in the LDM.\n\n\nCONUS 1km\n\uf0c1\n\n\n\n\nFull Disk 6km\n\uf0c1\n\n\n\n\nMesoscale Sectors (TMESO-1, TMESO-2)\n\uf0c1\n\n\nTwo floating mesoscale sectors (will vary from image shown)\n\n\n\n\nPuerto Rico (PRREGI)\n\uf0c1\n\n\n\n\nAlaska\n\uf0c1\n\n\n\n\nHawaii\n\uf0c1\n\n\n\n\n\n\nRGB Composites\n\uf0c1\n\n\nRGB Composites are also available for each sector. The three RGB products are: Icing - Ch 5, 3, 2 (1.61\u03bcm, 0.87\u03bcm, 0.64\u03bcm), Composite - Ch 2, 5, 14 #1 (0.64\u03bcm, 1.61\u03bcm, 11.20\u03bcm) and Composite #5 - Ch 2, 3, 2 (0.64\u03bcm, 0.87\u03bcm, 0.64\u03bcm). These products are generated \non the fly in AWIPS\n using the existing channel products from EDEX.\n\n\n\n\nGOES RGB Imagery is NOT SUPPORTED on macOS\n\n\nOpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac. Please use the Linux or Windows installs to view RGB products.\n\n\n\n\nIcing Composite\n\uf0c1\n\n\n\n\nDaytime Composite 1\n\uf0c1\n\n\n\n\nDaytime Composite 5\n\uf0c1\n\n\n\n\n\n\nChannel Differences\n\uf0c1\n\n\nChannel differences are the result of subtracting one channel from another to produce a new product. These products are generated \non the fly in AWIPS\n using the existing channel products from EDEX.\n\n\n\n\nNOTE\n: These may not be available for all sectors.\n\n\n\n\nThere currently 10 channel differences that are offered in CAVE:\n\n\n\n\nSplit Window (10.3 - 12.3 \u03bcm)\n\n\nSplit Cloud Top Phase (11.2 - 8.4 \u03bcm)\n\n\nNight Fog (10.3 - 2.9 \u03bcm)\n\n\nDay Fog (3.9 - 10.3 \u03bcm)\n\n\nSplit Fire (2.2 - 1.6 \u03bcm)\n\n\nSplit Ozone (9.6 - 10.3 \u03bcm)\n\n\nSplit Water Vapor (6.19 - 7.3 \u03bcm)\n\n\nSplit Snow (1.6 - 0.64 \u03bcm)\n\n\nVegetation (0.64 - 0.87 \u03bcm)\n\n\nUpper Level Info (11.2 - 6.19 \u03bcm)\n\n\n\n\n\n\nNOTE\n: The rendering of these products uses the \nJep\n package in Python, which has known to have issues installing on the MacOS and Windows versions.\n\n\n\n\n\n\nDerived Products\n\uf0c1\n\n\nDerived products are also known as \nLevel 2+\n products and are come through the \nHDS\n feed in the LDM. There are over 20 products currently available in AWIPS. To find out some more
"title": "GOES 16/17"
},
{
"location": "/cave/goes-16-satellite/#goes-1617",
"text": "The GOES-R decoder supports the ingest and display of NOAAport products (currently on the NIMAGE feed), Derived products (Level 2b netCDF files), and the Geostationary Lightning Mapper (GLM) products. GOES-R products are accessible in the Satellite menu. The menu is broken into sections and has submenus for each of the separate geospatial products: CONUS, Full Disk, Mesoscale Sectors, Hawaii, Alaska, and Puerto Rico. Each submenu has products for individual channels and RGB Composites , as well as submenus for channel differences and derived products . GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. NOTE : The RGB products are not available on MacOS.",
"title": "GOES 16/17"
},
{
"location": "/cave/goes-16-satellite/#ldm-pattern-actions",
"text": "This is found in the LDM's pqact.conf file which is located in /awips2/ldm/etc. The entries for GOES data are shown below. There are three different feeds that supply the individual channel data, derived products, and GLM data: NIMAGE, HDS and DIFAX, respectively. #\n# GOES 16/17 ABI\n#\nNIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*).nc\n FILE -close -edex\n /awips2/data_store/GOES/CMI/\\5.nc4\n#\n# GOES derived products\n#\nHDS ^(IXT.99) KNES (......)\n FILE -close -edex\n /awips2/data_store/GOES/derived/KNES_\\1_\\2-(seq)\n#\n# GLM\n#\nDIFAX ^/data/cspp-geo/(EAST|WEST|GRB-R)/OR_GLM-L2-([^/]*).nc\n FILE -close -edex\n /awips2/data_store/GOES/GLM/\\1_OR_GLM-L2-\\2.nc",
"title": "LDM Pattern Actions"
},
{
"location": "/cave/goes-16-satellite/#individual-channels",
"text": "All geospatial sectors have individual channel products that can be viewed. Below are samples of Channel 02 (0.64\u03bcm) for each of the sectors. These products come through the NIMAGE feed in the LDM.",
"title": "Individual Channels"
},
{
"location": "/cave/goes-16-satellite/#conus-1km",
"text": "",
"title": "CONUS 1km"
},
{
"location": "/cave/goes-16-satellite/#full-disk-6km",
"text": "",
"title": "Full Disk 6km"
},
{
"location": "/cave/goes-16-satellite/#mesoscale-sectors-tmeso-1-tmeso-2",
"text": "Two floating mesoscale sectors (will vary from image shown)",
"title": "Mesoscale Sectors (TMESO-1, TMESO-2)"
},
{
"location": "/cave/goes-16-satellite/#puerto-rico-prregi",
"text": "",
"title": "Puerto Rico (PRREGI)"
},
{
"location": "/cave/goes-16-satellite/#alaska",
"text": "",
"title": "Alaska"
},
{
"location": "/cave/goes-16-satellite/#hawaii",
"text": "",
"title": "Hawaii"
},
{
"location": "/cave/goes-16-satellite/#rgb-composites",
"text": "RGB Composites are also available for each sector. The three RGB products are: Icing - Ch 5, 3, 2 (1.61\u03bcm, 0.87\u03bcm, 0.64\u03bcm), Composite - Ch 2, 5, 14 #1 (0.64\u03bcm, 1.61\u03bcm, 11.20\u03bcm) and Composite #5 - Ch 2, 3, 2 (0.64\u03bcm, 0.87\u03bcm, 0.64\u03bcm). These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac. Please use the Linux or Windows installs to view RGB products.",
"title": "RGB Composites"
},
{
"location": "/cave/goes-16-satellite/#icing-composite",
"text": "",
"title": "Icing Composite"
},
{
"location": "/cave/goes-16-satellite/#daytime-composite-1",
"text": "",
"title": "Daytime Composite 1"
},
{
"location": "/cave/goes-16-satellite/#daytime-composite-5",
"text": "",
"title": "Daytime Composite 5"
},
{
"location": "/cave/goes-16-satellite/#channel-differences",
"text": "Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. NOTE : These may not be available for all sectors. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) NOTE : The rendering of these products uses the Jep package in Python, which has known to have issues installing on the MacOS and Windows versions.",
"title": "Channel Differences"
},
{
"location": "/cave/goes-16-satellite/#derived-products",
"text": "Derived products are also known as Level 2+ products and are come through the HDS feed in the LDM. There are over 20 products currently available in AWIPS. To find out some more information on some of the products please the Quick Guides compiled by CIRA. NOTE : These may not all be available for each sector. The current products offered in CAVE are: Aerosol Detection Aerosol Optical Depth Clear Sky Mask Cloud Optical Depth Cloud Particle Size Cloud Top Height Cloud Top Phase Cloud Top Pressure Cloud Top Temperature Derived CAPE Derived K-Index Derived Lifted Index Derived Showalter Index Derived Total Totals Fire Area Fire Power Fire Temperature Land Skin Temperature RR/QPE Sea Surface Temperature Snow Cover Total Precip Water Ash Cloud Height Ash Mass Load",
"title": "Derived Products"
},
{
"location": "/cave/goes-16-satellite/#hdf5-data-store",
"text": "Decoded GOES-R satellite images are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x. 18 awips fxalpha AKREGI\ndrwxr-xr-x. 235 awips fxalpha ECONUS\ndrwxr-xr-x. 38 awips fxalpha EFD\ndrwxr-xr-x. 30 awips fxalpha EMESO-1\ndrwxr-xr-x. 30 awips fxalpha EMESO-2\ndrwxr-xr-x. 18 awips fxalpha HIREGI\ndrwxr-xr-x. 18 awips fxalpha PRREGI\ndrwxr-xr-x. 18 awips fxalpha WCONUS\ndrwxr-xr-x. 18 awips fxalpha WFD\ndrwxr-xr-x. 18 awips fxalpha WMESO-1\ndrwxr-xr-x. 18 awips fxalpha WMESO-2",
"title": "HDF5 Data Store"
},
{
"location": "/cave/goes-16-satellite/#geostationary-lightning-mapper-glm",
"text": "NASA's SPoRT MSFC Earth Science Office has contributed plugins to decode GLM level2 products, displayed as point data in CAVE. GLM data is located in the menu structure: Satellite > Geostationary Lightning Data (GLM) . Data are displayable is available for Flash , Event , and Group products. There is also additional GLM data available in the Surface > GLM - Geostationary Lightning Mapper submenus.",
"title": "Geostationary Lightning Mapper (GLM)"
},
{
"location": "/cave/d2d-gridded-models/",
"text": "Volume Browser\n\uf0c1\n\n\nThe Volume Browser provides access to numerical models, other gridded data, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display.\n\n\nThe Volume Browser window is divided into four areas:\n\n\n\n\n\n\nThe \nMenu Bar\n along the top\n\n\nThe Data Selection Menus\n\n\nThe Product Selection List\n\n\nThe Load Buttons (Diff and Load) to load items from the Product Selection List\n\n\n\n\nEach area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser.\n\n\n\n\nVolume Browser Menu Bar\n\uf0c1\n\n\nThe dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser\n\n\n\n\nFile\n\n\nClone\n\n\nExit\n\n\n\n\n\n\nEdit\n\n\nClear All\n\n\nClear Sources\n\n\nClear Fields\n\n\nClear Panes\n\n\nSelect None\n\n\nSelect All\n\n\nFind (Ctrl+F)\n\n\n\n\n\n\nTools\n\n\nDisplay Types\n\n\nAnimation Types\n\n\n\n\n\n\nVB Tools\n\uf0c1\n\n\n\n\nBaselines\n\uf0c1\n\n\nSelecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be\nconstructed from within the Volume Browser. These baseline resources are \neditable\n.\n\n\nIf you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you\nare working with a baseline, a second click with B3 will return you to the original baseline, even\nif you modified another baseline.\n\n\nPoints\n\uf0c1\n\n\nPoints are used to generate model soundings, time-height cross-sections, time series, and variable vs.\nheight plots using the Volume Browser. As with the Baselines, the locations of these Points can be\nedited in the following manner:\n\n\n\n\n\"Snapping\" an Interactive Point\n: If you are zoomed in over an area when you load Interactive\nPoints and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned.\nThe system chooses a Point that has not been recently used. If you are currently working with a\nPoint, then a second B3 click will place another Point at the location of your cursor.\n\n\nDynamic Reference Map\n: When you generate a model sounding, a time-height cross-section, a\ntime series, or a variable vs. height plot, a small reference map indicating the location(s) of the\nplotted sounding(s) is provided in the upper left corner of the Main Display Pane.\n\n\n\n\nPoints may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are\nnot limited in terms of number, location, or designation. Points may also be assigned to different\ngroups to facilitate their use.\n\n\nChoose By ID\n\uf0c1\n\n\nChoose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations.\n\n\n\n\nDisplay Types\n\uf0c1\n\n\n\n\nPlan View (default)\n\uf0c1\n\n\nThis is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space.\n\n\nCross Section\n\uf0c1\n\n\nAllows
"title": "Volume Browser"
},
{
"location": "/cave/d2d-gridded-models/#volume-browser",
"text": "The Volume Browser provides access to numerical models, other gridded data, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons (Diff and Load) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser.",
"title": "Volume Browser"
},
{
"location": "/cave/d2d-gridded-models/#volume-browser-menu-bar",
"text": "The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Animation Types",
"title": "Volume Browser Menu Bar"
},
{
"location": "/cave/d2d-gridded-models/#vb-tools",
"text": "",
"title": "VB Tools"
},
{
"location": "/cave/d2d-gridded-models/#baselines",
"text": "Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be\nconstructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you\nare working with a baseline, a second click with B3 will return you to the original baseline, even\nif you modified another baseline.",
"title": "Baselines"
},
{
"location": "/cave/d2d-gridded-models/#points",
"text": "Points are used to generate model soundings, time-height cross-sections, time series, and variable vs.\nheight plots using the Volume Browser. As with the Baselines, the locations of these Points can be\nedited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive\nPoints and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned.\nThe system chooses a Point that has not been recently used. If you are currently working with a\nPoint, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a\ntime series, or a variable vs. height plot, a small reference map indicating the location(s) of the\nplotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are\nnot limited in terms of number, location, or designation. Points may also be assigned to different\ngroups to facilitate their use.",
"title": "Points"
},
{
"location": "/cave/d2d-gridded-models/#choose-by-id",
"text": "Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations.",
"title": "Choose By ID"
},
{
"location": "/cave/d2d-gridded-models/#display-types",
"text": "",
"title": "Display Types"
},
{
"location": "/cave/d2d-gridded-models/#plan-view-default",
"text": "This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space.",
"title": "Plan View (default)"
},
{
"location": "/cave/d2d-gridded-models/#cross-section",
"text": "Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\")",
"title": "Cross Section"
},
{
"location": "/cave/d2d-gridded-models/#time-height",
"text": "Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution.",
"title": "Time Height"
},
{
"location": "/cave/d2d-gridded-models/#var-vs-hgt",
"text": "Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data.",
"title": "Var vs Hgt"
},
{
"location": "/cave/d2d-gridded-models/#sounding",
"text": "Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen.",
"title": "Sounding"
},
{
"location": "/cave/d2d-gridded-models/#time-series",
"text": "Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point.",
"title": "Time Series"
},
{
"location": "/cave/d2d-gridded-models/#animation-types",
"text": "",
"title": "Animation Types"
},
{
"location": "/cave/d2d-gridded-models/#time",
"text": "The default option for the Volume Browser. It allows you to view model data through time",
"title": "Time"
},
{
"location": "/cave/d2d-gridded-models/#space",
"text": "Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.",
"title": "Space"
},
{
"location": "/cave/d2d-tools/",
"text": "Display Tools\n\uf0c1\n\n\nThe display tools are a subset of the tools available in CAVE. These programs are accessible though the \nTools\n dropdown menu.\n\n\n\n\nMany of the tools listed under the Tools menu can be placed into an \neditable state\n. Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the \nProduct Legend\n.\n\n\n\n\nNote\n: To see information about some of the other options in the Tools menu, check out the \nRadar Tools\n page.\n\n\n\n\n\n\nAz/Ran Overlay\n\uf0c1\n\n\nThis tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button.\n\n\n\n\n\n\nBaselines\n\uf0c1\n\n\nSelecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable.\n\n\n\"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime.\n\n\n\n\n\n\nChoose By ID\n\uf0c1\n\n\nChoose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser.\n\n\n\n\n\n\nDistance Bearing\n\uf0c1\n\n\nSelecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted.\n\n\n\n\n\n\nDistance Speed\n\uf0c1\n\n\nThis tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens.\n\n\n\n\n\n\n\n\nMode\n: You have the following selections from this option.\n\n\n\n\n\n\nPoint\n: A radio button that allows you to set the Centroid Marker as a single point.\n\n\n\n\n\n\nPolyline\n: A radio button that allows you to set the Centroid Marker as a polyline.\n\n\n\n\n\n\n\n\n\n\nLegend\n: You have the following selections from this option.\n\n\n\n\n\n\nTime\n: A radio button that allows you to display time with the Centroid Marker.\n\n\n\n\n\n\nSpeed\n: A radio button that allows you to display speed with the Centroid Marker.\n\n\n\n\n\n\n\n\n\n\n\n\nDistance Scale\n\uf0c1\n\n\nEnabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest.\n\n\n\n\n\n\nFeature Following Zoom\n\uf0c1\n\n\nWhen you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to m
"title": "Display Tools"
},
{
"location": "/cave/d2d-tools/#display-tools",
"text": "The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Note : To see information about some of the other options in the Tools menu, check out the Radar Tools page.",
"title": "Display Tools"
},
{
"location": "/cave/d2d-tools/#azran-overlay",
"text": "This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button.",
"title": "Az/Ran Overlay"
},
{
"location": "/cave/d2d-tools/#baselines",
"text": "Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime.",
"title": "Baselines"
},
{
"location": "/cave/d2d-tools/#choose-by-id",
"text": "Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser.",
"title": "Choose By ID"
},
{
"location": "/cave/d2d-tools/#distance-bearing",
"text": "Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted.",
"title": "Distance Bearing"
},
{
"location": "/cave/d2d-tools/#distance-speed",
"text": "This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker.",
"title": "Distance Speed"
},
{
"location": "/cave/d2d-tools/#distance-scale",
"text": "Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest.",
"title": "Distance Scale"
},
{
"location": "/cave/d2d-tools/#feature-following-zoom",
"text": "When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom.",
"title": "Feature Following Zoom"
},
{
"location": "/cave/d2d-tools/#home",
"text": "Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it.\nClicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location.",
"title": "Home"
},
{
"location": "/cave/d2d-tools/#points",
"text": "The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive\nPoints and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a\ntime series, or a variable vs. height plot, a small reference map indicating the location(s) of the\nplotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" <No Group> \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option\nof selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be\nunhidden using the Points List dialog box, where you would uncheck the checkbox under\nthe \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is\nselected, a pop-up opens to confirm whether you want to delete the Point. Selecting the\n\"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to\nreview the list of Points assigned to
"title": "Points"
},
{
"location": "/cave/d2d-tools/#put-home-cursor",
"text": "The Put home cursor tool provides an easy way to locate a METAR observation station, a city and\nstate, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location\n(station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box.\nAnother way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the\nStation, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location.",
"title": "Put home cursor"
},
{
"location": "/cave/d2d-tools/#range-rings",
"text": "The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site.",
"title": "Range Rings"
},
{
"location": "/cave/d2d-tools/#sunsetsunrise",
"text": "By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset.",
"title": "Sunset/Sunrise"
},
{
"location": "/cave/d2d-tools/#text-window",
"text": "Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled.",
"title": "Text Window"
},
{
"location": "/cave/d2d-tools/#time-of-arrival-lead-time",
"text": "Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s).",
"title": "Time of Arrival / Lead Time"
},
{
"location": "/cave/d2d-tools/#units-calculator",
"text": "This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box.",
"title": "Units Calculator"
},
{
"location": "/cave/d2d-tools/#text-workstation",
"text": "By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the Unidata AWIPS version.",
"title": "Text Workstation"
},
{
"location": "/cave/d2d-radar-tools/",
"text": "Radar Tools\n\uf0c1\n\n\nThe radar tools are a subset of the tools available in CAVE. These programs are accessible though the \nTools\n dropdown menu, and in individual site radar menus.\n\n\n\n\n\n\nEstimated Actual Velocity (EAV)\n\uf0c1\n\n\nA velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.\n\n\n\n\n\n\nRadar Display Controls\n\uf0c1\n\n\n\n\n\n\n\n\nThe Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below.\n\n\n\n\nNote\n: Our version of CAVE may not have all the products that these options are applicable to.\n\n\n\n\nThe Radar Display Controls dialog box is divided into eight sections: \nSTI\n, \nHI\n, \nTVS\n, \nDMD/MD/TVS\n, \nDMD\n, \nMBA\n, \nSRM\n, and \nSAILS\n. Each section has the following options:\n\n\nSTI (Storm Track Information)\n\uf0c1\n\n\nThis section has options to adjust the appearance of the STI graphic product.\n\n\n\n\nNumber of storms to show\n: This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms.\n\n\nType of track to show\n: This options menu allows you to choose the type of storm track that you want displayed.\n\n\n\n\nHI (Hail Index)\n\uf0c1\n\n\nThis portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively.\n\n\n\n\nLow hail probability (POH)\n: The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30.\n\n\nLow severe hail probability (POSH)\n: The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30.\n\n\nHigh hail probability\n: The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.\n\n\nHigh severe hail probability\n: The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.\n\n\n\n\nTVS (Tornado Vortex Signature)\n\uf0c1\n\n\nThere is one option in this section of the Radar Display Controls dialog box.\n\n\n\n\nShow elevated TVS\n: This toggle button lets you control the appearance of the elevated TVS radar graphic product.\n\n\n\n\nDMD, MD, TVS\n\uf0c1\n\n\nThere is one option in this section of the Radar Display Controls dialog box.\n\n\n\n\nShow extrapolated features\n: With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS.\n\n\n\n\nDMD (Digital Mesocyclone Display)\n\uf0c1\n\n\n\n\nMinimum feature strength\n: A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a
"title": "Radar Tools"
},
{
"location": "/cave/d2d-radar-tools/#radar-tools",
"text": "The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus.",
"title": "Radar Tools"
},
{
"location": "/cave/d2d-radar-tools/#estimated-actual-velocity-eav",
"text": "A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.",
"title": "Estimated Actual Velocity (EAV)"
},
{
"location": "/cave/d2d-radar-tools/#radar-display-controls",
"text": "The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options:",
"title": "Radar Display Controls"
},
{
"location": "/cave/d2d-radar-tools/#sti-storm-track-information",
"text": "This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed.",
"title": "STI (Storm Track Information)"
},
{
"location": "/cave/d2d-radar-tools/#hi-hail-index",
"text": "This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.",
"title": "HI (Hail Index)"
},
{
"location": "/cave/d2d-radar-tools/#tvs-tornado-vortex-signature",
"text": "There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product.",
"title": "TVS (Tornado Vortex Signature)"
},
{
"location": "/cave/d2d-radar-tools/#dmd-md-tvs",
"text": "There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS.",
"title": "DMD, MD, TVS"
},
{
"location": "/cave/d2d-radar-tools/#dmd-digital-mesocyclone-display",
"text": "Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks.",
"title": "DMD (Digital Mesocyclone Display)"
},
{
"location": "/cave/d2d-radar-tools/#mba-microburst-alert",
"text": "Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts.",
"title": "MBA (Microburst Alert)"
},
{
"location": "/cave/d2d-radar-tools/#srm-storm-relative-motion",
"text": "The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion.",
"title": "SRM (Storm Relative Motion)"
},
{
"location": "/cave/d2d-radar-tools/#sails-supplemental-adaptive-intra-volume-low-level-scan",
"text": "Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt.",
"title": "SAILS (Supplemental Adaptive Intra-Volume Low Level Scan)"
},
{
"location": "/cave/d2d-radar-tools/#vr-shear",
"text": "This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel\nmode. The VR - Shear overlay is loaded in different colors for each panel. There are actually\nfour copies of the program running, and each behaves independently. This means that you can\nget accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.",
"title": "VR - Shear"
},
{
"location": "/cave/d2d-edit-menus/",
"text": "Editing Menus\n\uf0c1\n\n\nAny of the menus in the menubar can be customized in the \nLocalization Perspective\n. \n\n\n\n\nModifying Menus\n\uf0c1\n\n\nOnce in the \nLocalization Perspective\n, menus can be modified by going to the \nD2D\n > \nMenus\n directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the \nindex.xml\n file found in these submenus is the \nmaster\n file which the actual menu is based off of. This file can reference other xml files and you may have to modify these \nchild\n xml files to get the results you are looking for. \n\n\nIn order to modify any file, you must right click on it and \nselect \nCopy To > USER (my-username)\n. Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change.\n\n\nThis example covers how to add a new menu entry to an existing menu.\n\n\n\n\nSwitch to the \nLocalization Perspective\n\n\nFind the \ngrid\n folder under \nD2D\n > \nMenus\n\n\nDouble-click to expand \nindex.xml\n\n\nRight-click to \nBASE (common_static)\n and select \nCopy To...\n, then select \nUSER\n level\n\n\n\n\nDouble-click \nUSER\n to open the editor and copy an existing include tag, and update the \nmodelName\n (this must match an existing product found in the Product Browser) and the \nmenuName\n (this can be anything)\n\n\n<include installTo=\"menu:models\" fileName=\"menus/grid/allFamilies.xml\">\n <substitute key=\"modelName\" value=\"GEFS\" />\n <substitute key=\"menuName\" value=\"GEFS\" />\n <substitute key=\"frameCount\" value=\"41\" />\n <substitute key=\"TP\" value=\"TP\"/>\n</include>\n\n\n\n\n\n\n\nOnce this is completed, save the file and restart CAVE\n\n\n\n\nNavigate to the \nModels\n menu and you should see a new entry with \nGEFS\n\n\n\n\n\n\nRemoving Menus\n\uf0c1\n\n\nThis example covers how to remove a menu (in this case \nMRMS\n) from D2D:\n\n\n\n\nSwitch to the \nLocalization Perspective\n\n\nFind the \nmrms\n folder under \nD2D\n > \nMenus\n\n\nDouble-click to expand \nindex.xml\n\n\nRight-click \nBASE\n and select \nCopy To...\n, then select \nUSER\n level\n\n\nRight-click refresh the \nmrms\n entry\n\n\n\n\nDouble click \nUSER\n to open the editor and change\n\n\n<menuContributionFile>\n <include installTo=\"menu:mrms?after=MRMS_MENU_START\" fileName=\"menus/mrms/mrms.xml\"/>\n</menuContributionFile>\n\n\n\nto \n\n\n<menuContributionFile>\n</menuContributionFile>\n\n\n\n\n\n\n\nWith this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as \nradar\n, \nupperair\n, \ntools\n, etc., to further customize D2D data menus for any level of localization.",
"title": "Editing Menus"
},
{
"location": "/cave/d2d-edit-menus/#editing-menus",
"text": "Any of the menus in the menubar can be customized in the Localization Perspective .",
"title": "Editing Menus"
},
{
"location": "/cave/d2d-edit-menus/#modifying-menus",
"text": "Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and \nselect Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) <include installTo=\"menu:models\" fileName=\"menus/grid/allFamilies.xml\">\n <substitute key=\"modelName\" value=\"GEFS\" />\n <substitute key=\"menuName\" value=\"GEFS\" />\n <substitute key=\"frameCount\" value=\"41\" />\n <substitute key=\"TP\" value=\"TP\"/>\n</include> Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS",
"title": "Modifying Menus"
},
{
"location": "/cave/d2d-edit-menus/#removing-menus",
"text": "This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change <menuContributionFile>\n <include installTo=\"menu:mrms?after=MRMS_MENU_START\" fileName=\"menus/mrms/mrms.xml\"/>\n</menuContributionFile> to <menuContributionFile>\n</menuContributionFile> With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.",
"title": "Removing Menus"
},
{
"location": "/cave/cave-localization/",
"text": "Change Localization\n\uf0c1\n\n\nLocalization Preferences\n\uf0c1\n\n\nThe default localization site for Unidata AWIPS is OAK (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. \n\n\n\n\nSince release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from \nCAVE > Preferences > Localization\n, where edits can be made to both the site ID and EDEX server name. Click \nRestart\n after changes are applied.\n\nThis window also has the option to \nPrompt for settings on startup\n, which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites).\n\n\n\n\nChange the site (example shows TBW Tampa Bay) and click \nApply\n or \nOK\n and confirm the popup dialog, which informs you that you must \nrestart\n CAVE for the changes to take effect.",
"title": "Change Localization"
},
{
"location": "/cave/cave-localization/#change-localization",
"text": "",
"title": "Change Localization"
},
{
"location": "/cave/cave-localization/#localization-preferences",
"text": "The default localization site for Unidata AWIPS is OAK (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. Since release 16.1.4, CAVE users can switch the localization site to any valid NWS WFO from CAVE > Preferences > Localization , where edits can be made to both the site ID and EDEX server name. Click Restart after changes are applied. \nThis window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.",
"title": "Localization Preferences"
},
{
"location": "/cave/import-export/",
"text": "Import/Export\n\uf0c1\n\n\nExport Images/GIFs\n\uf0c1\n\n\nThe D2D screen can be exported as a PNG image as well as an animated GIF using the \nFile > Export > Image\n menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the \n.gif\n extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF.\n\n\n\n\nNote\n: This functionality \ndoes not\n currently work on \nMac OS\n because it implements OGL libraries which are not compatible on Mac.\n\n\n\n\n\n\n\n\nExport KML\n\uf0c1\n\n\nThe \nExport\n submenu also includes a \nKML\n option (\nFile > Export > KML\n), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth.\n\n\n\n\nThe KML dialog box includes options to select frames to export. This includes exporting all frames,\nthe current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in\nthe Grid Manager. Additional options are available for selection under the \"Other Options\" section:\n\n\n\n\n\n\nExport Hidden\n: When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported.\n\n\n\n\n\n\nExport Maps\n: When selected, all enabled maps displayed within the Main Display Pane will be\nexported.\n\n\n\n\n\n\nShade Earth\n: When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background.\n\n\n\n\n\n\nShow Background Tiles\n: When selected, data (such as plot data) will display on top of black\ntiles when loaded in Google Earth.\n\n\n\n\n\n\n\n\nCAVE Import Formats\n\uf0c1\n\n\nCAVE supported the following geo-referenced data files. CAVE can import the following through formats through the \nFile > Import\n menu.\n\n\n\n\nBackground...\n\n\nImage...\n\n\nBCD File\n\n\nGeoTIFF\n\n\nLPI File\n\n\nSPI File\n \n\n\nDisplays\n\n\n\n\n\n\n\n\nCAVE Export Formats\n\uf0c1\n\n\nCAVE can export to the following through the \nFile > Export\n menu.\n\n\n\n\nImage\n\n\nPrint Screen\n\n\nKML\n\n\nEditor Display...\n\n\nPerspective Displays...",
"title": "Import/Export"
},
{
"location": "/cave/import-export/#importexport",
"text": "",
"title": "Import/Export"
},
{
"location": "/cave/import-export/#export-imagesgifs",
"text": "The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Note : This functionality does not currently work on Mac OS because it implements OGL libraries which are not compatible on Mac.",
"title": "Export Images/GIFs"
},
{
"location": "/cave/import-export/#export-kml",
"text": "The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames,\nthe current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in\nthe Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be\nexported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black\ntiles when loaded in Google Earth.",
"title": "Export KML"
},
{
"location": "/cave/import-export/#cave-import-formats",
"text": "CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays",
"title": "CAVE Import Formats"
},
{
"location": "/cave/import-export/#cave-export-formats",
"text": "CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...",
"title": "CAVE Export Formats"
},
{
"location": "/install/start-edex/",
"text": "EDEX Basic Commands\n\uf0c1\n\n\nUnidata's EDEX install also comes with a \nsimple \nedex\n program\n that can help execute basic EDEX utilities. The most basic of the commands are the following:\n\n\nTo start all EDEX services:\n\n\nedex start\n\n\n\nTo stop all EDEX services:\n\n\nedex stop\n\n\n\n\n\nService and Boot Settings\n\uf0c1\n\n\nThese commands will start and stop five EDEX service files installed into \n/etc/init.d/\n, four of which are run on boot:\n\n\nservice postgres start\nservice httpd-pypies start\nservice qpidd start\nservice edex_camel start\n\n\n\nThe fifth, \nedex_ldm\n, does \nnot run at boot\n to prevent filling up disk space if EDEX is not running:\n\n\nservice edex_ldm start\n\n\n\nAll of these services are started and stopped by the single program: \nedex\n as mentioned above.\n\n\nLDM Troubleshooting\n\uf0c1\n\n\nIf the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned \nabove\n. If \nsudo service edex_ldm start\n does not start up LDM smoothly, please try these steps:\n\n\n\n\nAll of the following commands should be run as user \nawips\n and the \nservice\n commands may need to be run with \nsudo\n.\n\n\n\n\n\n\n\n\nRun \nsudo service edex_ldm start\n or \nldmadmin start\n and recieve this message:\n\n\nldmadmin start\n\nstart_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all \nis well and then execute \"ldmadmin clean\" to remove the PID-file.\n\n\n\n\n\n\n\nRun \nldmadmin clean\n and \nsudo service edex_ldm start\n and receive this error:\n\n\nldmadmin clean\nsudo service edex_ldm start\n\nChecking the product-queue...\nThe writer-counter of the product-queue isn't zero. Either a process\nhas the product-queue open for writing or the queue might be corrupt.\nTerminate the process and recheck or use\n pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q\n /awips2/ldm/var/queues/ldm.pq\nto validate the queue and set the writer-counter to zero.\nLDM not started\n\n\n\n\n\n\n\nTo resolve the above, run:\n\n\npqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq \nldmadmin delqueue\nldmadmin mkqueue\nsudo service edex_ldm start\n\n\n\n\n\n\n\n\n\nEDEX Commands\n\uf0c1\n\n\nUnidata's version of EDEX installs with a helpful \nedex\n script that can be used for basic EDEX tasks.\n\n\nedex start\n\uf0c1\n\n\nedex start\n\nStarting EDEX PostgreSQL: [ OK ]\nStarting httpd: [ OK ]\nStarting QPID [ OK ]\nStarting EDEX Camel (request): \nStarting EDEX Camel (ingest): \nStarting EDEX Camel (ingestGrib): \nStarting AWIPS LDM:The product-queue is OK.\n...\n\n\n\nedex start base\n\uf0c1\n\n\nTo start all EDEX services \nexcept\n the LDM:\n\n\nedex start base\n\n\n\n\n\nedex stop\n\uf0c1\n\n\nedex stop\n\nStopping EDEX Camel (request): \nStopping EDEX Camel (ingest): \nStopping EDEX Camel (ingestGrib): \nStopping QPID [ OK ]\nStopping httpd: [ OK ]\nStopping EDEX PostgreSQL: [ OK ]\nStopping AWIPS LDM:Stopping the LDM server...\n...\n\n\n\n\n\nedex setup\n\uf0c1\n\n\nedex setup\n\n[edex] EDEX IP and Hostname Setup\n Checking /awips2/database/data/pg_hba.conf [OK]\n Checking /awips2/edex/bin/setup.env [OK]\n\n[edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf\n[done]\n\n\n\nThis command configures and/or confirms that the EDEX hostname and IP address definitions exist (\nedex setup\n is run by \nedex start\n).\n\n\n\n\nNote\n: If your EDEX server is running but you see the message \n\"Connectivity Error: Unable to validate localization preferences\"\n in CAVE, it may mean that the domain name defined in \n/awips2/edex/bin/setup.env\n can not be resolved from \noutside\n the server. Some machines have different \ninternally-resolved\n and \nexternally-resolved\n domain names (cloud-based especially
"title": "EDEX Basic Commands"
},
{
"location": "/install/start-edex/#edex-basic-commands",
"text": "Unidata's EDEX install also comes with a simple edex program that can help execute basic EDEX utilities. The most basic of the commands are the following: To start all EDEX services: edex start To stop all EDEX services: edex stop",
"title": "EDEX Basic Commands"
},
{
"location": "/install/start-edex/#service-and-boot-settings",
"text": "These commands will start and stop five EDEX service files installed into /etc/init.d/ , four of which are run on boot: service postgres start\nservice httpd-pypies start\nservice qpidd start\nservice edex_camel start The fifth, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running: service edex_ldm start All of these services are started and stopped by the single program: edex as mentioned above.",
"title": "Service and Boot Settings"
},
{
"location": "/install/start-edex/#ldm-troubleshooting",
"text": "If the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned above . If sudo service edex_ldm start does not start up LDM smoothly, please try these steps: All of the following commands should be run as user awips and the service commands may need to be run with sudo . Run sudo service edex_ldm start or ldmadmin start and recieve this message: ldmadmin start\n\nstart_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all \nis well and then execute \"ldmadmin clean\" to remove the PID-file. Run ldmadmin clean and sudo service edex_ldm start and receive this error: ldmadmin clean\nsudo service edex_ldm start\n\nChecking the product-queue...\nThe writer-counter of the product-queue isn't zero. Either a process\nhas the product-queue open for writing or the queue might be corrupt.\nTerminate the process and recheck or use\n pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q\n /awips2/ldm/var/queues/ldm.pq\nto validate the queue and set the writer-counter to zero.\nLDM not started To resolve the above, run: pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq \nldmadmin delqueue\nldmadmin mkqueue\nsudo service edex_ldm start",
"title": "LDM Troubleshooting"
},
{
"location": "/install/start-edex/#edex-commands",
"text": "Unidata's version of EDEX installs with a helpful edex script that can be used for basic EDEX tasks.",
"title": "EDEX Commands"
},
{
"location": "/install/start-edex/#edex-start",
"text": "edex start\n\nStarting EDEX PostgreSQL: [ OK ]\nStarting httpd: [ OK ]\nStarting QPID [ OK ]\nStarting EDEX Camel (request): \nStarting EDEX Camel (ingest): \nStarting EDEX Camel (ingestGrib): \nStarting AWIPS LDM:The product-queue is OK.\n...",
"title": "edex start"
},
{
"location": "/install/start-edex/#edex-start-base",
"text": "To start all EDEX services except the LDM: edex start base",
"title": "edex start base"
},
{
"location": "/install/start-edex/#edex-stop",
"text": "edex stop\n\nStopping EDEX Camel (request): \nStopping EDEX Camel (ingest): \nStopping EDEX Camel (ingestGrib): \nStopping QPID [ OK ]\nStopping httpd: [ OK ]\nStopping EDEX PostgreSQL: [ OK ]\nStopping AWIPS LDM:Stopping the LDM server...\n...",
"title": "edex stop"
},
{
"location": "/install/start-edex/#edex-setup",
"text": "edex setup\n\n[edex] EDEX IP and Hostname Setup\n Checking /awips2/database/data/pg_hba.conf [OK]\n Checking /awips2/edex/bin/setup.env [OK]\n\n[edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf\n[done] This command configures and/or confirms that the EDEX hostname and IP address definitions exist ( edex setup is run by edex start ). Note : If your EDEX server is running but you see the message \"Connectivity Error: Unable to validate localization preferences\" in CAVE, it may mean that the domain name defined in /awips2/edex/bin/setup.env can not be resolved from outside the server. Some machines have different internally-resolved and externally-resolved domain names (cloud-based especially). The name defined in setup.env must be externally-resolvable .",
"title": "edex setup"
},
{
"location": "/install/start-edex/#edex-log",
"text": "edex log\n\n[edex] EDEX Log Viewer\n\n :: No log specified - Defaulting to ingest log\n :: Viewing /awips2/edex/logs/edex-ingest-20151209.log. Press CTRL+C to exit\n\nINFO [Ingest.binlightning-1] /awips2/data_store/SFPA42_KWBC_091833_38031177.2015120918 processed in: 0.0050 (sec) Latency: 0.0550 (sec)\nINFO [Ingest.obs-1] /awips2/data_store/metar/SAIN31_VABB_091830_131392869.2015120918 processed in: 0.0810 (sec) Latency: 0.1800 (sec) More edex logs... edex log grib\nedex log request\nedex log ldm\nedex log radar\nedex log satellite\nedex log text",
"title": "edex log"
},
{
"location": "/install/start-edex/#edex-qpid",
"text": "Shows a list of the the Qpid message queue to monitor data ingest (messages in vs messages out, i.e. decoded): [centos@js-156-89 ~]$ edex qpid\nQueues\n queue dur excl msg msgIn msgOut bytes bytesIn bytesOut cons bind\n ================================================================================================\n external.dropbox Y Y 11 1.26m 1.26m 621 79.6m 79.6m 5 1\n Ingest.Radar Y Y 4 589k 589k 184 27.1m 27.1m 5 1\n Ingest.GribDecode Y Y 0 370k 370k 0 103m 103m 11 1\n Ingest.GribSplit Y Y 2 361k 361k 201 31.9m 31.9m 5 1\n Ingest.modelsounding Y Y 0 100k 100k 0 6.54m 6.54m 1 1\n Ingest.Text Y Y 0 97.8k 97.8k 0 5.25m 5.25m 2 1\n Ingest.GOESR Y Y 0 83.4k 83.4k 0 6.92m 6.92m 2 1\n Ingest.obs Y Y 0 46.2k 46.2k 0 2.40m 2.40m 1 1\n Grid.PostProcess Y Y 0 20.2k 20.2k 0 6.68m 6.68m 1 1\n Ingest.sfcobs Y Y 0 10.5k 10.5k 0 577k 577k 1 1\n Ingest.goessounding Y Y 0 6.68k 6.68k 0 427k 427k 1 1\n Ingest.Glm Y Y 0 5.61k 5.61k 0 581k 581k 1 1\n Ingest.aww Y Y 0 3.32k 3.32k 0 182k 182k 1 1",
"title": "edex qpid"
},
{
"location": "/install/start-edex/#edex-users",
"text": "To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users\n\n -- EDEX Users 20160826 --\nuser@101.253.20.225\nuser@192.168.1.67\nawips@0.0.0.0\nawips@sdsmt.edu",
"title": "edex users"
},
{
"location": "/install/start-edex/#edex-purge",
"text": "To view any stuck purge jobs in PortgreSQL (a rare but serious problem if your disk fills up). The solution to this is to run edex purge reset .",
"title": "edex purge"
},
{
"location": "/edex/settings/",
"text": "EDEX Settings\n\uf0c1\n\n\nPlugin Configuration\n\uf0c1\n\n\nThe directory \n/awips2/edex/conf/resources\n contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start:\n\n\nacarssounding.properties\nautobldsrv.properties\ncom.raytheon.edex.plugin.gfe.properties\ncom.raytheon.edex.text.properties\ncom.raytheon.uf.common.registry.ebxml.properties\ncom.raytheon.uf.edex.archive.cron.properties\ncom.raytheon.uf.edex.database.properties\ncom.raytheon.uf.edex.registry.ebxml.properties\ndistribution.properties\nedex-localization-http.properties\nedex-requestsrv.properties\nedex-uengine.properties\neventBus.properties\nftp.properties\ngoesr.properties\ngrib.properties\nmaintenance.properties\nproxy.properties\npurge.properties\nquartz.properties\nradar.properties\nstats.properties\ntextdbsrv.properties\nwarning.properties\n\n\n\nLook at \npurge.properties\n for example:\n\n\n# Master switch to enable and disable purging\npurge.enabled=true\n\n# Interval at which the purge job kicks off\npurge.cron=0+0/15+*+*+*+?\n\n# Interval at which the outgoing files are purged\npurge.outgoing.cron=0+30+*+*+*+?\n\n# Interval at which the logs are purged\npurge.logs.cron=0+30+0+*+*+?\n\n# Interval at which hdf5 orphans are purged\npurge.orphan.period=24h\n\n# Number of days older than the earliest known data to delete.\npurge.orphan.buffer=7\n...\n\n\n\nIn \ngrib.properties\n, \ngoesr.properties\n, and \nradar.properties\n you can adjust the number of decoder threads for each plugin.\n\n\ncat radar.properties\n\n# Number threads for radar products ingested from the SBN\nradar-decode.sbn.threads=5\n\n\n\n\n\nIngest Modes\n\uf0c1\n\n\nBy default, EDEX starts three \"modes\": \ningest\n, \ningestGrib\n, and \nrequest\n (each as its own JVM).\n\n\nThe file \n/awips2/edex/conf/modes/modes.xml\n contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more.\n\n\nEDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. \n\n\nIn \n/awips2/edex/conf/modes/modes.xml\n there are a number of unused plugin decoders excluded because the data are not available outside of the SBN:\n\n\n...\n<mode name=\"ingest\">\n <exclude>.*request.*</exclude>\n <exclude>edex-security.xml</exclude>\n <exclude>ebxml.*\\.xml</exclude>\n <exclude>grib-decode.xml</exclude>\n <exclude>grid-staticdata-process.xml</exclude>\n <exclude>.*(dpa|taf|nctext).*</exclude>\n <exclude>webservices.xml</exclude>\n <exclude>.*datadelivery.*</exclude>\n <exclude>.*bandwidth.*</exclude>\n <exclude>.*sbn-simulator.*</exclude>\n <exclude>hydrodualpol-ingest.xml</exclude>\n <exclude>grid-metadata.xml</exclude>\n <exclude>.*ogc.*</exclude>\n <exclude>obs-ingest-metarshef.xml</exclude>\n <exclude>ffmp-ingest.xml</exclude>\n <exclude>scan-ingest.xml</exclude>\n <exclude>cwat-ingest.xml</exclude>\n <exclude>fog-ingest.xml</exclude>\n <exclude>vil-ingest.xml</exclude>\n <exclude>preciprate-ingest.xml</exclude>\n <exclude>qpf-ingest.xml</exclude>\n <exclude>fssobs-ingest.xml</exclude>\n <exclude>cpgsrv-spring.xml</exclude>\n</mode>\n...\n\n\n\nIn this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM.\n\n\n\n\nNote\n: TAF and NCTEXT plugins are disabled here due to performance issues.\n\n\n\n\n\n\nJVM Memory\n\uf0c1\n\n\nThe directory \n/awips2/edex/etc/\n contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request):\n\n\nls -al /awips2/edex/etc/\n-rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh\n-rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh\n-rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh\n-rw-r--
"title": "EDEX Settings"
},
{
"location": "/edex/settings/#edex-settings",
"text": "",
"title": "EDEX Settings"
},
{
"location": "/edex/settings/#plugin-configuration",
"text": "The directory /awips2/edex/conf/resources contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start: acarssounding.properties\nautobldsrv.properties\ncom.raytheon.edex.plugin.gfe.properties\ncom.raytheon.edex.text.properties\ncom.raytheon.uf.common.registry.ebxml.properties\ncom.raytheon.uf.edex.archive.cron.properties\ncom.raytheon.uf.edex.database.properties\ncom.raytheon.uf.edex.registry.ebxml.properties\ndistribution.properties\nedex-localization-http.properties\nedex-requestsrv.properties\nedex-uengine.properties\neventBus.properties\nftp.properties\ngoesr.properties\ngrib.properties\nmaintenance.properties\nproxy.properties\npurge.properties\nquartz.properties\nradar.properties\nstats.properties\ntextdbsrv.properties\nwarning.properties Look at purge.properties for example: # Master switch to enable and disable purging\npurge.enabled=true\n\n# Interval at which the purge job kicks off\npurge.cron=0+0/15+*+*+*+?\n\n# Interval at which the outgoing files are purged\npurge.outgoing.cron=0+30+*+*+*+?\n\n# Interval at which the logs are purged\npurge.logs.cron=0+30+0+*+*+?\n\n# Interval at which hdf5 orphans are purged\npurge.orphan.period=24h\n\n# Number of days older than the earliest known data to delete.\npurge.orphan.buffer=7\n... In grib.properties , goesr.properties , and radar.properties you can adjust the number of decoder threads for each plugin. cat radar.properties\n\n# Number threads for radar products ingested from the SBN\nradar-decode.sbn.threads=5",
"title": "Plugin Configuration"
},
{
"location": "/edex/settings/#ingest-modes",
"text": "By default, EDEX starts three \"modes\": ingest , ingestGrib , and request (each as its own JVM). The file /awips2/edex/conf/modes/modes.xml contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more. EDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. In /awips2/edex/conf/modes/modes.xml there are a number of unused plugin decoders excluded because the data are not available outside of the SBN: ...\n<mode name=\"ingest\">\n <exclude>.*request.*</exclude>\n <exclude>edex-security.xml</exclude>\n <exclude>ebxml.*\\.xml</exclude>\n <exclude>grib-decode.xml</exclude>\n <exclude>grid-staticdata-process.xml</exclude>\n <exclude>.*(dpa|taf|nctext).*</exclude>\n <exclude>webservices.xml</exclude>\n <exclude>.*datadelivery.*</exclude>\n <exclude>.*bandwidth.*</exclude>\n <exclude>.*sbn-simulator.*</exclude>\n <exclude>hydrodualpol-ingest.xml</exclude>\n <exclude>grid-metadata.xml</exclude>\n <exclude>.*ogc.*</exclude>\n <exclude>obs-ingest-metarshef.xml</exclude>\n <exclude>ffmp-ingest.xml</exclude>\n <exclude>scan-ingest.xml</exclude>\n <exclude>cwat-ingest.xml</exclude>\n <exclude>fog-ingest.xml</exclude>\n <exclude>vil-ingest.xml</exclude>\n <exclude>preciprate-ingest.xml</exclude>\n <exclude>qpf-ingest.xml</exclude>\n <exclude>fssobs-ingest.xml</exclude>\n <exclude>cpgsrv-spring.xml</exclude>\n</mode>\n... In this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM. Note : TAF and NCTEXT plugins are disabled here due to performance issues.",
"title": "Ingest Modes"
},
{
"location": "/edex/settings/#jvm-memory",
"text": "The directory /awips2/edex/etc/ contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request): ls -al /awips2/edex/etc/\n-rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh\n-rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh\n-rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh\n-rw-r--r-- 1 awips fxalpha 337 Jul 24 18:36 ingest.sh\n-rw-r--r-- 1 awips fxalpha 848 Jul 24 18:42 profiler.sh\n-rw-r--r-- 1 awips fxalpha 1188 Jul 24 18:41 registry.sh\n-rw-r--r-- 1 awips fxalpha 601 Jul 24 18:36 request.sh Each file contains the Xmx definition for maximum memory: ...\nexport INIT_MEM=512 # in Meg\nexport MAX_MEM=4096 # in Meg\n... After editing these files, you must restart : service edex_camel restart .",
"title": "JVM Memory"
},
{
"location": "/edex/distributed-computing/",
"text": "Distributed EDEX\n\uf0c1\n\n\nAWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data.\n\n\n\n\nUnidata's Current EDEX Server\n\uf0c1\n\n\nCurrently, with our specific EDEX server we use a Database/Request instance that also decodes and ingests a good portion of the data. It handles all data requests from CAVE users, as well as the majority of the decoding and ingesting for data feeds coming down on the LDM. The \nradar\n data has been specifically exluded (from the decoding and ingest) and it has its own \nIngest/Decode Server\n which is explained in more detail below.\n\n\nFor our EDEX we have designated an instance of the ingest/decoding server to be dedicated to handling the radar data. Our \nRadar-EDEX\n recieves and decodes all radar down from the LDM and then stores it back on our main \nDatabase/Request EDEX\n in the form of HDF5 data files and PostgreSQL metadata.\n\n\n\n\nExample Installation\n\uf0c1\n\n\nThis walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to \nstore and serve\n while the second is used to \ningest and decode\n data.\n\n\n\n\n\n\nDatabase/Request Server\n\uf0c1\n\n\nFor this example, this server will be referred to by the IP address \n10.0.0.9\n.\n\n\n1. Install\n\uf0c1\n\n\ngroupadd fxalpha && useradd -G fxalpha awips\nmkdir /awips2\nwget -O /etc/yum.repos.d/awips2.repo https://www.unidata.ucar.edu/software/awips2/doc/awips2.repo\nyum clean all\nyum groupinstall awips2-database\n\n\n\n2. IPtables Config\n\uf0c1\n\n\nIt is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is \nnot recommended\n that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it \nis recommended\n that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server).\n\n\nvi /etc/sysconfig/iptables\n\n*filter\n:INPUT DROP [0:0]\n:FORWARD DROP [0:0]\n:OUTPUT ACCEPT [0:0]\n:EXTERNAL - [0:0]\n:EDEX - [0:0]\n-A INPUT -i lo -j ACCEPT\n-A INPUT -p icmp --icmp-type any -j ACCEPT\n-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT\n-A INPUT -s 10.0.0.7 -j EDEX\n-A INPUT -j EXTERNAL\n-A EXTERNAL -j REJECT\n-A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT\n-A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT\n-A EDEX -j REJECT\nCOMMIT\n\n\n\nNote the line \n-A INPUT -s 10.0.0.7 -j EDEX\n as well as the following \n-A EDEX ...\n rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). \n\n\n\n\nThe two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections\n\n\n\n\n3. Database Config\n\uf0c1\n\n\nIn the file \n/awips2/database/data/pg_hba.conf\n you define remote connections for all postgres tables with as \n<IP address>/32\n, after the block of IPv4 local connections:\n\n\nvi /awips2/database/data/pg_hba.conf\n\n# \"local\" is for Unix domain socket connections only\nlocal all all
"title": "Distributed EDEX"
},
{
"location": "/edex/distributed-computing/#distributed-edex",
"text": "AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data.",
"title": "Distributed EDEX"
},
{
"location": "/edex/distributed-computing/#unidatas-current-edex-server",
"text": "Currently, with our specific EDEX server we use a Database/Request instance that also decodes and ingests a good portion of the data. It handles all data requests from CAVE users, as well as the majority of the decoding and ingesting for data feeds coming down on the LDM. The radar data has been specifically exluded (from the decoding and ingest) and it has its own Ingest/Decode Server which is explained in more detail below. For our EDEX we have designated an instance of the ingest/decoding server to be dedicated to handling the radar data. Our Radar-EDEX recieves and decodes all radar down from the LDM and then stores it back on our main Database/Request EDEX in the form of HDF5 data files and PostgreSQL metadata.",
"title": "Unidata's Current EDEX Server"
},
{
"location": "/edex/distributed-computing/#example-installation",
"text": "This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data.",
"title": "Example Installation"
},
{
"location": "/edex/distributed-computing/#databaserequest-server",
"text": "For this example, this server will be referred to by the IP address 10.0.0.9 .",
"title": "Database/Request Server"
},
{
"location": "/edex/distributed-computing/#1-install",
"text": "groupadd fxalpha && useradd -G fxalpha awips\nmkdir /awips2\nwget -O /etc/yum.repos.d/awips2.repo https://www.unidata.ucar.edu/software/awips2/doc/awips2.repo\nyum clean all\nyum groupinstall awips2-database",
"title": "1. Install"
},
{
"location": "/edex/distributed-computing/#2-iptables-config",
"text": "It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is not recommended that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it is recommended that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server). vi /etc/sysconfig/iptables\n\n*filter\n:INPUT DROP [0:0]\n:FORWARD DROP [0:0]\n:OUTPUT ACCEPT [0:0]\n:EXTERNAL - [0:0]\n:EDEX - [0:0]\n-A INPUT -i lo -j ACCEPT\n-A INPUT -p icmp --icmp-type any -j ACCEPT\n-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT\n-A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT\n-A INPUT -s 10.0.0.7 -j EDEX\n-A INPUT -j EXTERNAL\n-A EXTERNAL -j REJECT\n-A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT\n-A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT\n-A EDEX -j REJECT\nCOMMIT Note the line -A INPUT -s 10.0.0.7 -j EDEX as well as the following -A EDEX ... rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). The two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections",
"title": "2. IPtables Config"
},
{
"location": "/edex/distributed-computing/#3-database-config",
"text": "In the file /awips2/database/data/pg_hba.conf you define remote connections for all postgres tables with as <IP address>/32 , after the block of IPv4 local connections: vi /awips2/database/data/pg_hba.conf\n\n# \"local\" is for Unix domain socket connections only\nlocal all all trust\nhostssl all all 10.0.0.7/32 cert clientcert=1\nhostssl all all 162.0.0.0/8 cert clientcert=1\nhostssl all all 127.0.0.1/32 cert clientcert=1\n# IPv6 local connections:\nhostssl all all ::1/128 cert clientcert=1\nhostnossl all all ::1/128 md5",
"title": "3. Database Config"
},
{
"location": "/edex/distributed-computing/#4-start-edex",
"text": "edex start database This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs)",
"title": "4. Start EDEX"
},
{
"location": "/edex/distributed-computing/#5-monitor-services",
"text": "The command edex will show which services are running, and for a Database/Request server, will not include the LDM, EDEXingest, or EDEXgrib: edex\n\n[edex status]\npostgres :: running :: pid 571\npypies :: running :: pid 639\nqpid :: running :: pid 674\nEDEXingest :: not running\nEDEXgrib :: not running\nEDEXrequest :: running :: pid 987 1029 23792 Since this Database/Request server is not running the main edexIngest JVM, we won't see anything from edex log , instead watch the Request Server with the command edex log request Confirm that EDEX Request connects to PostgreSQL! With the above edex log request , ensure that the log progresses past this point : Spring-enabled Plugins:\n-----------------------\nacars-common, acars-common-dataaccess, acarssounding-common, activetable-common,\nactivetable-request, airep-common, airep-common-dataaccess, airmet-common, \natcf-common, atcf-request, auth-request, awipstools-request, aww-common...\n\nJAXB context for PersistencePathKeySet inited in: 5ms\nINFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values\nFound 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to PostgreSQL - double-check DB_ADDR in /awips2/edex/bin/setup.env",
"title": "5. Monitor Services"
},
{
"location": "/edex/distributed-computing/#ingestdecode-server",
"text": "For this example, this server will be referred to by the IP address 10.0.0.7 .",
"title": "Ingest/Decode Server"
},
{
"location": "/edex/distributed-computing/#1-install_1",
"text": "groupadd fxalpha && useradd -G fxalpha awips\nwget -O /etc/yum.repos.d/awips2.repo https://www.unidata.ucar.edu/software/awips2/doc/awips2.repo\nyum clean all\nyum groupinstall awips2-ingest",
"title": "1. Install"
},
{
"location": "/edex/distributed-computing/#2-edex-config",
"text": "vi /awips2/edex/bin/setup.env Here you should redefine DB_ADDR and PYPIES_SERVER to point to the Database/Request server (10.0.0.9) export EDEX_SERVER=10.0.0.7\n\n# postgres connection\nexport DB_ADDR=10.0.0.9\nexport DB_PORT=5432\n\n# pypies hdf5 connection\nexport PYPIES_SERVER=http://10.0.0.9:9582\n\n# qpid connection\nexport BROKER_ADDR=${EDEX_SERVER} Notice that EDEX_SERVER and BROKER_ADDR (qpid) should remain defined as the localhost IP address (10.0.0.7)",
"title": "2. EDEX Config"
},
{
"location": "/edex/distributed-computing/#3-start-edex",
"text": "edex start ingest This will start Qpid and the EDEX Ingest and IngestGrib JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM)",
"title": "3. Start EDEX"
},
{
"location": "/edex/distributed-computing/#4-monitor-services",
"text": "Watch the edex JVM log with the command edex log Confirm that EDEX connects to PostgreSQL! With the above edex log , ensure that the log progresses past this point : Spring-enabled Plugins:\n-----------------------\nacars-common, acars-common-dataaccess, acarssounding-common, activetable-common,\nactivetable-ingest, airep-common, airep-common-dataaccess, airmet-common, \natcf-common, atcf-ingest, aww-common...\n\nJAXB context for PersistencePathKeySet inited in: 5ms\nINFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values\nFound 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to the remote PostgreSQL instance - double-check DB_ADDR in /awips2/edex/bin/setup.env You can manually check remote PostgreSQL connectivity on any EDEX Ingest server from the command line: su - awips\npsql -U awips -h <remote IP address> -p 5432 metadata Where the default passwd is awips and is defined in files in /awips2/edex/conf/db/hibernateConfig/",
"title": "4. Monitor Services"
},
{
"location": "/edex/distributed-computing/#additional-notes",
"text": "Be mindful of what IP address and hostnames are used in /awips2/edex/bin/setup.env and /awips2/database/data/pg_hba.conf , and that they are resolvable from the command line. Consult or edit /etc/hosts as needed. You can install multiple awips2-ingest servers, each decoding a different dataset or feed, all pointing to the same Database/Request server ( DB_ADDR and PYPIES_SERVER in /awips2/edex/bin/setup.env ): Every EDEX Ingest IP address must be allowed in both iptables and pg_hba.conf as shown above .",
"title": "Additional Notes"
},
{
"location": "/edex/edex-ingest-docker-container/",
"text": "Docker EDEX\n\uf0c1\n\n\nProject home: \nhttps://github.com/Unidata/edex-docker\n\n\n\n\nEDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration.\n\n\nThe \nEDEX Docker Image\n is built on CentOS 7 and contains the latest Unidata AWIPS release (18.1.1). \n\n\nThis container is an \ningest-only\n install, meaning there is \nno database or request server\n. This example requires a Database/Request server be configured for you to access remotely. See the \nDistributed EDEX\n document for more. \n\n\n\n\nDownload and Install Docker\n\uf0c1\n\n\nDownload and install Docker and Docker Compose:\n\n\n\n\nDocker for CentOS 7 Linux\n\n\nDocker for Mac\n\n\nDocker for Windows\n\n\ndocker-compose\n (it should be bundled with Docker by default on Mac and Windows)\n\n\n\n\n\n\nRun the EDEX Ingest Container\n\uf0c1\n\n\nClone the source repository:\n\n\ngit clone https://github.com/Unidata/edex-docker.git\ncd edex-docker\n\n\n\nRun the container with docker-compose:\n\n\ndocker-compose up -d edex-ingest\n\n\n\nConfirm the container is running:\n\n\ndocker ps -a\n\n\n\nEnter the container:\n\n\ndocker exec -it edex-ingest bash\n\n\n\nStop the container:\n\n\ndocker-compose stop\n\n\n\nDelete the container (keep the image):\n\n\ndocker-compose rm -f\n\n\n\nRun commands inside the container, such as:\n\n\ndocker exec edex-ingest edex\n\n\n\nwhich should return something like:\n\n\n[edex status]\n qpid :: running :: pid 22474\n EDEXingest :: running :: pid 21860 31513\n EDEXgrib :: not running\n ldmadmin :: running :: pid 22483\n\n edex (status|start|stop|setup|log|purge|qpid|users)\n\n\n\nTo update to the latest version and restart:\n\n\ndocker pull unidata/edex-ingest:latest\ndocker-compose stop\ndocker-compose up -d edex-ingest\n\n\n\n\n\nConfiguration and Customization\n\uf0c1\n\n\nThe file \ndocker-compose.yml\n defines files to mount to the container and which ports to open:\n\n\nedex-ingest:\n image: unidata/edex-ingest:latest\n container_name: edex-ingest\n volumes:\n - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf\n - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf\n - ./bin/setup.env:/awips2/edex/bin/setup.env\n - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh\n ports:\n - \"388:388\"\n ulimits:\n nofile:\n soft: 1024\n hard: 1024\n\n\n\n\n\nMounted Files\n\uf0c1\n\n\netc/ldmd.conf\n\uf0c1\n\n\nDefines which data feeds to receive. By default there is only one active request line (\nREQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu\n) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file \netc/ldmd.conf\n will be read the next time you restart the container.\n\n\netc/pqact.conf\n\uf0c1\n\n\nDefines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in \nldmd.conf\n (above).\n\n\nbin/setup.env\n\uf0c1\n\n\nDefines the remote EDEX Database/Request server:\n\n\n### EDEX localization related variables ###\nexport AW_SITE_IDENTIFIER=OAX\nexport EXT_ADDR=js-157-198.jetstream-cloud.org\n\n\n\n\n\nNote\n: \nEXT_ADDR\n must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our \nedex-ingest\n access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server. \n\n\n\n\nbin/runedex.sh\n\uf0c1\n\n\nThe default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see \nENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"]\n in \nDockerfile.edex\n), essentially:\n\n\n/awips2/qpid/bin/qpid-wrapper &\n/awips2/edex/bin/start.sh -noConsole ingest &\nldmadmin mkqueue\nldmadmin start",
"title": "Docker EDEX"
},
{
"location": "/edex/edex-ingest-docker-container/#docker-edex",
"text": "Project home: https://github.com/Unidata/edex-docker EDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration. The EDEX Docker Image is built on CentOS 7 and contains the latest Unidata AWIPS release (18.1.1). This container is an ingest-only install, meaning there is no database or request server . This example requires a Database/Request server be configured for you to access remotely. See the Distributed EDEX document for more.",
"title": "Docker EDEX"
},
{
"location": "/edex/edex-ingest-docker-container/#download-and-install-docker",
"text": "Download and install Docker and Docker Compose: Docker for CentOS 7 Linux Docker for Mac Docker for Windows docker-compose (it should be bundled with Docker by default on Mac and Windows)",
"title": "Download and Install Docker"
},
{
"location": "/edex/edex-ingest-docker-container/#run-the-edex-ingest-container",
"text": "Clone the source repository: git clone https://github.com/Unidata/edex-docker.git\ncd edex-docker Run the container with docker-compose: docker-compose up -d edex-ingest Confirm the container is running: docker ps -a Enter the container: docker exec -it edex-ingest bash Stop the container: docker-compose stop Delete the container (keep the image): docker-compose rm -f Run commands inside the container, such as: docker exec edex-ingest edex which should return something like: [edex status]\n qpid :: running :: pid 22474\n EDEXingest :: running :: pid 21860 31513\n EDEXgrib :: not running\n ldmadmin :: running :: pid 22483\n\n edex (status|start|stop|setup|log|purge|qpid|users) To update to the latest version and restart: docker pull unidata/edex-ingest:latest\ndocker-compose stop\ndocker-compose up -d edex-ingest",
"title": "Run the EDEX Ingest Container"
},
{
"location": "/edex/edex-ingest-docker-container/#configuration-and-customization",
"text": "The file docker-compose.yml defines files to mount to the container and which ports to open: edex-ingest:\n image: unidata/edex-ingest:latest\n container_name: edex-ingest\n volumes:\n - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf\n - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf\n - ./bin/setup.env:/awips2/edex/bin/setup.env\n - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh\n ports:\n - \"388:388\"\n ulimits:\n nofile:\n soft: 1024\n hard: 1024",
"title": "Configuration and Customization"
},
{
"location": "/edex/edex-ingest-docker-container/#mounted-files",
"text": "",
"title": "Mounted Files"
},
{
"location": "/edex/edex-ingest-docker-container/#etcldmdconf",
"text": "Defines which data feeds to receive. By default there is only one active request line ( REQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu ) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file etc/ldmd.conf will be read the next time you restart the container.",
"title": "etc/ldmd.conf"
},
{
"location": "/edex/edex-ingest-docker-container/#etcpqactconf",
"text": "Defines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in ldmd.conf (above).",
"title": "etc/pqact.conf"
},
{
"location": "/edex/edex-ingest-docker-container/#binsetupenv",
"text": "Defines the remote EDEX Database/Request server: ### EDEX localization related variables ###\nexport AW_SITE_IDENTIFIER=OAX\nexport EXT_ADDR=js-157-198.jetstream-cloud.org Note : EXT_ADDR must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our edex-ingest access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server.",
"title": "bin/setup.env"
},
{
"location": "/edex/edex-ingest-docker-container/#binrunedexsh",
"text": "The default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see ENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"] in Dockerfile.edex ), essentially: /awips2/qpid/bin/qpid-wrapper &\n/awips2/edex/bin/start.sh -noConsole ingest &\nldmadmin mkqueue\nldmadmin start",
"title": "bin/runedex.sh"
},
{
"location": "/edex/ldm/",
"text": "LDM Feeds\n\uf0c1\n\n\nDefault LDM Feeds for EDEX\n\uf0c1\n\n\nData feeds are defined by the \nldmd.conf\n file in \n/awips2/ldm/etc/ldmd.conf\n. The default feeds that come \"turned on\" with our EDEX are the following:\n\n\nREQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|HHC|N.P|N.Q|N.S|N.U|OHA).\" idd.unidata.ucar.edu\nREQUEST FNEXRAD|IDS|DDPLUS \".*\" idd.unidata.ucar.edu\nREQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI\nREQUEST EXP \"WwWind\" idd.unidata.ucar.edu # ESPL/PSD Profilers\nREQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM\nREQUEST EXP \".*\" lead.unidata.ucar.edu # GOES ABI netCDF4 (full sector)\nREQUEST NGRID \".*\" idd.unidata.ucar.edu\nREQUEST HDS \".*\" idd.unidata.ucar.edu\nREQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12\nREQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS0p25\n\n\n\n\n\nOptional LDM Feeds\n\uf0c1\n\n\nSome additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and \nrestart the LDM\n.\n\n\nFNMOC and CMC models\n\uf0c1\n\n\nREQUEST FNMOC \".*\" idd.unidata.ucar.edu\nREQUEST CMC \".*\" idd.unidata.ucar.edu\n\n\n\nLightning (restricted to educational use with rebroadcasting restricted)\n\uf0c1\n\n\nREQUEST LIGHTNING \".*\" striker2.atmos.albany.edu\nREQUEST LIGHTNING \".*\" idd.unidata.ucar.edu\n\n\n\nFSL/GSD Experimental HRRR (Sub-hourly)\n\uf0c1\n\n\nREQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu\n\n\n\n\n\nRestart the LDM\n\uf0c1\n\n\nUse the following commands to restart the LDM:\n\n\nsudo service edex_ldm restart\n\nldmadmin restart\n\n\n\n\n\nMonitor Incoming Data Feeds\n\uf0c1\n\n\nTo watch incoming data in real-time:\n\n\nnotifyme -vl -\n\n\n\nTo watch for a specific product and feed and time (360 sec = 6 min):\n\n\nnotifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360\n\n\n\nTo watch the same on a remote queue:\n\n\nnotifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360\n\n\n\n\n\nLDM Logging\n\uf0c1\n\n\nTo open a real-time readout of LDM logging you can run use the \nedex\n command. To exit, press \nCTRL+C\n.\n\n\nedex log ldm\n\n[edex] EDEX Log Viewer\n\n :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit\n\nAug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT\nAug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally).\nAug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE\nAug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally).\nAug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM\nAug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).",
"title": "LDM Feeds"
},
{
"location": "/edex/ldm/#ldm-feeds",
"text": "",
"title": "LDM Feeds"
},
{
"location": "/edex/ldm/#default-ldm-feeds-for-edex",
"text": "Data feeds are defined by the ldmd.conf file in /awips2/ldm/etc/ldmd.conf . The default feeds that come \"turned on\" with our EDEX are the following: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|HHC|N.P|N.Q|N.S|N.U|OHA).\" idd.unidata.ucar.edu\nREQUEST FNEXRAD|IDS|DDPLUS \".*\" idd.unidata.ucar.edu\nREQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI\nREQUEST EXP \"WwWind\" idd.unidata.ucar.edu # ESPL/PSD Profilers\nREQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM\nREQUEST EXP \".*\" lead.unidata.ucar.edu # GOES ABI netCDF4 (full sector)\nREQUEST NGRID \".*\" idd.unidata.ucar.edu\nREQUEST HDS \".*\" idd.unidata.ucar.edu\nREQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12\nREQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS0p25",
"title": "Default LDM Feeds for EDEX"
},
{
"location": "/edex/ldm/#optional-ldm-feeds",
"text": "Some additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and restart the LDM .",
"title": "Optional LDM Feeds"
},
{
"location": "/edex/ldm/#fnmoc-and-cmc-models",
"text": "REQUEST FNMOC \".*\" idd.unidata.ucar.edu\nREQUEST CMC \".*\" idd.unidata.ucar.edu",
"title": "FNMOC and CMC models"
},
{
"location": "/edex/ldm/#lightning-restricted-to-educational-use-with-rebroadcasting-restricted",
"text": "REQUEST LIGHTNING \".*\" striker2.atmos.albany.edu\nREQUEST LIGHTNING \".*\" idd.unidata.ucar.edu",
"title": "Lightning (restricted to educational use with rebroadcasting restricted)"
},
{
"location": "/edex/ldm/#fslgsd-experimental-hrrr-sub-hourly",
"text": "REQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu",
"title": "FSL/GSD Experimental HRRR (Sub-hourly)"
},
{
"location": "/edex/ldm/#restart-the-ldm",
"text": "Use the following commands to restart the LDM: sudo service edex_ldm restart\n\nldmadmin restart",
"title": "Restart the LDM"
},
{
"location": "/edex/ldm/#monitor-incoming-data-feeds",
"text": "To watch incoming data in real-time: notifyme -vl - To watch for a specific product and feed and time (360 sec = 6 min): notifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360 To watch the same on a remote queue: notifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360",
"title": "Monitor Incoming Data Feeds"
},
{
"location": "/edex/ldm/#ldm-logging",
"text": "To open a real-time readout of LDM logging you can run use the edex command. To exit, press CTRL+C . edex log ldm\n\n[edex] EDEX Log Viewer\n\n :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit\n\nAug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT\nAug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally).\nAug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE\nAug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally).\nAug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM\nAug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).",
"title": "LDM Logging"
},
{
"location": "/edex/data-distribution-files/",
"text": "Data Distribution Files\n\uf0c1\n\n\nOverview\n\uf0c1\n\n\nEDEX uses \ndistribution files\n to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act.\n\n\n\n\n*Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the \nmanual\n endpoint (/awips2/data_store/ingest/), then this behaviour could be different.\n\n\n\n\nIf a piece of data \ndoes not\n match any distribution XML, EDEX will:\n\n\n\n\n\n\nCreate an entry in \n/awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log\n\n\n\n\n\n\nSkip processing of the unrecognized file.\n\n\n\n\n\n\nDistribution files are stored in the \ncommon_static\n branch of the \nLocalization Store\n (series of directories that exist in \n/awips2/edex/data/utility/\n), and a list of available files can be found in the base-level directory. The base directory is: \n/awips2/edex/data/utility/common_static/base/distribution/\n.\n\n\nFor each plug-in, the distribution file is named \n[data-type].xml\n. For example, the distribution file for radar data is \nradar.xml\n. The distribution files follow the AWIPS base/site localization pattern: \n\n\n[root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution\n[root@edex distribution]# ls\nacars.xml goesr.xml poessounding.xml\nairep.xml goessounding.xml profiler.xml\nairmet.xml grib.xml radar.xml\natcf.xml intlsigmet.xml redbook.xml\naww.xml lsr.xml satellite.gini.xml\n...\n\n\n\n\n\nCreating a Site Override\n\uf0c1\n\n\n\n\n\n\nBase\n files are located in \n/awips2/edex/data/utility/common_static/base/distribution/\n\n\n\n\n\n\nSite\n override distribution files are located in \n/awips2/edex/data/utility/common_static/\nsite/XXX\n/distribution/\n, where \nXXX\n is the site identifier.\n\n\n\n\n\n\nNote that site-level files override the base files; as a result, local modifications to distribution files must be made as follows:\n\n\n\n\n\n\nThe base distribution file must be copied from \n/awips2/edex/data/utility/common_static/base/distribution\n to \n/awips2/edex/data/utility/common_static/site/XXX/distribution\n\n\n\n\n\n\nThe local modification must be made to the file in \n/awips2/edex/data/utility/common_static/site/XXX/distribution\n\n\n\n\n\n\nThe basic structure of the distribution file is:\n\n\n<requestPatterns xmlns:ns2=\"group\">\n <regex>[pattern]</regex>\n <regex>[pattern]</regex>\n</requestPatterns>\n\n\n\nIn each \n \ntag, \n[pattern]\n is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed.\n\n\nThe contents of the base version of the radar distribution file:\n\n\n[root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/\n[root@edex]# tail -4 radar.xml\n\n<requestPatterns >\n <regex>^SDUS[234578]. .*</regex>\n <regex>^Level3.*</regex>\n</requestPatterns>\n\n\n\nLooking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the \nexternal.dropbox\n QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a
"title": "Data Distribution Files"
},
{
"location": "/edex/data-distribution-files/#data-distribution-files",
"text": "",
"title": "Data Distribution Files"
},
{
"location": "/edex/data-distribution-files/#overview",
"text": "EDEX uses distribution files to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act. *Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the manual endpoint (/awips2/data_store/ingest/), then this behaviour could be different. If a piece of data does not match any distribution XML, EDEX will: Create an entry in /awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log Skip processing of the unrecognized file. Distribution files are stored in the common_static branch of the Localization Store (series of directories that exist in /awips2/edex/data/utility/ ), and a list of available files can be found in the base-level directory. The base directory is: /awips2/edex/data/utility/common_static/base/distribution/ . For each plug-in, the distribution file is named [data-type].xml . For example, the distribution file for radar data is radar.xml . The distribution files follow the AWIPS base/site localization pattern: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution\n[root@edex distribution]# ls\nacars.xml goesr.xml poessounding.xml\nairep.xml goessounding.xml profiler.xml\nairmet.xml grib.xml radar.xml\natcf.xml intlsigmet.xml redbook.xml\naww.xml lsr.xml satellite.gini.xml\n...",
"title": "Overview"
},
{
"location": "/edex/data-distribution-files/#creating-a-site-override",
"text": "Base files are located in /awips2/edex/data/utility/common_static/base/distribution/ Site override distribution files are located in /awips2/edex/data/utility/common_static/ site/XXX /distribution/ , where XXX is the site identifier. Note that site-level files override the base files; as a result, local modifications to distribution files must be made as follows: The base distribution file must be copied from /awips2/edex/data/utility/common_static/base/distribution to /awips2/edex/data/utility/common_static/site/XXX/distribution The local modification must be made to the file in /awips2/edex/data/utility/common_static/site/XXX/distribution The basic structure of the distribution file is: <requestPatterns xmlns:ns2=\"group\">\n <regex>[pattern]</regex>\n <regex>[pattern]</regex>\n</requestPatterns> In each tag, [pattern] is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed. The contents of the base version of the radar distribution file: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/\n[root@edex]# tail -4 radar.xml\n\n<requestPatterns >\n <regex>^SDUS[234578]. .*</regex>\n <regex>^Level3.*</regex>\n</requestPatterns> Looking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the external.dropbox QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a message in the QPID queue Ingest.radar. The radar decoder will then consume the message and process the radar data accordingly.",
"title": "Creating a Site Override"
},
{
"location": "/edex/data-distribution-files/#adding-a-regex-to-the-satellite-data-distribution-file",
"text": "As a quick example, suppose we have a local data source for satellite imagery that does not have a WMO header; also suppose that the data source writes to files whose names start with LOCAL.sat . To add this locally produced satellite data file to the EDEX distribution; perform the following steps. Copy the base version of satellite.gini.xml from the base distribution directory /awips2/edex/data/utility/common_static/base/distribution into the site distribution directory /awips2/edex/data/utility/common_static/site/XXX/distribution Edit the site version of satellite.gini.xml , adding a new <regex> </regex> tag immediately below the existing regular expression ( <regex> </regex> ) tag. The contents of the tag will be ^LOCAL.sat . The final result will be: \n <requestPatterns xmlns:ns2=\"group\">\n <regex>TI[CGT]... ....</regex>\n <regex>rad/NEXRCOMP</regex>\n <regex>.\\*.gini.\\*</regex>\n <regex>^LOCAL.sat.*</regex> \n </requestPatterns> Save the file and exit the editor. EDEX will automatically pick up the new distribution pattern. Raw files are written to subdirectories in /awips2/data_store/ , and a message is sent via QPID to the EDEX distribution service from the LDM. When a regular expression match is found in a data distribution file, the raw data file is placed in a queue for the matching plugin to decode and process. The distribution files are used to match file headers as well as filenames, which is how files dropped into EDEX's manual endpoint ( /awips2/data_store/ingest/ ) are processed.",
"title": "Adding a REGEX to the Satellite Data Distribution File"
},
{
"location": "/edex/data-distribution-files/#editing-an-edex-data-distribution-file",
"text": "Because these files are in the common/_static directory, they have to be manually edited using a text editor. You should not edit the base files; rather, as stated above, you should copy the base version to your site and then edit the site version . The regular expressions in the distribution files do not necessarily need to correspond with the regular expressions in the LDM pqact.conf file. It is important to note that: The regex in the pqact.conf file applies to the productID that is passed through the LDM. and The regex in the distribution files (.xml) typically applies to the header in the file. It can also apply to the filename, if the file is coming through the manual endpoint, or if the data has no header to begin with. If patterns exist in pqact.conf but there are no corresponding matching regex expressions in any distribution file, then raw data files will be written to /awips2/data_store/ but will not be ingested and processed by EDEX. Entries for these non-ingested files would be written to the unrecognized files log in /awips/edex/logs .",
"title": "Editing an EDEX Data Distribution File"
},
{
"location": "/edex/data-distribution-files/#examples",
"text": "",
"title": "Examples"
},
{
"location": "/edex/data-distribution-files/#surface-obs",
"text": "Its distribution file is located at: /awips2/edex/data/utility/common_static/base/distribution/obs.xml : <requestPatterns xmlns:ns2=\"group\">\n <regex>^S[AP].*</regex>\n</requestPatterns> It will process any file header that starts with SA or SP , which should match any WMO header that contains METAR data (e.g. SAUS , SPUS , SACN , SAMX ).",
"title": "Surface Obs"
},
{
"location": "/edex/data-distribution-files/#text-data",
"text": "Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/text.xml : <requestPatterns>\n <regex>^[ACFNRUW][A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex>\n <regex>^S[A-CEG-Z].*</regex>\n <!-- Only have AFOS mapping for T[BCX] -->\n <regex>^T[BCX].*</regex>\n <regex>^SF[A-OQ-TV-Z].*</regex>\n <regex>^SDUS1.*</regex>\n <regex>^SDUS4[1-6].*</regex>\n <regex>^SDUS9[^7].*</regex>\n <regex>^SFU[^S].*</regex>\n <regex>^SFUS4[^1].*</regex>\n <regex>^SFP[^A].*</regex>\n <regex>^SFPA[^4].*</regex>\n <regex>^SFPA4[^12].*</regex>\n <regex>^BMBB91.*</regex>\n <regex>^N[A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex>\n <regex>^F[EHIJKLMQVWX].*</regex>\n <regex>wcl_decrypted</regex>\n <regex>ecmwf_mos_decrypted</regex>\n</requestPatterns> Processes lots of WM patterns. The second pattern ^S[A-CEG-Z].* matches any header that starts with S except for SD or SF . This is because it matches A through C ( A-C ), E, and G through Z ( G-Z ). So it also matches the SA and SP files that the obs.xml plugin matches. This means that METARs are processed by both plugins simultaneously.",
"title": "Text Data"
},
{
"location": "/edex/data-distribution-files/#grib-data",
"text": "Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/grib.xml : <requestPatterns>\n <!-- Super Set of all possible WMO grib patterns -->\n <!-- Is specifically not restricting on CCCC since HPE isn't populating it -->\n <regex>^[EHLMOYZ][A-Z]{3}\\d{2}</regex>\n <!-- Exclude Data Delivery specific patterns -->\n <regexExclude>^LZ[ABC][ABC]9[123] (KWBC|KNCF)</regexExclude>\n\n <!-- ECMWF decrypted -->\n <regex>ecmwf_decrypted</regex>\n\n <!-- NWPS pattern -->\n <regex>\\p{Alpha}{3}_nwps_CG1</regex>\n <regex>\\p{Alpha}{3}_nwps_CG0_Trkng</regex>\n\n <!-- grib files without WMO headers -->\n <regex>.*grib.*</regex>\n <regex>.*GRIB.*</regex>\n <regex>.*grb.*</regex>\n <regex>^US058.*</regex>\n <regex>^CMC_reg.*</regex>\n</requestPatterns> The grib/grid decoder distribution file matches all numerical grids distributed over the IDD NGRID feed by matching WMO header, and from CONDUIT by matching various .grib file extensions. It also includes an example of a regexExclude message which can be used to single out matching values that aren't to be included.",
"title": "Grib Data"
},
{
"location": "/edex/data-distribution-files/#addtional-information",
"text": "Important notes about regular expressions: Any time a new entry is placed in the pqact.conf file on LDM, it is likely a corresponding entry needs to be added to the appropriate Data Distribution file in the data distribution directory, or the data file will be logged to edex-ingest-unrecognized-files-YYYYMMDD.log . The exception to this rule is if the new data coming from the LDM is a type of data that already exists and EDEX already has a distribution file with a matching regex that will recognize it. If you have written a new regex for a distribution file to match on a filename, and it is not matching, then the file most likely has a header. In this case EDEX will only look at the header to match the regex. You must change your regex to something that matches the header, not the filename.",
"title": "Addtional Information"
},
{
"location": "/edex/new-grid/",
"text": "Ingest a New Grid\n\uf0c1\n\n\nUnrecognized grids can be decoded by EDEX simply by dropping \n*.grib\n or \n*.grib2\n files into \n/awips2/data_store/ingest/\n\n\nTo add support for a new grid, two edits must be made:\n\n\n\n\nGeospatial projection\n must be defined in a \ngrid navigation file\n\n\nGrid name\n, \ncenter\n, \nsubcenter\n, and \nprocess ID\n must be defined in a \nmodel definition file\n\n\n\n\nIf the parameters in the grib file haven't been previously specified, another change \nmay\n be needed as well:\n\n\n\n\nCenter\n, \nsubcenter\n, \ndiscipline\n, \ncategory\n, and possibly \nparameter ID\n information may need to be defined in a \ntable\n\n\n\n\n\n\nIngest an Unsupported Grid\n\uf0c1\n\n\nGrib Products\n\uf0c1\n\n\n\n\n\n\nDownload an example grib1 file and rename to a \n*.grib\n extension, then copy to the manual ingest point \n/awips2/data_store/ingest/\n \n\n\nwget https://www.unidata.ucar.edu/software/awips2/14102318_nmm_d01.GrbF00600 -O wrf.grib\n\ncp wrf.grib /awips2/data_store/ingest/\n\n\n\nRemember that the data distribution file (\n/awips2/edex/data/utility/common_static/base/distribution/grib.xml\n) will match filenames which have the \n*.grib*\n extension.\n\n\n\n\n\n\nConfirm that the grib file decodes in the grib log file:\n\n\nedex log grib\n\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec)\n\n...\n\n\n\n\n\n\n\nCheck that the hdf5 data directory exists for our unnamed grid\n\n\nls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89\n\n\n\nThough the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively). \n\n\n\n\n\n\nGrib2 Products\n\uf0c1\n\n\n\n\n\n\nDownload an example grib2 file and rename to a \n*.grib2\n extension, then copy to the manual ingest point \n/awips2/data_store/ingest/\n \n\n\nwget https://www.unidata.ucar.edu/software/awips2/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2\n\ncp cpti.grib2 /awips2/data_store/ingest/\n\n\n\nRemember that the data distribution file (\n/awips2/edex/data/utility/common_static/base/distribution/grib.xml\n) will match filenames which have the \n*.grib*\n extension.\n\n\n\n\n\n\nConfirm that the grib file decodes in the grib log file:\n\n\nedex log grib\n\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec)\n\n...\n\n\n\nNote:\n This step will likely fail, because the parameter is not yet defined. The error will look like:\n\n\nINFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0\nINFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/data_store/ingest/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec)\nINFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61]\n\n\n\nIn order to successfully ingest the example file, \ndefine the appropriate table\n.\n\n\n\n\n\n\nCheck that the hdf5 data directory exists for our unnamed grid\n\n\nls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97
"title": "Ingest a New Grid"
},
{
"location": "/edex/new-grid/#ingest-a-new-grid",
"text": "Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table",
"title": "Ingest a New Grid"
},
{
"location": "/edex/new-grid/#ingest-an-unsupported-grid",
"text": "",
"title": "Ingest an Unsupported Grid"
},
{
"location": "/edex/new-grid/#grib-products",
"text": "Download an example grib1 file and rename to a *.grib extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://www.unidata.ucar.edu/software/awips2/14102318_nmm_d01.GrbF00600 -O wrf.grib\n\ncp wrf.grib /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension. Confirm that the grib file decodes in the grib log file: edex log grib\n\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec)\n\n... Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively).",
"title": "Grib Products"
},
{
"location": "/edex/new-grid/#grib2-products",
"text": "Download an example grib2 file and rename to a *.grib2 extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://www.unidata.ucar.edu/software/awips2/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2\n\ncp cpti.grib2 /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension. Confirm that the grib file decodes in the grib log file: edex log grib\n\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec)\nINFO [Ingest.GribDecode] /awips2/data_store/ingest/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec)\n\n... Note: This step will likely fail, because the parameter is not yet defined. The error will look like: INFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0\nINFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/data_store/ingest/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec)\nINFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61] In order to successfully ingest the example file, define the appropriate table . Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (161, 0, 97, respectively).",
"title": "Grib2 Products"
},
{
"location": "/edex/new-grid/#determine-grid-projection",
"text": "",
"title": "Determine Grid Projection"
},
{
"location": "/edex/new-grid/#grib-products_1",
"text": "When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata\n\nmetadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:7:0:89');\n nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 \n-----+-----+------------------+------------------+-----------+-----------+------------------+-------------------+-------------------+------------------+------------------\n 201 | 155 | 4.29699993133545 | 4.29699993133545 | 6378160 | 6356775 | 42.2830009460449 | -72.3610000610352 | -67.0770034790039 | 45.3680000305176 | 45.3680000305176\n(1 row) Compare with the projection info returned by wgrib on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): \nwgrib -V wrf.grib \nrec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef\n ALBDO=Albedo [%]\n timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0\n center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) \n Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 \n Latin1 45.368000 Latin2 45.368000 LatSP 0.000000 LonSP 0.000000\n North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8\n min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 Notice that our grib file has a Lambert Conformal projection. We will need these values for the next step. Note that there is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage area.",
"title": "Grib Products"
},
{
"location": "/edex/new-grid/#grib2-products_1",
"text": "When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata\n\nmetadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:161:0:97');\n\nnx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 \n-----+-----+-------+-------+-----------+-----------+-----------+-----+-----+--------+--------\n600 | 640 | 0.005 | 0.005 | | | 40.799999 | 261 | | | \n(1 row) Compare with the projection info returned by wgrib2 on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): \nwgrib2 -grid -nxny cpti.grib2\n1:0:grid_template=0:winds(N/S):\n lat-lon grid:(600 x 640) units 1e-06 input WE:NS output WE:SN res 48\n lat 40.799999 to 37.599999 by 0.005000 \n lon 260.999999 to 263.999999 by 0.005000 #points=384000:(600 x 640)\n ... Notice that our grib2 file has a Lat/lon Grid projection. Where: nx is 600 ny is 640 dx is 0.005 dy is 0.005 la1 is 40.799999 lo1 is 261 We will need these values for the next step. Note that there is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage (la1 and lo1) area.",
"title": "Grib2 Products"
},
{
"location": "/edex/new-grid/#create-grid-projection-file",
"text": "Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) <lambertConformalGridCoverage>\n <name>305</name>\n <description>Regional - CONUS (Lambert Conformal)</description>\n <la1>16.322</la1>\n <lo1>-125.955</lo1>\n <firstGridPointCorner>LowerLeft</firstGridPointCorner>\n <nx>151</nx>\n <ny>113</ny>\n <dx>40.63525</dx>\n <dy>40.63525</dy>\n <spacingUnit>km</spacingUnit>\n <minorAxis>6356775.0</minorAxis>\n <majorAxis>6378160.0</majorAxis>\n <lov>-95.0</lov>\n <latin1>25.0</latin1>\n <latin2>25.0</latin2>\n</lambertConformalGridCoverage> polarStereoGridCoverage (example seaice_south1_grid.xml ) <polarStereoGridCoverage>\n <name>405</name>\n <description>Sea Ice south 690X710 13km grid</description>\n <la1>-36.866</la1>\n <lo1>139.806</lo1>\n <firstGridPointCorner>LowerLeft</firstGridPointCorner>\n <nx>690</nx>\n <ny>710</ny>\n <dx>12.7</dx>\n <dy>12.7</dy>\n <spacingUnit>km</spacingUnit>\n <minorAxis>6371229.0</minorAxis>\n <majorAxis>6371229.0</majorAxis>\n <lov>100.0</lov>\n</polarStereoGridCoverage> latLonGridCoverage (example UkmetHR-SHemisphere.xml ) <latLonGridCoverage>\n <name>864162002</name>\n <description>UKMet HiRes combined - Southern Hemisphere\n Longitude range 71.25E - 70.416E </description>\n <la1>-89.721</la1>\n <lo1>71.25</lo1>\n <firstGridPointCorner>LowerLeft</firstGridPointCorner>\n <nx>864</nx>\n <ny>162</ny>\n <dx>0.833</dx>\n <dy>0.556</dy>\n <spacingUnit>degree</spacingUnit>\n <la2>-0.278</la2>\n <lo2>70.416</lo2>\n</latLonGridCoverage> mercatorGridCoverage (example gridNBM_PR.xml ) <mercatorGridCoverage>\n <name>NBM_PR</name>\n <description> National Blend Grid over Puerto Rico - (1.25 km)</description>\n <la1>16.9775</la1>\n <lo1>-68.0278</lo1>\n <firstGridPointCorner>LowerLeft</firstGridPointCorner>\n <nx>339</nx>\n <ny>225</ny>\n <dx>1.25</dx>\n <dy>1.25</dy>\n <la2>19.3750032477232</la2>\n <lo2>-63.984399999999994</lo2>\n <latin>20</latin>\n <spacingUnit>km</spacingUnit>\n <minorAxis>6371200</minorAxis>\n <majorAxis>6371200</majorAxis>\n</mercatorGridCoverage>",
"title": "Create Grid Projection File"
},
{
"location": "/edex/new-grid/#grib-products_2",
"text": "Copy an existing xml file with the same grid projection type (in this case lambertConformalGridCoverage ) to a new file wrf.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/\ncp RUCIcing.xml wrf.xml And edit the new wrf.xml to define the projection values using the output from wgrib or the database (example provided): vi wrf.xml\n\n<lambertConformalGridCoverage>\n <name>201155</name>\n <description>Regional - CONUS (Lambert Conformal)</description>\n <la1>42.2830009460449</la1>\n <lo1>-72.3610000610352</lo1>\n <firstGridPointCorner>LowerLeft</firstGridPointCorner>\n <nx>201</nx>\n <ny>155</ny>\n <dx>4.29699993133545</dx>\n <dy>4.29699993133545</dy>\n <spacingUnit>km</spacingUnit>\n <minorAxis>6356775.0</minorAxis>\n <majorAxis>6378160.0</majorAxis>\n <lov>-67.0770034790039</lov>\n <latin1>45.3680000305176</latin1>\n <latin2>45.3680000305176</latin2>\n</lambertConformalGridCoverage> Note : Notice the <name>201155</name> tag was defined from the number of grid points (201 and 155). This value will be matched against an entry in our models file (below) to set the name of the model (e.g. WRF).",
"title": "Grib Products"
},
{
"location": "/edex/new-grid/#grib2-products_2",
"text": "Copy an existing xml file with the same grid projection type (in this case latLonGridCoverage ) to a new file cpti.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/\ncp MRMS-1km.xml cpti.xml And edit the new cpti.xml to define the projection values using the output from wgrib2 or the database (example provided): vi cpti.xml\n\n<latLonGridCoverage>\n <name>600640</name>\n <description>Small domain for CPTI products</description>\n <la1>40.799999</la1>\n <lo1>261</lo1>\n <firstGridPointCorner>UpperLeft</firstGridPointCorner>\n <nx>600</nx>\n <ny>640</ny>\n <dx>0.005</dx>\n <dy>0.005</dy>\n <spacingUnit>degree</spacingUnit>\n</latLonGridCoverage> Note : Notice the <name>384000</name> tag was defined from the number of grid points (600 and 640). This value will be matched against an entry in our models file (below) to set the name of the model (e.g. CPTI).",
"title": "Grib2 Products"
},
{
"location": "/edex/new-grid/#create-model-definition",
"text": "Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ .",
"title": "Create Model Definition"
},
{
"location": "/edex/new-grid/#grid-prodcuts",
"text": "Since our grib file has a center ID of 7 (NCEP) we will edit the gribModels_NCEP-7.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/\n\nvi gribModels_NCEP-7.xml In <gribModelSet> add an entry: <model>\n <name>WRF</name>\n <center>7</center>\n <subcenter>0</subcenter>\n <grid>201155</grid>\n <process>\n <id>89</id>\n </process>\n </model> Save the file and restart EDEX for the changes to take effect: sudo service edex_camel restart ingestGrib Now copy the wrf.grib file again to /awips2/data_store/ingest/ . If everything is correct we will not see any persistence errors since the grid is now named WRF and not GribModel:7:0:89 . cp wrf.grib /awips2/data_store/ingest/\n\nedex log grib After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid .",
"title": "Grid Prodcuts"
},
{
"location": "/edex/new-grid/#grib2-products_3",
"text": "Since our grib2 file has a center of 161 (NOAA) we will edit the gribModels_NOAA-161.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/\n\nvi gribModels_NOAA-161.xml In <gribModelSet> , under the <-- Subcenter 0 --> comment, add an entry: <model>\n <name>CPTI</name>\n <center>161</center>\n <subcenter>0</subcenter>\n <grid>600640</grid>\n <process>\n <id>97</id>\n </process>\n</model> Save the model file and restart edex: sudo service edex_camel restart ingestGrib Now if you drop cpti.grib2 into the manual endpoint again, it should ingest without any persistence errors.",
"title": "Grib2 Products"
},
{
"location": "/edex/new-grid/#adding-a-table",
"text": "If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table .",
"title": "Adding a Table"
},
{
"location": "/edex/new-grid/#grib-products_3",
"text": "The center and subcenter have been identified previously here and here , as 7 and 0, respectively. So, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/ To find the discipline of a grib product, you need the process and table values from the grib file. These are output with the wgrib -V command: \nwgrib -V wrf.grib \nrec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef\n ALBDO=Albedo [%]\n timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0\n center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) \n Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000\n Latin1 45.368000 Latin2 45.368000 < LatSP 0.000000 LonSP 0.000000\n North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8\n min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 For our example, the process is 89 and table is 2 . Next, take a look in: /awips2/edex/data/utility/common_static/base/grid/grib1ParameterConvTable.xml And find the entry that has grib1 data with TableVersion 2 and Value 89: <grib1Parameter>\n <center>7</center>\n <grib1TableVersion>2</grib1TableVersion>\n <grib1Value>89</grib1Value>\n <grib2discipline>0</grib2discipline>\n <grib2category>3</grib2category>\n <grib2Value>10</grib2Value>\n </grib1Parameter> Here, we can see the discipline and category values (referred to as x above) are 0 and 3, respectively. So, the table needed for our example file is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/4.2.0.3.table",
"title": "Grib products"
},
{
"location": "/edex/new-grid/#grib2-products_4",
"text": "If you are using a grib2 file, then you can use either the log output or the -center , -subcenter , and -full_name options on wgrib2 to get the center, subcenter, discipline, category, and parameter information: The table would be found in the directory structure using this file's center and subcenter. The center can be found by either: Running the following command: wgrib2 -center cpti.grib2\n 1:0:center=US NOAA Office of Oceanic and Atmospheric Research\n... And then looking up the corresponding value for \"US NOAA Office of Oceanic and Atmospheric Research\" at this website , where it happens to be 161 . OR: Running the following command: \nwgrib2 -varX cpti.grib2\n 1:0:var209_255_1_ 161 _3_61\n... Where the 4th argument after \"var\" is the center id, in this case **161**. To get the subcenter, simply run: \nwgrib2 -subcenter cpti.grib2\n 1:0:subcenter= 0 \n... The subcenter of this file is 0 . So based on the center and subcenter, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/ To find the exact table, we need the discipline and category: \nwgrib2 -full_name cpti.grib2\n 1:0:var 209 _ 3 _ 61 .500_m_above_mean_sea_level\n ... In this case the discipline is 209 and category is 3 , so the corresponding table is: 4.2.209.3.table So, the full path to the corresponding table would be: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/4.2.209.3.table The parameter ID was also listed in that output as 61 . Make sure that specific parameter information is defined in the table: \n...\n56:56:Reflectivity at -20C:dBZ:ReflectivityM20C\n57:57:Reflectivity At Lowest Altitude (RALA):dBZ:ReflectivityAtLowestAltitude\n58:58:Merged Reflectivity At Lowest Altitude (RALA):dBZ:MergedReflectivityAtLowestAltitude\n59:59:CPTI 80mph+:%:CPTI80mph 61:61:CPTI 110mph+:%:CPTI110mph You will have to restart ingestGrib for the changes to take place: sudo service edex_camel restart ingestGrib Now you can try re-ingesting the grib2 file .",
"title": "Grib2 Products"
},
{
"location": "/edex/new-grid/#troubleshooting-grib-ingest",
"text": "Make sure the latitude and longitude entries in your coverage specification file match those of your ingested raw grib file. There is a tolerance of +/- 0.1 degree to keep in mind when defining your coverage area. If some of the information is unknown, using a grib utility application such as wgrib and wgrib2 can be useful in determining the information that must be added to correctly process a new grib file. If you are experiencing Segmentation fault errors when running wgrib2, it may be best to install the latest version using the following command: yum install wgrib2 And then you may either need to change where wgrib2 points to, or use /bin/wgrib2 to run the recently downloaded version.",
"title": "Troubleshooting Grib Ingest"
},
{
"location": "/edex/data-purge/",
"text": "Purging and Retention\n\uf0c1\n\n\nPurge Types\n\uf0c1\n\n\nThere are two main forms of data puring in AWIPS. The most often thought of is the purging for \nprocessed data\n. This has to do with how long data is stored for \nafter\n it has been decoded and processed.\n\n\nThe second type of purging has to do with \nraw data\n. This has to do with how long data is stored for \nbefore\n it has been decoded.\n\n\nProcessed Data Purging\n\uf0c1\n\n\nAWIPS uses a plugin-based purge strategy for processed \nHDF5 data\n. This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written.\n\n\n\n\nNote\n: Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. \n\n\n\n\nPurging rules are defined in XML files in the Localization Store. On EDEX, most are located in \n/awips2/edex/data/utility/common_static/base/purge\n, and follow the \nbase/site\n localization pattern (e.g. site purge files are in \nsite/XXX/purge\n rather than \nbase/purge\n, where XXX is the site identifier).\n\n\nEach data set can have a purge rule defined, and the xml file is named after the data set:\n\n\nls /awips2/edex/data/utility/common_static/base/purge/\n\nacarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml\nacarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml\naggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml\nairepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml\n...\n\n\n\n\n\nTime-based purge\n\uf0c1\n\n\nIf a plugin has no XML file, the default rule of 1 day (24 hours) is used, from \n/awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml\n:\n\n\n\n\n\n<purgeRuleSet>\n <defaultRule>\n \n<period>01-00:00:00</period>\n\n </defaultRule>\n</purgeRuleSet>\n\n\n\n\nTime-based purging is set with the \nperiod\n tag and uses the \nreference time\n of the data. The reference time of the data is determined by the decoder. \n\n\n\n\n30-day NEXRAD3 Example\n\uf0c1\n\n\nModify \n/awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml\n to increase the data retention period from 1 to 31 days:\n\n\n\n<purgeRuleSet>\n <defaultRule>\n \n<period>31-00:00:00</period>\n\n </defaultRule>\n</purgeRuleSet>\n\n\n\n\n\n\nNote\n: you do NOT have to restart EDEX when you change a purge rule!\n\n\n\n\n\n\nFrame-Based Purge\n\uf0c1\n\n\nSome plugins use frame-base purging, retaining and certain number of product \"versions\". \n\n\n/awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml\n\n\n \n\n<defaultRule>\n <versionsToKeep>2</versionsToKeep>\n\n <period>07-00:00:00</period>\n </defaultRule>\n <rule>\n \n<keyValue>LAPS</keyValue>\n <versionsToKeep>30</versionsToKeep>\n\n </rule>\n <rule regex=\"true\">\n \n<keyValue>NAM(?:12|20|40)</keyValue>\n <versionsToKeep>2</versionsToKeep>\n <modTimeToWait>00-00:15:00</modTimeToWait>\n\n </rule>\n ...\n\n\n\n\nIn the above example, notice a \ndefault rule\n (2) is specified, as well as specific models with their own rules.\n\nThe tag \nmodTimeToWait\n can be used in conjunction with \nversionsToKeep\n and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait.\n\n\n\n\nPurge Logs\n\uf0c1\n\n\nData purge events are logged to the file \nedex-ingest-purge-[yyyymmdd].log\n, where \n[yyyymmdd]\n is the date stamp. \n\n\ntail -f edex-ingest-purge-20120327.log\n\n--------START LOG PURGE---------\nINFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log\nINFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files\nINFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3]
"title": "Purging and Retention"
},
{
"location": "/edex/data-purge/#purging-and-retention",
"text": "",
"title": "Purging and Retention"
},
{
"location": "/edex/data-purge/#purge-types",
"text": "There are two main forms of data puring in AWIPS. The most often thought of is the purging for processed data . This has to do with how long data is stored for after it has been decoded and processed. The second type of purging has to do with raw data . This has to do with how long data is stored for before it has been decoded.",
"title": "Purge Types"
},
{
"location": "/edex/data-purge/#processed-data-purging",
"text": "AWIPS uses a plugin-based purge strategy for processed HDF5 data . This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written. Note : Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. Purging rules are defined in XML files in the Localization Store. On EDEX, most are located in /awips2/edex/data/utility/common_static/base/purge , and follow the base/site localization pattern (e.g. site purge files are in site/XXX/purge rather than base/purge , where XXX is the site identifier). Each data set can have a purge rule defined, and the xml file is named after the data set: ls /awips2/edex/data/utility/common_static/base/purge/\n\nacarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml\nacarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml\naggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml\nairepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml\n...",
"title": "Processed Data Purging"
},
{
"location": "/edex/data-purge/#time-based-purge",
"text": "If a plugin has no XML file, the default rule of 1 day (24 hours) is used, from /awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml : \n<purgeRuleSet>\n <defaultRule>\n <period>01-00:00:00</period> \n </defaultRule>\n</purgeRuleSet> Time-based purging is set with the period tag and uses the reference time of the data. The reference time of the data is determined by the decoder.",
"title": "Time-based purge"
},
{
"location": "/edex/data-purge/#30-day-nexrad3-example",
"text": "Modify /awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml to increase the data retention period from 1 to 31 days: \n<purgeRuleSet>\n <defaultRule>\n <period>31-00:00:00</period> \n </defaultRule>\n</purgeRuleSet> Note : you do NOT have to restart EDEX when you change a purge rule!",
"title": "30-day NEXRAD3 Example"
},
{
"location": "/edex/data-purge/#frame-based-purge",
"text": "Some plugins use frame-base purging, retaining and certain number of product \"versions\". /awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml <defaultRule>\n <versionsToKeep>2</versionsToKeep> \n <period>07-00:00:00</period>\n </defaultRule>\n <rule>\n <keyValue>LAPS</keyValue>\n <versionsToKeep>30</versionsToKeep> \n </rule>\n <rule regex=\"true\">\n <keyValue>NAM(?:12|20|40)</keyValue>\n <versionsToKeep>2</versionsToKeep>\n <modTimeToWait>00-00:15:00</modTimeToWait> \n </rule>\n ... In the above example, notice a default rule (2) is specified, as well as specific models with their own rules. \nThe tag modTimeToWait can be used in conjunction with versionsToKeep and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait.",
"title": "Frame-Based Purge"
},
{
"location": "/edex/data-purge/#purge-logs",
"text": "Data purge events are logged to the file edex-ingest-purge-[yyyymmdd].log , where [yyyymmdd] is the date stamp. tail -f edex-ingest-purge-20120327.log\n\n--------START LOG PURGE---------\nINFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log\nINFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files\nINFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Archived 14 files\nINFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped processing 1 files\nINFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::---------END LOG PURGE-----------",
"title": "Purge Logs"
},
{
"location": "/edex/data-purge/#all-purge-rules",
"text": "To see all purge rule directories (base, site, configured): find /awips2/edex/data/utility -name purge\n\n/awips2/edex/data/utility/common_static/base/purge If any overrides have been made, then it's possible that site directories may show up as results from the find command as well.",
"title": "All Purge Rules"
},
{
"location": "/edex/data-purge/#raw-data-purging",
"text": "Raw data are files that have been brought in by the LDM and recognized by an action in the pqact.conf file. These files are written to subdirectories of /awips2/data_store/ . This data will wait here until it is purged, from the purging rules defined in /awips2/edex/data/utility/common_static/base/archiver/purger/RAW_DATA.xml . If the purge time is too short, and the processing latencies on EDEX are too long, it is possible that EDEX will miss some of this data, and the purge times will need to be adjusted by changing the <defaultRetentionHours> or <selectedRetentionHours> tag on the relevent data sets.",
"title": "Raw Data Purging"
},
{
"location": "/edex/data-purge/#default-retention",
"text": "The defaultRetentionHours tag is defined at the beginning of the RAW_DATA.xml file. It is the duration that will apply to any piece of data that does not fall under an explicitly defined category . The default value for our EDEX is 1 hour: \n<archive>\n <name>Raw</name>\n <rootDir>/awips2/data_store/</rootDir>\n <defaultRetentionHours>1</defaultRetentionHours> \n <category>\n ...",
"title": "Default Retention"
},
{
"location": "/edex/data-purge/#selected-retention",
"text": "Data sets are broken up into categories in the RAW_DATA.xml file. These categories are groupings of similar data. Each category has a selectedRetentionHours tag which specifies how long the matching data will be kept for. For example, there is a Model category which sets the purge time to 3 hours for all grib, bufrmos, and modelsounding data: \n... <category>\n <name>Model</name>\n <selectedRetentionHours>3</selectedRetentionHours> \n <dataSet>\n <dirPattern>(grib|grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*)</dirPattern> \n <displayLabel>{1} - {6}</displayLabel>\n <dateGroupIndices>2,3,4,5</dateGroupIndices>\n </dataSet>\n <dataSet>\n <dirPattern>(bufrmos|modelsounding)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})</dirPattern> \n <displayLabel>{1}</displayLabel>\n <dateGroupIndices>2,3,4,5</dateGroupIndices>\n </dataSet>\n</category>\n...",
"title": "Selected Retention"
},
{
"location": "/edex/data-purge/#logging",
"text": "Raw data purging can be seen in the purge logs as well ( /awips2/edex/logs/edex-ingest-purge-[yyyymmdd].log where [yyyymmdd] is the date stamp). [centos@tg-atm160027-edex-dev purge]$ grep -i 'archive' /awips2/edex/logs/edex-ingest-purge-20200728.log\nINFO 2020-07-28 20:05:23,959 2329 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\".\nINFO 2020-07-28 20:05:23,960 2330 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\", deleted 0 files and directories.\nINFO 2020-07-28 20:05:23,961 2331 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/bufrhdw\"\nINFO 2020-07-28 20:05:23,963 2332 [Purge-Archive] ArchivePurgeManager: EDEX - Locked: \"/awips2/data_store/xml\"\nINFO 2020-07-28 20:05:23,963 2333 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Products, directory \"/awips2/data_store/xml\".\nINFO 2020-07-28 20:05:23,964 2334 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Products, directory \"/awips2/data_store/xml\", deleted 5 files and directories.\nINFO 2020-07-28 20:05:23,967 2335 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/xml\"\nINFO 2020-07-28 20:05:23,967 2336 [Purge-Archive] ArchivePurger: EDEX - Raw::Archive Purged 28387 files in 23.8s.\nINFO 2020-07-28 20:05:23,979 2337 [Purge-Archive] ArchivePurgeManager: EDEX - Purging directory: \"/awips2/edex/data/archive\".\nINFO 2020-07-28 20:05:23,992 2338 [Purge-Archive] ArchivePurger: EDEX - Processed::Archive Purged 0 files in 25ms.\nINFO 2020-07-28 20:05:23,992 2339 [Purge-Archive] ArchivePurger: EDEX - Archive Purge finished. Time to run: 23.9s\n...",
"title": "Logging"
},
{
"location": "/edex/edex-users/",
"text": "Monitor Users\n\uf0c1\n\n\nTo see a list of clients connecting to your EDEX server, use the \nedex users [YYYYMMDD]\n command, where \n[YYYYMMDD]\n is the optional date string.\n\n\nedex users\n\n -- EDEX Users 20160826 --\nuser@101.253.20.225\nuser@192.168.1.67\nawips@0.0.0.0\nawips@sdsmt.edu\n...\n\n\n\nLogging Daily EDEX Users\n\uf0c1\n\n\nTo get a running log of who has accessed EDEX, you can create a short script. \n\n\nThe example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's \nedex users\n list to a logfile \n/home/awips/edex-users.log\n:\n\n\n\n\n\n\nvi~/edexUsers.sh\n\n#!/bin/bash\n/awips2/edex/bin/edex users >> /home/awips/edex-users.log\n\n\n\n\n\n\n\ncrontab -e\n\n0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1",
"title": "Monitor Users"
},
{
"location": "/edex/edex-users/#monitor-users",
"text": "To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users\n\n -- EDEX Users 20160826 --\nuser@101.253.20.225\nuser@192.168.1.67\nawips@0.0.0.0\nawips@sdsmt.edu\n...",
"title": "Monitor Users"
},
{
"location": "/edex/edex-users/#logging-daily-edex-users",
"text": "To get a running log of who has accessed EDEX, you can create a short script. The example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's edex users list to a logfile /home/awips/edex-users.log : vi~/edexUsers.sh\n\n#!/bin/bash\n/awips2/edex/bin/edex users >> /home/awips/edex-users.log crontab -e\n\n0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1",
"title": "Logging Daily EDEX Users"
},
{
"location": "/edex/data-plugins/",
"text": "td:first-child { font-weight: bold }\n\n\n\nAWIPS Plugins and Supported Data Types\n\uf0c1\n\n\n\n\n\n\n\n\nNAME\n\n\nDESCRIPTION\n\n\n\n\n\n\n\n\n\n\naqi\n\n\nAir Quality Index\n data\n\n\n\n\n\n\nbufrmos\n\n\nModel Output Statistics\n\n\n\n\n\n\nbufrua\n\n\nUpper air radiosonde data\n\n\n\n\n\n\nclimate-hmdb\n\n\nClimate text products\n\n\n\n\n\n\ngeodata\n\n\nNetCDF \nJTS Geometry\n records\n\n\n\n\n\n\ngeomag\n\n\nSWPC Geomagnetic Forecast\n (RTKP)\n\n\n\n\n\n\ngfe\n\n\nGraphical Forecast Editor\n grids\n\n\n\n\n\n\nghcd\n\n\nSWPC Generic High Cadence Data\n\n\n\n\n\n\ngpd\n\n\nNCEP Generic Point Data\n\n\n\n\n\n\ngrid\n\n\nBinary gridded data \ngrib1/grib2\n\n\n\n\n\n\nidft\n\n\nIce Drift Forecasts\n\n\n\n\n\n\nmadis\n\n\nNCEP Meteorological Assimilation Data Ingest System (\nMADIS\n)\n\n\n\n\n\n\nmanualIngest\n\n\nManual data ingest plugin\n\n\n\n\n\n\nmetartohmdb\n\n\nAdds metar records to the \nVerification and Climate database\n\n\n\n\n\n\nmodelsounding\n\n\nIndividual grid point soundings from the GFS and NAM models\n\n\n\n\n\n\nmping\n\n\nMeteorological Phenomena Identification Near the Ground (\nmPING\n)\n\n\n\n\n\n\nncpafm\n\n\nPoint/Area Forecast Matrices\n data\n\n\n\n\n\n\nnctext\n\n\nNCEP Text decoders\n\n\n\n\n\n\nncuair\n\n\nNCEP Upper Air decoder\n\n\n\n\n\n\nndm\n\n\nNational Dataset Maintenance ingester\n\n\n\n\n\n\nntrans\n\n\nNCCEP Ntrans Metafiles\n\n\n\n\n\n\nobs\n\n\nSurface observations from METARs\n\n\n\n\n\n\npgen\n\n\nNCEP NAWIPS PGEN decoder\n\n\n\n\n\n\nredbook\n\n\nRedbook graphics\n\n\n\n\n\n\nsfcobs\n\n\nSurface observations other than METAR format including buoys\n\n\n\n\n\n\nsolarimage\n\n\nSWPC Solar imagery\n\n\n\n\n\n\nssha\n\n\nNCEP Sea Surface Height Anomaly BUFR data\n\n\n\n\n\n\ntext\n\n\nVarious Text Products\n\n\n\n\n\n\nvaa\n\n\nVolcanic ash advisories\n\n\n\n\n\n\n\n\nAWIPS Plugins for Remote Sensing/Lightning\n\uf0c1\n\n\n\n\n\n\n\n\nNAME\n\n\nDESCRIPTION\n\n\n\n\n\n\n\n\n\n\nbinlightning\n\n\nLightning data from the National Lightning Detection Network\n\n\n\n\n\n\nbufrascat\n\n\nAdvanced Scatterometer wind data\n\n\n\n\n\n\nbufrhdw\n\n\nGOES High Density Winds\n\n\n\n\n\n\nbufrmthdw\n\n\nMTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds\n\n\n\n\n\n\nbufrssmi\n\n\nSpecial Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites\n\n\n\n\n\n\ncrimss\n\n\nNPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings\n\n\n\n\n\n\ndmw\n\n\nGOES-R Derived Motion Winds\n\n\n\n\n\n\nglm\n\n\nGOES Geostationary Lightning Mapper\n\n\n\n\n\n\ngoesr\n\n\nPlugins to decode and display GOES-R products\n\n\n\n\n\n\ngoessounding\n\n\nGOES Satellite Soundings\n\n\n\n\n\n\nlma\n\n\nLightning Mapping Array\n\n\n\n\n\n\nmcidas\n\n\nNCEP decoder for McIDAS AREA files\n\n\n\n\n\n\nmodis\n\n\nNASA Moderate-resolution Imaging Spectroradiometer\n\n\n\n\n\n\nncscat\n\n\nNCEP ASCAT/Quikscat records\n\n\n\n\n\n\nnpp\n\n\nNational Polar-Orbiting Partnership Satellites Soundings\n\n\n\n\n\n\nnucaps\n\n\nSoundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites\n\n\n\n\n\n\npoessounding\n\n\nPolar Operational Environmental Satellite soundings\n\n\n\n\n\n\nradar\n\n\nWSR-88D and TDWR Level 3 data\n\n\n\n\n\n\nregionalsat\n\n\nDecoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground\n\n\n\n\n\n\nsatellite-gini\n\n\nGINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD)\n\n\n\n\n\n\nsatellite-mcidas\n\n\nMcIDAS area files (Raytheon/D2D-developed)\n\n\n\n\n\n\nviirs\n\n\nNPP Visible Infrared Imaging Radiometer Suite data\n\n\n\n\n\n\nsgwh\n\n\nNCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3\n\n\n\n\n\n\ntextlightning\n\n\nText lightning data\n\n\n\n\n\n\n\n\nAWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro)\n\uf0c1\n\n\n\n\n\n\n\n\nNAME\n\n\nDESCRIPTION\n\n\n\n\n\n\n\n\n\n\natcf\n\n\nAutomated
"title": "Data Plugins"
},
{
"location": "/edex/data-plugins/#awips-plugins-and-supported-data-types",
"text": "NAME DESCRIPTION aqi Air Quality Index data bufrmos Model Output Statistics bufrua Upper air radiosonde data climate-hmdb Climate text products geodata NetCDF JTS Geometry records geomag SWPC Geomagnetic Forecast (RTKP) gfe Graphical Forecast Editor grids ghcd SWPC Generic High Cadence Data gpd NCEP Generic Point Data grid Binary gridded data grib1/grib2 idft Ice Drift Forecasts madis NCEP Meteorological Assimilation Data Ingest System ( MADIS ) manualIngest Manual data ingest plugin metartohmdb Adds metar records to the Verification and Climate database modelsounding Individual grid point soundings from the GFS and NAM models mping Meteorological Phenomena Identification Near the Ground ( mPING ) ncpafm Point/Area Forecast Matrices data nctext NCEP Text decoders ncuair NCEP Upper Air decoder ndm National Dataset Maintenance ingester ntrans NCCEP Ntrans Metafiles obs Surface observations from METARs pgen NCEP NAWIPS PGEN decoder redbook Redbook graphics sfcobs Surface observations other than METAR format including buoys solarimage SWPC Solar imagery ssha NCEP Sea Surface Height Anomaly BUFR data text Various Text Products vaa Volcanic ash advisories",
"title": "AWIPS Plugins and Supported Data Types"
},
{
"location": "/edex/data-plugins/#awips-plugins-for-remote-sensinglightning",
"text": "NAME DESCRIPTION binlightning Lightning data from the National Lightning Detection Network bufrascat Advanced Scatterometer wind data bufrhdw GOES High Density Winds bufrmthdw MTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds bufrssmi Special Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites crimss NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings dmw GOES-R Derived Motion Winds glm GOES Geostationary Lightning Mapper goesr Plugins to decode and display GOES-R products goessounding GOES Satellite Soundings lma Lightning Mapping Array mcidas NCEP decoder for McIDAS AREA files modis NASA Moderate-resolution Imaging Spectroradiometer ncscat NCEP ASCAT/Quikscat records npp National Polar-Orbiting Partnership Satellites Soundings nucaps Soundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites poessounding Polar Operational Environmental Satellite soundings radar WSR-88D and TDWR Level 3 data regionalsat Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground satellite-gini GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) satellite-mcidas McIDAS area files (Raytheon/D2D-developed) viirs NPP Visible Infrared Imaging Radiometer Suite data sgwh NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3 textlightning Text lightning data",
"title": "AWIPS Plugins for Remote Sensing/Lightning"
},
{
"location": "/edex/data-plugins/#awips-plugins-for-decision-assistance-watchwarnhazardshydro",
"text": "NAME DESCRIPTION atcf Automated Tropical Cyclone Forecast convectprob NOAA/CIMSS Prob Severe Model editedregions Hazard Services Edited Regions editedevents Hazard Services Edited Events cwat County Warning Area Threat produced by SCAN (System for Convection Analysis and Nowcasting). CWAT was formerly called SCAN Convective Threat Index (SCTI). ffg Flash flood guidance metadata (countybased ffg from RFCs) ffmp Flash Flood Monitoring and Prediction data (raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. The WMO header for the RFC FFG begins with \u201cZEGZ98\u201d. ) fog Fog Monitor . Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 \u00b5m, and 10.7 \u00b5m]) freezingLevel MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) fssobs Observations for the Fog monitor, SNOW, and SAFESEAS (raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs). lsr Local Storm Reports mpe Multi-sensor Precipitation Estimation preciprate Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]. qpf Quantitative Precipitation Forecast from SCAN. (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN\u2019s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) satpre Satellite-estimated Pecipiration (hydroApps) scan SCAN (System for Convection Analysis and Nowcasting). (Inputs for the SCAN Table include radar, cloud-to-ground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. shef Standard Hydrometeorological Exchange Format data. warning Watches, Warnings, and Advisories wcp SPC Convective Watches svrwx SPC Local Storm Report Summaries tcg Tropical Cyclone Guidance tcm Tropical Cyclone Forecast/Advisory tcs Tropical Cyclone Forecast/Advisory stormtrack NCEP StormTrack Plug-In (Automatic Tropical Cyclone Forecast & Ensemble cyclones) vil Cell-based Vertically Integrated Liquid from SCAN (Input is radar) spc Storm Prediction Center Convective Outlook KML files",
"title": "AWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro)"
},
{
"location": "/edex/data-plugins/#awips-plugins-for-aviation",
"text": "NAME DESCRIPTION acars Aircraft Communications Addressing and Reporting System (ACARS) observations acarssounding Vertical profiles derived from ACARS data airep Automated Aircraft Reports airmet \u201cAirmen\u2019s Meteorological Information\u201d: aviation weather advisories for potentially hazardous, but non-severe weather asdi FAA Aircraft Situation Data for Industry aww Airport Weather Warning bufrncwf National Convective Weather Forecast for Aviation bufrsigwx Aviation Significant Weather ccfp Aviation Collaborative Convective Forecast Product convsigmet Aviation Significant Meteorological Information for convective weather cwa Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) intlsigmet International Significant Meteorological Information for Aviation nctaf NCEP TAF decoders nonconvsigmet Aviation Significant Meteorological Information for non-convective weather pirep Pilot Reports taf Terminal Aerodrome Forecasts",
"title": "AWIPS Plugins for Aviation"
},
{
"location": "/edex/case-studies/",
"text": "Case Study Server Configuration\n\uf0c1\n\n\nThis document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data).\n\n\n\n\nQuick Install\n\uf0c1\n\n\nFollow the \nEDEX Install Instructions\n including iptables config and an optional SSD mount (for large data volumes).\n\n\ngroupadd fxalpha && useradd -G fxalpha awips\nmkdir -p /awips2/data_store\nwget -O /etc/yum.repos.d/awips2.repo https://www.unidata.ucar.edu/software/awips2/doc/el7.repo\nyum clean all\nyum groupinstall awips2-server -y\n\n\n\n\n\nDisable Data Purging\n\uf0c1\n\n\nThe easiest way to disable data purging is to add an \n<exclude>purge.*</exclude>\n entry in \n/awips2/edex/conf/modes/modes.xml\n so that the purge plugin is not loaded when the EDEX ingest JVM is started:\n\n\nvi /awips2/edex/conf/modes/modes.xml\n\n<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n<edexModes>\n <mode name=\"ingest\">\n <exclude>.*request.*</exclude>\n <exclude>edex-security.xml</exclude>\n ...\n <exclude>purge.*</exclude>\n </mode>\n ...\n</edexModes>\n\n\n\n\n\nStart EDEX\n\uf0c1\n\n\nStart EDEX without running the LDM, since we do not want current data. Run the following command:\n\n\nedex start base\n\n\n\nDouble check everything is running, except the LDM:\n\n\nedex\n\n[edex status]\n postgres :: running :: pid 43644\n pypies :: running :: pid 3557\n qpid :: running :: pid 43742\n EDEXingest :: running :: pid 6564 44301 44597\n EDEXgrib :: running :: pid 6565 44302 44598\n EDEXrequest :: running :: pid 6566 44303 44599\n ldmadmin :: not running\n\n\n\n\n\nIngest Case Study Data\n\uf0c1\n\n\nRaw data files of any type can be copied or moved into \n/awips2/data_store/ingest/\n to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. \n\n\nIndividual files can be ingested on the command line with the regex header/pattern supplied as the last argument:\n\n\nqpidNotify.py /full/path/to/data.file [regex match]\n\n\n\nFor example:\n\n\nqpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc\n\nqpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib\n\nqpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3\n\n\n\n\n\nViewing Archive Data in CAVE\n\uf0c1\n\n\nBecause we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested.\n\n\nHowever, to display specific time-based data (in case you ingest more than one case study), there are two options:\n\n\nSet Load Mode to Inventory\n\uf0c1\n\n\nIn the top-left toolbar change \nValid time seq\n to \nInventory\n.\n\n\n\n\nNow any data product selected from the menus or the Product Browser should prompt you to select the exact time.\n\n\n\n\nSet Data Display Time in CAVE\n\uf0c1\n\n\nAt the bottom of the CAVE application, double-click the \nTime:\n entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.",
"title": "Archive Case Studies"
},
{
"location": "/edex/case-studies/#case-study-server-configuration",
"text": "This document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data).",
"title": "Case Study Server Configuration"
},
{
"location": "/edex/case-studies/#quick-install",
"text": "Follow the EDEX Install Instructions including iptables config and an optional SSD mount (for large data volumes). groupadd fxalpha && useradd -G fxalpha awips\nmkdir -p /awips2/data_store\nwget -O /etc/yum.repos.d/awips2.repo https://www.unidata.ucar.edu/software/awips2/doc/el7.repo\nyum clean all\nyum groupinstall awips2-server -y",
"title": "Quick Install"
},
{
"location": "/edex/case-studies/#disable-data-purging",
"text": "The easiest way to disable data purging is to add an <exclude>purge.*</exclude> entry in /awips2/edex/conf/modes/modes.xml so that the purge plugin is not loaded when the EDEX ingest JVM is started: vi /awips2/edex/conf/modes/modes.xml\n\n<?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?>\n<edexModes>\n <mode name=\"ingest\">\n <exclude>.*request.*</exclude>\n <exclude>edex-security.xml</exclude>\n ...\n <exclude>purge.*</exclude>\n </mode>\n ...\n</edexModes>",
"title": "Disable Data Purging"
},
{
"location": "/edex/case-studies/#start-edex",
"text": "Start EDEX without running the LDM, since we do not want current data. Run the following command: edex start base Double check everything is running, except the LDM: edex\n\n[edex status]\n postgres :: running :: pid 43644\n pypies :: running :: pid 3557\n qpid :: running :: pid 43742\n EDEXingest :: running :: pid 6564 44301 44597\n EDEXgrib :: running :: pid 6565 44302 44598\n EDEXrequest :: running :: pid 6566 44303 44599\n ldmadmin :: not running",
"title": "Start EDEX"
},
{
"location": "/edex/case-studies/#ingest-case-study-data",
"text": "Raw data files of any type can be copied or moved into /awips2/data_store/ingest/ to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. Individual files can be ingested on the command line with the regex header/pattern supplied as the last argument: qpidNotify.py /full/path/to/data.file [regex match] For example: qpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc\n\nqpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib\n\nqpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3",
"title": "Ingest Case Study Data"
},
{
"location": "/edex/case-studies/#viewing-archive-data-in-cave",
"text": "Because we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested. However, to display specific time-based data (in case you ingest more than one case study), there are two options:",
"title": "Viewing Archive Data in CAVE"
},
{
"location": "/edex/case-studies/#set-load-mode-to-inventory",
"text": "In the top-left toolbar change Valid time seq to Inventory . Now any data product selected from the menus or the Product Browser should prompt you to select the exact time.",
"title": "Set Load Mode to Inventory"
},
{
"location": "/edex/case-studies/#set-data-display-time-in-cave",
"text": "At the bottom of the CAVE application, double-click the Time: entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.",
"title": "Set Data Display Time in CAVE"
},
{
"location": "/dev/awips-development-environment/",
"text": "AWIPS Development Environment (ADE)\n\uf0c1\n\n\nQuick instructions on how to download the latest source code and run CAVE from Eclipse.\n\n\n\n\nNote\n: It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS.\n\n\n\n\n1. Remove AWIPS Instances\n\uf0c1\n\n\nFirst, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed.\n\n\nUninstall with yum:\n\n\nyum clean all\nyum groupremove awips2-cave\n\n\n\nCheck to make sure all rpms have been removed:\n\n\nrpm -qa | grep awips2\n\n\n\nRemove the awips2 directory:\n\n\nrm -rf /awips2\n\n\n\n\n\n2. Set Up AWIPS Repo\n\uf0c1\n\n\nCreate a repo file named \n/etc/yum.repos.d/awips2.repo\n, and set the contents to the following:\n\n\n\n[awips2repo]\nname=AWIPS II Repository\nbaseurl=https://www.unidata.ucar.edu/repos/yum/\nel7-dev\n/\nenabled=1\nprotect=0\ngpgcheck=0\nproxy=_none_\n\n\n\n\n\nNote\n: This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl.\n\n\n\n\n\n\n3. Install the ADE\n\uf0c1\n\n\nInstall the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). \n\n\nyum clean all\nyum groupinstall awips2-ade\n\n\n\n\n\n4. Download the Source Code\n\uf0c1\n\n\nIf it's not already installed, install git:\n\n\nyum install git\n\n\n\nNext clone all of the required repositories for AWIPS:\n\n\ngit clone https://github.com/Unidata/awips2.git\ngit clone https://github.com/Unidata/awips2-core.git\ngit clone https://github.com/Unidata/awips2-core-foss.git\ngit clone https://github.com/Unidata/awips2-foss.git\ngit clone https://github.com/Unidata/awips2-ncep.git\ngit clone https://github.com/Unidata/awips2-nws.git\ngit clone https://github.com/Unidata/awips2-gsd.git\ngit clone https://github.com/Unidata/awips2-drawing.git\ngit clone https://github.com/Unidata/awips2-cimss.git\n\n\n\n\n\n5. Set Up Eclipse\n\uf0c1\n\n\nOpen eclipse by running: \n/awips2/eclipse/eclipse.sh\n\n\nVerify or make the following changes to set up eclipse for AWIPS development:\n\n\n\n\n\n\nPreferences > Java \n\n\nSet to \n/awips2/java\n\n\n\n\n\n\nPreferences > PyDev > Python Interpreter\n\n\nSet to \n/awips2/python/bin/python\n\n\n\n\n\n\nThere might be some unresolved errors. These should be made to warnings instead.\n\n\nPreferences > Java > Compiler > Building > \nCircular Dependencies\n > Change to Warning\nPreferences > Plug-in Development > API Baselines > \nMissing API Baseline\n > Change to Warning\n\n\n\n\n\n\nTurn off automatic building\n (you will turn this back on after importing the repos)\n\n\nProject > Uncheck \"Build Automatically\"\n\n\n\n\n\n\nFile > Import > General > Existing Projects Into Workspace\n\n\nImport all of the git cloned project folders \nEXCEPT\n for the main (first) \ngithub.com/Unidata/awips2.git\n directory (which should be \n~/awips2\n).\n\nSelect \nawips2-core\n, \nawips2-core-foss\n, \nawips2-foss\n, \nawips2-ncep\n, etc. > Select All Projects > Finish \n\n\nYou'll want to import \n~/awips2\n in two parts to ensure a clean and error-free Eclipse build:\n\n\n\n\nImport \nawips2/cave\n > Select All Projects > Finish\n\n\nImport \nawips2/edexOsgi\n > Select All Projects > Finish\n\n\n\n\n\n\n\n\nProject > Clean\n\n\nClean the build and ensure no errors are reported. \n\n\n\n\n\n\nTurn automatic building back on\n\n\nProject > Check \"Build Automatically\"\n\n\n\n\n\n\n\n\n6. Run CAVE\n\uf0c1\n\n\nLaunch CAVE from eclipse using \ncom.raytheon.viz.product.awips/developer.product\n.\n\n\nDouble-click the \ndeveloper.product\n file to open the Product View in Eclipse. Select \nOverview\n > \nSynchronize\n and then right-click the file in the left-side package explorer:\n\n\nSelect \nRu
"title": "Development"
},
{
"location": "/dev/awips-development-environment/#awips-development-environment-ade",
"text": "Quick instructions on how to download the latest source code and run CAVE from Eclipse. Note : It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS.",
"title": "AWIPS Development Environment (ADE)"
},
{
"location": "/dev/awips-development-environment/#1-remove-awips-instances",
"text": "First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all\nyum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2",
"title": "1. Remove AWIPS Instances"
},
{
"location": "/dev/awips-development-environment/#2-set-up-awips-repo",
"text": "Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: \n[awips2repo]\nname=AWIPS II Repository\nbaseurl=https://www.unidata.ucar.edu/repos/yum/ el7-dev /\nenabled=1\nprotect=0\ngpgcheck=0\nproxy=_none_ Note : This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl.",
"title": "2. Set Up AWIPS Repo"
},
{
"location": "/dev/awips-development-environment/#3-install-the-ade",
"text": "Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all\nyum groupinstall awips2-ade",
"title": "3. Install the ADE"
},
{
"location": "/dev/awips-development-environment/#4-download-the-source-code",
"text": "If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git\ngit clone https://github.com/Unidata/awips2-core.git\ngit clone https://github.com/Unidata/awips2-core-foss.git\ngit clone https://github.com/Unidata/awips2-foss.git\ngit clone https://github.com/Unidata/awips2-ncep.git\ngit clone https://github.com/Unidata/awips2-nws.git\ngit clone https://github.com/Unidata/awips2-gsd.git\ngit clone https://github.com/Unidata/awips2-drawing.git\ngit clone https://github.com/Unidata/awips2-cimss.git",
"title": "4. Download the Source Code"
},
{
"location": "/dev/awips-development-environment/#5-set-up-eclipse",
"text": "Open eclipse by running: /awips2/eclipse/eclipse.sh Verify or make the following changes to set up eclipse for AWIPS development: Preferences > Java Set to /awips2/java Preferences > PyDev > Python Interpreter Set to /awips2/python/bin/python There might be some unresolved errors. These should be made to warnings instead. Preferences > Java > Compiler > Building > Circular Dependencies > Change to Warning\nPreferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\" File > Import > General > Existing Projects Into Workspace Import all of the git cloned project folders EXCEPT for the main (first) github.com/Unidata/awips2.git directory (which should be ~/awips2 ). \nSelect awips2-core , awips2-core-foss , awips2-foss , awips2-ncep , etc. > Select All Projects > Finish You'll want to import ~/awips2 in two parts to ensure a clean and error-free Eclipse build: Import awips2/cave > Select All Projects > Finish Import awips2/edexOsgi > Select All Projects > Finish Project > Clean Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\"",
"title": "5. Set Up Eclipse"
},
{
"location": "/dev/awips-development-environment/#6-run-cave",
"text": "Launch CAVE from eclipse using com.raytheon.viz.product.awips/developer.product . Double-click the developer.product file to open the Product View in Eclipse. Select Overview > Synchronize and then right-click the file in the left-side package explorer: Select Run As > Eclipse Application to launch CAVE in the development environment. Select Debug > Eclipse Application to launch CAVE in in debug mode.",
"title": "6. Run CAVE"
},
{
"location": "/dev/awips-development-environment/#troubleshooting",
"text": "If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild.",
"title": "Troubleshooting"
},
{
"location": "/python/overview/",
"text": "Python API\n\uf0c1\n\n\nThe \npython-awips\n package provides a data access framework (DAF) for requesting grid and geometry datasets from an EDEX server. The DAF can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data.\n\n\nThorough documentation for python-awips can be found \nhere\n.",
"title": "Python API"
},
{
"location": "/python/overview/#python-api",
"text": "The python-awips package provides a data access framework (DAF) for requesting grid and geometry datasets from an EDEX server. The DAF can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data. Thorough documentation for python-awips can be found here .",
"title": "Python API"
},
{
"location": "/appendix/appendix-grid-parameters/",
"text": "Abbreviation\n\n\nDescription\n\n\nUnits\n\n\n\n\n\n\n\n\n\n\nWSPD\n\n\n10 Metre neutral wind speed over waves\n\n\nm/s\n\n\n\n\n\n\nWDRT\n\n\n10 Metre Wind Direction Over Waves\n\n\nDegree\n\n\n\n\n\n\nARI12H1000YR\n\n\n12H Average Recurrance Interval Accumulation 1000 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H100YR\n\n\n12H Average Recurrance Interval Accumulation 100 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H10YR\n\n\n12H Average Recurrance Interval Accumulation 10 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H1YR\n\n\n12H Average Recurrance Interval Accumulation 1 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H200YR\n\n\n12H Average Recurrance Interval Accumulation 200 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H25YR\n\n\n12H Average Recurrance Interval Accumulation 25 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H2YR\n\n\n12H Average Recurrance Interval Accumulation 2 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H500YR\n\n\n12H Average Recurrance Interval Accumulation 500 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H50YR\n\n\n12H Average Recurrance Interval Accumulation 50 Year\n\n\nin*1000\n\n\n\n\n\n\nARI12H5YR\n\n\n12H Average Recurrance Interval Accumulation 5 Year\n\n\nin*1000\n\n\n\n\n\n\nPRP12H\n\n\n12 hour Precipitation Accumulation Return Period\n\n\nyear\n\n\n\n\n\n\nGaugeInfIndex12H\n\n\n12 hour QPE Gauge Influence Index\n\n\n\n\n\n\n\n\nFFG12\n\n\n12-hr flash flood guidance\n\n\nmm\n\n\n\n\n\n\nFFR12\n\n\n12-hr flash flood runoff values\n\n\nmm\n\n\n\n\n\n\nEchoTop18\n\n\n18 dBZ Echo Top\n\n\nkm\n\n\n\n\n\n\nARI1H1000YR\n\n\n1H Average Recurrance Interval Accumulation 1000 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H100YR\n\n\n1H Average Recurrance Interval Accumulation 100 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H10YR\n\n\n1H Average Recurrance Interval Accumulation 10 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H1YR\n\n\n1H Average Recurrance Interval Accumulation 1 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H200YR\n\n\n1H Average Recurrance Interval Accumulation 200 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H25YR\n\n\n1H Average Recurrance Interval Accumulation 25 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H2YR\n\n\n1H Average Recurrance Interval Accumulation 2 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H500YR\n\n\n1H Average Recurrance Interval Accumulation 500 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H50YR\n\n\n1H Average Recurrance Interval Accumulation 50 Year\n\n\nin*1000\n\n\n\n\n\n\nARI1H5YR\n\n\n1H Average Recurrance Interval Accumulation 5 Year\n\n\nin*1000\n\n\n\n\n\n\nPRP01H\n\n\n1 hour Precipitation Accumulation Return Period\n\n\nyear\n\n\n\n\n\n\nGaugeInfIndex01H\n\n\n1 hour QPE Gauge Influence Index\n\n\n\n\n\n\n\n\nQPEFFG01H\n\n\n1 hour QPE-to-FFG Ratio\n\n\n%\n\n\n\n\n\n\nFFG01\n\n\n1-hr flash flood guidance\n\n\nmm\n\n\n\n\n\n\nFFR01\n\n\n1-hr flash flood runoff values\n\n\nmm\n\n\n\n\n\n\nQPE01\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_ACR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_ALR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_FWR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_KRF\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_MSR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_ORN\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_PTR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_RHA\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_RSA\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_STR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_TAR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_TIR\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nQPE01_TUA\n\n\n1-hr Quantitative Precip Estimate\n\n\nmm\n\n\n\n\n\n\nEVEC1\n\n\n1st Vector Component of Electric Field\n\n\nV*m^1\n\n\n\n\n\n\nBVEC1\n\n\n1st Vector Component of Magnetic Field\n\n\nT\n\n\n\n\n\n\nVEL1\n\n\n1st Vector Component of Velocity (Coordinate system dependent)\n\n\nm*s^1\n\n\n\n\n\n\nTCSRG20\n\n\n20% Tropical Cyclone Storm Surge Exceedance\n\n\nm\n\n\n\n\n\n\nARI24H1000YR\n\n\n24H
"title": "AWIPS Grid Parameters"
},
{
"location": "/python/maps-database/",
"text": "mapdata.airport\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\narpt_id\n\n\ncharacter varying(4)\n\n\n\n\n\n\nname\n\n\ncharacter varying(42)\n\n\n\n\n\n\ncity\n\n\ncharacter varying(40)\n\n\n\n\n\n\nstate\n\n\ncharacter varying(2)\n\n\n\n\n\n\nsiteno\n\n\ncharacter varying(9)\n\n\n\n\n\n\nsite_type\n\n\ncharacter varying(1)\n\n\n\n\n\n\nfac_use\n\n\ncharacter varying(2)\n\n\n\n\n\n\nowner_type\n\n\ncharacter varying(2)\n\n\n\n\n\n\nelv\n\n\ninteger\n\n\n\n\n\n\nlatitude\n\n\ncharacter varying(16)\n\n\n\n\n\n\nlongitude\n\n\ncharacter varying(16)\n\n\n\n\n\n\nlon\n\n\ndouble precision\n\n\n\n\n\n\nlat\n\n\ndouble precision\n\n\n\n\n\n\nthe_geom\n\n\ngeometry(Point,4326)\n\n\n\n\n\n\n\n\nok\n\n\nmapdata.allrivers\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\nihabbsrf_i\n\n\ndouble precision\n\n\n\n\n\n\nrr\n\n\ncharacter varying(11)\n\n\n\n\n\n\nhuc\n\n\ninteger\n\n\n\n\n\n\ntype\n\n\ncharacter varying(1)\n\n\n\n\n\n\npmile\n\n\ndouble precision\n\n\n\n\n\n\npname\n\n\ncharacter varying(30)\n\n\n\n\n\n\nowname\n\n\ncharacter varying(30)\n\n\n\n\n\n\npnmcd\n\n\ncharacter varying(11)\n\n\n\n\n\n\nownmcd\n\n\ncharacter varying(11)\n\n\n\n\n\n\ndsrr\n\n\ndouble precision\n\n\n\n\n\n\ndshuc\n\n\ninteger\n\n\n\n\n\n\nusdir\n\n\ncharacter varying(1)\n\n\n\n\n\n\nlev\n\n\nsmallint\n\n\n\n\n\n\nj\n\n\nsmallint\n\n\n\n\n\n\ntermid\n\n\ninteger\n\n\n\n\n\n\ntrmblv\n\n\nsmallint\n\n\n\n\n\n\nk\n\n\nsmallint\n\n\n\n\n\n\nthe_geom\n\n\ngeometry(MultiLineString,4326)\n\n\n\n\n\n\n\n\nmapdata.artcc\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\nartcc\n\n\ncharacter varying(4)\n\n\n\n\n\n\nalt\n\n\ncharacter varying(1)\n\n\n\n\n\n\nname\n\n\ncharacter varying(30)\n\n\n\n\n\n\ntype\n\n\ncharacter varying(5)\n\n\n\n\n\n\ncity\n\n\ncharacter varying(40)\n\n\n\n\n\n\nid\n\n\ndouble precision\n\n\n\n\n\n\nthe_geom\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\n\n\nmapdata.basins\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\nrfc\n\n\ncharacter varying(7)\n\n\n\n\n\n\ncwa\n\n\ncharacter varying(5)\n\n\n\n\n\n\nid\n\n\ncharacter varying(8)\n\n\n\n\n\n\nname\n\n\ncharacter varying(64)\n\n\n\n\n\n\nlon\n\n\ndouble precision\n\n\n\n\n\n\nlat\n\n\ndouble precision\n\n\n\n\n\n\nthe_geom\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\nthe_geom_0\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\nthe_geom_0_064\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\nthe_geom_0_016\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\nthe_geom_0_004\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\nthe_geom_0_001\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\n\n\nmapdata.canada\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\nf_code\n\n\ncharacter varying(5)\n\n\n\n\n\n\nname_en\n\n\ncharacter varying(25)\n\n\n\n\n\n\nnom_fr\n\n\ncharacter varying(25)\n\n\n\n\n\n\ncountry\n\n\ncharacter varying(3)\n\n\n\n\n\n\ncgns_fid\n\n\ncharacter varying(32)\n\n\n\n\n\n\nthe_geom\n\n\ngeometry(MultiPolygon,4326)\n\n\n\n\n\n\n\n\nmapdata.city\n\uf0c1\n\n\n\n\n\n\n\n\nColumn\n\n\nType\n\n\n\n\n\n\n\n\n\n\nst_fips\n\n\ncharacter varying(4)\n\n\n\n\n\n\nsfips\n\n\ncharacter varying(2)\n\n\n\n\n\n\ncounty_fip\n\n\ncharacter varying(4)\n\n\n\n\n\n\ncfips\n\n\ncharacter varying(4)\n\n\n\n\n\n\npl_fips\n\n\ncharacter varying(7)\n\n\n\n\n\n\nid\n\n\ncharacter varying(20)\n\n\n\n\n\n\nname\n\n\ncharacter varying(39)\n\n\n\n\n\n\nelevation\n\n\ncharacter varying(60)\n\n\n\n\n\n\npop_1990\n\n\nnumeric\n\n\n\n\n\n\npopulation\n\n\ncharacter varying(30)\n\n\n\n\n\n\nst\n\n\ncharacter varying(6)\n\n\n\n\n\n\nwarngenlev\n\n\ncharacter varying(16)\n\n\n\n\n\n\nwarngentyp\n\n\ncharacter varying(16)\n\n\n\n\n\n\nwatch_warn\n\n\ncharacter varying(3)\n\n\n\n\n\n\nzwatch_war\n\n\ndouble precision\n\n\n\n\n\n\nprog_disc\n\n\ninteger\n\n\n\n\n\n\nzprog_disc\n\n\ndouble precision\n\n\n\n\n\n\ncomboflag\n\n\ndouble precision\n\n\n\n\n\n\nland_water\n\n\ncharacter varying(16)\n\n\n\n\n\n\nrecnum\n\n\ndouble precision\n\n\n\n\n\n\nlon\n\n\ndouble precision\n\n\n\n\n\n\nlat\n\n\ndouble precision\n\n\n\n\n\n\nf3\n\n\ndouble pr
"title": "Maps Database"
},
{
"location": "/python/maps-database/#mapdataairport",
"text": "Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok",
"title": "mapdata.airport"
},
{
"location": "/python/maps-database/#mapdataallrivers",
"text": "Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326)",
"title": "mapdata.allrivers"
},
{
"location": "/python/maps-database/#mapdataartcc",
"text": "Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.artcc"
},
{
"location": "/python/maps-database/#mapdatabasins",
"text": "Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326)",
"title": "mapdata.basins"
},
{
"location": "/python/maps-database/#mapdatacanada",
"text": "Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.canada"
},
{
"location": "/python/maps-database/#mapdatacity",
"text": "Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326)",
"title": "mapdata.city"
},
{
"location": "/python/maps-database/#mapdatacounty",
"text": "Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.county"
},
{
"location": "/python/maps-database/#mapdatacustomlocations",
"text": "Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.customlocations"
},
{
"location": "/python/maps-database/#mapdatacwa",
"text": "Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.cwa"
},
{
"location": "/python/maps-database/#mapdatafirewxaor",
"text": "Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.firewxaor"
},
{
"location": "/python/maps-database/#mapdatafirewxzones",
"text": "Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.firewxzones"
},
{
"location": "/python/maps-database/#mapdatafix",
"text": "Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326)",
"title": "mapdata.fix"
},
{
"location": "/python/maps-database/#mapdatahighaltitude",
"text": "Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)",
"title": "mapdata.highaltitude"
},
{
"location": "/python/maps-database/#mapdatahighsea",
"text": "Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.highsea"
},
{
"location": "/python/maps-database/#mapdatahighway",
"text": "Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)",
"title": "mapdata.highway"
},
{
"location": "/python/maps-database/#mapdatahsa",
"text": "Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.hsa"
},
{
"location": "/python/maps-database/#mapdatainterstate",
"text": "Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)",
"title": "mapdata.interstate"
},
{
"location": "/python/maps-database/#mapdataisc",
"text": "Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.isc"
},
{
"location": "/python/maps-database/#mapdatalake",
"text": "Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.lake"
},
{
"location": "/python/maps-database/#mapdatalatlon10",
"text": "Column Type the_geom geometry(MultiLineString,4326)",
"title": "mapdata.latlon10"
},
{
"location": "/python/maps-database/#mapdatalowaltitude",
"text": "Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)",
"title": "mapdata.lowaltitude"
},
{
"location": "/python/maps-database/#mapdatamajorrivers",
"text": "Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326)",
"title": "mapdata.majorrivers"
},
{
"location": "/python/maps-database/#mapdatamarinesites",
"text": "Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326)",
"title": "mapdata.marinesites"
},
{
"location": "/python/maps-database/#mapdatamarinezones",
"text": "Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.marinezones"
},
{
"location": "/python/maps-database/#mapdatamexico",
"text": "Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.mexico"
},
{
"location": "/python/maps-database/#mapdatanavaid",
"text": "Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326)",
"title": "mapdata.navaid"
},
{
"location": "/python/maps-database/#mapdataoffshore",
"text": "Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.offshore"
},
{
"location": "/python/maps-database/#mapdatarailroad",
"text": "Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326)",
"title": "mapdata.railroad"
},
{
"location": "/python/maps-database/#mapdatarfc",
"text": "Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.rfc"
},
{
"location": "/python/maps-database/#mapdataspecialuse",
"text": "Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.specialuse"
},
{
"location": "/python/maps-database/#mapdatastates",
"text": "Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.states"
},
{
"location": "/python/maps-database/#mapdatatimezones",
"text": "Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.timezones"
},
{
"location": "/python/maps-database/#mapdatawarngenloc",
"text": "Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.warngenloc"
},
{
"location": "/python/maps-database/#mapdataworld",
"text": "Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.world"
},
{
"location": "/python/maps-database/#mapdatazone",
"text": "Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)",
"title": "mapdata.zone"
},
{
"location": "/appendix/appendix-acronyms/",
"text": "A\n\uf0c1\n\n\n\n\nACARS - Aircraft Communications Addressing and Reporting System\n\n\nAEV - AFOS-Era Verification\n\n\nAFOS - Automation of Field Operations and Services\n\n\nAGL - above ground level\n\n\nAI - AWIPS Identifier\n\n\nAMSU - Advanced Microwave Sounding Unit\n\n\nARD - AWIPS Remote Display\n\n\nASL - Above Sea Level\n\n\nASOS - Automated Surface Observing System\n\n\nASR - Airport Surveillance Radar\n\n\nATMS - Advanced Technology Microwave Sounder\n\n\nAvnFPS - Aviation Forecast Preparation System\n\n\nAVP - AWIPS Verification Program\n\n\nAWC - Aviation Weather Center\n\n\nAWIPS - Advanced Weather Interactive Processing System\n\n\n\n\nB\n\uf0c1\n\n\n\n\nBGAN - Broadboand Global Area Network\n\n\nBUFR - Binary Universal Form for the Representation of meteorological data\n\n\n\n\nC\n\uf0c1\n\n\n\n\nCAPE - Convective Available Potential Energy\n\n\nCAVE - Common AWIPS Visualization Environment\n\n\nCC - Correlation Coefficient\n\n\nCCF - Coded Cities Forecast\n\n\nCCFP - Collaborative Convective Forecast Product\n\n\nCCL - Convective Condensation Level\n\n\nCDP - Cell Display Parameters\n\n\nCFC - Clutter Filter Control\n\n\nCGI - Common Gateway Interface\n\n\nCIN - Convective Inhibition\n\n\nCITR - Commerce Information Technology Requirement\n\n\nCONUS - Conterminous/Contiguous/Continental United States\n\n\nCOOP - Continuity Of Operations Planning\n\n\nCOTS - commercial off-the-shelf\n\n\nCrIMSS - Cross-track Infrared and Microwave Sounder Suite\n\n\nCrIS - Cross-track Infrared Sounder\n\n\nCWA - County Warning Area\n\n\nCWSU - Center Weather Service Unit\n\n\nCZ - Composite Reflectivity\n\n\n\n\nD\n\uf0c1\n\n\n\n\nD2D - Display 2 Dimensions\n\n\nDFM - Digital Forecast Matrix\n\n\nDMD - Digital Mesocyclone Display\n\n\nDMS - Data Monitoring System\n\n\nDOC - Department of Commerce\n\n\nDPA - Digital Precipitation Array\n\n\n\n\nE\n\uf0c1\n\n\n\n\nECMWF - European Centre for Medium-Range Forecasts\n\n\nEDEX - Environmental Data EXchange\n\n\nEMC - Environmental Modeling Center\n\n\nEL - Equilibrium Level\n\n\nESA - Electronic Systems Analyst\n\n\nESRL - Earth System Research Laboratory\n\n\n\n\nF\n\uf0c1\n\n\n\n\nFFG - Flash Flood Guidance\n\n\nFFFG - Forced Flash Flood Guidance\n\n\nFFMP - Flash Flood Monitoring and Prediction\n\n\nFFMPA - Flash Flood Monitoring and Prediction: Advanced\n\n\nFFTI - Flash Flood Threat Index\n\n\nFFW - Flash Flood Warning\n\n\nFSL - Forecast Systems Laboratory\n\n\n\n\nG\n\uf0c1\n\n\n\n\nGFE - Graphical Forecast Editor\n\n\nGFS - Global Forecasting Systems\n\n\nGHG - Graphical Hazards Generator\n\n\nGIS - Geographic Information Systems\n\n\nGMT - Greenwich Mean Time\n\n\nGOES - Geostationary Operational Environmental Satellite\n\n\nGSD - Global System Division\n\n\n\n\nH\n\uf0c1\n\n\n\n\nHC - Hydrometeor Classification\n\n\nHI - Hail Index\n\n\nHM - Hydromet\n\n\nHPC - Hydrologic Precipitation Center\n\n\nHWR - Hourly Weather Roundup\n\n\n\n\nI\n\uf0c1\n\n\n\n\nICAO - International Civil Aviation Organization\n\n\nIFP - Interactive Forecast Program\n\n\nIFPS - Interactive Forecast Preparation System\n\n\nIHFS - Integrated Hydrologic Forecast System\n\n\nIMET - Incident Meteorologist\n\n\nIR - infrared\n\n\nISS - Incident Support Specialist IST - Interactive Skew-T\n\n\n\n\nJ\n\uf0c1\n\n\n\n\nJMS - Java Messaging System\n\n\n\n\nK\n\uf0c1\n\n\n\n\nKDP - Specific Differential Phase\n\n\nKML - Keyhole Markup Language\n\n\nKMZ - KML zipped (compressed).\n\n\n\n\nL\n\uf0c1\n\n\n\n\nLAC - Listening Area Code\n\n\nLAMP - Localized Aviation MOS Program\n\n\nLAN - Local Area Network\n\n\nLAPS - Local Analysis and Prediction System\n\n\nLARC - Local Automatic Remote Collector\n\n\nLCL - Lifting Condensation Level\n\n\nLDAD - Local Data Acquisition and Dissemination\n\n\nLFC - Level of Free Convection\n\n\nLSR - Local Storm Report\n\n\n\n\nM\n\uf0c1\n\n\n\n\nMAPS - Mesoscale Analysis and Prediction System\n\n\nmb - millibar; pressure\n\n\nMDCRS - Meteorological Data Collection and Receiving System\n\n\nMDL - Meteorological Development Laboratory\n\n\nMDP - Mesoc
"title": "Acronyms and Abbreviations"
},
{
"location": "/appendix/appendix-acronyms/#a",
"text": "ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System",
"title": "A"
},
{
"location": "/appendix/appendix-acronyms/#b",
"text": "BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data",
"title": "B"
},
{
"location": "/appendix/appendix-acronyms/#c",
"text": "CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity",
"title": "C"
},
{
"location": "/appendix/appendix-acronyms/#d",
"text": "D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array",
"title": "D"
},
{
"location": "/appendix/appendix-acronyms/#e",
"text": "ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory",
"title": "E"
},
{
"location": "/appendix/appendix-acronyms/#f",
"text": "FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory",
"title": "F"
},
{
"location": "/appendix/appendix-acronyms/#g",
"text": "GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division",
"title": "G"
},
{
"location": "/appendix/appendix-acronyms/#h",
"text": "HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup",
"title": "H"
},
{
"location": "/appendix/appendix-acronyms/#i",
"text": "ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T",
"title": "I"
},
{
"location": "/appendix/appendix-acronyms/#j",
"text": "JMS - Java Messaging System",
"title": "J"
},
{
"location": "/appendix/appendix-acronyms/#k",
"text": "KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed).",
"title": "K"
},
{
"location": "/appendix/appendix-acronyms/#l",
"text": "LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report",
"title": "L"
},
{
"location": "/appendix/appendix-acronyms/#m",
"text": "MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level",
"title": "M"
},
{
"location": "/appendix/appendix-acronyms/#n",
"text": "NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service",
"title": "N"
},
{
"location": "/appendix/appendix-acronyms/#o",
"text": "OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request",
"title": "O"
},
{
"location": "/appendix/appendix-acronyms/#p",
"text": "PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency",
"title": "P"
},
{
"location": "/appendix/appendix-acronyms/#q",
"text": "QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary",
"title": "Q"
},
{
"location": "/appendix/appendix-acronyms/#r",
"text": "RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP)",
"title": "R"
},
{
"location": "/appendix/appendix-acronyms/#s",
"text": "SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability",
"title": "S"
},
{
"location": "/appendix/appendix-acronyms/#t",
"text": "TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts",
"title": "T"
},
{
"location": "/appendix/appendix-acronyms/#u",
"text": "UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time",
"title": "U"
},
{
"location": "/appendix/appendix-acronyms/#v",
"text": "VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile",
"title": "V"
},
{
"location": "/appendix/appendix-acronyms/#w",
"text": "W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor",
"title": "W"
},
{
"location": "/appendix/appendix-acronyms/#z",
"text": "Z - Reflectivity ZDR - Differential Reflectivity",
"title": "Z"
},
{
"location": "/appendix/appendix-wsr88d/",
"text": "Product Name\n\n\nMnemonic\n\n\nID\n\n\nLevels\n\n\nRes\n\n\nElevation\n\n\n\n\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n19\n\n\n16\n\n\n100\n\n\n.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n19\n\n\n16\n\n\n100\n\n\n1.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n19\n\n\n16\n\n\n100\n\n\n2.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n19\n\n\n16\n\n\n100\n\n\n3.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n20\n\n\n16\n\n\n200\n\n\n.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n27\n\n\n16\n\n\n100\n\n\n.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n27\n\n\n16\n\n\n100\n\n\n1.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n27\n\n\n16\n\n\n100\n\n\n2.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n27\n\n\n16\n\n\n100\n\n\n3.5\n\n\n\n\n\n\nStorm Rel Velocity (SRM)\n\n\nSRM\n\n\n56\n\n\n16\n\n\n100\n\n\n.5\n\n\n\n\n\n\nStorm Rel Velocity (SRM)\n\n\nSRM\n\n\n56\n\n\n16\n\n\n100\n\n\n1.5\n\n\n\n\n\n\nStorm Rel Velocity (SRM)\n\n\nSRM\n\n\n56\n\n\n16\n\n\n100\n\n\n2.5\n\n\n\n\n\n\nStorm Rel Velocity (SRM)\n\n\nSRM\n\n\n56\n\n\n16\n\n\n100\n\n\n3.5\n\n\n\n\n\n\nComposite Ref (CZ)\n\n\nCZ\n\n\n37\n\n\n16\n\n\n100\n\n\n-1\n\n\n\n\n\n\nComposite Ref (CZ)\n\n\nCZ\n\n\n38\n\n\n16\n\n\n400\n\n\n-1\n\n\n\n\n\n\nLyr Comp Ref Max (LRM) Level 1\n\n\nLRM\n\n\n65\n\n\n8\n\n\n0\n\n\n-1\n\n\n\n\n\n\nLyr Comp Ref Max (LRM) Level 2\n\n\nLRM\n\n\n66\n\n\n8\n\n\n0\n\n\n-1\n\n\n\n\n\n\nLyr Comp Ref Max (LRM) Level 3\n\n\nLRM\n\n\n90\n\n\n8\n\n\n0\n\n\n-1\n\n\n\n\n\n\nLyr Comp Ref MAX (APR)\n\n\nAPR\n\n\n67\n\n\n16\n\n\n0\n\n\n-1\n\n\n\n\n\n\nEcho Tops (ET)\n\n\nET\n\n\n41\n\n\n16\n\n\n0\n\n\n-1\n\n\n\n\n\n\nVert Integ Liq (VIL)\n\n\nVIL\n\n\n57\n\n\n16\n\n\n0\n\n\n-1\n\n\n\n\n\n\nOne Hour Precip (OHP)\n\n\nOHP\n\n\n78\n\n\n16\n\n\n0\n\n\n-1\n\n\n\n\n\n\nStorm Total Precip (STP)\n\n\nSTP\n\n\n80\n\n\n16\n\n\n0\n\n\n-1\n\n\n\n\n\n\nVAD Wind Profile (VWP)\n\n\nVWP\n\n\n48\n\n\n0\n\n\n0\n\n\n-1\n\n\n\n\n\n\nDigital Precip Array (DPA)\n\n\nDPA\n\n\n81\n\n\n256\n\n\n400\n\n\n-1\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n25\n\n\n16\n\n\n100\n\n\n.5\n\n\n\n\n\n\nBase Spectrum Width (SW)\n\n\nSW\n\n\n28\n\n\n8\n\n\n100\n\n\n.5\n\n\n\n\n\n\nBase Spectrum Width (SW)\n\n\nSW\n\n\n30\n\n\n8\n\n\n100\n\n\n.5\n\n\n\n\n\n\nSevere Weather Probablilty (SWP)\n\n\nSWP\n\n\n47\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nStorm Tracking Information (STI)\n\n\nSTI\n\n\n58\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nHail Index (HI)\n\n\nHI\n\n\n59\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nMesocyclone (M)\n\n\nM\n\n\n60\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nMesocyclone (MD)\n\n\nMD\n\n\n141\n\n\n0\n\n\n0\n\n\n1\n\n\n\n\n\n\nTornadic Vortex Signature (TVS)\n\n\nTVS\n\n\n61\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nStorm Structure (SS)\n\n\nSS\n\n\n62\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nSupplemental Precipitation Data (SPD)\n\n\nSPD\n\n\n82\n\n\n0\n\n\n100\n\n\n-1\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n1.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n2.4\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n3.4\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n4.3\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n5.3\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n6.2\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n7.5\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n8.7\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n10.0\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n12.0\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n14.0\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n16.7\n\n\n\n\n\n\nReflectivity (Z)\n\n\nZ\n\n\n94\n\n\n256\n\n\n100\n\n\n19.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n99\n\n\n256\n\n\n25\n\n\n.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n99\n\n\n256\n\n\n25\n\n\n1.5\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n99\n\n\n256\n\n\n25\n\n\n2.4\n\n\n\n\n\n\nVelocity (V)\n\n\nV\n\n\n99\n\n\n256\n\n\n25\n\n\n3
"title": "WSR-88D Product Table"
},
{
"location": "/cave/cave-keyboard-shortcuts/",
"text": "Keyboard Shortcuts\n\uf0c1\n\n\n\n\n\nD2D Menu Shortcuts\n\uf0c1\n\n\n\n\n\n\n\n\nAction\n\n\nCommand\n\n\n\n\n\n\n\n\n\n\nOpen a New Map\n\n\nCtrl + N\n\n\n\n\n\n\nOpen a Bundle\n\n\nCtrl + O\n\n\n\n\n\n\nSave Bundle\n\n\nCtrl + S\n\n\n\n\n\n\nSave Bundle Locally\n\n\nCtrl + Shift + S\n\n\n\n\n\n\nSave KML\n\n\nCtrl + K\n\n\n\n\n\n\nExit CAVE\n\n\nAlt + F4\n\n\n\n\n\n\nExit CAVE\n\n\nCtrl + Q\n\n\n\n\n\n\nClear Data\n\n\nCtrl + C\n\n\n\n\n\n\nFirst Frame\n\n\nCtrl + \u2190\n\n\n\n\n\n\nLast Frame\n\n\nCtrl + \u2192\n\n\n\n\n\n\nStep Back\n\n\n\u2190\n\n\n\n\n\n\nStep Forward\n\n\n\u2190\n\n\n\n\n\n\nIncrease Loop Speed\n\n\nPage Up\n\n\n\n\n\n\nDecrease Loop Speed\n\n\nPage Down\n\n\n\n\n\n\nOpen Time Options\n\n\nCtrl + T\n\n\n\n\n\n\nToggle Image Combination\n\n\nInsert\n\n\n\n\n\n\nOpen Loop Properties\n\n\nCtrl + L\n\n\n\n\n\n\nOpen Image Properties\n\n\nCtrl + I\n\n\n\n\n\n\n\n\nD2D All Tilts Shortcuts\n\uf0c1\n\n\n\n\nNote\n: Requires all tilts product in main display panel\n\n\n\n\n\n\n\n\n\n\nAction\n\n\nCommand\n\n\n\n\n\n\n\n\n\n\nStep Back 1 Volume\n\n\n\u2190\n\n\n\n\n\n\nStep Forward 1 Volume\n\n\n\u2192\n\n\n\n\n\n\nStep up 1 Elevation Angle\n\n\n\u2191\n\n\n\n\n\n\nStep down 1 Elevation Angle\n\n\n\u2193\n\n\n\n\n\n\nJump to First Frame\n\n\nCtrl + \u2190\n\n\n\n\n\n\nJump to Last Frame\n\n\nCtrl + \u2192\n\n\n\n\n\n\nJump to Highest Elevation Angle\n\n\nCtrl + \u2191\n\n\n\n\n\n\nJump to Lowest Elevation Angle\n\n\nCtrl + \u2193\n\n\n\n\n\n\n\n\nD2D Numeric Keypad Shortcuts\n\uf0c1\n\n\n\n\nNote\n: Num Lock must be enabled for these keystrokes to work\n\n\n\n\n\n\n\n\n\n\nAction\n\n\nCommand\n\n\n\n\n\n\n\n\n\n\nIncrease Brightness of Image 1, Decrease Image 2\n\n\n[Numpad] +\n\n\n\n\n\n\nDecrease Brightness of Image 1, Increase Image 2\n\n\n[Numpad] -\n\n\n\n\n\n\nToggle Image Producted in Main Map On/Off\n\n\n[Numpad] 0\n\n\n\n\n\n\nToggle First 9 Graphic Products On/Off\n\n\n[Numpad] 1-9\n\n\n\n\n\n\nToggle Next 10 Graphic Prodcuts On/Off\n\n\nShift + [Numpad] 0-9\n\n\n\n\n\n\nToggle Between Images 1/2 at Full Brightness\n\n\n[Numpad] .\n\n\n\n\n\n\nToggle Legend\n\n\n[Numpad] Enter\n\n\n\n\n\n\n\n\nPanel Combo Rotate (PCR) Shortcuts\n\uf0c1\n\n\n\n\nNote\n: These numbers refer to the ones at the top of the Keyboard\n\n\n\n\n\n\n\n\n\n\nAction\n\n\nCommand\n\n\n\n\n\n\n\n\n\n\nCycle Through PCR Products\n\n\nDelete\n\n\n\n\n\n\nReturn to 4 Panel View\n\n\nEnd\n\n\n\n\n\n\nCycle Back Through PCR Products\n\n\nBackspace\n\n\n\n\n\n\nDisplay Corresponding Product\n\n\n1-8\n\n\n\n\n\n\n\n\nText Editor Shortcuts\n\uf0c1\n\n\n\n\n\n\n\n\nAction\n\n\nCommand\n\n\n\n\n\n\n\n\n\n\nExtend Selection to Start of Line\n\n\nShift + Home\n\n\n\n\n\n\nExtend Selection to End of Line\n\n\nShift + End\n\n\n\n\n\n\nExtend Selection to Start of Document\n\n\nCtrl + Shift + Home\n\n\n\n\n\n\nExtend Selection to End of Document\n\n\nCtrl + Shift + End\n\n\n\n\n\n\nExtend Selection Up 1 Screen\n\n\nShift + Page Up\n\n\n\n\n\n\nExtend Selection Down 1 Screen\n\n\nShift + Page Down\n\n\n\n\n\n\nExtend Selection to Previous Character\n\n\nShift + \u2190\n\n\n\n\n\n\nExtend Selection by Previous Word\n\n\nCtrl + Shift + \u2190\n\n\n\n\n\n\nExtend Selection to Next Character\n\n\nShift + \u2192\n\n\n\n\n\n\nExtend Selection by Next Word\n\n\nCtrl + Shift + \u2192\n\n\n\n\n\n\nExtend Selection Up 1 Line\n\n\nShift + \u2191\n\n\n\n\n\n\nExtend Selection Down 1 Line\n\n\nShift + \u2193\n\n\n\n\n\n\nDelete Previous Word\n\n\nCtrl + Backspace\n\n\n\n\n\n\nDelete Next Word\n\n\nCtrl + Delete\n\n\n\n\n\n\nClose the Window\n\n\nCtrl + Shift + F4\n\n\n\n\n\n\nUndo\n\n\nCtrl + Z\n\n\n\n\n\n\nCopy\n\n\nCtrl + C\n\n\n\n\n\n\nPaste\n\n\nCtrl + V\n\n\n\n\n\n\nCut\n\n\nCtrl + X",
"title": "Keyboard Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#keyboard-shortcuts",
"text": "",
"title": "Keyboard Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#d2d-menu-shortcuts",
"text": "Action Command Open a New Map Ctrl + N Open a Bundle Ctrl + O Save Bundle Ctrl + S Save Bundle Locally Ctrl + Shift + S Save KML Ctrl + K Exit CAVE Alt + F4 Exit CAVE Ctrl + Q Clear Data Ctrl + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2190 Increase Loop Speed Page Up Decrease Loop Speed Page Down Open Time Options Ctrl + T Toggle Image Combination Insert Open Loop Properties Ctrl + L Open Image Properties Ctrl + I",
"title": "D2D Menu Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#d2d-all-tilts-shortcuts",
"text": "Note : Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193",
"title": "D2D All Tilts Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#d2d-numeric-keypad-shortcuts",
"text": "Note : Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1/2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter",
"title": "D2D Numeric Keypad Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#panel-combo-rotate-pcr-shortcuts",
"text": "Note : These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8",
"title": "Panel Combo Rotate (PCR) Shortcuts"
},
{
"location": "/cave/cave-keyboard-shortcuts/#text-editor-shortcuts",
"text": "Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X",
"title": "Text Editor Shortcuts"
}
]
}