awips2/search/search_index.json
2024-05-09 13:39:29 +00:00

1 line
No EOL
532 KiB
JSON

{"config":{"indexing":"full","lang":["en"],"min_search_length":3,"prebuild_index":false,"separator":"[\\s\\-]+"},"docs":[{"location":"","text":"NSF Unidata AWIPS Manual \uf0c1 https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the NSF Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the NSF Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. We support two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package). Download and Install CAVE \uf0c1 Download and Install EDEX \uf0c1 Work with Python-AWIPS \uf0c1 License \uf0c1 NSF Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). NSF Unidata AWIPS license information can be found here . AWIPS Data in the Cloud \uf0c1 NSF Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after). Distributed Computing \uf0c1 AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, we have modified the package to be more applicable in the University setting. Our releases of AWIPS are stripped of operation-specific configurations and plugins. Originally, our EDEX installations were released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX Software Components \uf0c1 EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES EDEX \uf0c1 The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX CAVE \uf0c1 Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE LDM \uf0c1 https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by NSF Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the NSF Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop edexBridge \uf0c1 edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing. Qpid \uf0c1 http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd PostgreSQL \uf0c1 http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres HDF5 \uf0c1 http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by NSF Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ . PyPIES \uf0c1 PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"Home"},{"location":"#nsf-unidata-awips-manual","text":"https://www.unidata.ucar.edu/software/awips2 The Advanced Weather Interactive Processing System (AWIPS) is a meteorological software package. It is used for decoding, displaying, and analyzing data, and was originally developed for the National Weather Service (NWS) by Raytheon. There is a division here at UCAR called the NSF Unidata Program Center (UPC) which develops and supports a modified non-operational version of AWIPS for use in research and education by UCAR member institutions . This is released as open source software, free to download and use by anyone. AWIPS takes a unified approach to data ingest, where most data ingested into the system comes through the LDM client pulling data feeds from the NSF Unidata IDD . Various raw data and product files (netCDF, grib, BUFR, ASCII text, gini, AREA) are decoded and stored as HDF5 files and Postgres metadata by EDEX , which serves products and data over http. We support two data visualization frameworks: CAVE (an Eclipse-built Java application which runs on Linux, Mac, and Windows), and python-awips (a python package).","title":"NSF Unidata AWIPS Manual"},{"location":"#download-and-install-cave","text":"","title":"Download and Install CAVE"},{"location":"#download-and-install-edex","text":"","title":"Download and Install EDEX"},{"location":"#work-with-python-awips","text":"","title":"Work with Python-AWIPS"},{"location":"#license","text":"NSF Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). NSF Unidata AWIPS license information can be found here .","title":"License"},{"location":"#awips-data-in-the-cloud","text":"NSF Unidata and XSEDE Jetstream have partnered to offer an EDEX data server in the cloud, open to the community. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu (without http:// before, or :9581/services after).","title":"AWIPS Data in the Cloud"},{"location":"#distributed-computing","text":"AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. Because AWIPS was originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, we have modified the package to be more applicable in the University setting. Our releases of AWIPS are stripped of operation-specific configurations and plugins. Originally, our EDEX installations were released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive (SSD) could handle most of the entire NOAAport data volume. However, with GOES-R(16) now online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was a need to distribute EDEX data decoding in order to handle this firehose of data. Read More: Distributed EDEX","title":"Distributed Computing"},{"location":"#software-components","text":"EDEX CAVE LDM edexBridge Qpid PostgreSQL HDF5 PyPIES","title":"Software Components"},{"location":"#edex","text":"The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands edex start and edex stop , which runs the system script /etc/rc.d/init.d/edex_camel Read More: How to Install EDEX","title":"EDEX"},{"location":"#cave","text":"Common AWIPS Visualization Environment. The data rendering and visualization tool for AWIPS. CAVE contains of a number of different data display configurations called perspectives. Perspectives used in operational forecasting environments include D2D (Display Two-Dimensional), GFE (Graphical Forecast Editor), and NCP (National Centers Perspective). CAVE is started with the command /awips2/cave/cave.sh or cave.sh Read More: How to Install CAVE","title":"CAVE"},{"location":"#ldm","text":"https://www.unidata.ucar.edu/software/ldm/ The LDM (Local Data Manager), developed and supported by NSF Unidata, is a suite of client and server programs designed for data distribution, and is the fundamental component comprising the NSF Unidata Internet Data Distribution (IDD) system. In AWIPS, the LDM provides data feeds for grids, surface observations, upper-air profiles, satellite and radar imagery and various other meteorological datasets. The LDM writes data directly to file and alerts EDEX via Qpid when a file is available for processing. The LDM is started and stopped with the commands edex start and edex stop , which runs the commands service edex_ldm start and service edex_ldm stop","title":"LDM"},{"location":"#edexbridge","text":"edexBridge, invoked in the LDM configuration file /awips2/ldm/etc/ldmd.conf , is used by the LDM to post \"data available\" messaged to Qpid, which alerts the EDEX Ingest server that a file is ready for processing.","title":"edexBridge"},{"location":"#qpid","text":"http://qpid.apache.org Apache Qpid , the Queue Processor Interface Daemon, is the messaging system used by AWIPS to facilitate communication between services. When the LDM receives a data file to be processed, it employs edexBridge to send EDEX ingest servers a message via Qpid. When EDEX has finished decoding the file, it sends CAVE a message via Qpid that data are available for display or further processing. Qpid is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/qpidd","title":"Qpid"},{"location":"#postgresql","text":"http://www.postgresql.org PostgreSQL , known simply as Postgres, is a relational database management system (DBMS) which handles the storage and retrieval of metadata, database tables and some decoded data. The storage and reading of EDEX metadata is handled by the Postgres DBMS. Users may query the metadata tables by using the termainal-based front-end for Postgres called psql . Postgres is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/edex_postgres","title":"PostgreSQL"},{"location":"#hdf5","text":"http://www.hdfgroup.org/HDF5/ Hierarchical Data Format (v.5) is the primary data storage format used by AWIPS for processed grids, satellite and radar imagery and other products. Similar to netCDF, developed and supported by NSF Unidata, HDF5 supports multiple types of data within a single file. For example, a single HDF5 file of radar data may contain multiple volume scans of base reflectivity and base velocity as well as derived products such as composite reflectivity. The file may also contain data from multiple radars. HDF5 data is stored on the EDEX server in /awips2/edex/data/hdf5/ .","title":"HDF5"},{"location":"#pypies","text":"PyPIES , Python Process Isolated Enhanced Storage, (httpd-pypies) was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES. PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by edex start and edex stop , and is controlled by the system script /etc/rc.d/init.d/httpd-pypies .","title":"PyPIES"},{"location":"appendix/appendix-acronyms/","text":"A \uf0c1 ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System B \uf0c1 BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data C \uf0c1 CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity D \uf0c1 D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array E \uf0c1 ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory F \uf0c1 FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory G \uf0c1 GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division H \uf0c1 HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup I \uf0c1 ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T J \uf0c1 JMS - Java Messaging System K \uf0c1 KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed). L \uf0c1 LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report M \uf0c1 MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level N \uf0c1 NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service O \uf0c1 OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request P \uf0c1 PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency Q \uf0c1 QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary R \uf0c1 RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP) S \uf0c1 SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability T \uf0c1 TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts U \uf0c1 UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time V \uf0c1 VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile W \uf0c1 W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor Z \uf0c1 Z - Reflectivity ZDR - Differential Reflectivity","title":"Acronyms and Abbreviations"},{"location":"appendix/appendix-acronyms/#a","text":"ACARS - Aircraft Communications Addressing and Reporting System AEV - AFOS-Era Verification AFOS - Automation of Field Operations and Services AGL - above ground level AI - AWIPS Identifier AMSU - Advanced Microwave Sounding Unit ARD - AWIPS Remote Display ASL - Above Sea Level ASOS - Automated Surface Observing System ASR - Airport Surveillance Radar ATMS - Advanced Technology Microwave Sounder AvnFPS - Aviation Forecast Preparation System AVP - AWIPS Verification Program AWC - Aviation Weather Center AWIPS - Advanced Weather Interactive Processing System","title":"A"},{"location":"appendix/appendix-acronyms/#b","text":"BGAN - Broadboand Global Area Network BUFR - Binary Universal Form for the Representation of meteorological data","title":"B"},{"location":"appendix/appendix-acronyms/#c","text":"CAPE - Convective Available Potential Energy CAVE - Common AWIPS Visualization Environment CC - Correlation Coefficient CCF - Coded Cities Forecast CCFP - Collaborative Convective Forecast Product CCL - Convective Condensation Level CDP - Cell Display Parameters CFC - Clutter Filter Control CGI - Common Gateway Interface CIN - Convective Inhibition CITR - Commerce Information Technology Requirement CONUS - Conterminous/Contiguous/Continental United States COOP - Continuity Of Operations Planning COTS - commercial off-the-shelf CrIMSS - Cross-track Infrared and Microwave Sounder Suite CrIS - Cross-track Infrared Sounder CWA - County Warning Area CWSU - Center Weather Service Unit CZ - Composite Reflectivity","title":"C"},{"location":"appendix/appendix-acronyms/#d","text":"D2D - Display 2 Dimensions DFM - Digital Forecast Matrix DMD - Digital Mesocyclone Display DMS - Data Monitoring System DOC - Department of Commerce DPA - Digital Precipitation Array","title":"D"},{"location":"appendix/appendix-acronyms/#e","text":"ECMWF - European Centre for Medium-Range Forecasts EDEX - Environmental Data EXchange EMC - Environmental Modeling Center EL - Equilibrium Level ESA - Electronic Systems Analyst ESRL - Earth System Research Laboratory","title":"E"},{"location":"appendix/appendix-acronyms/#f","text":"FFG - Flash Flood Guidance FFFG - Forced Flash Flood Guidance FFMP - Flash Flood Monitoring and Prediction FFMPA - Flash Flood Monitoring and Prediction: Advanced FFTI - Flash Flood Threat Index FFW - Flash Flood Warning FSL - Forecast Systems Laboratory","title":"F"},{"location":"appendix/appendix-acronyms/#g","text":"GFE - Graphical Forecast Editor GFS - Global Forecasting Systems GHG - Graphical Hazards Generator GIS - Geographic Information Systems GMT - Greenwich Mean Time GOES - Geostationary Operational Environmental Satellite GSD - Global System Division","title":"G"},{"location":"appendix/appendix-acronyms/#h","text":"HC - Hydrometeor Classification HI - Hail Index HM - Hydromet HPC - Hydrologic Precipitation Center HWR - Hourly Weather Roundup","title":"H"},{"location":"appendix/appendix-acronyms/#i","text":"ICAO - International Civil Aviation Organization IFP - Interactive Forecast Program IFPS - Interactive Forecast Preparation System IHFS - Integrated Hydrologic Forecast System IMET - Incident Meteorologist IR - infrared ISS - Incident Support Specialist IST - Interactive Skew-T","title":"I"},{"location":"appendix/appendix-acronyms/#j","text":"JMS - Java Messaging System","title":"J"},{"location":"appendix/appendix-acronyms/#k","text":"KDP - Specific Differential Phase KML - Keyhole Markup Language KMZ - KML zipped (compressed).","title":"K"},{"location":"appendix/appendix-acronyms/#l","text":"LAC - Listening Area Code LAMP - Localized Aviation MOS Program LAN - Local Area Network LAPS - Local Analysis and Prediction System LARC - Local Automatic Remote Collector LCL - Lifting Condensation Level LDAD - Local Data Acquisition and Dissemination LFC - Level of Free Convection LSR - Local Storm Report","title":"L"},{"location":"appendix/appendix-acronyms/#m","text":"MAPS - Mesoscale Analysis and Prediction System mb - millibar; pressure MDCRS - Meteorological Data Collection and Receiving System MDL - Meteorological Development Laboratory MDP - Mesocyclone Display Parameters MDPI - Microburst-Day Potential Index MEF - Manually Entered Forecast METAR - Meteorological Aviation Report MHS - message handling system ML - Melting Layer MND - Mass News Dissemination MOS - Model Output Statistics MPC - Marine Prediction Center MPE - Multisensor Precipitation Estimator MRD - Message Reference Descriptor MRU - Meso Rapid Update MSAS - MAPS Surface Assimilation System MSL - Mean Sea Level","title":"M"},{"location":"appendix/appendix-acronyms/#n","text":"NAM - North American Mesoscale model NCEP - National Centers for Environmental Prediction NCF - Network Control Facility NDFD - National Digital Forecast Database NE-PAC - Northeastern Pacific NESDIS - National Environmental Satellite, Data and Information Service NH - Northern Hemisphere nMi - nautical miles NOAA - National Oceanic and Atmospheric Administration NPN - NOAA Profiler Network NPP - Suomi National Polar-orbiting Partnership NUCAPS - NOAA Unique CrIS/ATMS Processing Systems NWP - Numerical Weather Prediction NWR - NOAA Weather Radio NWS - National Weather Service NWRWAVES - NOAA Weather Radio With All-Hazards VTEC Enhanced Software NWSRFS - National Weather Service River Forecast System NWWS - NOAA Weather Wire Service","title":"N"},{"location":"appendix/appendix-acronyms/#o","text":"OCP - Ocean Prediction Center OH - Office of Hydrology OPC - Ocean Prediction Center ORPG - Open Radar Products Generator OSD - One Hour Snow Depth OSW - One Hour Snow Water OTR - One Time Request","title":"O"},{"location":"appendix/appendix-acronyms/#p","text":"PID - Product Identification PIL - Product Inventory List PIREP - Pilot Weather Report POES - Polar Operational Environmental Satellite POSH - Probability of Severe Hail POH - Probability of Hail POP - Probability of Precipitation PQPF - Probabilistic QPF PRF - Pulse Repetition Frequency","title":"P"},{"location":"appendix/appendix-acronyms/#q","text":"QC - quality control QCMS - Quality Control and Monitoring System QPE - Quantitative Precipitation Estimator QPF - Quantitative Precipitation Forecast QPS - Quantitative Precipitation Summary","title":"Q"},{"location":"appendix/appendix-acronyms/#r","text":"RAOB - Radiosonde Observation RAP - Rapid Refresh (Replaced RUC) RCM - Radar Coded Message RER - Record Report RFC - River Forecast Center RGB - Red, Green, Blue RHI - Range Height Indicator RMR - Radar Multiple Request ROSA - Remote Observing System Automation RPG - Radar Product Generator RPS - routine product set RTD - Requirements Traceability Document; Routine, Delayed RTMA - Real Time Mesoscale Analysts RUC - Rapid Update Cycle (Replaced by RAP)","title":"R"},{"location":"appendix/appendix-acronyms/#s","text":"SAFESEAS - System on AWIPS for Forecasting and Evaluation of Seas and Lakes SBN - Satellite Broadcast Network SCAN - System for Convection Analysis and Nowcasting SCD - Supplementary Climatological Data SCID - Storm Cell Identification Display SCP - Satellite Cloud Product SCTI - SCAN CWA Threat Index SDC - State Distribution Circuit SNOW - System for Nowcasting Of Winter Weather SOO - Science and Operations Officer SPC - Storm Prediction Center SPE - Satellite Precipitation Estimate SREF - Short Range Ensemble Forecast SRG - Supplemental Product Generator SRM - Storm Relative Motion SSD - Storm-Total Snow Depth SSM/I - Special Sensor Microwave/Imager SSW - Storm-Total Snow Water STI - Storm Track Information Suomi NPP - Suomi National Polar-orbiting Partnership SW - Spectrum Width SWEAT Index - Severe Weather Threat Index SWP - Severe Weather Probability","title":"S"},{"location":"appendix/appendix-acronyms/#t","text":"TAF - Terminal Aerodrome Forecast (international code) TAFB - Tropical Analysis and Forecast Branch TCM - Marine/Tropical Cyclone Advisory TCP - Public Tropical Cyclone Advisory TDWR - Terminal Doppler Weather Radio TE-PAC - Tropical Pacific TMI - Text Message Intercept TRU - TVS Rapid Update TT - Total Totals TVS - Tornado Vortex Signature TWB - Transcribed Weather Broadcasts","title":"T"},{"location":"appendix/appendix-acronyms/#u","text":"UGC - Universal Geographic Code ULR - User Selectable Layer Reflectivity URL - Universal Resource Locator USD - User Selectable Snow Depth USW - User Selectable Snow Water UTC - Coordinated Universal Time","title":"U"},{"location":"appendix/appendix-acronyms/#v","text":"VAD - Velocity Azimuth Display VCP - volume coverage pattern VIIR - Visible Infrared Imager Radiometer Suite VIL - Vertically Integrated Liquid VTEC - Valid Time and Event Code VWP - VAD Wind Profile","title":"V"},{"location":"appendix/appendix-acronyms/#w","text":"W-ATL - Western Atlantic WFO - Weather Forecast Office WINDEX - Wind Index WMO - World Meteorological Organization WSFO - Weather Service Forecast Office WSO - Weather Service Office WSOM - Weather Service Operations Manual WSR-88D - Weather Surveillance Radar-1988 Doppler WWA - Watch Warning Advisory WV - water vapor","title":"W"},{"location":"appendix/appendix-acronyms/#z","text":"Z - Reflectivity ZDR - Differential Reflectivity","title":"Z"},{"location":"appendix/appendix-grid-parameters/","text":"Abbreviation Description Units 0to5 t-5Day Mean Hgt m 2xTP6hr 12Hr Accum Precip from 2 6hr mm 36SHRMi S=Shear incr > 10kts 3-6km 50dbzZ 50dbz Hgt for 1 in. Svr Hail m accum_altimeter24 accum_altimeter24 Pa accum_dewpoint24 accum_dewpoint24 F accum_dpFromTenths24 accum_dpFromTenths24 accum_GH12 accum_GH12 m accum_htMan12 accum_htMan12 m accum_numMand12 accum_numMand12 accum_precip1Hour3 accum_precip1Hour3 in accum_precip1Hour6 accum_precip1Hour6 in accum_precip6Hour24 accum_precip6Hour24 in accum_prMan12 accum_prMan12 Pa accum_rawMETAR24 accum_rawMETAR24 accum_sfcPress3 accum_sfcPress3 Pa accum_temperature24 accum_temperatur24 in accum_tempFromTenths24 accum_tempFromTenths24 in accum_windDir24 accum_windDir24 in accum_windSpeed24 accum_windSpeed24 in ACOND Aerodynamic conductance m/s adimc Additional Impervious Area Water Content % ageoVC Ageo Vert Circ ageoW Ageo Wind m/s ageoWM Magnitude Ageo Wind m/s ALBDO Albedo % Along Component Along m/s Alt24Chg Alt24Chg Pa Alti Altimeter hPa ANCConvectiveOutlook ANC Convective Outlook ANCFinalForecast ANC Final Forecast dBZ ANCLayerCompositeReflectivity ANC Layer Composite Reflectivity dBZ AppT Apparent Temperature \u00b0F AV Absolute Vorticity /s AV Vorticity /s BARO Barometric Velocity Vectors m/s BASSW Spectrum Width kts BdEPT06 Max ThetaE Difference (3-6km Min minus 0-3km Max) K BGRUN Baseflow-Groundwater Runoff kg/m^2 BLI Best (4 layer) Lifted Index K BLI Best Lifted Index K BlkMag Bulk Shear Magnitude m/s BlkShr Bulk Shear Vectors m/s BMIXL Blackadar's Mixing Length Scale m BREFMaxHourly Hourly Base Reflectivity Maximum dBZ BrightBandBottomHeight Bright Band Bottom Height m BrightBandTopHeight Bright Band Top Height m BRN Net Bulk Richardson Number BRNEHIi 72% Supercell Cases Tornadic BRNmag m/s BRNSHR BRN Shear BRNvec m/s BRTMP Brightness Temperature K CAPE Convective Available Potential Energy J/kg CAPEc1 Prob CAPE > 500 J/kg % CAPEc2 Prob CAPE > 1000 J/kg % CAPEc3 Prob CAPE > 2000 J/kg % CAPEc4 Prob CAPE > 3000 J/kg % CAPEc5 Prob CAPE > 4000 J/kg % CapeStk Cape Stack capeToLvl cape up to level CAT Clear Air Turbulence % cCape Computed CAPE J/kg cCin Computed CIN J/kg CCOND Canopy Conductance m/s CCP Cloud Cover % CCPerranl Cloud Cover Analysis Uncertainty % CD Drag Coefficient Numeric CDCON Convective Cloud Cover % CDUVB Clear sky UV-B Downward Solar Flux W/m^2 CEIL Ceiling m CFRZR Categorical Freezing Rain CFRZR Categorical Freezing Rain bit CFRZRc1 Chc of Measurable FZRA (Dominant) % CFRZRmean Categorical Freezing Precip mean CFRZRsprd Categorical Freezing Precip sprd CIce Cloud Ice g/m^3 CICE Cloud Ice kg/m^2 CICEP Categorical Ice Pellets CICEP Categorical Ice Pellets bit CICEPc1 Chc of Measurable IP (Dominant) % CICEPmean Categorical Ice Pellets mean CICEPsprd Categorical Ice Pellets sprd Cig Ceiling Height Cigc1 Prob Ceiling Hgt < 500 ft % Cigc2 Prob Ceiling Hgt < 1000 ft % Cigc3 Prob Ceiling Hgt < 3000 ft % CIn Convective Inhibition J/kg ClCond Cloud Condensate g/m^3 CLGTN Categorical Lightning Potential CLGTN2hr 2hr Categorical Lightning Potential climoPW PW % of normal % climoPWimp Import NARR PW in CloudCover Cloud Cover K CLWMR Cloud Mixing Ratio kg/kg CnvP2hr 2hr Convective probability % CnvPcat Categorical convective potential CNWAT Plant Canopy Surface Water mm COCO Correlation Coefficient CompositeReflectivityMaxHourly Hourly Composite Reflectivity Maximum dBZ CONUSMergedReflectivity CONUS Merged Reflectivity dBZ CONUSMergedRHV CONUS Merged RhoHV CONUSMergedZDR CONUS Merged ZDR dB CONUSPlusMergedReflectivity CONUS-Plus Merged Reflectivity dBZ CONVP Categorical Convection Potential CONVP2hr 2hr Convection potential Corf Corfidi Vectors m/s CorfF Corfidi Vectors-Forward Prop kn CorfFM Corfidi Vec-Forward Mag kn CorfM Corfidi Vec Mag kn covCat Coverage Category % CP Conv Precip mm CP Convective Precipitation mm CP12hr Convective Precipitation(12 hours) mm CP1hr Convective Precipitation(1 hour) mm CP3hr Convective Precipitation(3 hours) mm CP6hr Convective Precipitation(6 hours) mm CP9hr Convective Precipitation(9 hours) mm CP-GFS Convective Precipitation for GFS mm CPOFP Percent of Frozen Precipitation % CPOFP Probability of Frozen precip % CPOFP Probability of Frozen Precip % CPOLP Probability of liquid precip % CPOP Categorical POP CPOZP Probability of Freezing Precip % CPOZP Probability of Freezing Precip % CPr Condensation Pressure hPa CPRAT Convective Precipitation Rate mm/s CPrD Condensation Pressure Deficit hPa CRAIN Categorical Rain CRAIN Categorical Rain bit CRAINc1 Chc of Measurable Rain (Dominant) % CRAINmean Categorical Rain mean CRAINsprd Categorical Rain sprd CritT1 Layer Min Temperature -6C, -10C K CSDLF Clear Sky Downward Long Wave Flux W/m^2 CSDSF Clear Sky Downward Solar Flux W/m^2 CSNOW Categorical Snow CSNOW Categorical Snow bit CSNOWc1 Chc of Measurable Snow (Dominant) % CSNOWmean Categorical Snow mean CSNOWsprd Categorical Snow sprd CSSI CO Svr Storm Idx CSULF Clear Sky Upward Long Wave Flux W/m^2 CSUSF Clear Sky Upward Solar Flux W/m^2 cTOT Cross Totals C CTSTM Categorical Tstorm CTyp Cloud Type CUEFI Convective Cloud Efficiency non-dim CumNrm Normalized Cumulative Shear /s CumShr Cumulative Shear m/s CURU Cu Rule 0>SKC,-1>SCT,-4<BKN,-6<OVC CW Cloud Water g/m^3 CWAT Cloud Water mm CWORK Cloud Work Function J/kg CXR Comp Refl dBZ dCape Downdraft CAPE J/kg defV Deformation sec^-1 del2gH df Duct Function dGH12 12hr Height Change m DIABi Omega from Diabatic Effects dPa/s diam Feature Diameter km Dir24Chg Dir24Chg \u00b0 DIRC Current Direction \u00b0 DIRC Surface Current Direction degree dirPW Primary Wave Direction DIRPW Primary Wave Direction deg dirSW Secondary Wave Direction DIRSW Secondary Wave Direction deg DivF Frontogenesis Vector Divergence K/m/s^2 DivFn Fn Vector Divergence K/m/s^2 DivFs Fs Vector Divergence K/m/s^2 DLWRF Downward Long-Wave Rad. Flux W/m^2 dP Pressure Thickness mb dP1hr 1hr MSL Press Change hPa Dp24Chg Dp24Chg \u00b0F dP3hr 3hr MSL Press Change hPa dP6hr 6hr MSL Press Change hPa DpD Dew Point Depression K DpD Dewpoint depression K DpDt Local Pressure Derivative hPa/s Dpress Pressure Difference hPa DpT Dew Point Temperature K DpT Dewpoint temperature K DpTerranl Dew Point Temperature Error Analysis K DpTerranl Dewpoint Analysis Uncertainty K DpTmean Dewpoint Temp mean K DpTsprd Dewpoint Temp sprd K dPW1hr 1hr Precipitable Water Change in dPW3hr 3hr Precipitable Water Change in dPW6hr 6hr Precipitable Water Change in DSLM Deviation Of Sea Level from Mean m DSWRF Downward Short-Wave Radiation Flux W/m^2 dT Vrt Temp Chg \u2103 DthDt Total Theta Tendency K/S DUVB UV-B Downward Solar Flux W/m^2 dVAdv Diff vort Adv /s*1.0E9 dZ Thickness m EchoTop18 18 dBZ Echo Top km EchoTop30 30 dBZ Echo Top km EchoTop50 50 dBZ Echo Top km EchoTop60 60 dBZ Echo Top km EHI Energy Helicity Index EHI01 Energy Helicity Index 0-1km m*m/s*s EHIi Energy Helicity Index ELEV Ocean Surface Elevation Relative to Geoid m ELEV Tidal Height m ELON East Longitude (0 to 360) deg EMSP MSLP (ETA Reduction) Pa EPT Equiv Pot Temp K EPT Equivalent Potential Temperature K EPTA Equiv Pot Temp Adv K/s EPTC Equiv Pot Temp Conv K/s EPTGrd Theta-E Gradient K/m EPTGrdM Theta-E Grad Mag K/m EPTs Saturated Equiv Pot Temp K EPVg Geo Equiv Pot Vort K/hPa/s EPVs Saturated Equiv Pot Vort K/hPa/s EPVt1 Instability is Slantwise=S Upright=U EPVt2 EPV* Instability is Slantwise=S Upright=U ESP Enhanced Stretching Potential (ML) ESP2 ESP gamma ETCWL Extra Tropical Storm Surge Combined Surge and Tide m ETSRG Extra Tropical Storm Surge m EVBS Direct Evaporation from Bare Soil W/m^2 EVCW Canopy water evaporation W/m^2 EVP Evaporation kg/m^2 FD Fire Danger FeatMot Feature Motion kn fGen QG Frontogenesis K^2/m^2/s FLDCP Field Capacity Fraction fnD Qn Div K/m^2/s FnVecs Fn Vectors K/m/s FRICV Frictional Velocity m/s FROZR Frozen Rain kg/m^2 FRZR Ice Accum m FRZR12hr 12 Hr Ice Accum mm FRZR6hr 6 Hr Ice Accum mm FRZRmodel Model Run Ice mm FRZRrun Model Run Ice Accum mm fsD Qs Div K/m^2/s FsVecs Fs Vectors K/m/s FVecs Frontogenesis Vectors K/m/s Fzra1 850-1000 fz thk Fzra2 Thickness: FZRA/FZDZ g2gsh Gate to Gate Shear kts gamma Lapse Rate K/m gammaE ThetaE Lapse Rate K/m GaugeCorrQPE01H QPE - Radar with Gauge Bias Correction (1 hr. accum.) mm GaugeCorrQPE03H QPE - Radar with Gauge Bias Correction (3 hr. accum.) mm GaugeCorrQPE06H QPE - Radar with Gauge Bias Correction (6 hr. accum.) mm GaugeCorrQPE12H QPE - Radar with Gauge Bias Correction (12 hr. accum.) mm GaugeCorrQPE24H QPE - Radar with Gauge Bias Correction (24 hr. accum.) mm GaugeCorrQPE48H QPE - Radar with Gauge Bias Correction (48 hr. accum.) mm GaugeCorrQPE72H QPE - Radar with Gauge Bias Correction (72 hr. accum.) mm GaugeInfIndex01HP1 1 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex01HP2 1 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex03HP1 3 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex03HP2 3 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex06HP1 6 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex06HP2 6 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex12HP1 12 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex12HP2 12 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex24HP1 24 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex24HP2 24 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex48HP1 48 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex48HP2 48 hour QPE Gauge Influence Index Pass 2 GaugeInfIndex72HP1 72 hour QPE Gauge Influence Index Pass 1 GaugeInfIndex72HP2 72 hour QPE Gauge Influence Index Pass 2 GaugeOnlyQPE01H QPE - Radar Gauge Only (1 hr. accum.) mm GaugeOnlyQPE03H QPE - Radar Gauge Only (3 hr. accum.) mm GaugeOnlyQPE06H QPE - Radar Gauge Only (6 hr. accum.) mm GaugeOnlyQPE12H QPE - Radar Gauge Only (12 hr. accum.) mm GaugeOnlyQPE24H QPE - Radar Gauge Only (24 hr. accum.) mm GaugeOnlyQPE48H QPE - Radar Gauge Only (48 hr. accum.) mm GaugeOnlyQPE72H QPE - Radar Gauge Only (72 hr. accum.) mm GeH Geometric Height m geoVort Geo Vorticity /s geoW Geostrophic Wind m/s geoWM Magnitude Geo Wind m/s GFLUX Ground Heat Flux W/m^2 GH Geopotential Height gpm GH Height m GH12hour 12 hour Height m GH2day 2 day Height m GH5day 5 day Height m GH_avg Height Ensemble Mean m GH_perts Height Perturbations m GH_std Height Ensemble Std Dev m GHmean Geopotential Height mean m GHsprd Geopotential Height spread m GHxSM Filtered-500km Hgt m GHxSM2 Filtered-250km Hgt m gOvf Gust Wind Gust m/s GVV Geometric Vertical Velocity m/s GVV Geometric Vertical Velocity m/s GVV1hr Mean 1hr Geometric Vertical Velocity m/s H50Above0C Height of 50 dBZ Echo Above 0C km H50AboveM20C Height of 50 dBZ Echo Above -20C km H60Above0C Height of 60 dBZ Echo Above 0C km H60AboveM20C Height of 60 dBZ Echo Above -20C km HAILPROB Hail Probability % HC Hydrometeor Class HCDC High Cloud Cover % HeightCompositeReflectivity Composite Reflectivity Height m HeightLLCompositeReflectivity Low-Level Composite Reflectivity Height m Heli Helicity m*m/s*s Heli Storm Relative Helicity m^2/s^2 HeliC Helicity (for > 300J/Kg MLCape) HeliD Helicity (NCEP Delivered) m\u00b2/s\u00b2 HI Haines Index HI Haines Index Numeric HI1 Haines Stab Term HI3 HI1 Index Assign HI4 Moist Term Index Assign HIdx Heat Index K HIdx Heat Index K HighLayerCompositeReflectivity High Layer Composite Reflectivity (24-60 kft) dBZ HIWC HiWc K HPBL Height of Planetary Boundary Layer m HPBL Planetary Boundary Layer Height m HTSGW Total Significant Wave Height m HyC Hydrometer Conc g/m^3 ICAHT ICAO Standard Atmosphere Reference Height m ICEC Derived Radar Composite Proportion ICEC Ice Cover ICEC Ice Cover Proportion ICEG Ice growth rate m/s ICETK Ice Thickness m ICI Icing Severity Index ICIP Icing Probability % ICMR Ice Water Mixing Ratio ICNG Icing Potential % ICPRB Icing Probability % ICSEV Icing Severity Index ICSEV Icing severity non-dim ILW Int Liquid Water g/m^2 Into Component Into m/s INV Height of MaxTw above FrzLvl ft IP Icing Pot IPLayer SFC Cold Lyr Probs Toward SLEET ft IRBand4 Infrared Imagery K JFWPRB9-20 Fire Wx: Prob Wind >= 17.5 kts and RH < 20% % KDP Specific Differential Phase deg/km KI K Index K KI K Index K L-I Computed LI \u2103 L3EchoTop Level III High Resolution Enhanced Echo Top Mosaic kft L3VIL Level III High Resolution VIL Mosaic kg/m^2 LAND Land Cover (0=sea, 1=land) Proportion LANDN Land-sea coverage (nearest neighbor) [land=1,sea=0] LAPR Lapse Rate K/m latitude Latitude \u00b0 LatLon Earth Location LCDC Low Cloud Cover % LgSP Large Scale Precipitation mm LgSP1hr Large Scale Precipitation(1 hour) mm LgSP3hr Large Scale Precipitation(3 hour) mm LHF Latent Heat Flux W/m^2 LightningDensity15min CG Lightning Density (15 min.) Flashes/km^2/min LightningDensity1min CG Lightning Density (1 min.) Flashes/km^2/min LightningDensity30min CG Lightning Density (30 min.) Flashes/km^2/min LightningDensity5min CG Lightning Density (5 min.) Flashes/km^2/min LightningJumpGrid Lightning Jump LightningJumpGridMax5min Lightning Jump Max LightningProbabilityNext30min CG Lightning Probability (0-30 min.) % LightningProbabilityNext60min CG Lightning Probability (0-60 min.) % LIsfc2x Lifted Index Sfc to \u2103 LLCompositeReflectivity Low-Level Composite Reflectivity dBZ LLWSWind LLWSWind kts LM5 Bunkers Left-Moving Supercell m/s LM6 Elevated Left-Moving Supercell m/s loCape CAPE to 3kmAGL (Tv) J/kg longitude Longitude \u00b0 LowLayerCompositeReflectivity Low Layer Composite Reflectivity (0-24 kft) dBZ LSOIL Liquid soil moisture content (non-frozen) kg/m^2 lsrSample LSR Sample LtgP2hr 2hr Lightning probability % LtgPcat Categorical lightning potential LTNG Lightning non-dim LTNG Max 1hr Lightning Threat (flashes/km^2) LWHR Long-Wave Radiative Heating Rate K/s lzfpc Lower Zone Primary Free Water Content % lzfsc Lower Zone Secondary Free Water Content % lztwc Lower Zone Tension Water Content % MAdv Moisture Adv (g/kg)/s maritimeObscuredSkyIFR ft maritimeObscuredSkyLIFR ft maritimeObscuredSkyMVFR ft maritimeObscuredSkySymIFR maritimeObscuredSkySymLIFR maritimeObscuredSkySymMVFR maritimeObscuredSkySymVFR maritimeObscuredSkyVFR ft maritimeWind20T34 kn maritimeWind34T48 kn maritimeWind48T64 kn maritimeWind64P kn maritimeWindDir20T34 deg maritimeWindDir34T48 deg maritimeWindDir48T64 deg maritimeWindDir64P deg maritimeWindDirLow deg maritimeWindGust20T34 kn maritimeWindGust34T48 kn maritimeWindGust48T64 kn maritimeWindGust64P kn maritimeWindGustLow kn maritimeWindLow kn MaxDVV Max 1hr Downdraft Vertical Velocity m/s maxEPT Max ThetaE (0-3kmAgl) K MaxGRPL1hr Max Hourly Graupel kg/m^2 MaxREF1hr Max Hourly Reflectivity dBZ MAXRH Maximum Relative Humidity % MAXRH12hr 12-hour Maximum Rel Humidity % MAXRH3hr 3-hour Maximum Rel Humidity % MAXUPHL Max 1hr Updraft Helicity m^2/s^2 MAXUPHL Max Updraft Helicity m^2/s^2 MaxUPHL1hr Max Hourly Updft Helicity m^2/s^2 MaxUVV Max 1hr Updraft Vertical Velocity m/s MAXUW U Component of Hourly Maximum Wind Speed m/s MAXVW V Component of Hourly Maximum Wind Speed m/s MaxWGS1hr Max Hourly Wind Gust m/s MaxWHRRR Maximum 1hr Wind Gust m/s MaxWind1hr MaxWind1hr m/s MCDC Medium Cloud Cover % MCon Moisture Flux Div (g/kg)/s MCon2 Moisture Flux Div (Conv only) (g/kg)/s MCONV Horizontal Moisture Convergence kg/kg*s^m^2/s MergedAzShear02kmAGL Low-Level Azimuthal Shear (0-2km AGL) 1/s MergedAzShear36kmAGL Mid-Level Azimuthal Shear (3-6km AGL) 1/s MergedBaseReflectivity Raw Merged Base Reflectivity dBZ MergedBaseReflectivityQC Merged Base Reflectivity dBZ MergedReflectivityAtLowestAltitude Merged Reflectivity At Lowest Altitude (RALA) dBZ MergedReflectivityComposite Raw Composite Reflectivity Mosaic dBZ MergedReflectivityQCComposite Composite Reflectivity dBZ MergedReflectivityQComposite Composite Reflectivity Mosaic dBZ MESH Maximum Estimated Size of Hail (MESH) mm MESHTrack120min MESH Tracks (120 min. accum.) mm MESHTrack1440min MESH Tracks (1440 min. accum.) mm MESHTrack240min MESH Tracks (240 min. accum.) mm MESHTrack30min MESH Tracks (30 min. accum.) mm MESHTrack360min MESH Tracks (360 min. accum.) mm MESHTrack60min MESH Tracks (60 min. accum.) mm minEPT Min ThetaE (3-6kmAgl) K MINRH Minimum Relative Humidity % MINRH12hr 12-hour Minimum Rel Humidity % MINRH3hr 3-hour Minimum Rel Humidity % Mix1 850-1000 mx thk Mix2 Thickness: Wintery MIX MIXR Humidity Mixing Ratio kg/kg mixRat Mixing Ratio g/kg MLLCL ML LCL Height m Mmag Moisture Trans Mag g\u00b7m/(kg\u00b7s) MMP MCS Maintenance Probability % MMSP MSLP (MAPS Reduction) Pa MnT Minimum Temperature K MnT Minimum Temperature K MnT12hr 12-hr Minimum Temperature K MnT3hr 3-hr Minimum Temperature K MnT6hr 6-hr Minimum Temperature K MnT_avg Min Temp Ensemble Mean K MnT_perts Min Temp Perturbations K MnT_std Min Temp Ensemble Std Dev K ModelHeight0C Freezing Level Height m ModelSurfaceTemperature Surface Temperature C ModelWetbulbTemperature Wet Bulb Temperature C MountainMapperQPE01H QPE - Mountain Mapper (1 hr. accum.) mm MountainMapperQPE03H QPE - Mountain Mapper (3 hr. accum.) mm MountainMapperQPE06H QPE - Mountain Mapper (6 hr. accum.) mm MountainMapperQPE12H QPE - Mountain Mapper (12 hr. accum.) mm MountainMapperQPE24H QPE - Mountain Mapper (24 hr. accum.) mm MountainMapperQPE48H QPE - Mountain Mapper (48 hr. accum.) mm MountainMapperQPE72H QPE - Mountain Mapper (72 hr. accum.) mm MpV Saturated Geo Pot Vort K/hPa/s MRETag Echo Tops m MRMSVIL Vertically Integrated Liquid (VIL) kg/m^2 MRMSVIL120min VIL Max (120 min.) kg/m^2 MRMSVIL1440min VIL Max (1440 min.) kg/m^2 MRMSVILDensity Vertically Integrated Liquid (VIL) Density g/m^3 MSFDi Isen Moisture Stability Flux Div (g*hPa*m)/(kg*K*s^2) MSFi Isentropic Moisture Stability Flux g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSFmi Isen Moisture Stability Flux Mag g\u00b7hPa\u00b7m/(kg\u00b7K\u00b7s) MSG Mont Strm Func m MSG Montgomery Stream Function m^2/s^2 msl-P MSL Pressure hPa msl-P2 MSL Pressure (2) hPa msl-P_avg MSL Press Ensemble Mean hPa msl-P_perts MSL Press Perturbations hPa msl-P_std MSL Press Ensemble Std Dev hPa MSL1 MSL1 ft MSL2 MSL2 ft MSL3 MSL3 ft MSL4 MSL4 ft MSL5 MSL5 ft MSLSA Altimeter hPa MTV Moisture Trans Vecs g\u00b7m/(kg\u00b7s) muCape Most Unstable CAPE J/kg MultiSensorP1QPE01H QPE - Multi Sensor P1 (1 hr. accum.) mm MultiSensorP1QPE03H QPE - Multi Sensor P1 (3 hr. accum.) mm MultiSensorP1QPE06H QPE - Multi Sensor P1 (6 hr. accum.) mm MultiSensorP1QPE12H QPE - Multi Sensor P1 (12 hr. accum.) mm MultiSensorP1QPE24H QPE - Multi Sensor P1 (24 hr. accum.) mm MultiSensorP1QPE48H QPE - Multi Sensor P1 (48 hr. accum.) mm MultiSensorP1QPE72H QPE - Multi Sensor P1 (72 hr. accum.) mm MultiSensorP2QPE01H QPE - Multi Sensor P2 (1 hr. accum.) mm MultiSensorP2QPE03H QPE - Multi Sensor P2 (3 hr. accum.) mm MultiSensorP2QPE06H QPE - Multi Sensor P2 (6 hr. accum.) mm MultiSensorP2QPE12H QPE - Multi Sensor P2 (12 hr. accum.) mm MultiSensorP2QPE24H QPE - Multi Sensor P2 (24 hr. accum.) mm MultiSensorP2QPE48H QPE - Multi Sensor P2 (48 hr. accum.) mm MultiSensorP2QPE72H QPE - Multi Sensor P2 (72 hr. accum.) mm MXDVV Max Downdraft Vertical Velocity m/s MXREF Max 1hr CAPPI dB MXSALB Maximum Snow Albedo % MxT Maximum Temperature K MxT Maximum Temperature K MxT12hr 12-hr Maximum Temperature K MxT3hr 3-hr Maximum Temperature K MxT6hr 6-hr Maximum Temperature K MxT_avg Max Temp Ensemble Mean K MxT_perts Max Temp Perturbations K MxT_std Max Temp Ensemble Std Dev K MXUVV Max Updraft Vertical Velocity m/s NBDSF Near IR Beam Downward Solar Flux W/m^2 NBE Neg Buoy Energy J/kg NDDSF Near IR Diffuse Downward Solar Flux W/m^2 NetIO Net Isen Adiabatic Omega Pa/s NLAT Latitude (-90 to 90) deg NST Nonsupercell Tornado (>1 NST Threat) NST1 Nonsupercell Tornado (>1 NST Threat NST2 Nonsupercell Tornado (>1 NST Threat numLevels Number of Levels O3MR Ozone Mixing Ratio kg/kg obscuredSky2IFR ft obscuredSky2LIFR ft obscuredSky2MVFR ft obscuredSky2VFR ft obscuredSky3IFR ft obscuredSky3LIFR ft obscuredSky3MVFR ft obscuredSky3VFR ft obscuredSkyIFR ft obscuredSkyLIFR ft obscuredSkyMVFR ft obscuredSkySym2IFR obscuredSkySym2LIFR obscuredSkySym2MVFR obscuredSkySym2VFR obscuredSkySym3IFR obscuredSkySym3LIFR obscuredSkySym3MVFR obscuredSkySym3VFR obscuredSkySymIFR obscuredSkySymLIFR obscuredSkySymMVFR obscuredSkySymVFR obscuredSkyVFR ft obsWind30T50 kn obsWind50P kn obsWindDir30T50 deg obsWindDir50P deg obsWindDirLow deg obsWindGust30T50 kn obsWindGust50P kn obsWindGustLow kn obsWindLow kn obVis Obstruction to Vision OGRD Current Vectors m/s OmDiff mb between -15C Omega and MaxOmega hPa ONE One OTIM Observation Time OZCON Ozone Concentration ppb OZMAX1 Ozone Daily Max from 1-hour Average ppbV OZMAX8 Ozone Daily Max from 8-hour Average ppbV P Pressure hPa P Pressure Pa PAdv Pressure Adv hPa/s PBE Pos Buoy Energy J/kg PBLREG Planetary Boundary Layer Regime PEC Precipitation Potential Placement in PEC_TT24 24h Cumulative Precip Potential Placement in PERPW Primary Wave Mean Period s PERPW Primary Wave Period s Perranl Pressure Analysis Uncertainty Pa Perranl Pressure Error Analysis Pa PERSW Secondary wave mean period s PERSW Secondary Wave Mean Period s PEVAP Potential Evaporation mm PEVPR Potential Evaporation Rate W/m^2 PFrnt 2-D Frontogenesis/Mag Fn K/m/s PGrd Pressure Gradient hPa/m PGrd1 Pressure Gradient dPa/km PGrdM Pressure Grad Mag hPa/m PICE Pecipitating ice content g/m^3 PIVA Thermal Wind Vort Adv /s pkPwr Peak Power dB PLI Parcel Lifted Index (to 500 mb) K PLIxc1 Prob LI < 0 % PLIxc2 Prob LI < -2 % PLIxc3 Prob LI < -4 % PLIxc4 Prob LI < -6 % PLIxc5 Prob LI < -8 % PMSL Pressure Reduced to MSL Pa PMSLmean Mean Sea Level Pressure mean hPa PMSLsprd Mean Sea Level Pressure sprd hPa poesDif11u3_7uIR POES 11u-3.7u Satellite GenericPixel POP Probability of precip % POP12hr 12hr precip probability % POP3hr 3hr precip probability % POP6 POP 6hr % POP6hr 6hr precip probability % POP_001 Prob of .1in/6hr Precip % POP_002 Prob of .3in/6hr Precip % POP_003 Prob of .6in/6hr Precip % POP_004 Prob of 1in/6hr Precip % POP_005 Prob of 2in/6hr Precip % POP_006 Prob of .1in/12hr Precip % POP_007 Prob of .3in/12hr Precip % POP_008 Prob of .6in/12hr Precip % POP_009 Prob of 1in/12hr Precip % POP_010 Prob of 2in/12hr Precip % POP_011 Prob of .05in/6hr Precip % POP_012 Prob of .05in/12hr Precip % POP_013 Prob of 1in/24hr Precip % POP_014 Prob of 2in/24hr Precip % POP_015 Prob of 2in/36hr Precip % POP_016 Prob of 2in/48hr Precip % POROS Soil Porosity Proportion POSH Probability of Severe Hail (POSH) % PoT Potential Temp K PoT Potential Temperature K PoTA Pot Temp Adv K/s PPAM Prob Precip abv nrml % PPAN Prob Precip abv nrml % PPAS Prob Precip abv nrml % PPBM Prob Precip blw nrml % PPBN Prob Precip blw nrml % PPBS Prob Precip blw nrml % PPFFG Probability of excessive rain % PPI Precipitation Probability Index % PPI1hr Precipitation Probability Index(1 hour) % PPI6hr Precipitation Probability Index(6 hour) % PPNN Prob Precip near nrml % PR Precip Rate mm/s PR Precipitation Rate mm/s prCloudHgt prCLoud converted to Hgt m prCloudHgtHi prCloudHgt when in hi layer m prCloudHgtLow prCloudHgt when in low layer m prCloudHgtMid prCloudHgt when in mid layer m prcp12hr 12hr probability of 0.01 inch of precip % prcp3hr 3hr probability of 0.01 inch of precip % prcp6hr 6hr probability of 0.01 inch of precip % Precip24Hr Precip24Hr in Precip3Hr Precip3Hr in Precip6Hr Precip6Hr in PrecipRate Radar Precipitation Rate (SPR) mm/hr PrecipType Surface Precipitation Type (SPT) PRESA Pressure Anomaly Pa PresStk Obsolete, replace later presWeather Present Weather Prob34 Prob of Wind Speed > 34 knots m/s Prob50 Prob of Wind Speed > 50 knots m/s Prob64 Prob of Wind Speed > 64 knots m/s ProbDpT50 Probability of Dewpoint temp > 50 degF % ProbDpT55 Probability of Dewpoint temp > 55 degF % ProbDpT60 Probability of Dewpoint temp > 60 degF % ProbDpT65 Probability of Dewpoint temp > 65 degF % ProbDpT70 Probability of Dewpoint temp > 70 degF % ProbVSS10p3Layer Prob Vertical Speed Shear > 20 kts % ProbVSS10p3Sfc Prob 0-2kft Shear > 20 kts % PROCON Probability of convection % PROCON2hr 2hr Convection probability % PROLGHT Lightning probability % PROLGHT2hr 2hr Lightning probability % PRP01H 1hr MRMS Radar-Only ARI year PRP03H 3hr MRMS Radar-Only ARI year PRP06H 6hr MRMS Radar-Only ARI year PRP12H 12hr MRMS Radar-Only ARI year PRP24H 24hr MRMS Radar-Only ARI year PRP30M 30min MRMS Radar-Only ARI year PRPMax Maximum MRMS Radar-Only ARI year PRSIGSV Total Probability of Extreme Severe Thunderstorms % PRSVR Total Probability of Severe Thunderstorms % Psfc Surface pressure hPa PT3 3 hr Pres Change hPa PTAM Prob Temp abv nrml % PTAN Prob Temp abv nrml % PTAS Prob Temp abv nrml % PTBM Prob Temp blw nrml % PTBN Prob Temp blw nrml % PTBS Prob Temp blw nrml % PTNN Prob Temp near nrml % Ptopo Surface pressure hPa PTOR Tornado Probability % PTvA Pot Vorticity Adv K/hPa/s*1.0E5 PTyp Precip Type PTypeRefIP Prob Precip Type is Refreezing Ice Pellets % pV Potential Vorticity K/hPa/s pVeq Equiv Pot Vort K/hPa/s PVORT Potential Vorticity m^2 kg^-1 s^-1 PVV Omega Pa/s PVV Vertical Velocity Pressure Pa/s PW Precipitable Water mm PW Preciptable H2O in PW2 Preciptable H2O >1.4 in. in PWmean Precipitable Water mean mm PWS34 Incremental Prob of wind speed >= 34 knots % PWS50 Incremental Prob of wind speed >= 50 knots % PWS64 Incremental Prob of wind speed >= 64 knots % PWsprd Precipitable Water sprd mm qDiv Div Q K/m^2/s*1.0E-12 QMAX Maximum specific humidity at 2m kg/kg QMIN Minimum specific humidity at 2m kg/kg qnVec Qn Vectors K/m^2/s QPECrestSoilMoisture QPE-CREST Soil Moisture % QPECrestStreamflow QPE-CREST Maximum Streamflow (m^3)*(s^-1) QPECrestUStreamflow QPE-CREST Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPEFFG01H 1hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG03H 3hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFG06H 6hr MRMS Radar-Only QPE-to-FFG Ratio QPEFFGMax Maximum MRMS Radar-Only QPE-to-FFG Ratio QPEHPStreamflow QPE-Hydrophobic Maximum Streamflow (m^3)*(s^-1) QPEHPUStreamflow QPE-Hydrophobic Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPESacSoilMoisture QPE-SAC-SMA Soil Moisture % QPESacStreamflow QPE-SAC-SMA Maximum Streamflow (m^3)*(s^-1) QPESacUStreamflow QPE-SAC-SMA Maximum Unit Streamflow (m^3) (s^-1) (km^-2) QPV1 QVec Conv K/m^2/s*1.0E-12 QPV2 Negative EPV* K/hPa/s QPV3 QPV Net QPV4 QG-EPV, RH>75% qsVec Qs Vectors K/m^2/s qVec Q Vectors K/m^2/s RadarAQI01H Radar Accumulation Quality Index 1 hour RadarAQI03H Radar Accumulation Quality Index 3 hour RadarAQI06H Radar Accumulation Quality Index 6 hour RadarAQI12H Radar Accumulation Quality Index 12 hour RadarAQI24H Radar Accumulation Quality Index 24 hour RadarAQI48H Radar Accumulation Quality Index 48 hour RadarAQI72H Radar Accumulation Quality Index 72 hour RadarOnlyQPE01H QPE - Radar Only (1 hr. accum.) mm RadarOnlyQPE03H QPE - Radar Only (3 hr. accum.) mm RadarOnlyQPE06H QPE - Radar Only (6 hr. accum.) mm RadarOnlyQPE12H QPE - Radar Only (12 hr. accum.) mm RadarOnlyQPE12Z QPE - Radar Only (Since 12Z accum.) mm RadarOnlyQPE15M QPE - Radar Only (15 min accum.) mm RadarOnlyQPE24H QPE - Radar Only (24 hr. accum.) mm RadarOnlyQPE48H QPE - Radar Only (48 hr. accum.) mm RadarOnlyQPE72H QPE - Radar Only (72 hr. accum.) mm RadarQualityIndex Radar Quality Index (RQI) RAIN Rain content g/m^3 Rain1 850-1000 ra thk Rain2 700-850 ra thk Rain3 Thickness: Rain Likely Raob Raob Interleaved Data rawMETAR24Chg rawMETAR24Chg \u2103 RCQ Humidity parameter in canopy conductance Proportion RCS Solar parameter in canopy conductance Proportion RCSOL Soil moisture parameter in canopy conductance Proportion Reflectivity0C Reflectivity at 0C dBZ ReflectivityAtLowestAltitude Reflectivity At Lowest Altitude (RALA) dBZ ReflectivityM10C Reflectivity at -10C dBZ ReflectivityM15C Reflectivity at -15C dBZ ReflectivityM20C Reflectivity at -20C dBZ ReflectivityM5C Reflectivity at -5C dBZ RETOP Echo Top m RH Rel Humidity % RH Relative Humidity % RH_001 Prob of RH Grtn 70 percent % RH_001_bin Binary Prob of RH Grtn 70 percent RH_001_perts Prob of RH Grtn 70 percent Perts RH_002 Prob of RH Grtn 90 percent % RH_002_bin Binary Prob of RH Grtn 90 percent RH_002_perts Prob of RH Grtn 90 percent Perts RH_avg Rel Humidity Ensemble Mean % RH_perts Rel Humidity Perturbations % RH_std Rel Humidity Ensemble Std Dev % RHmean Relative Humidity mean % RHsprd Relative Humidity spread % RIME Rime Factor non-dim RLYRS Number of Soil Layers in Root Zone Numeric RM5 Bunkers Right-Moving Supercell m/s RM6 Elevated Right-Moving Supercell m/s RMGH2 t-2Day Mean Hgt m RMprop Right Mover Propagation Vector RMprop2 Elevated Right Mover Propagation Vector rms root mean square kn Ro Rossby Number Vag/Vg RotationTrackLL120min Low-Level Rotation Tracks 0-2km AGL (120 min. accum.) 1/s RotationTrackLL1440min Low-Level Rotation Tracks 0-2km AGL (1440 min. accum.) 1/s RotationTrackLL240min Low-Level Rotation Tracks 0-2km AGL (240 min. accum.) 1/s RotationTrackLL30min Low-Level Rotation Tracks 0-2km AGL (30 min. accum.) 1/s RotationTrackLL360min Low-Level Rotation Tracks 0-2km AGL (360 min. accum.) 1/s RotationTrackLL60min Low-Level Rotation Tracks 0-2km AGL (60 min. accum.) 1/s RotationTrackML120min Mid-Level Rotation Tracks 3-6km AGL (120 min. accum.) 1/s RotationTrackML1440min Mid-Level Rotation Tracks 3-6km AGL (1440 min. accum.) 1/s RotationTrackML240min Mid-Level Rotation Tracks 3-6km AGL (240 min. accum.) 1/s RotationTrackML30min Mid-Level Rotation Tracks 3-6km AGL (30 min. accum.) 1/s RotationTrackML360min Mid-Level Rotation Tracks 3-6km AGL (360 min. accum.) 1/s RotationTrackML60min Mid-Level Rotation Tracks 3-6km AGL (60 min. accum.) 1/s routed_flow Channel Routed Flow [Low] routed_flow_c Channel Routed Flow [Combo] routed_flow_h Channel Routed Flow [Hi] routed_flow_m Channel Routed Flow [Mid] RR Reflectivity dBZ RRtype Radar w/PType dBZ RRV Radial Velocity kts RSMIN Minimal Stomatal Resistance s/m RV Rel Vorticity /s RWMR Rain Mixing Ratio kg/kg s2H2O_CLIMO Climatological -SON/DJF/MAM- Snow-to-water ratio s2H2O_GFS GFS Snow-to-water ratio s2H2O_MEAN HPC Mean Snow-to-water ratio s2H2O_NAM NAM Snow-to-water ratio SA12hr 12 Hr Snow Accum mm SA1hr 1 Hr Snow Accum mm SA24hr 24 Hr Snow Accum mm SA36hr 36 Hr Snow Accum mm SA3hr 3 Hr Snow Accum mm SA48hr 48 Hr Snow Accum mm SA6hr 6 Hr Snow Accum mm SAcc Snow Accum via Thickness mm SALIN Practical Salinity SALTY Salinity kg/kg SAmodel Model Run Snow via Thickness mm SArun Model Run Snow Accum via Thickness mm satCloudPhase Satellite Cloud Phase[8.5-11.2 um] K SATD Saturation Deficit Pa satDif11u12uIR 11u-12u Satellite GenericPixel satDif11u13uIR 11u-13u Satellite GenericPixel satDif11u3_9uIR 11u-3.9u Satellite GenericPixel satDivWVIR IR in WV Satellite DerivedWV satFog Satellite Fog[3.9-11.2 um] K satMoisture Satellite Moisture[11.2-12.3 um] K satSnow Satellite Snow[0.64-1.61 um] satUpperLevelInfo Satellite Upper Level Info[11.2-6.19 um] K satVegetation Satellite Vegetation[0.64-0.87 um] SBSNO Sublimation (evaporation from snow) W/m^2 SBT113 Simulated Brightness Temperature for GOES 11, Channel 3 K SBT114 Simulated Brightness Temperature for GOES 11, Channel 4 K SBT123 Simulated Brightness Temperature for GOES 12, Channel 3 K SBT124 Simulated Brightness Temperature for GOES 12, Channel 4 K sce NOHRSC Snow Coverage Elevation kft SCP Snow Cover SCP Snow Cover % SCWind SCWind m/s SDEN Snow Density kg/m\u00b3 SDENCLIMO Climatological -SON/DJF/MAM- Snow Density kg/m\u00b3 SDENGFS GFS Snow Density kg/m\u00b3 SDENMEAN HPC Mean Snow Density kg/m\u00b3 SDENNAM NAM Snow Density kg/m\u00b3 SeamlessHSR Seamless Hybrid Scan Reflectivity (SHSR) dBZ SeamlessHSRHeight Seamless Hybrid Scan Reflectivity (SHSR) Height km SFCR Surface Roughness m SH Spec Humidity SH Specific Humidity % Shear Shear (Vector) /s SHF Sensible Heat Flux W/m^2 SHI Severe Hail Index (SHI) ShrMag Shear Magnitude /s shWlt Showalter Index \u2103 SHx Spec Humidity g/kg SIGHAILPROB Significant Hail Probability % SIGTRNDPROB Significant Tornado Probability % SIGWINDPROB Significant Wind Probability % SIPD Supercooled Large Droplet Threat SLDP Supercooled Large Droplet Threat SLI Lifted Index K SLI Surface Lifted Index K SLTYP Surface Slope Type Index SMC Soil Moisture % SMDRY Direct Evaporation Cease (soil moisture) Proportion SMREF Transpiration Stress-onset (soil moisture) Proportion SnD Snow Depth m SnD Snow Depth m SNFALB Snow-Free Albedo SNMR Snow Mixing Ratio kg/kg SNOL12c1 Prob 12-hr SNOW > 1 in % SNOL12c10 Prob 12-hr SNOW > 24 in % SNOL12c2 Prob 12-hr SNOW > 2 in % SNOL12c3 Prob 12-hr SNOW > 4 in % SNOL12c4 Prob 12-hr SNOW > 6 in % SNOL12c5 Prob 12-hr SNOW > 7.5 in % SNOL12c6 Prob 12-hr SNOW > 8 in % SNOL12c7 Prob 12-hr SNOW > 10 in % SNOL12c8 Prob 12-hr SNOW > 12 in % SNOL12c9 Prob 12-hr SNOW > 16 in % SNOL12mean 12-hr Snowfall mean mm SNOL12sprd 12-hr Large scale Snowfall sprd mm SNOM Snow Melt kg/m^2 snoRat snoRatCrocus Snow Ratio - Crocus/ECMWF snoRatEMCSREF Snow Ratio: EMC SREF snoRatOv2 snoRatSPC Snow Ratio - SPC snoRatSPCdeep Snow Ratio - SPC 0-3km MaxT snoRatSPCsurface Snow Ratio - SPCsurface snoRatWPC Snow Ratio - WPC Mean SNOW Snow content g/m^3 Snow1 850-1000 sn thk Snow2 700-850 sn thk Snow3 Thickness: Snow Likely snowd3hr 3hr Snow Depth m snowd6hr 6hr Snow Depth m SNOWLVL Snow Level m SnowT Preferred Ice Growth K SNSQ Snow Sql Parameter SNW Sect Norm Wind m/s SNWA Ageo Sect Norm Wind kn SOILM Soil Moisture Content kg/m^2 SOILW Volumetric Soil Moisture Content Proportion SOTYP Soil Type SPAcc Storm Total Precip mm SPBARO Barotropic Velocity m/s SPC Current Speed m/s SPC Surface Current Speed m/s Spd24Chg Spd24Chg kn sRank Feature Strength Rank SRMl Storm Relative Flow Vectors LM m/s SRMlM Storm Relative Flow Mag LM m/s SRMm Storm Relative Flow Vecs (Mean Wind) m/s SRMmM Storm Relative Flow Mag (Mean Wind) m/s SRMr Storm Relative Flow Vecs (RM) m/s SRMrM Storm Relative Flow Mag (RM) m/s SSAcc Storm Total Snow mm SSi Isentropic Static Stability hPa/K SSP Significant Severe Parameter SSRUN Storm Surface Runoff kg/m^2 St-Pr Stable Precipitation mm St-Pr1hr 1 hr Stable Precipitation mm St-Pr2hr 2 hr Stable Precipitation mm St-Pr3hr 3 hr Stable Precipitation mm staName StaName stationId Station Id C stdDewpoint Std Dewpoint K stdMaxWindSpeed Std Max Wind Speed m/s stdSkyCover Std Sky Cover stdTemperature Std Temperature K stdWindDir Std Wind Direction stdWindSpeed Std Wind Speed m/s STP Sig. Tornado Parameter (>1 Sig Tor) STP1 Sig. Tornado Parameter (>1 Sig Tor) STRM Stream Function m^2/s StrmMot Storm Motion kn StrTP Strong Tornado Parameter m/s^2 SuCP Supercell Composite Parameter SUNSD Sunshine Duration s SuperLayerCompositeReflectivity Super Layer Composite Reflectivity (33-60 kft) dBZ SVV Sigma Coordinate Vertical Velocity /s SWDIR Direction of Swell Waves deg SWdir Swell Direction swe NOHRSC Snow Water Equivalent in SWELL Significant Height of Swell Waves m SWELL Swell Height m SWHR Solar Radiative Heating Rate K/s SWLEN Mean length of swell waves m SWPER Mean Period of Swell Waves s SWPER Swell Period s SWSTP Steepness of swell waves swtIdx Sweat Index SynPrecip24Hr SynPrecip24Hr mm SynthPrecipRateID QPE - Synthetic Precip Rate ID T Temperature K T Temperature K T24Chg T24Chg \u00b0F T24hr 24 hr Temperature K T_001 Prob of Temp Lstn 0C % T_001_bin Binary Prob of Temp Lstn 0C T_001_perts Prob of Temp Lstn 0C Perturbations T_avg Temperature Ensemble Mean K T_perts Temperature Perturbations K T_std Temperature Ensemble Std Dev K Ta Temperature Anomaly K TAdv Temperature Adv K/s Tc1 Prob Temp < O C % TCC Total Cloud Cover % TCCerranl Total Cloud Cover Error Analysis % TCICON Total Column-Integrated Condensate kg/m^2 TCLSW Total Column Integrated Supercooled Liquid Water kg/m^2 TCOLG Total Column Integrated Graupel kg/m^2 TCOLI Total Column-Integrated Cloud Ice kg/m^2 TCOLM Total Column Integrated Melting Ice kg/m^2 TCOLR Total Column Integrated Rain kg/m^2 TCOLS Total Column Integrated Snow kg/m^2 TCOLW Total Column-Integrated Cloud Water kg/m^2 TCOND Total Condensate kg/kg Tdef Total Deformation /s*100000.0 Tdend Dendritic Growth Temperatures K Terranl Temperature Analysis Uncertainty K Terranl Temperature Error Analysis K TGrd Temperature Gradient K/m TGrdM Temperature Grad Mag K/m ThetaE Theta E K ThGrd Temperature Gradient \u2103/m Thom5 S-R Flow Thom5a S-R Flow Thom6 S-R Flow Suggests Tor Supercells ThP Thunderstorm probability % ThP Thunderstorm Probability % ThP12hr 12hr Thunderstorm probability % ThP3hr 3hr Thunderstorm probability % ThP6hr 6hr Thunderstorm probability % ThPcat Categorical thunderstorm TiltAng Radar Tilt Angle deg TKE Turb Kin Energy J/kg TKE Turbulent Kinetic Energy J/kg Tmax Layer Max Temperature K TmDpD Temp minus Dewp Dep Tmean Temperature mean K Tmin Layer Min Temperature K Topo Topography m TORi BRNSHR,EHI,LRate>3C/km,CIN < 150 TORi2 BRNSHR,EHI,0-2km LRate > 3C/km TotQi Isentropic Total Moisture g\u00b7hPa/(kg\u00b7K) TOTSN 24hr Snowfall m TOTSN12hr 12hr Snowfall m TOZNE Total Ozone DU TP Precipitation mm TP Total Precipitation mm TP120hr 5 Day Total Gridded Precip in TP12c1 12-hr POP > 0.01 in % TP12c2 12-hr POP > 0.05 in % TP12c3 12-hr POP > 0.10 in % TP12c4 12-hr POP > 0.25 in % TP12c5 12-hr POP > 0.50 in % TP12c6 12-hr POP > 1.00 in % TP12c7 12-hr POP > 1.50 in % TP12c8 12-hr POP > 2.00 in % TP12hr 12 Hr Accum Precip mm TP12hr Total Precipitation(12 hours) mm TP12mean 12-hr Total Precip mean mm TP12sprd 12-hr Total Precip sprd mm TP168hr 7 Day Total Gridded Precip mm TP18hr Total Precipitation(18 hours) mm TP1hr 1 Hr Accum Precip mm TP1hr Total Precipitation(1 hour) mm TP24c1 24-hr POP > 0.01 in % TP24c2 24-hr POP > 0.05 in % TP24c3 24-hr POP > 0.10 in % TP24c4 24-hr POP > 0.25 in % TP24c5 24-hr POP > 0.50 in % TP24c6 24-hr POP > 1.00 in % TP24c7 24-hr POP > 1.50 in % TP24c8 24-hr POP > 2.00 in % TP24hr 24 Hr Accum Precip mm TP24hr Total Precipitation(24 hours) mm TP24hr_avg 24hr Precip Ensemble Mean mm TP24hr_perts 24hr Precip Perturbations mm TP24hr_std 24hr Precip Ensemble Std Dev mm TP24mean 24-hr Total Precip mean mm TP24sprd 24-hr Total Precip sprd mm TP36hr 36 Hr Accum Precip mm TP3c1 3-hr POP > 0.01 in % TP3c2 3-hr POP > 0.05 in % TP3c3 3-hr POP > 0.10 in % TP3c4 3-hr POP > 0.25 in % TP3c5 3-hr POP > 0.50 in % TP3c6 3-hr POP > 1.00 in % TP3c7 3-hr POP > 1.50 in % TP3c8 3-hr POP > 2.00 in % TP3hr 3 Hr Accum Precip mm TP3hr Total Precipitation(3 hours) mm TP3mean 3-hr Total Precip mean mm TP3sprd 3-hr Total Precip sprd mm TP48hr 48 Hr Accum Precip mm TP48hr Total Precipitation(48 hours) mm TP6c1 6-hr POP > 0.01 in % TP6c2 6-hr POP > 0.05 in % TP6c3 6-hr POP > 0.10 in % TP6c4 6-hr POP > 0.25 in % TP6c5 6-hr POP > 0.50 in % TP6c6 6-hr POP > 1.00 in % TP6c7 6-hr POP > 1.50 in % TP6c8 6-hr POP > 2.00 in % TP6hr 6 Hr Accum Precip mm TP6hr Total Precipitation(6 hours) mm TP6hr_avg 6hr Precip Ensemble Mean mm TP6hr_perts 6hr Precip Perturbations mm TP6hr_std 6hr Precip Ensemble Std Dev mm TP6mean 6-hr Total Precip mean mm TP6sprd 6-hr Total Precip sprd mm TP72hr 3 Day Total Gridded Precip mm TP9hr Total Precipitation(9 hours) mm TP_ACR ACR Precip in TP_ALR ALR Precip in TP_avg Precip Ensemble Mean mm TP_ECMWF ECMWF Precipitation in TP_ECMWF12hr ECMWF 12 Hr Accum Precip in TP_FWR FWR Precip in TP_HPC HPC Precip in TP_KRF KRF Precip in TP_MSR MSR Precip in TP_ORN ORN Precip in TP_perts Precip Perturbations mm TP_PTR PTR Precip in TP_RHA RHA Precip in TP_RSA RSA Precip in TP_std Precip Ensemble Std Dev mm TP_STR STR Precip in TP_TAR TAR Precip in TP_TIR TIR Precip in TP_TUA TUA Precip in TPFI Turbulence Index TPFI Turbulence Potential Forecast Index TP-GFS Total Precipitation for GFS mm tpHPC HPC Precip in tpHPCndfd Precipitation mm TPmodel Model Run Precip mm TPrun Run Accum Pcpn mm TPrun_avg Accum Precip Ensemble Mean mm TPrun_perts Accum Precip Perturbations mm TPrun_std Accum Precip Ensemble Std Dev mm TPx12x6 12-6 Hr Accum Precip mm TPx1x3 3x1 Hr Accum Precip mm TPx3 3 Hr Accum Precip mm TQIND TQ Index 12=Cold Pool 17=Embedded Convection C TRANS Transpiration W/m^2 transparentMaritimeSky ft transparentMaritimeSkySym ft transparentSky ft transparentSky2 ft transparentSky3 ft transparentSkySym ft transparentSkySym2 ft transparentSkySym3 ft TransWind TransWind kts TShrMi S=0-6km Shear Supports Scells TSLSA 3 hr Pres Change hPa TSNOW Total Snow kg/m^2 TSOIL Soil Temperature K Tsprd Temperature spread K TSRWE Total Snowfall Rate Water Equivalent kg/m^2/s Tstk Temp Stack K tTOT Total Totals C TURB Turbulence Index TV Virtual Temperature K TW Wet Bulb Temp K tWind Thermal Wind kn tWindU U Component of Thermal Wind kn tWindV V Component of Thermal Wind kn TwMax Layer Max Wet-bulb Temperature K TwMin Layer Min Wet-bulb Temperature K TWO Two Twstk Wet-bulb Temp Stack K TxSM Filtered-500km Temp C U-GWD Zonal Flux of Gravity Wave Stress N/m^2 UFLX Momentum Flux, U-Component N/m^2 uFX Geo Momentum m/s ulSnoRat ULWRF Comp Refl dBZ ULWRF Upward Long-Wave Rad. Flux W/m^2 UPHL Updraft Helicity m^2/s^2 USTM U-Component of Storm Motion m/s USWRF Reflectivity dBZ USWRF Upward Short-Wave Radiation Flux W/m^2 uv2 Horz Variance m^2/s^2 uW u Component of Wind m/s uW U-Component of Wind m/s uWerranl uWmean m/s uWsprd uWStk U Stack m/s uzfwc Upper Zone Free Water Content % uztwc Upper Zone Tension Water Content % V-GWD Meridional Flux of Gravity Wave Stress N/m^2 VAdv Vorticity Adv /s*1.0E9 VAdvAdvection Vorticity Adv /s VAPP Vapor Pressure Pa VBDSF Visible Beam Downward Solar Flux W/m^2 VEG Vegetation % vertCirc Vertical Circulation VFLX Momentum Flux, V-Component N/m^2 VGP Vort Gen Param VGTYP Vegetation Type Integer (0-13) VII Vertically Integrated Ice (VII) kg/m^2 VILIQ Vertically Integrated Liquid (VIL) kg/m^2 Vis Visibility m Vis Visibility m visbyIFR mi visbyLIFR mi visbyMVFR mi visbyVFR mi Visc1 Prob Sfc Visibility < 1 mile % Visc2 Prob Sfc Visibility < 3 miles % Visc23 Prob Sfc Visibility < 5 miles % visCat Categorical visibility Viserranl Visibility Analysis Uncertainty m Viserranl Visibility Error Analysis m Visible Visible Imagery VPT Virtual Potential Temperature K VRATE Ventilation Rate m^2/s vSmthW Verticall Smoothed Wind m/s VSS Vertical Shear Speed /s VSTM V-Component of Storm Motion m/s VTMP Virtual Temperature K vTOT Vertical Totals VUCSH Vertical u-component shear /s VV Vertical velocity m/s VVCSH Vertical v-component shear /s vW v Component of Wind m/s vW V-Component of Wind m/s vWerranl vWmean m/s vwpSample VWP Sample VWSH Vertical Speed Shear /s vWsprd vWStk V Stack m/s w2 Vert Variance m^2/s^2 WarmRainProbability Probability of Warm Rain % water_depth Hillslope Water Depth in WaterVapor Water Vapor Imagery K WATR Water Runoff kg/m^2 WCD Warm Cloud Depth Approx.: Frzlvl-LCL Thickness m WD Wind Direction (from which blowing) deg WD Wind direction deg WDea Wind Direction Analysis Uncertainity deg WDEPTH Geometric Depth Below Sea Surface m WDerranl Wind Direction Error Analysis deg wDiv Wind Divergence /s WDmean Wind Direction mean deg WEASD Water Equiv accum snow depth m WEASD Water Equivalent of Accumulated Snow Depth mm WGH 5-Wave Geopotential Height gpm WGH 5-wave geopotential height m WGS Wind Gust Speed m/s WGS Wind Gust Speed m/s WGS1hr Max 1-hr Wind Gust Speed m/s WGSea Wind Gust Speed Analysis Uncertainty m/s WGSerranl Wind Gust Speed Error Analysis m/s WGSMX1hr Max Hourly Wind Gust m/s WILT Wilting Point Proportion Wind Wind m/s Wind_avg Wind Ensemble Mean m/s Wind_perts Wind Perturbations m/s Windmean Mean Wind kn WINDPROB Wind Probability % WMIXE Wind Mixing Energy J WndChl Wind Chill K WS Wind Speed m/s WSc1 Prob SFC wind speed > 25 kt % WSc2 Prob SFC wind speed > 34 kt % WSc3 Prob SFC wind speed > 48 kt % WSc4 Prob SFC wind speed > 50 kt % WSc6 Prob SFC wind speed > 20 kt % WSc7 Prob SFC wind speed > 30 kt % WSc8 Prob SFC wind speed > 40 kt % WSerranl Wind Speed Error Analysis m/s WSmean Wind Speed mean m/s wSp Wind speed m/s wSp_001 Prob of Wind Grtn 40kts % wSp_001_bin Binary Prob of Wind Grtn 40kts wSp_001_perts Prob of Wind Grtn 40kts Perts wSp_002 Prob of Wind Grtn 50kts % wSp_002_bin Binary Prob of Wind Grtn 50kts wSp_002_perts Prob of Wind Grtn 50kts Perts wSp_003 Prob of Wind Grtn 60kts % wSp_003_bin Binary Prob of Wind Grtn 60kts wSp_003_perts Prob of Wind Grtn 60kts Perts wSp_004 Prob of Wind Grtn 30kts % wSp_004_bin Binary Prob of Wind Grtn 30kts wSp_004_perts Prob of Wind Grtn 30kts Perts wSp_avg Windspeed Ensemble Mean m/s wSp_perts Windspeed Perturbations m/s wSp_std Windspeed Ensemble Std Dev m/s wSpea Wind Speed Analysis Uncertainty kn wSpmean Mean Windspeed kt wSpsprd Windspread spread kt WSsprd Wind Speed sprd m/s WVDIR Direction of Wind Waves deg WVdir Wind Wave Direction wvHeight wvHeight m WVHGT Significant Height of Wind Waves m WVHGT Wind Wave Height m WVLEN Mean length of wind waves m WVPER Mean Period of Wind Waves s WVPER Wind Wave Period s wvPeriod wvPeriod WVSTP Steepness of wind waves wvType wvType wW w Component of Wind cm/s wx Weather zAGL Height AGL m ZDR Differential Reflectivity dB","title":"AWIPS Grid Parameters"},{"location":"appendix/appendix-wsr88d/","text":"Product Name Mnemonic ID Levels Res Elevation Reflectivity (Z) Z 19 16 100 .5 Reflectivity (Z) Z 19 16 100 1.5 Reflectivity (Z) Z 19 16 100 2.5 Reflectivity (Z) Z 19 16 100 3.5 Reflectivity (Z) Z 20 16 200 .5 Velocity (V) V 27 16 100 .5 Velocity (V) V 27 16 100 1.5 Velocity (V) V 27 16 100 2.5 Velocity (V) V 27 16 100 3.5 Storm Rel Velocity (SRM) SRM 56 16 100 .5 Storm Rel Velocity (SRM) SRM 56 16 100 1.5 Storm Rel Velocity (SRM) SRM 56 16 100 2.5 Storm Rel Velocity (SRM) SRM 56 16 100 3.5 Composite Ref (CZ) CZ 37 16 100 -1 Composite Ref (CZ) CZ 38 16 400 -1 Lyr Comp Ref Max (LRM) Level 1 LRM 65 8 0 -1 Lyr Comp Ref Max (LRM) Level 2 LRM 66 8 0 -1 Lyr Comp Ref Max (LRM) Level 3 LRM 90 8 0 -1 Lyr Comp Ref MAX (APR) APR 67 16 0 -1 Echo Tops (ET) ET 41 16 0 -1 Vert Integ Liq (VIL) VIL 57 16 0 -1 One Hour Precip (OHP) OHP 78 16 0 -1 Storm Total Precip (STP) STP 80 16 0 -1 VAD Wind Profile (VWP) VWP 48 0 0 -1 Digital Precip Array (DPA) DPA 81 256 400 -1 Velocity (V) V 25 16 100 .5 Base Spectrum Width (SW) SW 28 8 100 .5 Base Spectrum Width (SW) SW 30 8 100 .5 Severe Weather Probablilty (SWP) SWP 47 0 100 -1 Storm Tracking Information (STI) STI 58 0 100 -1 Hail Index (HI) HI 59 0 100 -1 Mesocyclone (M) M 60 0 100 -1 Mesocyclone (MD) MD 141 0 0 1 Tornadic Vortex Signature (TVS) TVS 61 0 100 -1 Storm Structure (SS) SS 62 0 100 -1 Supplemental Precipitation Data (SPD) SPD 82 0 100 -1 Reflectivity (Z) Z 94 256 100 .5 Reflectivity (Z) Z 94 256 100 1.5 Reflectivity (Z) Z 94 256 100 2.4 Reflectivity (Z) Z 94 256 100 3.4 Reflectivity (Z) Z 94 256 100 4.3 Reflectivity (Z) Z 94 256 100 5.3 Reflectivity (Z) Z 94 256 100 6.2 Reflectivity (Z) Z 94 256 100 7.5 Reflectivity (Z) Z 94 256 100 8.7 Reflectivity (Z) Z 94 256 100 10.0 Reflectivity (Z) Z 94 256 100 12.0 Reflectivity (Z) Z 94 256 100 14.0 Reflectivity (Z) Z 94 256 100 16.7 Reflectivity (Z) Z 94 256 100 19.5 Velocity (V) V 99 256 25 .5 Velocity (V) V 99 256 25 1.5 Velocity (V) V 99 256 25 2.4 Velocity (V) V 99 256 25 3.4 Velocity (V) V 99 256 25 4.3 Velocity (V) V 99 256 25 5.3 Velocity (V) V 99 256 25 6.2 Velocity (V) V 99 256 25 7.5 Velocity (V) V 99 256 25 8.7 Velocity (V) V 99 256 25 10.0 Velocity (V) V 99 256 25 12.0 Velocity (V) V 99 256 25 14.0 Velocity (V) V 99 256 25 16.7 Velocity (V) V 99 256 25 195 Super Res Reflectivity (Z) HZ 153 256 25 .5 Super Res Reflectivity (Z) HZ 153 256 25 1.5 Super Res Velocity (V) HV 154 256 25 .5 Super Res Velocity (V) HV 154 256 25 1.5 Super Res Spec Width (SW) HSW 155 256 25 .5 Super Res Spec Width (SW) HSW 155 256 25 1.5 Spectrum Width (SW) SW 30 8 100 1.5 Spectrum Width (SW) SW 28 8 25 1.5 Digital Vert Integ Liq (DVL) DVL 134 256 100 -1 Digital Hybrid Scan Refl (DHR) DHR 32 256 100 -1 Enhanced Echo Tops (EET) EET 135 256 100 -1 Digital Meso Detection (DMD) DMD 149 0 0 16384 TVS Rapid Update (TRU) TRU 143 0 0 16384 User Selectable Lyr Refl (ULR) ULR 137 16 100 -1 Storm Total Precip (STP) STP 138 256 200 -1 1-Hour Snow-Water Equiv (OSW) OSW 144 16 100 -1 1-Hour Snow Depth (OSD) OSD 145 16 100 -1 Storm Tot Snow Depth (SSD) SSD 147 16 100 -1 Storm Tot Snow-Water Equiv (SSW) SSW 146 16 100 -1 Differential Refl (ZDR) ZDR 158 16 100 .5 Differential Refl (ZDR) ZDR 159 256 25 16384 Correlation Coeff (CC) CC 160 16 100 .5 Correlation Coeff (CC) CC 161 256 25 16384 Specific Diff Phase (KDP) KDP 162 16 100 .5 Specific Diff Phase (KDP) KDP 163 256 25 16384 Hydrometeor Class (HC) HC 164 16 100 .5 Hydrometeor Class (HC) HC 165 256 25 16384 Melting Layer (ML) ML 166 0 0 16384 Hybrid Hydrometeor Class (HHC) HHC 177 256 25 -1 Digital Inst Precip Rate (DPR) DPR 176 0 25 -1 One Hour Accum (OHA) OHA 169 16 200 -1 User Select Accum (DUA) DUA 173 256 25 -1 User Select Accum (DUA) DUA 173 256 25 -1 Storm Total Accum (STA) STA 171 16 200 -1 Storm Total Accum (DSA) STA 172 256 25 -1 One Hour Diff (DOD) DOD 174 256 25 -1 Storm Total Diff (DSD) DSD 175 256 25 -1","title":"WSR-88D Product Table"},{"location":"appendix/common-problems/","text":"Common Problems \uf0c1 All Operating Systems \uf0c1 Removing caveData \uf0c1 Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData. Linux \uf0c1 For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData Windows \uf0c1 For Windows users, simply delete the caveData folder in your home user directory: Mac \uf0c1 For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData Disappearing Configurations \uf0c1 If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE. Remotely Connecting to CAVE \uf0c1 Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges. CAVE Spring Start Up Error \uf0c1 If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products. Products Not Loading Properly \uf0c1 This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. There are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type Windows \uf0c1 To correct this issue on Windows: Make sure you only have the latest version of CAVE installed (which now fully bundles the proper version of Python with the application) If you have ever previously added PYTHONHOME , PYTHONPATH , or JAVA_HOME to your user or system variables, please delete those. Open the Environment Variables window by typing \"env\" in the start bar Mac \uf0c1 To correct this issue on Mac: Make sure you have the latest verison of CAVE installed (which now fully bundles the proper version of Python with the application) Linux \uf0c1 To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions Windows \uf0c1 CAVE Map Display in Lower Left Quadrant - Windows \uf0c1 If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the no_env.exe, select Properties This is not the batch file (CAVE.bat) that gets installed as the CAVE shortcut on the Desktop, the no_env.exe is located in C:\\Users\\[your_username]\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\no_env.exe . Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System Windows CAVE Start Up Error \uf0c1 This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE. macOS \uf0c1 Monterey CAVE Warning \uf0c1 If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue. White Boxes for Surface Resources \uf0c1 If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE. Model Data Not Rendering \uf0c1 This behavior has appeared with MacOS Sonoma (v14) -- model data is no longer loading and you see the following errors on the screen or in the AlertView: ERROR: An internal error occured during: \"Initializing...\". ERROR: An internal error occured during: \"Product Loader\". ERROR: An internal error occured during \"Initializing...\". If you encounter this behavior, please close CAVE, clear caveData as described above , then restart CAVE and try to load the data again. If you still experience issues, please let us know at support-awips@unidata.ucar.edu Linux \uf0c1 Troubleshooting Uninstalling EDEX \uf0c1 Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Common Problems"},{"location":"appendix/common-problems/#common-problems","text":"","title":"Common Problems"},{"location":"appendix/common-problems/#all-operating-systems","text":"","title":"All Operating Systems"},{"location":"appendix/common-problems/#removing-cavedata","text":"Removing caveData (flushing the local cache) should be one of the first troubleshooting steps to take when experiencing weird behavior in CAVE. The cache lives in a folder called caveData , hence why this process is also referred to as removing or deleting caveData.","title":"Removing caveData"},{"location":"appendix/common-problems/#linux","text":"For Linux users, the easiest way is to open a new terminal and run the following command: rm -rf ~/caveData","title":"Linux"},{"location":"appendix/common-problems/#windows","text":"For Windows users, simply delete the caveData folder in your home user directory:","title":"Windows"},{"location":"appendix/common-problems/#mac","text":"For Mac users, the easiest way is to open a new terminal and run the following command: rm -rf ~/Library/caveData","title":"Mac"},{"location":"appendix/common-problems/#disappearing-configurations","text":"If you ever notice some of the following settings you've configured/saved disappear from CAVE: Saved Displays or Procedures NSHARP settings (line thickness, etc) Colormap settings StyleRule settings This is not a fully exhaustive list, so if something else has disappeared it might be the same underlying issue still. Then it is likely we have recently changed our production EDEX server. There is a good chance we can recover your settings. To do so, please send a short email to support-awips@unidata.ucar.edu with the topic \"Missing Configurations\", and include the username(s) of the computer(s) you use to run CAVE.","title":"Disappearing Configurations"},{"location":"appendix/common-problems/#remotely-connecting-to-cave","text":"Since the pandemic began, many users have asked if they can use X11 forwarding or ssh tunneling to remotely connect to CAVE machines. This is not recommended or supported , and CAVE crashes in many different ways and expresses strange behavior as well. We highly recommend you download the appropriate CAVE installer on your local machine, if that is an option. If that is not an option, then the only remote access we recommend is using some type of VNC. RealVNC and nomachine are two options that are in use with positive outcomes. UltraVNC may be another option, but may have quite a delay. There may also be other free or paid software available that we are not aware of. It is likely that any VNC option you choose will also require some software or configuration to be set on the remote machine, and this will likely require administrative privileges.","title":"Remotely Connecting to CAVE"},{"location":"appendix/common-problems/#cave-spring-start-up-error","text":"If you encounter the error below, please see one of our solution methods for resolving: CAVE's Spring container did not initialize correctly and CAVE must shut down. We have found the reason for this failure is because the host machine is set to use a language other than English (ie. Spanish, French, etc). To resolve this issue, either: Switch your system to English, when using CAVE or Use our Virtual Machine option . This option allows your actual machine to stay in whichever language you choose, while allowing you to run CAVE in an environment set to English. Although we list this installation under the Windows OS, this can also be done on Linux. The VM option has one notable drawback at the moment -- it cannot render RGB satellite products.","title":"CAVE Spring Start Up Error"},{"location":"appendix/common-problems/#products-not-loading-properly","text":"This problem is most commonly seen with the direct Windows installation. It can also manifest in the Mac installation (and is possible on Linux), and the root of the problem is not having Python installed properly for CAVE to use the packages. There are derived products which use the local machine to create and render the data. This creation is dependent upon python and its required packages working correctly. The dataset will be available in the menus and product browser, but when loaded, no data is drawn on the editor, but an entry is added to the legend. You may see an error that mentions the python package, jep . Known datasets this can affect (this is not a comprehensive list): Model Winds Metars Winds METAR Station Plot GFS Precip Type","title":"Products Not Loading Properly"},{"location":"appendix/common-problems/#windows_1","text":"To correct this issue on Windows: Make sure you only have the latest version of CAVE installed (which now fully bundles the proper version of Python with the application) If you have ever previously added PYTHONHOME , PYTHONPATH , or JAVA_HOME to your user or system variables, please delete those. Open the Environment Variables window by typing \"env\" in the start bar","title":"Windows"},{"location":"appendix/common-problems/#mac_1","text":"To correct this issue on Mac: Make sure you have the latest verison of CAVE installed (which now fully bundles the proper version of Python with the application)","title":"Mac"},{"location":"appendix/common-problems/#linux_1","text":"To correct this issue on Linux: When running which python from a terminal, make sure /awips2/python/ is returned, if not, reset that environment variable, or re-run the awips_install.sh script from our installation instructions","title":"Linux"},{"location":"appendix/common-problems/#windows_2","text":"","title":"Windows"},{"location":"appendix/common-problems/#cave-map-display-in-lower-left-quadrant-windows","text":"If you start up CAVE in Windows and notice the map is showing up only in the bottom left quadrant of your display, you will just need to tweak a few display settings. Try following these steps to fix your issue: Right-click on the no_env.exe, select Properties This is not the batch file (CAVE.bat) that gets installed as the CAVE shortcut on the Desktop, the no_env.exe is located in C:\\Users\\[your_username]\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\no_env.exe . Select the Compatibility tab Click \"Change High DPI Settings\" At the bottom enable \"Override High DPI scaling behavior\" Change the dropdown from Application to System","title":"CAVE Map Display in Lower Left Quadrant - Windows"},{"location":"appendix/common-problems/#windows-cave-start-up-error","text":"This should no longer be an issue for our v20 release of AWIPS. One common error some users are seeing manifests itself just after selecting an EDEX server to connect to. The following error dialogs may show up: Error purging logs Error instantiating workbench: null These errors are actually happening because the Windows machine is using IPv6, which is not compatible with AWIPS at this time. To fix the issue simply follow these steps: These screenshots may vary from your system. These instructions are per connection , so if you use multiple connections or switch between wired and wireless connections, you'll need to do the following for each of those connections so that CAVE will always run properly. 1. Close all error windows and any open windows associated with CAVE. 2. In the Windows search field, search for \"control panel\". 3. Once in the Control Panel, look for \"Network and Sharing Center\". 4. Select the adapter for your current connection (should be either \"Ethernet\" or \"Wi-Fi\"). 5. Click on \"Properties\". 6. Uncheck \"Internet Protocol Version 6 (TCP/IPv6)\" and select OK. You may need to restart your machine for this to take effect 7. Restart CAVE.","title":"Windows CAVE Start Up Error"},{"location":"appendix/common-problems/#macos","text":"","title":"macOS"},{"location":"appendix/common-problems/#monterey-cave-warning","text":"If you are running MacOS Monterey, you may see the following message when starting CAVE: Monterey versions 12.3 or newer will not support our production (v18) CAVE. Please download and install our beta v20 CAVE for newer MacOS Versions to avoid this issue.","title":"Monterey CAVE Warning"},{"location":"appendix/common-problems/#white-boxes-for-surface-resources","text":"If you do not have an NVIDIA graphics card and driver, you may see \"boxes\" drawn on the editor for some of the products ( METARS Station Plots and Surface Winds are the resources we're aware of), as shown below: You may be able to fix this issue: Check what graphics cards are available on your machine, by going to the Apple menu (far left, upper corner) > About This Mac > Overview tab (default): If you see two entries at the Graphics line, like the image shown above, then you have two graphics cards on your system. Intel graphics cards may be able to render our products properly. In this case, you can \"force\" your computer to use the Intel card by running the following in a terminal: sudo pmset -[a|b|c] gpuswitch 0 Where [a|b|c] is only one of those options, which mean: a: adjust settings for all scenarios b: adjust settings while running off battery c: adjust settings while connected to charger The argument 0 sets the computer to use the dedicated GPU (in our case above the Intel GPU). The two other options for that argument are: 1: automatic graphics switching 2: integrated GPU It may be smart to run pmset -g first, so you can see what the current gpuswitch setting is (likely 1 ), that way you can revert the settings if you want them back to how they were, when not using CAVE.","title":"White Boxes for Surface Resources"},{"location":"appendix/common-problems/#model-data-not-rendering","text":"This behavior has appeared with MacOS Sonoma (v14) -- model data is no longer loading and you see the following errors on the screen or in the AlertView: ERROR: An internal error occured during: \"Initializing...\". ERROR: An internal error occured during: \"Product Loader\". ERROR: An internal error occured during \"Initializing...\". If you encounter this behavior, please close CAVE, clear caveData as described above , then restart CAVE and try to load the data again. If you still experience issues, please let us know at support-awips@unidata.ucar.edu","title":"Model Data Not Rendering"},{"location":"appendix/common-problems/#linux_2","text":"","title":"Linux"},{"location":"appendix/common-problems/#troubleshooting-uninstalling-edex","text":"Sometimes yum can get in a weird state and not know what AWIPS groups have been installed. For example if you are trying to remove AWIPS you may see an error: yum groupremove \"AWIPS EDEX Server\" Loaded plugins: fastestmirror, langpacks Loading mirror speeds from cached hostfile * base: mirror.dal.nexril.net * elrepo: ftp.osuosl.org * epel: mirrors.xmission.com * extras: mirrors.cat.pdx.edu * updates: mirror.mobap.edu No environment named AWIPS EDEX Server exists Maybe run: yum groups mark remove (see man yum) No packages to remove from groups To solve this issue, mark the group you want to remove and then try removing it again: yum groups mark remove \"AWIPS EDEX Server\" yum groupremove \"AWIPS EDEX Server\" Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 .","title":"Troubleshooting Uninstalling EDEX"},{"location":"appendix/educational-resources/","text":"Educational Resources \uf0c1 Here at NSF Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips. CAVE eLearning Course \uf0c1 Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE. Access \uf0c1 Please create an account on our eLearning site , then self-enroll in Learn AWIPS CAVE . Content \uf0c1 Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings Prerequisites \uf0c1 Required: A supported web browser NSF Unidata's CAVE version 20.3.2 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor Design \uf0c1 Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention. Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS eLearning Course \uf0c1 Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS . Access \uf0c1 Please create an account on our eLearning site , then self-enroll in Learn Python-AWIPS . Content \uf0c1 Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data Prerequisites \uf0c1 Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions Support \uf0c1 If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu Python-AWIPS Example Notebooks \uf0c1 In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips. Access \uf0c1 All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references. Content \uf0c1 Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more YouTube Channel and Playlist \uf0c1 NSF Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos. Access \uf0c1 All NSF Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist . Content \uf0c1 Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples AWIPS Tips Blog Series \uf0c1 AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our NSF Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more. Access \uf0c1 View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out! Content \uf0c1 A full list of all released blogs can be found below: General \uf0c1 Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from our eLearning Site AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release NSF Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX! AWIPS 20.3.2-1 Production AWIPS Release Changes Related to v20.3.2 AWIPS Release AMS 2024 Highlight AWIPS 20.3.2-2 AWIPS Software Release Upcoming CentOS7 End of Life CAVE \uf0c1 Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints Measuring Up - Distance Tools in CAVE New RAWS Data Customized Contours Using the Text Workstation Saving User Configurations Locally Python-AWIPS \uf0c1 Access Model Output with Python-AWIPS Plot New GOES Products From Our Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release Creating New Products with Python-AWIPS Exploring Satellite Imagery using Python-AWIPS EDEX \uf0c1 Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS All About EDEX Modes Distributed EDEX Architecture","title":"Educational Resources"},{"location":"appendix/educational-resources/#educational-resources","text":"Here at NSF Unidata, we want to provide as many resources as possible to make our tools and applications easy to use. For AWIPS we currently have a new eLearning course that is specific to CAVE. We also have a suite of Jupyter Notebooks that are meant to provide a detailed overview of many capabilities of python-awips.","title":"Educational Resources"},{"location":"appendix/educational-resources/#cave-elearning-course","text":"Learn AWIPS CAVE is our online educational course for those interested in learning about CAVE.","title":"CAVE eLearning Course"},{"location":"appendix/educational-resources/#access","text":"Please create an account on our eLearning site , then self-enroll in Learn AWIPS CAVE .","title":"Access"},{"location":"appendix/educational-resources/#content","text":"Learn AWIPS CAVE is specifically tailored to content regarding CAVE -- the local graphical application used to view weather data. The following topics and capabilities are covered throughout the course: Launching CAVE Navigating the interface Modifying product appearances Understanding the time match basis Creating publication-quality graphics Exploring various CAVE layouts Saving and loading procedures and displays Using radar displays Using baselines and points Creating time series displays Creating vertical cross section displays Using the NSHARP editor for soundings Viewing model soundings","title":"Content"},{"location":"appendix/educational-resources/#prerequisites","text":"Required: A supported web browser NSF Unidata's CAVE version 20.3.2 installed on a supported operating system Recommended: A keyboard with a numpad and mouse with a scrollwheel Second monitor","title":"Prerequisites"},{"location":"appendix/educational-resources/#design","text":"Learn AWIPS CAVE is designed for those new to AWIPS or for those seeking to learn best practices. The course is organized into modular sections with supporting lessons, allowing for spaced learning or completion in multiple class or lab sessions. Each section concludes with a quiz to assess learning, and results can be requested by instructors or supervisors for their classes/teams. Below is a snapshot taken from the course. Lessons are tied to relevant learning objectives . Lessons are scaffolded such that each skill builds upon the next. Tutorials, challenges, and assessments are designed to support higher-order thinking skills and learning retention.","title":"Design"},{"location":"appendix/educational-resources/#support","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-elearning-course","text":"Learn Python-AWIPS is our online educational course for those interested in learning about Python-AWIPS .","title":"Python-AWIPS eLearning Course"},{"location":"appendix/educational-resources/#access_1","text":"Please create an account on our eLearning site , then self-enroll in Learn Python-AWIPS .","title":"Access"},{"location":"appendix/educational-resources/#content_1","text":"Learn Python-AWIPS is designed for new users of Python-AWIPS who have some background in both Python and CAVE. Through tutorials, challenges, and demonstrations, you will learn the basics for working with EDEX resources through Python. The following topics and capabilities are covered throughout the course: Programmatically explore the resources available on an EDEX server Make a request to an EDEX for data See examples of data manipulation Plot requested data","title":"Content"},{"location":"appendix/educational-resources/#prerequisites_1","text":"Required: A supported web browser Python3 Conda Git Python-AWIPS using the Source Code with Examples Install instructions","title":"Prerequisites"},{"location":"appendix/educational-resources/#support_1","text":"If you experience any technical issues with our online course, please contact us at: support-elearning@unidata.ucar.edu","title":"Support"},{"location":"appendix/educational-resources/#python-awips-example-notebooks","text":"In addition to CAVE, AWIPS also has a Python package called python-awips which allows access to all data on an EDEX server. We have created a suite of Jupyter Notebooks as examples for how to use various functions of python-awips.","title":"Python-AWIPS Example Notebooks"},{"location":"appendix/educational-resources/#access_2","text":"All of our Notebooks can be downloaded and accessed locally by following the source code installation instructions found on our python-awips website . Additionally, non-interactive webpage renderings of each of the Notebooks are also available for quick and easy references.","title":"Access"},{"location":"appendix/educational-resources/#content_2","text":"Our python-awips Notebooks span a wide range of topics, but generally cover the following: Investigating what data is available on an EDEX server Accessing and filtering desired data based on time and location Plotting and analyzing datasets Specific examples for various data types: satellite imagery, model data, soundings, surface obs, and more","title":"Content"},{"location":"appendix/educational-resources/#youtube-channel-and-playlist","text":"NSF Unidata has a YouTube channel where we publish videos about all of our software pacakges. Specifically we also have a playlist dedicated to AWIPS videos.","title":"YouTube Channel and Playlist"},{"location":"appendix/educational-resources/#access_3","text":"All NSF Unidata vidoes can be accessed here on our channel. All AWIPS vidoes can be found on the AWIPS Playlist .","title":"Access"},{"location":"appendix/educational-resources/#content_3","text":"Our AWIPS videos cover a wide range of topics, but include some of the following themes: AWIPS topic overviews Instructional videos (ex. how to install CAVE) In-depth walkthroughs on CAVE functionality Python-AWIPS notebook examples","title":"Content"},{"location":"appendix/educational-resources/#awips-tips-blog-series","text":"AWIPS Tips is a bi-weekly (every two weeks) blog series that is posted on our NSF Unidata blogs page. Entries in the series cover topics relating to CAVE, python-awips, EDEX, and more.","title":"AWIPS Tips Blog Series"},{"location":"appendix/educational-resources/#access_4","text":"View all of the AWIPS Tips blogs here , and easily search for them using the awips-tips tag. Please join our mailing list (awips2-users) to get the notifications of new AWIPS Tips when they come out!","title":"Access"},{"location":"appendix/educational-resources/#content_4","text":"A full list of all released blogs can be found below:","title":"Content"},{"location":"appendix/educational-resources/#general","text":"Welcome to AWIPS Tips! AWIPS 18.2.1 Software Release Announcing AWIPS eLearning AWIPS 18.2.1-3 Software Release Access Learn AWIPS CAVE from our eLearning Site AWIPS 18.2.1-5 Software Release GLM DATA IDD/LDM Feed Updates AWIPS 18.2.1-6 Software Release NSF Unidata AWIPS Summer Internship 2022: Rhoen Fiutak Announcing a New eLearning Course: Learn Python-AWIPS Use Case Example: Texas A&M CAVE in the Classroom AWIPS 20.3.2-0.1 Beta CAVE Software Release AWIPS 20.3.2-0.2 Beta CAVE Software Release AWIPS 20.3.2-0.3 Beta CAVE Software Release AWIPS 20.3.2-0.4 Beta Software Release - with EDEX! AWIPS 20.3.2-1 Production AWIPS Release Changes Related to v20.3.2 AWIPS Release AMS 2024 Highlight AWIPS 20.3.2-2 AWIPS Software Release Upcoming CentOS7 End of Life","title":"General"},{"location":"appendix/educational-resources/#cave","text":"Visualizing Data in CAVE Display Capabilities in CAVE Time Tips Explore the CAVE Product Browser CAVE's Local Cache: caveData Explore the CAVE Volume Browser: Plan Views Using CAVE's Points and Baselines Tool Explore the CAVE Volume Browser: Cross Section and Time Series Using CAVE Displays and Procedures Getting Started With the NSHARP Display Tool Explore the CAVE Volume Browser: Model Soundings NUCAPS Soundings Import Shapefiles in CAVE Create Objective Analysis Plots Use Warngen to Draw Convective Warnings Using Drawing Properties for WWA Display in CAVE Understanding Graphic vs Image Products in CAVE Getting to Know CAVE's Display Properties Creating a User Override Frames in CAVE Panes in CAVE Image Combination with CAVE Colorized GOES CIRA Products Changing Localizations in CAVE All About Sampling Maps Database Constraints Measuring Up - Distance Tools in CAVE New RAWS Data Customized Contours Using the Text Workstation Saving User Configurations Locally","title":"CAVE"},{"location":"appendix/educational-resources/#python-awips","text":"Access Model Output with Python-AWIPS Plot New GOES Products From Our Public EDEX Load Map Resources and Topography using Python-AWIPS Create a Colored Surface Temperature Plot Create Colorized Model Plots View WWA Polygons with Python-AWIPS Creating METAR Station Plots Create Sounding Plots with Model Data Plotting Multiple Datasets from EDEX Open Jupyter Notebooks with our Virtual Machine Visualizing Upper Air Soundings Compare Model Sounding Data in Python Beta Python-AWIPS Release Creating New Products with Python-AWIPS Exploring Satellite Imagery using Python-AWIPS","title":"Python-AWIPS"},{"location":"appendix/educational-resources/#edex","text":"Get to Know EDEX EDEX Data Retention Adding ECMWF Data to EDEX Ingesting GOES Satellite Data Localization Levels in EDEX Porting Users CAVE Configurations Creating New Scales/Maps Adding Shapefiles to the Maps Menu with EDEX Removing Model Data from EDEX LDM Usage in AWIPS All About EDEX Modes Distributed EDEX Architecture","title":"EDEX"},{"location":"appendix/maps-database/","text":"mapdata.airport \uf0c1 Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok mapdata.allrivers \uf0c1 Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326) mapdata.artcc \uf0c1 Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326) mapdata.basins \uf0c1 Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326) mapdata.canada \uf0c1 Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326) mapdata.city \uf0c1 Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326) mapdata.county \uf0c1 Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.customlocations \uf0c1 Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.cwa \uf0c1 Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.firewxaor \uf0c1 Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.firewxzones \uf0c1 Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.fix \uf0c1 Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) mapdata.highaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.highsea \uf0c1 Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.highway \uf0c1 Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.hsa \uf0c1 Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.interstate \uf0c1 Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326) mapdata.isc \uf0c1 Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326) mapdata.lake \uf0c1 Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.latlon10 \uf0c1 Column Type the_geom geometry(MultiLineString,4326) mapdata.lowaltitude \uf0c1 Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326) mapdata.majorrivers \uf0c1 Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326) mapdata.marinesites \uf0c1 Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326) mapdata.marinezones \uf0c1 Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.mexico \uf0c1 Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326) mapdata.navaid \uf0c1 Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326) mapdata.offshore \uf0c1 Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326) mapdata.railroad \uf0c1 Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326) mapdata.rfc \uf0c1 Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326) mapdata.specialuse \uf0c1 Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326) mapdata.states \uf0c1 Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326) mapdata.timezones \uf0c1 Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) mapdata.warngenloc \uf0c1 Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326) mapdata.world \uf0c1 Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326) mapdata.zone \uf0c1 Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"Maps Database"},{"location":"appendix/maps-database/#mapdataairport","text":"Column Type arpt_id character varying(4) name character varying(42) city character varying(40) state character varying(2) siteno character varying(9) site_type character varying(1) fac_use character varying(2) owner_type character varying(2) elv integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326) ok","title":"mapdata.airport"},{"location":"appendix/maps-database/#mapdataallrivers","text":"Column Type ihabbsrf_i double precision rr character varying(11) huc integer type character varying(1) pmile double precision pname character varying(30) owname character varying(30) pnmcd character varying(11) ownmcd character varying(11) dsrr double precision dshuc integer usdir character varying(1) lev smallint j smallint termid integer trmblv smallint k smallint the_geom geometry(MultiLineString,4326)","title":"mapdata.allrivers"},{"location":"appendix/maps-database/#mapdataartcc","text":"Column Type artcc character varying(4) alt character varying(1) name character varying(30) type character varying(5) city character varying(40) id double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.artcc"},{"location":"appendix/maps-database/#mapdatabasins","text":"Column Type rfc character varying(7) cwa character varying(5) id character varying(8) name character varying(64) lon double precision lat double precision the_geom geometry(MultiPolygon,4326) the_geom_0 geometry(MultiPolygon,4326) the_geom_0_064 geometry(MultiPolygon,4326) the_geom_0_016 geometry(MultiPolygon,4326) the_geom_0_004 geometry(MultiPolygon,4326) the_geom_0_001 geometry(MultiPolygon,4326)","title":"mapdata.basins"},{"location":"appendix/maps-database/#mapdatacanada","text":"Column Type f_code character varying(5) name_en character varying(25) nom_fr character varying(25) country character varying(3) cgns_fid character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.canada"},{"location":"appendix/maps-database/#mapdatacity","text":"Column Type st_fips character varying(4) sfips character varying(2) county_fip character varying(4) cfips character varying(4) pl_fips character varying(7) id character varying(20) name character varying(39) elevation character varying(60) pop_1990 numeric population character varying(30) st character varying(6) warngenlev character varying(16) warngentyp character varying(16) watch_warn character varying(3) zwatch_war double precision prog_disc integer zprog_disc double precision comboflag double precision land_water character varying(16) recnum double precision lon double precision lat double precision f3 double precision f4 character varying(254) f6 double precision state character varying(25) the_geom geometry(Point,4326)","title":"mapdata.city"},{"location":"appendix/maps-database/#mapdatacounty","text":"Column Type state character varying(2) cwa character varying(9) countyname character varying(24) fips character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.county"},{"location":"appendix/maps-database/#mapdatacustomlocations","text":"Column Type bullet character varying(16) name character varying(64) cwa character varying(12) rfc character varying(8) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.customlocations"},{"location":"appendix/maps-database/#mapdatacwa","text":"Column Type cwa character varying(9) wfo character varying(3) lon numeric lat numeric region character varying(2) fullstaid character varying(4) citystate character varying(50) city character varying(50) state character varying(50) st character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.cwa"},{"location":"appendix/maps-database/#mapdatafirewxaor","text":"Column Type cwa character varying(3) wfo character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxaor"},{"location":"appendix/maps-database/#mapdatafirewxzones","text":"Column Type state character varying(2) zone character varying(3) cwa character varying(3) name character varying(254) state_zone character varying(5) time_zone character varying(2) fe_area character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.firewxzones"},{"location":"appendix/maps-database/#mapdatafix","text":"Column Type id character varying(30) type character varying(2) use character varying(5) state character varying(2) min_alt integer latitude character varying(16) longitude character varying(16) lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.fix"},{"location":"appendix/maps-database/#mapdatahighaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.highaltitude"},{"location":"appendix/maps-database/#mapdatahighsea","text":"Column Type wfo character varying(3) name character varying(250) lat numeric lon numeric id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.highsea"},{"location":"appendix/maps-database/#mapdatahighway","text":"Column Type prefix character varying(2) pretype character varying(6) name character varying(30) type character varying(6) suffix character varying(2) class character varying(1) class_rte character varying(1) hwy_type character varying(1) hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.highway"},{"location":"appendix/maps-database/#mapdatahsa","text":"Column Type wfo character varying(3) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.hsa"},{"location":"appendix/maps-database/#mapdatainterstate","text":"Column Type prefix character varying(2) pretype Ushy Hwy Ave Cord Rt Loop I Sthy name character varying(30) type character varying(6) suffix character varying(2) hwy_type I U S hwy_symbol character varying(20) route character varying(25) the_geom geometry(MultiLineString,4326)","title":"mapdata.interstate"},{"location":"appendix/maps-database/#mapdataisc","text":"Column Type wfo character varying(3) cwa character varying(3) the_geom geometry(MultiPolygon,4326)","title":"mapdata.isc"},{"location":"appendix/maps-database/#mapdatalake","text":"Column Type name character varying(40) feature character varying(40) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.lake"},{"location":"appendix/maps-database/#mapdatalatlon10","text":"Column Type the_geom geometry(MultiLineString,4326)","title":"mapdata.latlon10"},{"location":"appendix/maps-database/#mapdatalowaltitude","text":"Column Type awy_des character varying(2) awy_id character varying(12) awy_type character varying(1) airway character varying(16) newfield1 double precision the_geom geometry(MultiLineString,4326)","title":"mapdata.lowaltitude"},{"location":"appendix/maps-database/#mapdatamajorrivers","text":"Column Type rf1_150_id double precision huc integer seg smallint milept double precision seqno double precision rflag character varying(1) owflag character varying(1) tflag character varying(1) sflag character varying(1) type character varying(1) segl double precision lev smallint j smallint k smallint pmile double precision arbsum double precision usdir character varying(1) termid integer trmblv smallint pname character varying(30) pnmcd character varying(11) owname character varying(30) ownmcd character varying(11) dshuc integer dsseg smallint dsmlpt double precision editrf1_ double precision demand double precision ftimped double precision tfimped double precision dir double precision rescode double precision center double precision erf1__ double precision reservoir_ double precision pname_res character varying(30) pnmcd_res character varying(11) meanq double precision lowq double precision meanv double precision lowv double precision worka double precision gagecode double precision strahler double precision rr character varying(11) dsrr double precision huc2 smallint huc4 smallint huc6 integer the_geom geometry(MultiLineString,4326)","title":"mapdata.majorrivers"},{"location":"appendix/maps-database/#mapdatamarinesites","text":"Column Type st character varying(3) name character varying(50) prog_disc bigint warngenlev character varying(14) the_geom geometry(Point,4326)","title":"mapdata.marinesites"},{"location":"appendix/maps-database/#mapdatamarinezones","text":"Column Type id character varying(6) wfo character varying(3) gl_wfo character varying(3) name character varying(254) ajoin0 character varying(6) ajoin1 character varying(6) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.marinezones"},{"location":"appendix/maps-database/#mapdatamexico","text":"Column Type area double precision perimeter double precision st_mx_ double precision st_mx_id double precision name character varying(66) country character varying(127) continent character varying(127) the_geom geometry(MultiPolygon,4326)","title":"mapdata.mexico"},{"location":"appendix/maps-database/#mapdatanavaid","text":"Column Type id character varying(30) clscode character varying(11) city character varying(40) elv integer freq double precision name character varying(30) status character varying(30) type character varying(25) oprhours character varying(11) oprname character varying(50) latdms character varying(16) londms character varying(16) airway character varying(254) sym smallint lon double precision lat double precision the_geom geometry(Point,4326)","title":"mapdata.navaid"},{"location":"appendix/maps-database/#mapdataoffshore","text":"Column Type id character varying(50) wfo character varying(10) lon numeric lat numeric location character varying(70) name character varying(90) the_geom geometry(MultiPolygon,4326)","title":"mapdata.offshore"},{"location":"appendix/maps-database/#mapdatarailroad","text":"Column Type fnode_ double precision tnode_ double precision lpoly_ double precision rpoly_ double precision length numeric railrdl021 double precision railrdl020 double precision feature character varying(18) name character varying(43) state character varying(2) state_fips character varying(2) the_geom geometry(MultiLineString,4326)","title":"mapdata.railroad"},{"location":"appendix/maps-database/#mapdatarfc","text":"Column Type site_id character varying(3) state character varying(2) rfc_name character varying(18) rfc_city character varying(25) basin_id character varying(5) the_geom geometry(MultiPolygon,4326)","title":"mapdata.rfc"},{"location":"appendix/maps-database/#mapdataspecialuse","text":"Column Type name character varying(32) code character varying(16) yn smallint alt_desc character varying(128) artcc character varying(4) ctr_agen character varying(128) sch_agen character varying(128) state character varying(2) the_geom geometry(MultiPolygon,4326)","title":"mapdata.specialuse"},{"location":"appendix/maps-database/#mapdatastates","text":"Column Type state character varying(2) name character varying(24) fips character varying(2) lon numeric lat numeric the_geom geometry(MultiPolygon,4326)","title":"mapdata.states"},{"location":"appendix/maps-database/#mapdatatimezones","text":"Column Type name character varying(50) time_zone character varying(1) standard character varying(9) advanced character varying(10) unix_time character varying(19) lon double precision lat double precision the_geom geometry(MultiPolygon,4326)","title":"mapdata.timezones"},{"location":"appendix/maps-database/#mapdatawarngenloc","text":"Column Type name character varying(254) st character varying(3) state character varying(20) population integer warngenlev integer cwa character varying(4) goodness double precision lat numeric lon numeric usedirs numeric(10,0) supdirs character varying(20) landwater character varying(3) recnum integer the_geom geometry(MultiPolygon,4326)","title":"mapdata.warngenloc"},{"location":"appendix/maps-database/#mapdataworld","text":"Column Type name character varying(30) count double precision first_coun character varying(2) first_regi character varying(1) the_geom geometry(MultiPolygon,4326)","title":"mapdata.world"},{"location":"appendix/maps-database/#mapdatazone","text":"Column Type state character varying(2) cwa character varying(9) time_zone character varying(2) fe_area character varying(2) zone character varying(3) name character varying(254) state_zone character varying(5) lon numeric lat numeric shortname character varying(32) the_geom geometry(MultiPolygon,4326)","title":"mapdata.zone"},{"location":"cave/bundles-and-procedures/","text":"Displays and Procedures \uf0c1 AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control. Displays \uf0c1 File > Load Display \uf0c1 Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved. Load Display from Local Disk \uf0c1 To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories. File > Save Display \uf0c1 Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user. File > Save Display Locally \uf0c1 To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right. File > Delete Displays \uf0c1 Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently. Procedures \uf0c1 New Procedure \uf0c1 Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save. Open Procedure \uf0c1 Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below. Delete Procedure \uf0c1 From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays-and-procedures","text":"AWIPS contains two methods for saving and loading data resources: Displays are an all-encompasing way to save loaded resources and current view configurations either onto the connected EDEX server, or a local file for access in future CAVE sessions. Procedures are similar to Displays, but can be thought of as groups of procedure items which allow the user to save/load only parts of the procedure they desire, and allows the user to manage saved resources with more control.","title":"Displays and Procedures"},{"location":"cave/bundles-and-procedures/#displays","text":"","title":"Displays"},{"location":"cave/bundles-and-procedures/#file-load-display","text":"Load a previously-saved display from within the AWIPS system. The pop-up dialog allows you to select your own saved displays as well as those saved by other users. When loading a display, all existing tabs will be closed and replaced with the contents from the saved display. Displays will load as many Map Editor tabs as existed when the display was originally saved.","title":"File &gt; Load Display"},{"location":"cave/bundles-and-procedures/#load-display-from-local-disk","text":"To load a previously-saved display from a path within the file directory locally, select File > Load Display and then select the File button on the right to browse your local directories.","title":"Load Display from Local Disk"},{"location":"cave/bundles-and-procedures/#file-save-display","text":"Save a product display within the AWIPS system. This saves the display to the EDEX server for your specific user.","title":"File &gt; Save Display"},{"location":"cave/bundles-and-procedures/#file-save-display-locally","text":"To save a product display to a path within the file directory locally, select File > Save Display Locally and then select the File button on the right.","title":"File &gt; Save Display Locally"},{"location":"cave/bundles-and-procedures/#file-delete-displays","text":"Select and remove a saved display under File > Delete Displays , this will open a pop-up dialog. Select the file name and click OK and then confirm deletion to remove the saved file permanently.","title":"File &gt; Delete Displays"},{"location":"cave/bundles-and-procedures/#procedures","text":"","title":"Procedures"},{"location":"cave/bundles-and-procedures/#new-procedure","text":"Select the menu File > Procedures > New... Select Copy Into to add all loaded resources from your current map to the Procedure Stack Select Save (or Save As ) and then enter a name for the Procedure before clicking OK to save.","title":"New Procedure"},{"location":"cave/bundles-and-procedures/#open-procedure","text":"Similar to creating a new Procedure, select File > Procedures > Open... , select the saved resources and click Load to load them to the current Map Editor tab. If multiple procedure items are wanted for loading, create a new tab for each procedure item and then load that item into the tab. This process is shown in the video below.","title":"Open Procedure"},{"location":"cave/bundles-and-procedures/#delete-procedure","text":"From the menu File > Procedures > Delete... you can delete existing Procedure files in a way similar to deleting saved display files.","title":"Delete Procedure"},{"location":"cave/cave-keyboard-shortcuts/","text":"Keyboard Shortcuts \uf0c1 D2D Menu Shortcuts \uf0c1 Action Command Open New Map Ctrl + T Open a Display Ctrl + Shift + L Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K New Procedure Ctrl + N Open Procedure Ctrl + O Delete Procedure Ctrl + D Copy Display to Procedure(s) Ctrl + B History List Ctrl + Shift + H Clear Data Ctrl + Shift + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2192 Increase Loop Speed Page Up Decrease Loop Speed Page Down Toggle Time Options Ctrl + Shift + T Toggle Image Combination Ctrl + Shift + I Open Loop Properties Ctrl + L Open Image Properties Ctrl + I Show Print Dialog Ctrl + P Locate Cursor F12 D2D All Tilts Shortcuts \uf0c1 Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193 D2D Numeric Keypad Shortcuts \uf0c1 Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter Panel Combo Rotate (PCR) Shortcuts \uf0c1 These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8 Text Editor Shortcuts \uf0c1 Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#keyboard-shortcuts","text":"","title":"Keyboard Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-menu-shortcuts","text":"Action Command Open New Map Ctrl + T Open a Display Ctrl + Shift + L Save Display Ctrl + S Save Display Locally Ctrl + Shift + S Save KML Ctrl + K New Procedure Ctrl + N Open Procedure Ctrl + O Delete Procedure Ctrl + D Copy Display to Procedure(s) Ctrl + B History List Ctrl + Shift + H Clear Data Ctrl + Shift + C First Frame Ctrl + \u2190 Last Frame Ctrl + \u2192 Step Back \u2190 Step Forward \u2192 Increase Loop Speed Page Up Decrease Loop Speed Page Down Toggle Time Options Ctrl + Shift + T Toggle Image Combination Ctrl + Shift + I Open Loop Properties Ctrl + L Open Image Properties Ctrl + I Show Print Dialog Ctrl + P Locate Cursor F12","title":"D2D Menu Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-all-tilts-shortcuts","text":"Requires all tilts product in main display panel Action Command Step Back 1 Volume \u2190 Step Forward 1 Volume \u2192 Step up 1 Elevation Angle \u2191 Step down 1 Elevation Angle \u2193 Jump to First Frame Ctrl + \u2190 Jump to Last Frame Ctrl + \u2192 Jump to Highest Elevation Angle Ctrl + \u2191 Jump to Lowest Elevation Angle Ctrl + \u2193","title":"D2D All Tilts Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#d2d-numeric-keypad-shortcuts","text":"Num Lock must be enabled for these keystrokes to work Action Command Increase Brightness of Image 1, Decrease Image 2 [Numpad] + Decrease Brightness of Image 1, Increase Image 2 [Numpad] - Toggle Image Producted in Main Map On/Off [Numpad] 0 Toggle First 9 Graphic Products On/Off [Numpad] 1-9 Toggle Next 10 Graphic Prodcuts On/Off Shift + [Numpad] 0-9 Toggle Between Images 1 and 2 at Full Brightness [Numpad] . Toggle Legend [Numpad] Enter","title":"D2D Numeric Keypad Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#panel-combo-rotate-pcr-shortcuts","text":"These numbers refer to the ones at the top of the Keyboard Action Command Cycle Through PCR Products Delete Return to 4 Panel View End Cycle Back Through PCR Products Backspace Display Corresponding Product 1-8","title":"Panel Combo Rotate (PCR) Shortcuts"},{"location":"cave/cave-keyboard-shortcuts/#text-editor-shortcuts","text":"Action Command Extend Selection to Start of Line Shift + Home Extend Selection to End of Line Shift + End Extend Selection to Start of Document Ctrl + Shift + Home Extend Selection to End of Document Ctrl + Shift + End Extend Selection Up 1 Screen Shift + Page Up Extend Selection Down 1 Screen Shift + Page Down Extend Selection to Previous Character Shift + \u2190 Extend Selection by Previous Word Ctrl + Shift + \u2190 Extend Selection to Next Character Shift + \u2192 Extend Selection by Next Word Ctrl + Shift + \u2192 Extend Selection Up 1 Line Shift + \u2191 Extend Selection Down 1 Line Shift + \u2193 Delete Previous Word Ctrl + Backspace Delete Next Word Ctrl + Delete Close the Window Ctrl + Shift + F4 Undo Ctrl + Z Copy Ctrl + C Paste Ctrl + V Cut Ctrl + X","title":"Text Editor Shortcuts"},{"location":"cave/cave-localization/","text":"Change Localization \uf0c1 Localization Preferences \uf0c1 The default localization site for NSF Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Change Localization"},{"location":"cave/cave-localization/#change-localization","text":"","title":"Change Localization"},{"location":"cave/cave-localization/#localization-preferences","text":"The default localization site for NSF Unidata AWIPS is OAX (Omaha, Nebraska, where the Raytheon team is located). When you are prompted to connect to an EDEX server, you can change the WFO ID as well. This window also has the option to Prompt for settings on startup , which if checked, would ask for the EDEX Server and Site location every time CAVE is started (this can be useful if you are used to switching between servers and/or sites). Change the site (example shows TBW Tampa Bay) and click Apply or OK and confirm the popup dialog, which informs you that you must restart CAVE for the changes to take effect.","title":"Localization Preferences"},{"location":"cave/d2d-all-tools/","text":"There are two main subsets of types of tools in CAVE: Display Tools and Radar Tools . Display Tools \uf0c1 The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend . Az/Ran Overlay \uf0c1 This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button. Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser. Distance Bearing \uf0c1 Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted. Distance Speed \uf0c1 This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker. Distance Scale \uf0c1 Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest. Feature Following Zoom \uf0c1 When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom. Home \uf0c1 Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location. Points \uf0c1 The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" <No Group> \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map. Put home cursor \uf0c1 The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location. Range Rings \uf0c1 The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site. Sunset/Sunrise \uf0c1 By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset. Text Window \uf0c1 Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled. Time of Arrival / Lead Time \uf0c1 Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s). Units Calculator \uf0c1 This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box. Text Workstation \uf0c1 By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the NSF Unidata AWIPS version. Radar Tools \uf0c1 The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus. Estimated Actual Velocity (EAV) \uf0c1 A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed. Radar Display Controls \uf0c1 The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options: STI (Storm Track Information) \uf0c1 This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed. HI (Hail Index) \uf0c1 This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. TVS (Tornado Vortex Signature) \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product. DMD, MD, TVS \uf0c1 There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS. DMD (Digital Mesocyclone Display) \uf0c1 Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks. MBA (Microburst Alert) \uf0c1 Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts. SRM (Storm Relative Motion) \uf0c1 The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion. SAILS (Supplemental Adaptive Intra-Volume Low Level Scan) \uf0c1 Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt. VR - Shear \uf0c1 This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"Tools"},{"location":"cave/d2d-all-tools/#display-tools","text":"The display tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu. Many of the tools listed under the Tools menu can be placed into an editable state . Do not enable the \"Hide Legends\" feature if you want to place a tool in an editable state, because access to editability is done by clicking the center mouse button, or right-clicking over the Product Legend .","title":"Display Tools"},{"location":"cave/d2d-all-tools/#azran-overlay","text":"This tool displays a movable azimuth/range radar map overlay. The overlay is in the \"editable\" state when displayed, and can be relocated by clicking the right mouse button.","title":"Az/Ran Overlay"},{"location":"cave/d2d-all-tools/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. Baselines come up editable. \"Snapping\" an Interactive Baseline: If you are zoomed in over an area when you load Interactive Baselines and no Baselines appear, press the right mouse button to \"snap\" a Baseline to where the mouse cursor is. The system chooses a Baseline that has not been recently used. If you are working with a Baseline, a second click with the right mouse button will return you to the original Baseline, even if you modified another Baseline in the meantime.","title":"Baselines"},{"location":"cave/d2d-all-tools/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. You can also access the Choose By ID tool from the Tools menu on the Volume Browser.","title":"Choose By ID"},{"location":"cave/d2d-all-tools/#distance-bearing","text":"Selecting this tool displays six editable lines, each of which shows the azimuth and range of the labeled end of the line relative to the unlabeled end of the line. You can make the lines editable by clicking the center mouse button over the legend at the lower right of the display. Once in edit mode, a line can be moved as a unit and/or either of its end points can be adjusted.","title":"Distance Bearing"},{"location":"cave/d2d-all-tools/#distance-speed","text":"This tool can be used to determine the speed and direction of a storm or any other meteorological feature of interest. Selecting Distance Speed displays a Centroid Marker to move to the location of the storm or feature of interest in any two or more frames of displayed imagery (e.g., a satellite or radar loop). The system then displays a storm track with the direction (degrees) and speed (knots) of movement. When you select the Distance Speed option, the Distance Speed dialog box opens. Mode : You have the following selections from this option. Point : A radio button that allows you to set the Centroid Marker as a single point. Polyline : A radio button that allows you to set the Centroid Marker as a polyline. Legend : You have the following selections from this option. Time : A radio button that allows you to display time with the Centroid Marker. Speed : A radio button that allows you to display speed with the Centroid Marker.","title":"Distance Speed"},{"location":"cave/d2d-all-tools/#distance-scale","text":"Enabling this feature adds a scalebar to the bottom right hand of the main D2D display. This tool can be used to determine the size of a storm or any other meteorological feature of interest.","title":"Distance Scale"},{"location":"cave/d2d-all-tools/#feature-following-zoom","text":"When you zoom in over a small area to be able to view a feature in detail, animation will often cause the feature to move into and then out of the field of view. This tool allows you to follow a feature of interest even when zoomed in to a small area. To use this feature, first, you need to identify the location and motion of the feature, using Distance Speed or the WarnGen tracker. Once satisfied that the tracking icon is following the feature of interest, load this tool, and the center of the zoom area will track with the Distance Speed icon. Toggling the overlay off will resume the standard zooming behavior, and toggling it back on will reinvoke the feature following zoom.","title":"Feature Following Zoom"},{"location":"cave/d2d-all-tools/#home","text":"Selecting the Home option displays a marker, which is an \"X\" with the word \"Home\" next to it. Clicking on the Home Location Legend with the center mouse button makes the marker editable; drag the \"X\" or click with the right mouse button to change its location. When the Home Marker is displayed, use the Sample feature (click and hold to access the menu to turn on sampling) to display the range in miles and azimuth (in degrees) of the pointer location relative to the Home location.","title":"Home"},{"location":"cave/d2d-all-tools/#points","text":"The Points option initially displays a circular 10-point pattern, labeled A through J on the Map display. Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click the right mouse button to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second right mouse button click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Once the Points tools have been loaded, the addition, deletion, or manipulation of Points can be accomplished in three ways: Create Point Dialog Box : The Create Point dialog box is opened by clicking and holding the right mouse button on the map (but not on any exisiting Point) and selecting the \"New Point...\" option. The Create Point dialog box opens with the Lat and Lon text boxes populated with the latitude and longiture values at the point where you had clicked the right mouse button. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). In the Create Point dialog box, you must : Enter the Point's name And may do any of the following: Modify the latitude and longitude values Assign the Point's color and font use Assign the Point to a group Select whether the Point is movable or hidden By default, individual Points do not have an assigned color. They inherit the color of the Interactive Points layer reflected in the Interactive Points product legend. You can change the color of the Interactive Points layer by right clicking on the Interactive Points product legend and selecting a color from the dropdown list. The selected color then changes all points not having an assigned color to the new color. Points can be assigned to \" <No Group> \" which will organize them in the root location containing the group names when accessed by the Edit Points dialog box (see below). Edit Point Dialog Box : The Edit Point dialog box is opened by clicking and holding the right mouse button on a Point on the map and selecting the \"Edit Point...\" option. The latitude and longitude values can be viewed in \"Degrees : Minutes : Seconds,\" \"Degrees : Minutes,\" or \"Degrees Only\" (N and S refer to North and South; W and E refer to West and East). Besides the option of selecting the Edit Points dialog box, you also have the option of selecting \"Hide Point,\" \"Delete Point,\" or \"Move Point.\" Once hidden, the Point can be unhidden using the Points List dialog box, where you would uncheck the checkbox under the \"Hidden\" column adjacent to the Point that was hidden (see below). If \"Delete Point\" is selected, a pop-up opens to confirm whether you want to delete the Point. Selecting the \"Move Point\" option moves the Point to wherever you place the cursor on the map. Points List Dialog Box : The Points List dialog box is opened by clicking and holding the right mouse button on the Interactive Points product legend and selecting the \"Edit Points...\" option. The Points List dialog box lists all the available groups and Points. Groups can be expanded to review the list of Points assigned to that group by clicking the arrow next to the group name. Initially, the default set of Points (A-J) are listed in the D2D Group, as shown above. In the Points List dialog box, Points and groups may be dragged into and out of other groups to create or disassemble subgroups. The Points List dialog box also includes three columns. Point Name : Lists the group name and designated Points. Movable : Checking the checkbox adjacent to the Point disables the Point from being moved. Hidden : Checking the checkbox adjacent to the Point hides the Point on the map.","title":"Points"},{"location":"cave/d2d-all-tools/#put-home-cursor","text":"The Put home cursor tool provides an easy way to locate a METAR observation station, a city and state, or a latitude/longitude coordinate. For Canada and Mexico, only the METAR observation stations and latitude/longitude coordinates are accessible. When you select Put home cursor from the Tools dropdown menu, the Home marker X is displayed and the Put Home Cursor dialog box opens. You can use the Home marker, as previously described in the Home Tool, and the new Home location (station, city/state, or latitude/longitude) is identified in the Put Home Cursor dialog box. Another way to use this tool is to type in the station, city and state, or latitude and longitude, and select Go, or hit Enter on the keypad, to move the Home marker to the specified location. The new location's nearest METAR site, city and state, and latitude and longitude appear in the Put Home Cursor dialog box. The Put Home Cursor dialog box contains the following options. Location Selection : There are three ways to find a desired location. Once you choose the Station, City/State, or Lat/Lon radio button, an Entry Box is activated next to the respective label within the Put Home Cursor dialog box. Enter the desired location information. Go : This menu button initiates the search for the desired station, city/state, or latitude/longitude. The Home marker jumps to the newly specified location.","title":"Put home cursor"},{"location":"cave/d2d-all-tools/#range-rings","text":"The Range Rings Tool displays adjustable range rings around locations of interest to your local office. When you select Range Rings from the Tools dropdown menu, the Range Rings legend appears in the Main Display Pane. The tool comes up editable, and the rangeRing dialog box opens. (Clicking the middle mouse button over the legend toggles tool editability and closes/opens the rangeRing dialog box.) Within this dialog box, you can toggle on/off any of the target locations using the square selectors. Adjust the size of the radii (in nautical miles) by typing a new value in the entry boxes associated with each location and pressing the Apply button. You can also add labels at the center of the range ring and/or at any of the radial distances using the Labels Options menu associated with each location. Using the Movable Rings, you can add a new location at a specific point by using the Interactive Points Tool, or by typing in latitude/longitude coordinates. There is no practical limit on the number of new locations you can add to the display. The list of locations is pre-set but can be customized at a field site.","title":"Range Rings"},{"location":"cave/d2d-all-tools/#sunsetsunrise","text":"By typing a date, as well as the latitude and longitude of a location into the Sunrise/Sunset Tool dialog box, you can obtain the time (for any time zone) of sunrise and sunset, as well as the total length of daylight for that date. Additional features include the ability to calculate the sunrise/sunset in a different hemisphere, and the azimuthal angles, relative to true north, of the sunrise and sunset.","title":"Sunset/Sunrise"},{"location":"cave/d2d-all-tools/#text-window","text":"Selecting this option brings up a Text Display window that behaves in the same way as a window on the Text Workstation , except that the scripts menu is disabled.","title":"Text Window"},{"location":"cave/d2d-all-tools/#time-of-arrival-lead-time","text":"Selecting the Time Of Arrival / Lead Time option displays a tracking line from a feature's initial starting point in a past frame to its final position in the current frame. Once the final position is set, an Arrival Point is displayed. You can drag this point anywhere along the line to get the Time Of Arrival / Lead Time and Distance. You can also change the Mode from Point to Circular Front or Polyline anywhere along the line to better represent the feature(s).","title":"Time of Arrival / Lead Time"},{"location":"cave/d2d-all-tools/#units-calculator","text":"This tool converts the units of the first column into differing units of the second column. The units are grouped into temperature, speed, distance, time, and atmospheric pressure. First, simply type the number and select the units of the value you wish to convert in the firstcolumn entry box. Then in the second column, select the desired units to which you want the original value converted. The new value will appear in the second column entry box.","title":"Units Calculator"},{"location":"cave/d2d-all-tools/#text-workstation","text":"By selecting one of the \"Text\" buttons, a text window opens up. In National Weather Service operations, the text workstation is used to edit new warning text as well as look up past warnings, METARs, and TAFs. This functionality is disabled in the NSF Unidata AWIPS version.","title":"Text Workstation"},{"location":"cave/d2d-all-tools/#radar-tools","text":"The radar tools are a subset of the tools available in CAVE. These programs are accessible though the Tools dropdown menu, and in individual site radar menus.","title":"Radar Tools"},{"location":"cave/d2d-all-tools/#estimated-actual-velocity-eav","text":"A velocity (V) display from the radar shows only the radial component of the wind, so the indicated speed depends on the direction of the wind and the azimuth (direction) from the radar. Consider, for example, a north wind. Straight north of the radar, the full speed of the wind will be seen on the V product. As one moves around to the east of the radar, the radial component gets smaller, eventually reaching zero straight east of the radar. If the wind direction is known, then the actual wind speed can be computed by dividing the observed radial speed by the cosine of the angle between the radar radial and the actual direction. The EAV tool allows you to provide that angle and use the sampling function of the display to show the actual wind speed.","title":"Estimated Actual Velocity (EAV)"},{"location":"cave/d2d-all-tools/#radar-display-controls","text":"The Radar Display Controls dialog box is derived from the Radar Tools submenu and provides options that control the appearance of the Storm Track Information (STI), the Hail Index (HI), the Tornado Vortex Signature (TVS), the Digital Mesocyclone Display (DMD) products, the Microburst Alert (MBA) products, the Storm Relative Motion (SRM), and the SAILS products. The Radar Display Controls dialog box options are described below. Note : Our version of CAVE may not have all the products that these options are applicable to. The Radar Display Controls dialog box is divided into eight sections: STI , HI , TVS , DMD/MD/TVS , DMD , MBA , SRM , and SAILS . Each section has the following options:","title":"Radar Display Controls"},{"location":"cave/d2d-all-tools/#sti-storm-track-information","text":"This section has options to adjust the appearance of the STI graphic product. Number of storms to show : This slider bar lets you choose the maximum number of storms (0 to 100) you wish to display on the STI product. The default value is 20 storms. Type of track to show : This options menu allows you to choose the type of storm track that you want displayed.","title":"STI (Storm Track Information)"},{"location":"cave/d2d-all-tools/#hi-hail-index","text":"This portion of the Radar Display Controls dialog box contains options that alter the appearance of the HI radar graphic product. You can set the low and high algorithm thresholds of the Probability of Hail (POH) and the Probability of Severe Hail (POSH). Storms that meet the low POH threshold are indicated by small open triangles, while small solid triangles mark those that meet the high POH threshold. Similarly, large open triangles or solid triangles are plotted for the POSH low and high thresholds, respectively. Low hail probability (POH) : The storms that meet or exceed the threshold are indicated by small open triangles. The default setting is 30. Low severe hail probability (POSH) : The storms that meet or exceed the threshold are indicated by large open triangles. The default setting is 30. High hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50. High severe hail probability : The storms that meet or exceed the threshold are indicated by small solid triangles. The default setting is 50.","title":"HI (Hail Index)"},{"location":"cave/d2d-all-tools/#tvs-tornado-vortex-signature","text":"There is one option in this section of the Radar Display Controls dialog box. Show elevated TVS : This toggle button lets you control the appearance of the elevated TVS radar graphic product.","title":"TVS (Tornado Vortex Signature)"},{"location":"cave/d2d-all-tools/#dmd-md-tvs","text":"There is one option in this section of the Radar Display Controls dialog box. Show extrapolated features : With this option, you can choose whether to show the time-extrapolated features using DMD, MD, or TVS.","title":"DMD, MD, TVS"},{"location":"cave/d2d-all-tools/#dmd-digital-mesocyclone-display","text":"Minimum feature strength : A mesocyclone clutter filter which specifies the minimum 3D strength rank use to display a mesocyclone (default is 5). Show overlapping Mesos : Toggles whether to show overlapping mesocyclones. Type of track to show : This dropdown has option available for whether to display past and/or forcast tracks.","title":"DMD (Digital Mesocyclone Display)"},{"location":"cave/d2d-all-tools/#mba-microburst-alert","text":"Show Wind Shear : This option allows you to choose whether to display wind shear associated with microburts alerts.","title":"MBA (Microburst Alert)"},{"location":"cave/d2d-all-tools/#srm-storm-relative-motion","text":"The first three options in the SRM section allow you to choose where you want to derive the storm motion from. Storm Motion from WarnGen Track : Selecting this option will display the storm motion from a WarnGen Track. Average Storm Motion from STI : Selecting this option will display the average storm motion from from the storm track information (STI). Custom Storm Motion : Selecting this option allows you to specify a custom storm motion with the selections below. Direction : This slider allows you to choose the direction (in degrees??) of the storm motion. Speed : This slider allows you to specify the speed (in mph??) of the storm motion.","title":"SRM (Storm Relative Motion)"},{"location":"cave/d2d-all-tools/#sails-supplemental-adaptive-intra-volume-low-level-scan","text":"Enable SAILS Frame Coordinator : Enabled (default) : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will step to the next higher tilt (similar to GR2 Analyst) and Ctrl right arrow will step to the most recent tilt available for any elevation angle. Disabled : keyboard shortcuts change where tilting up from 0.5 degree SAILS tilt will not go anywhere (old confusing behavior) and Ctrl right arrow will step to the most recent time of the current tilt.","title":"SAILS (Supplemental Adaptive Intra-Volume Low Level Scan)"},{"location":"cave/d2d-all-tools/#vr-shear","text":"This tool is used in conjunction with Doppler velocity data to calculate the velocity difference (or \"shear\") of the data directly under the end points. As with the Baselines, this feature comes up editable and the end points can be dragged to specific gates of velocity data. When in place, the speed difference (kts), distance between end points (nautical miles), shear (s-1), and distance from radar (Nmi) are automatically plotted next to the end points and in the upper left corner of the Main Display Pane. A positive shear value indicates cyclonic shear, while a negative value indicates anticyclonic shear. If either end point is not directly over velocity data, the phrase \"no data\" is reported for the shear value. This tool is also useful in determining gate-to-gate shear. Simply place the two end points directly over adjacent gates of velocity data. \"Snapping\" VR Shear : If you are zoomed in over an area when you load VR - Shear, and the VR - Shear Baseline does not appear, click the right mouse button to \"snap\" the Baseline to where the mouse cursor is located. VR - Shear in 4 Panel : You can use the VR - Shear Tool when the large display is in 4 panel mode. The VR - Shear overlay is loaded in different colors for each panel. There are actually four copies of the program running, and each behaves independently. This means that you can get accurate readings in any one of the four panels \u2014 one VR - Shear panel is editable at a time. To activate, click the center mouse button on the VR - Shear legend in the desired panel and position the query line to the echoes of interest.","title":"VR - Shear"},{"location":"cave/d2d-edit-menus/","text":"Editing Menus \uf0c1 Any of the menus in the menubar can be customized in the Localization Perspective . Modifying Menus \uf0c1 Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) <include installTo=\"menu:models\" fileName=\"menus/grid/allFamilies.xml\"> <substitute key=\"modelName\" value=\"GEFS\" /> <substitute key=\"menuName\" value=\"GEFS\" /> <substitute key=\"frameCount\" value=\"41\" /> <substitute key=\"TP\" value=\"TP\"/> </include> Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS Removing Menus \uf0c1 This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change <menuContributionFile> <include installTo=\"menu:mrms?after=MRMS_MENU_START\" fileName=\"menus/mrms/mrms.xml\"/> </menuContributionFile> to <menuContributionFile> </menuContributionFile> With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#editing-menus","text":"Any of the menus in the menubar can be customized in the Localization Perspective .","title":"Editing Menus"},{"location":"cave/d2d-edit-menus/#modifying-menus","text":"Once in the Localization Perspective , menus can be modified by going to the D2D > Menus directory in the File Browser. Here there are submenus for different data types and menu structures. Usually the index.xml file found in these submenus is the master file which the actual menu is based off of. This file can reference other xml files and you may have to modify these child xml files to get the results you are looking for. In order to modify any file, you must right click on it and select Copy To > USER (my-username) . Then you may open this copy and begin to modify it. Once this process has been completed and a change has been made and saved, CAVE will need to be restarted and opened in the D2D perspective to see the change. This example covers how to add a new menu entry to an existing menu. Switch to the Localization Perspective Find the grid folder under D2D > Menus Double-click to expand index.xml Right-click to BASE (common_static) and select Copy To... , then select USER level Double-click USER to open the editor and copy an existing include tag, and update the modelName (this must match an existing product found in the Product Browser) and the menuName (this can be anything) <include installTo=\"menu:models\" fileName=\"menus/grid/allFamilies.xml\"> <substitute key=\"modelName\" value=\"GEFS\" /> <substitute key=\"menuName\" value=\"GEFS\" /> <substitute key=\"frameCount\" value=\"41\" /> <substitute key=\"TP\" value=\"TP\"/> </include> Once this is completed, save the file and restart CAVE Navigate to the Models menu and you should see a new entry with GEFS","title":"Modifying Menus"},{"location":"cave/d2d-edit-menus/#removing-menus","text":"This example covers how to remove a menu (in this case MRMS ) from D2D: Switch to the Localization Perspective Find the mrms folder under D2D > Menus Double-click to expand index.xml Right-click BASE and select Copy To... , then select USER level Right-click refresh the mrms entry Double click USER to open the editor and change <menuContributionFile> <include installTo=\"menu:mrms?after=MRMS_MENU_START\" fileName=\"menus/mrms/mrms.xml\"/> </menuContributionFile> to <menuContributionFile> </menuContributionFile> With this completed, you can now restart CAVE and will not see the MRMS menu anymore. Repeat this example for other product menus, such as radar , upperair , tools , etc., to further customize D2D data menus for any level of localization.","title":"Removing Menus"},{"location":"cave/d2d-gis-shapefiles/","text":"GIS Import \uf0c1 The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data Display GIS Data \uf0c1 Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display. GIS Data Preferences \uf0c1 Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here. Customizing the GIS Attribute Dialog \uf0c1 You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane. Highlighting \uf0c1 Highlighting Selected Areas \uf0c1 To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink. Unhighlighting Selected Areas \uf0c1 You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox. Unhighlighting All Areas \uf0c1 To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted. Controlling Visibility of Image Areas \uf0c1 Hiding Selected Areas \uf0c1 To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear. Unhiding Selected Areas \uf0c1 You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox. Unhiding All Areas \uf0c1 To make all images visible, select Annotation > Make All Visible . Configuring Attributes Table \uf0c1 In the Attributes Table, you have the option to sort by columns and select which columns are displayed. Selecting Columns to Display \uf0c1 By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table. Sorting Column Information \uf0c1 The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table. Labeling GIS Data \uf0c1 You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"GIS and Shapefiles"},{"location":"cave/d2d-gis-shapefiles/#gis-import","text":"The Geographic Information System (GIS) Import menu entry enables users to import geospatial data from varying GIS data sources for display in CAVE. CAVE currently only supports shape data in WGS84 unprojected latitude/longitude. This section describes how to: Load GIS Data in CAVE Modify the GIS Data Preferences Customize the Attributes Label GIS Data","title":"GIS Import"},{"location":"cave/d2d-gis-shapefiles/#display-gis-data","text":"Importing a GIS shapefile is accessed through File > Import > GIS Data . The GIS DataStore Parameters dialog is comprised of four sections: DataStore Type : You can select a file type from the dropdown list. The only option is GIS File . Connection Parameters : Click the Browse button and navigate to the directory where your shapefiles are. Pressing Connect will populate the available shapefiles. Load As : Shapefiles can be loaded as a Map or as a Product. Map : The selected shapefile displays as a map, similar to if you load a map from the Maps menu. Product : When this radio button is selected, you will also need to select the start and end date/time the data is valid for. The selected shapefile displays as a product with a shaded (color-filled) image. When plotting with additional products, if the display time falls within the start/end time range selected, the shapefile will display. When the valid time falls outside the start/end time, the map product image does not display. Table : This section lists all of the available shapefiles that are available for display.","title":"Display GIS Data"},{"location":"cave/d2d-gis-shapefiles/#gis-data-preferences","text":"Updating GIS display preferences is accessed through CAVE > Preferences > GIS Viewer . You are able to alter the highlight color, style, width, and opacity of the product in the Main Display here.","title":"GIS Data Preferences"},{"location":"cave/d2d-gis-shapefiles/#customizing-the-gis-attribute-dialog","text":"You have the ability to highlight or hide specific areas of the displayed map. These functionalities are available by right click and holding on the Map Product ID in the Legend area and selecting Display Attributes . The pop-up window is commonly referred to as the \"Attributes Table\". For each row of information there is an associated map/map product image displayed on the Main Display Pane.","title":"Customizing the GIS Attribute Dialog"},{"location":"cave/d2d-gis-shapefiles/#highlighting","text":"","title":"Highlighting"},{"location":"cave/d2d-gis-shapefiles/#highlighting-selected-areas","text":"To highlight a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Highlighted checkbox. Active highlighted rows will be yellow in the table and the corresponding area in the map display will be pink.","title":"Highlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-selected-areas","text":"You can unhighlight by selecting the row, right mouse hold and uncheck the Highlighted checkbox.","title":"Unhighlighting Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhighlighting-all-areas","text":"To remove all highlighted, select Annotation > Clear Highlights . If you are interested in a particular area in the Main Display Pane, but don't know the where in the Attributes Table it is, left double-click on the area of interest and the corresponding row will be highlighted.","title":"Unhighlighting All Areas"},{"location":"cave/d2d-gis-shapefiles/#controlling-visibility-of-image-areas","text":"","title":"Controlling Visibility of Image Areas"},{"location":"cave/d2d-gis-shapefiles/#hiding-selected-areas","text":"To hide a selected area(s) of the GIS image, highlight the corresponding row(s) in the Attributes Table, right click and hold on one of the selected rows, and check the Visible checkbox. Hidden rows will be gray in the table and the corresponding area in the map display will disappear.","title":"Hiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-selected-areas","text":"You can make these images visible by selecting the row, right mouse hold and check the Visible checkbox.","title":"Unhiding Selected Areas"},{"location":"cave/d2d-gis-shapefiles/#unhiding-all-areas","text":"To make all images visible, select Annotation > Make All Visible .","title":"Unhiding All Areas"},{"location":"cave/d2d-gis-shapefiles/#configuring-attributes-table","text":"In the Attributes Table, you have the option to sort by columns and select which columns are displayed.","title":"Configuring Attributes Table"},{"location":"cave/d2d-gis-shapefiles/#selecting-columns-to-display","text":"By default, all available columns are displayed. The Select Columns dialog will pop-up if you select Data > Select Columns... . You can highlight the columns use the arrows to move them into the Available or Displayed columns. Clicking OK will update your table.","title":"Selecting Columns to Display"},{"location":"cave/d2d-gis-shapefiles/#sorting-column-information","text":"The Sort Order dialog will pop-up if you select Data > Sort... . You can use the drop down menu to choose the column to sort by and then sort by Ascending or Descending. You can sort by additional columns. Clicking OK will update your table.","title":"Sorting Column Information"},{"location":"cave/d2d-gis-shapefiles/#labeling-gis-data","text":"You can select which attribute you want to use to label the objects on the Main Display. To open the Label submenu, right click and hold on the Map Product ID in the Legend area to open a pop-up menu and select Label and choose which attribute you want as the label.","title":"Labeling GIS Data"},{"location":"cave/d2d-gridded-models/","text":"Volume Browser \uf0c1 The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus. Visual Overview \uf0c1 The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons ( Diff and Load ) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser. Volume Browser Menu Bar \uf0c1 The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types VB Tools \uf0c1 Baselines \uf0c1 Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline. Points \uf0c1 Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use. Choose By ID \uf0c1 Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations. Display Types \uf0c1 Plan View (default) \uf0c1 This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space. Cross Section \uf0c1 Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\") Time Height \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution. Var vs Hgt \uf0c1 Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data. Sounding \uf0c1 Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen. Time Series \uf0c1 Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point. Loop Types \uf0c1 Time \uf0c1 The default option for the Volume Browser. It allows you to view model data through time Space \uf0c1 Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#volume-browser","text":"The Volume Browser provides access to numerical models, sounding data, and selected point data sources, such as RAOB, METAR, and Profiler. Through the Browser interface, you can choose the data source(s), field(s), plane(s), and point(s), and generate a customized list of model graphics or images for display. The Volume Browser can be accessed from either the Tools (alphabetically organized) or Models (first option) menus.","title":"Volume Browser"},{"location":"cave/d2d-gridded-models/#visual-overview","text":"The Volume Browser window is divided into four areas: The Menu Bar along the top The Data Selection Menus The Product Selection List The Load Buttons ( Diff and Load ) to load items from the Product Selection List Each area is then subdivided into menu components. The menu bar along the top of the Volume Browser window has dropdown lists that contain options for controlling all the various menu choices of the Volume Browser.","title":"Visual Overview"},{"location":"cave/d2d-gridded-models/#volume-browser-menu-bar","text":"The dropdown menus in the Volume Browser menu bar contain options for controlling and manipulating the Volume Browser or the products chosen through the Volume Browser File Clone Exit Edit Clear All Clear Sources Clear Fields Clear Panes Select None Select All Find (Ctrl+F) Tools Display Types Loop Types","title":"Volume Browser Menu Bar"},{"location":"cave/d2d-gridded-models/#vb-tools","text":"","title":"VB Tools"},{"location":"cave/d2d-gridded-models/#baselines","text":"Selecting Baselines displays 10 lines, labeled A-A' to J-J', along which cross-sections can be constructed from within the Volume Browser. These baseline resources are editable . If you are zoomed in over an area when you load baselines and none appear, press the middle mouse button (B3) to \"snap\" a baseline to where the mouse cursor is. The system chooses a baseline that has not been recently used. If you are working with a baseline, a second click with B3 will return you to the original baseline, even if you modified another baseline.","title":"Baselines"},{"location":"cave/d2d-gridded-models/#points","text":"Points are used to generate model soundings, time-height cross-sections, time series, and variable vs. height plots using the Volume Browser. As with the Baselines, the locations of these Points can be edited in the following manner: \"Snapping\" an Interactive Point : If you are zoomed in over an area when you load Interactive Points and no Points appear, click B3 to \"snap\" a Point to where the mouse cursor is positioned. The system chooses a Point that has not been recently used. If you are currently working with a Point, then a second B3 click will place another Point at the location of your cursor. Dynamic Reference Map : When you generate a model sounding, a time-height cross-section, a time series, or a variable vs. height plot, a small reference map indicating the location(s) of the plotted sounding(s) is provided in the upper left corner of the Main Display Pane. Points may be created, deleted, hidden, and manipulated (location, name, font, and color). Points are not limited in terms of number, location, or designation. Points may also be assigned to different groups to facilitate their use.","title":"Points"},{"location":"cave/d2d-gridded-models/#choose-by-id","text":"Choose By ID, which is a function of DMD (Digital Mesocyclone Display), is a method of selecting feature locations. The tool is used to monitor the same feature at a certain location. Without the Choose By ID tool, a monitored feature (over a period of time) could move away from its monitored location and another feature could move in its place. You can use Choose By ID to set points, baselines, and \"Home\" for conventional locations like METARs and RAOBs (Radiosonde Observations), but its primary use is for the WSR-88D-identified mesocyclone locations.","title":"Choose By ID"},{"location":"cave/d2d-gridded-models/#display-types","text":"","title":"Display Types"},{"location":"cave/d2d-gridded-models/#plan-view-default","text":"This is the default option for the Volume Browser. From the Plan-view perspective, data are plotted onto horizontal surfaces. The additional options menu that appears in the Volume Browser menu bar allows you to choose whether you want the Plan view data to Animate in Time or Animate in Space.","title":"Plan View (default)"},{"location":"cave/d2d-gridded-models/#cross-section","text":"Allows you to view gridded data as vertical slices along specific baselines. You need to use either the Interactive Baseline Tool or the predefined latitude/longitude baselines to specify the slice you wish to see. One of the additional options menus that appear in the Volume Browser menu bar allows you to choose whether you want the cross-section data to animate in time or space, while the other options menu allows you to adjust the vertical resolution. Descriptions of these options follows. (Note that the Fields and Planes submenu labels have changed after selecting \"Cross section.\")","title":"Cross Section"},{"location":"cave/d2d-gridded-models/#time-height","text":"Used in conjunction with the Interactive Points Tool to enable you to view a time height cross section of a full run of gridded model data for a specific location. Additional options menus in the Volume Browser menu bar allow you to choose the direction in which you want the data to be plotted, and to adjust the vertical resolution.","title":"Time Height"},{"location":"cave/d2d-gridded-models/#var-vs-hgt","text":"Enables you to view a profile of a meteorological model field as it changes through height, which is displayed in millibars. By using the Interactive Points Tool, you can select one or more locations from which to plot the data.","title":"Var vs Hgt"},{"location":"cave/d2d-gridded-models/#sounding","text":"Works in conjunction with the Interactive Points Tool to enable you to generate a Skew-T chart for a specific location, no additional menus appear in the Volume Browser when the Soundings setting is chosen.","title":"Sounding"},{"location":"cave/d2d-gridded-models/#time-series","text":"Used in conjunction with the Interactive Points Tool to enable you to plot gridded data on a time versus data value graph for a specified point.","title":"Time Series"},{"location":"cave/d2d-gridded-models/#loop-types","text":"","title":"Loop Types"},{"location":"cave/d2d-gridded-models/#time","text":"The default option for the Volume Browser. It allows you to view model data through time","title":"Time"},{"location":"cave/d2d-gridded-models/#space","text":"Allows you to loop through a series of predefined latitude or longitude cross-sectional slices at a fixed time.","title":"Space"},{"location":"cave/d2d-perspective/","text":"D2D Perspective \uf0c1 D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ). Resource Stack \uf0c1 At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products. Left-Click Resource Name to Hide \uf0c1 A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible. Right-Click Background to Cycle Resource Views \uf0c1 The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources. Hold-Right-Click Resource Name for Menu \uf0c1 Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu. Unload a Single Resource \uf0c1 To remove or \"unload\" a single resource, use the Resource Menu. Open the Resource Menu on the resource you wish to remove and select Unload . Display Menu \uf0c1 The display menu has many options which can alter the functionality in CAVE. Hold-Right-Click Background for Display Menu \uf0c1 Holding down the right mouse button anywhere in the map view will open a right-click menu Show Map Legends \uf0c1 From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view. Sample Loaded Resources \uf0c1 Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu. Toggle 2 or 4-Panel Layout \uf0c1 Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four). Unload Data \uf0c1 Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products. Product Browser \uf0c1 The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available. Options Menu \uf0c1 There are several toggle options and options dialogs that are available under the Options menu found at the top of the application. Time Options (Ctrl + T) \uf0c1 This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is. Image Combination (Insert) \uf0c1 This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable. Display Properties \uf0c1 This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties. Loop Properties (Ctrl + L) \uf0c1 Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping. Image Properties (Ctrl + I) \uf0c1 The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data. Set Time \uf0c1 This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data. Set Background Color \uf0c1 You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3). Switching Perspectives \uf0c1 Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. The GFE perspective is not currently working on the direct Windows or MacOS installations of CAVE. The GFE perspective is only enabled for the OAX site. CAVE Preferences \uf0c1 Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname. Load Mode \uf0c1 Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#d2d-perspective","text":"D2D (Display 2-Dimensions) is the default AWIPS CAVE perspective, designed to mimmic the look and feel of the legacy AWIPS I system. System menus include CAVE , File , View , Options , and Tools . Data menus include Models , Surface , NCEP/Hydro , Upper Air , Satellite , Local Radar Stations , Radar , MRMS , and Maps . Map projection, image properties, frame control, and a few featured applications ( Warngen , Nsharp , and Browser ) make up the the primary D2D toolbar. Note : Depending on which Operating System version of CAVE there may be other application options ( PGEN , GEMPAK ).","title":"D2D Perspective"},{"location":"cave/d2d-perspective/#resource-stack","text":"At bottom-right of the map window the the Resource Stack, which displays all loaded resources and map overlays, and allows for interaction and customization with the resource via a right-click menu . There are three available views of the Resource Stack, the default will show all Product Resources. The other two views are the Simple view, which shows the time, and the Map Resources. To switch between views see the Right-Click Functionality . It's important to understand that Product Resources and Map Resources are handled differently given the time-based nature of Products, compared to the static nature of maps. Selecting the Clear button will remove all Products but not remove any Map Products.","title":"Resource Stack"},{"location":"cave/d2d-perspective/#left-click-resource-name-to-hide","text":"A left click on any resource in the stack will hide the resource and turn the label gray. Clicking the name again makes the resource visible.","title":"Left-Click Resource Name to Hide"},{"location":"cave/d2d-perspective/#right-click-background-to-cycle-resource-views","text":"The default display in the resource stack is the Product Resources. Right Click the mouse on the map background (anywhere but on the stack itself) to switch to a Simple View, which just shows the current displayed time if product data is loaded. Right Click again to show all Map Resources. Right Click again to switch back to Product Resources.","title":"Right-Click Background to Cycle Resource Views"},{"location":"cave/d2d-perspective/#hold-right-click-resource-name-for-menu","text":"Drag the mouse over a loaded resource and hold the right mouse button until a menu appears. The hold-right-click menu allows you to control individual resource Image Properties , Change Colormaps , change resource color, width, density, and magnification, move resources up and down in the stack, as well as configure custom options with other interactive resources. This menu also gives you the option to unload this specific product , as opposed to removing all data prodcuts. Simply select the Unload option at the bottom of the resource's hold-right-click menu.","title":"Hold-Right-Click Resource Name for Menu"},{"location":"cave/d2d-perspective/#unload-a-single-resource","text":"To remove or \"unload\" a single resource, use the Resource Menu. Open the Resource Menu on the resource you wish to remove and select Unload .","title":"Unload a Single Resource"},{"location":"cave/d2d-perspective/#display-menu","text":"The display menu has many options which can alter the functionality in CAVE.","title":"Display Menu"},{"location":"cave/d2d-perspective/#hold-right-click-background-for-display-menu","text":"Holding down the right mouse button anywhere in the map view will open a right-click menu","title":"Hold-Right-Click Background for Display Menu"},{"location":"cave/d2d-perspective/#show-map-legends","text":"From the above menu select Show Map Legends and watch the Resource Stack show only map resources which are loaded to the view.","title":"Show Map Legends"},{"location":"cave/d2d-perspective/#sample-loaded-resources","text":"Most data types have a right-click menu option for reading out the pixel value, displayed as multi-line text for multiple resources. This can be toggled on and off by selecting the Sample option in the Display Menu.","title":"Sample Loaded Resources"},{"location":"cave/d2d-perspective/#toggle-2-or-4-panel-layout","text":"Right-click hold in the view and select Two Panel Layout or Four Panel Layout to create duplicates of the current view. Notice the readout is at the same position in both panels. Any mouse movement made on one panel will be made on the other. By default, loading any data will load that data onto both panels. However, there is the option to specify which panel you would like to load data into, which can be useful if you want to have different data in each of the panels. To access this option, simple hold-right click to pull up the Display menu and choose Load to This Panel as shown below: Now, a yellow L will appear in the lower left hand corner of the panel you selected to load data to. When data is loaded from the menus it will only load to the display desginated with the L. Switch back to loading in both panels, by using the Load to All Panels option in the Display Menu. From this multi-pane display, hold-right-click again and you will see the Single Panel Layout option to switch back to a standard view (defaulting to the left of two, and top-left of four).","title":"Toggle 2 or 4-Panel Layout"},{"location":"cave/d2d-perspective/#unload-data","text":"Select Unload All Products to remove all loaded graphic and image products from the display and start fresh. Select Unload Graphics to remove all but the image products.","title":"Unload Data"},{"location":"cave/d2d-perspective/#product-browser","text":"The Product Browser allows users to browse a complete data inventory in a side window, organized by data type. To open the Product Browser, either select the icon in the toolbar ( ), or go to the menu: CAVE > Data Browsers > Product Browser . Selections for Grid , Lightning , Maps , Radar , Redbook , and Satellite are available. All products loaded with the Product Browser are given default settings. Note : The Linux and Mac version also have a selection for GFE available.","title":"Product Browser"},{"location":"cave/d2d-perspective/#options-menu","text":"There are several toggle options and options dialogs that are available under the Options menu found at the top of the application.","title":"Options Menu"},{"location":"cave/d2d-perspective/#time-options-ctrl-t","text":"This check button enables/disables the ability to select the time interval between frames of real-time or model data. This feature has the added benefit of allowing you to view extended amounts of data (temporally) but stay within the limits of 64 frames. For example, METAR surface plots, which typically display every hour, can be set to display every three hours via the Select Valid Time and Time Resolution Dialog Box. When the Time Options check button is selected, the next product you choose to display in the Main Display Pane launches either the Select Valid Time and Time Resolution dialog box or the Select Offset and Tolerance dialog box. When you are loading data to an empty display and the Time Options check button is enabled, the Select Valid Time and Time Resolution dialog box opens. Valid Time: In this column of dates/times, you may choose the one that will be the first frame loaded onto the Large Display Pane. The Default option is the most recent data. Time Resolution: This column contains various time increments in which the data can be displayed. Once you make a selection, the Valid Time Column indents the exact times that will be displayed. The Default resolution displays the most recent frames available. With the Time Options check button enabled for a display that already contains data, when you choose the data to be overlaid in the Main Display Pane, the Select Offset and Tolerance dialog box appears, providing the following options: Offset : This column contains various time increments at intervals before, at, or after the time you selected for the first product that is displayed in the Main Display Pane. Tolerance : The options in this column refer to how strict the time matching is. \"None\" means an exact match, while \"Infinite\" will put the closest match in each frame, regardless of how far off it is.","title":"Time Options (Ctrl + T)"},{"location":"cave/d2d-perspective/#image-combination-insert","text":"This check button enables/disables the ability to display two images at once. Combined-image displays have been improved by removing the valid time for non-forecast products and removing the date string (time is kept) from the left side of the legend. In particular, this makes All-Tilts radar legends more usable.","title":"Image Combination (Insert)"},{"location":"cave/d2d-perspective/#display-properties","text":"This menu option opens the Display Properties dialog box. Most of the options available in this dialog box are also available on the Toolbar , while the rest are available in the individual resource menus if that resource uses these properties.","title":"Display Properties"},{"location":"cave/d2d-perspective/#loop-properties-ctrl-l","text":"Loop Properties is another dialog box that can be opened from the Options menu or from the Loop Properties iconified button on the D2D Toolbar, or by using the Ctrl + L keyboard shortcut. The dialog allows you to adjust the forward and backward speeds, with 0 = off and 10 = maximum speed. You can set the duration of the first and last frame dwell times to between zero and 2.5 seconds. You can turn looping on or off by checking the Looping check button. There is also a Looping button located on the Toolbar that enables/disables the animation in the large display pane. Finally, you can turn looping on and increase/decrease forward speed by pressing Page Up/Page Down on your keyboard, and turn looping off with the Left or Right Arrow keys. On the toolbar, you can use the button to start/stop looping.","title":"Loop Properties (Ctrl + L)"},{"location":"cave/d2d-perspective/#image-properties-ctrl-i","text":"The Image Properties dialog box can be opened here (in the Options menu) or by using the Image Properties iconified button on the D2D Toolbar ( ), or using using the Ctrl + I keyboard shortcut. This dialog box provides options that allow you to change the color table; adjust the brightness, contrast, and alpha of either a single image or combined images; fade between combined images; and/or interpolate the displayed data.","title":"Image Properties (Ctrl + I)"},{"location":"cave/d2d-perspective/#set-time","text":"This option allows you to set the CAVE clock, located on the bottom of the screen, to an earlier time for reviewing archived data.","title":"Set Time"},{"location":"cave/d2d-perspective/#set-background-color","text":"You can now set the background display color on your workstation. You can also set the background display color for a single pane via mouse Button 3 (B3).","title":"Set Background Color"},{"location":"cave/d2d-perspective/#switching-perspectives","text":"Switching perspectives in CAVE can be found in the CAVE > Perspective menu. D2D is one of many available CAVE perspectives. By selecting the CAVE > Perspective menu you can switch into the GFE , or Localization perspective. The GFE perspective is not currently working on the direct Windows or MacOS installations of CAVE. The GFE perspective is only enabled for the OAX site.","title":"Switching Perspectives"},{"location":"cave/d2d-perspective/#cave-preferences","text":"Preferences and settings for the CAVE client can be found in the CAVE > Preferences menu. Set the Localization Site and server for the workstation; configure mouse operations, change performance levels, font magnification, and text workstation hostname.","title":"CAVE Preferences"},{"location":"cave/d2d-perspective/#load-mode","text":"Within the Display Properties dialog is the Load Mode option, which provides different ways to display data by manipulating previous model runs and inventories of data sets. The selected load mode is shown on the toolbar when the Load Mode menu is closed, and can also be changed by using this toolbar option as well. A description of the Load Mode options follow. Latest : Displays forecast data only from the latest model run, but also backfills at the beginning of the loop with available frames from previous runs to satisfy the requested number of frames. Valid time seq : Displays the most recent data and fills empty frames with previous data. For models, it provides the product from the latest possible run for every available valid time. No Backfill : Displays model data only from the most recent model run time with no backfilling to fill out a loop. Using this Load Mode prevents the mixing of old and new data. Previous run : Displays the previous model run, backfilling with frames from previous runs at the beginning of the loop to satisfy the requested number of frames. Prev valid time seq : Displays the previous model run and fills empty frames with previous model data or analyses. Prognosis loop : Shows a sequence of n-hour forecasts from successive model runs. Analysis loop : Loads a sequence of model analyses but no forecasts. dProg/dt : Selects forecasts from different model runs that all have the same valid times. This load mode is available only when there are no other products loaded in the large display pane. Forced : Puts the latest version of a selected product in all frames without time-matching. Forecast match : Overlays a model product only when its forecast times match those of an initially loaded product. This load mode is available only when another product is already loaded in the large display pane. Inventory : Selecting a product when the load mode is set to Inventory brings up a Dialog Box with the available forecast and inventory times from which you can select the product you want. Inventory loads into the currently displayed frame. Slot : Puts the latest version of a selected product in the currently displayed frame.","title":"Load Mode"},{"location":"cave/goes-16-17-satellite/","text":"GOES 16/17 \uf0c1 The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE. LDM Pattern Actions \uf0c1 The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4 Individual Channels \uf0c1 All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors. East CONUS 1km \uf0c1 East Full Disk 6km \uf0c1 East Mesoscale Sectors (EMESO-1, EMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) West CONUS 1km \uf0c1 West Full Disk \uf0c1 West Mesoscale Sectors (WMESO-1, WMESO-2) \uf0c1 Two floating mesoscale sectors (location will vary day to day from image shown) Alaska \uf0c1 Hawaii \uf0c1 Puerto Rico (PRREGI) \uf0c1 RGB Composites \uf0c1 RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products. Day Cloud Phase \uf0c1 Fire Temperature \uf0c1 Day Land Cloud \uf0c1 Day Cloud Convection \uf0c1 Day Land Cloud Fires \uf0c1 VIS/IR Sandwich \uf0c1 Simple Water Vapor \uf0c1 Air Mass \uf0c1 Ash \uf0c1 Day Convection \uf0c1 Day Snow Fog \uf0c1 Differential Water Vapor \uf0c1 Dust \uf0c1 CIMSS Natural Color \uf0c1 Nighttime Microphysics \uf0c1 SO2 \uf0c1 CIRA Geocolor \uf0c1 CIRA Debra Dust \uf0c1 CIRA Cloud Snow \uf0c1 Daytime Composite 1 \uf0c1 Daytime Composite 5 \uf0c1 Channel Differences \uf0c1 Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows. Derived Products \uf0c1 Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M Geostationary Lightning Mapper (GLM) \uf0c1 Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus. Derived Motion Winds \uf0c1 Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers. Vertical Temperature and Moisture Profile \uf0c1 Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here . HDF5 Data Store \uf0c1 Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#goes-1617","text":"The goesr EDEX decoder supports the ingest of GOES products coming over NOAAPort and Unidata's IDD. These include single channel imagery , derived products (Level 2b netCDF files), gridded Geostationary Lightning Mapper (GLM) products (produced by Eric Bruning at Texas Tech), CIRA created RGB specific products, and vertical temperature/moisture profiles . Using derived parameters, additional RGB and channel difference products can be loaded. The dmw EDEX decoder supports the ingest of GOES derived motion winds . GOES East and West products are accessible in the Satellite menu. The menu is broken into sections starting with common CONUS GOES East/West Combo products. There are submenus for each of the separate geospatial sectors: East Full Disk East CONUS East Mesoscale Sectors (x2) West Full Disk West CONUS West Mesoscale Sectors (x2) Hawaii Alaska Puerto Rico Each sector submenu has products for individual channels and vertical profiles, as well as submenus for derived products, channel differences, RGB Composites, GLM data, and derived motion winds. GLM data can also be found with its own submenu option a little lower down the menu and under the Surface menu. The RGB products are not available on MacOS or in a Virtual Machine running CAVE.","title":"GOES 16/17"},{"location":"cave/goes-16-17-satellite/#ldm-pattern-actions","text":"The Unidata IDD redistributes both the NOAAPort/SBN GOES tiled products as well as stitched together GOES products. While AWIPS can decode and ingest both, it's important to only be requesting from one or the other so you aren't creating duplicate processing. The entries that should be used for GOES data are shown below which is found in the LDM's pqact.conf file, located in /awips2/ldm/etc . (For the full list of pqact entries, you can view this file). # GOES 16/17 Single Channel (ABI) via Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudAndMoistureImagery/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CMI-IDD/\\5\\6\\7\\8.nc4 # GOES 16/17 derived products + derived motion wind via SBN HDS ^(IXT.[8-9]9) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) NOTHER ^(IXT[WXY]01) (KNES) (..)(..)(..) FILE -close -edex /awips2/data_store/GOES/(\\3:yyyy)(\\3:mm)\\3/\\4/derivedProducts-SBN/\\1_KNES_\\2\\3\\4\\5-(seq) # GOES 16 GLM Gridded Products via Texas Tech-->Unidata IDD NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeostationaryLightningMapper/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\3/\\6/GLM-IDD/\\4\\5\\6\\7.nc4 # GOES CIRA derived products NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/GeoColor/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/GeoColor/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/DebraDust/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/DebraDust/\\5\\6\\7\\8.nc4 NIMAGE ^/data/ldm/pub/native/satellite/GOES/([^/]*)/Products/CloudSnow/([^/]*)/([^/]*)/([0-9]{8})/([^/]*)(c[0-9]{7})(..)(.....).nc FILE -close -edex /awips2/data_store/GOES/\\4/\\7/CIRA/CloudSnow/\\5\\6\\7\\8.nc4","title":"LDM Pattern Actions"},{"location":"cave/goes-16-17-satellite/#individual-channels","text":"All geospatial sectors have 16 individual channel products that can be viewed. Below are samples of Channel 14 (11.20\u03bcm) for each of the sectors.","title":"Individual Channels"},{"location":"cave/goes-16-17-satellite/#east-conus-1km","text":"","title":"East CONUS 1km"},{"location":"cave/goes-16-17-satellite/#east-full-disk-6km","text":"","title":"East Full Disk 6km"},{"location":"cave/goes-16-17-satellite/#east-mesoscale-sectors-emeso-1-emeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"East Mesoscale Sectors (EMESO-1, EMESO-2)"},{"location":"cave/goes-16-17-satellite/#west-conus-1km","text":"","title":"West CONUS 1km"},{"location":"cave/goes-16-17-satellite/#west-full-disk","text":"","title":"West Full Disk"},{"location":"cave/goes-16-17-satellite/#west-mesoscale-sectors-wmeso-1-wmeso-2","text":"Two floating mesoscale sectors (location will vary day to day from image shown)","title":"West Mesoscale Sectors (WMESO-1, WMESO-2)"},{"location":"cave/goes-16-17-satellite/#alaska","text":"","title":"Alaska"},{"location":"cave/goes-16-17-satellite/#hawaii","text":"","title":"Hawaii"},{"location":"cave/goes-16-17-satellite/#puerto-rico-prregi","text":"","title":"Puerto Rico (PRREGI)"},{"location":"cave/goes-16-17-satellite/#rgb-composites","text":"RGB Composites are made by combining 3 channels and are available for each sector. Quite a few new RGB products have been added in Unidata's 18.2.1 release. These products are generated on the fly in AWIPS using the existing channel products from EDEX. GOES RGB Imagery is NOT SUPPORTED on macOS or within a Virtual Machine OpenGL Shading Language limitations prevent multi-channel imagery from displaying correctly on Mac or in a Virtual Machine. Please use the Linux or Windows installs to view RGB products.","title":"RGB Composites"},{"location":"cave/goes-16-17-satellite/#day-cloud-phase","text":"","title":"Day Cloud Phase"},{"location":"cave/goes-16-17-satellite/#fire-temperature","text":"","title":"Fire Temperature"},{"location":"cave/goes-16-17-satellite/#day-land-cloud","text":"","title":"Day Land Cloud"},{"location":"cave/goes-16-17-satellite/#day-cloud-convection","text":"","title":"Day Cloud Convection"},{"location":"cave/goes-16-17-satellite/#day-land-cloud-fires","text":"","title":"Day Land Cloud Fires"},{"location":"cave/goes-16-17-satellite/#visir-sandwich","text":"","title":"VIS/IR Sandwich"},{"location":"cave/goes-16-17-satellite/#simple-water-vapor","text":"","title":"Simple Water Vapor"},{"location":"cave/goes-16-17-satellite/#air-mass","text":"","title":"Air Mass"},{"location":"cave/goes-16-17-satellite/#ash","text":"","title":"Ash"},{"location":"cave/goes-16-17-satellite/#day-convection","text":"","title":"Day Convection"},{"location":"cave/goes-16-17-satellite/#day-snow-fog","text":"","title":"Day Snow Fog"},{"location":"cave/goes-16-17-satellite/#differential-water-vapor","text":"","title":"Differential Water Vapor"},{"location":"cave/goes-16-17-satellite/#dust","text":"","title":"Dust"},{"location":"cave/goes-16-17-satellite/#cimss-natural-color","text":"","title":"CIMSS Natural Color"},{"location":"cave/goes-16-17-satellite/#nighttime-microphysics","text":"","title":"Nighttime Microphysics"},{"location":"cave/goes-16-17-satellite/#so2","text":"","title":"SO2"},{"location":"cave/goes-16-17-satellite/#cira-geocolor","text":"","title":"CIRA Geocolor"},{"location":"cave/goes-16-17-satellite/#cira-debra-dust","text":"","title":"CIRA Debra Dust"},{"location":"cave/goes-16-17-satellite/#cira-cloud-snow","text":"","title":"CIRA Cloud Snow"},{"location":"cave/goes-16-17-satellite/#daytime-composite-1","text":"","title":"Daytime Composite 1"},{"location":"cave/goes-16-17-satellite/#daytime-composite-5","text":"","title":"Daytime Composite 5"},{"location":"cave/goes-16-17-satellite/#channel-differences","text":"Channel differences are the result of subtracting one channel from another to produce a new product. These products are generated on the fly in AWIPS using the existing channel products from EDEX. There currently 10 channel differences that are offered in CAVE: Split Window (10.3 - 12.3 \u03bcm) Split Cloud Top Phase (11.2 - 8.4 \u03bcm) Night Fog (10.3 - 2.9 \u03bcm) Day Fog (3.9 - 10.3 \u03bcm) Split Fire (2.2 - 1.6 \u03bcm) Split Ozone (9.6 - 10.3 \u03bcm) Split Water Vapor (6.19 - 7.3 \u03bcm) Split Snow (1.6 - 0.64 \u03bcm) Vegetation (0.64 - 0.87 \u03bcm) Upper Level Info (11.2 - 6.19 \u03bcm) The rendering of these products uses the Jep package in Python, which has specific install instructions for Windows.","title":"Channel Differences"},{"location":"cave/goes-16-17-satellite/#derived-products","text":"Derived products are also known as Level 2+ products. Currently there are only derived products from GOES East available in AWIPS. Each sector has a different set of products available. To find out some more information on some of the products please the Quick Guides compiled by CIRA. These may not all be available for each sector. The current products offered in CAVE are listed below and to the right is which GOES East sector they are available for (F=Full Disk, C=CONUS, M=Mesoscale): Aerosol Detection - F,C,M Aerosol Optical Depth - F,C Clear Sky Mask - F,C,M Cloud Optical Depth - F,C Cloud Particle Size -F,C,M Cloud Top Height -F,C,M Cloud Top Phase -F,C,M Cloud Top Pressure -F,C Cloud Top Temperature - F,M Derived CAPE - F,C,M Derived K-Index - F,C,M Derived Lifted Index - F,C,M Derived Showalter Index - F,C,M Derived Total Totals - F,C,M Fire Area - F,C Fire Power - F,C Fire Temperature - F,C Instrument Flight Rule (IFR) Probability - C Low IFR Probability - C Marginal Visual Flight Rules (MVFR) Probability - C Cloud Thickness - C Land Skin Temperature - F,C,M RR/QPE - F Sea Surface Temperature - F Total Precip Water - F,C,M","title":"Derived Products"},{"location":"cave/goes-16-17-satellite/#geostationary-lightning-mapper-glm","text":"Dr. Eric Bruning at Texas Tech has taken the raw GLM data and coded up some new gridded products that can be ingested and displayed in AWIPS. Minimum Flash Area Average Flash Area Flash Extent Density Group Extent Density Total Optical Energy GLM data are located in the menu structure: Satellite > [SECTOR] > GLM Products . You can also access the data from Surface > GLM - Geostationary Lightning Mapper submenus.","title":"Geostationary Lightning Mapper (GLM)"},{"location":"cave/goes-16-17-satellite/#derived-motion-winds","text":"Derived Motion Wind Vectors are produced using sequential ABI images and can provide information about winds at different levels. The wind vectors are computed using both visible and infrared imagery. Winds can be plotted by different pressure layers or individual channels. More information can be found here . Below is an image of the winds at different pressure layers.","title":"Derived Motion Winds"},{"location":"cave/goes-16-17-satellite/#vertical-temperature-and-moisture-profile","text":"Vertical Temperature and Moisture profiles are available in AWIPS. Similar to NUCAPS, when loaded in CAVE, a circle is displayed for each location that has a vertical profile available. When clicking on the circle, NSHARP will open with the vertical temperature and moisture profile. These profiles are GFS data that have been adjusted based on the satellite observations. More information can be found here .","title":"Vertical Temperature and Moisture Profile"},{"location":"cave/goes-16-17-satellite/#hdf5-data-store","text":"Decoded GOES satellite data are stored in /awips2/edex/data/hdf5/satellite/ under sector subdirectories: drwxr-xr-x awips fxalpha 4096 AKREGI drwxr-xr-x awips fxalpha 4096 Antarctic drwxr-xr-x awips fxalpha 4096 Arctic drwxr-xr-x awips fxalpha 4096 AREA0600 drwxr-xr-x awips fxalpha 4096 AREA0700 drwxr-xr-x awips fxalpha 4096 AREA3100 drwxr-xr-x awips fxalpha 4096 AREA3101 drwxr-xr-x awips fxalpha 12288 ECONUS drwxr-xr-x awips fxalpha 4096 EFD drwxr-xr-x awips fxalpha 4096 EMESO-1 drwxr-xr-x awips fxalpha 4096 EMESO-2 drwxr-xr-x awips fxalpha 4096 HIREGI drwxr-xr-x awips fxalpha 4096 NEXRCOMP drwxr-xr-x awips fxalpha 4096 PRREGI drwxr-xr-x awips fxalpha 4096 WCONUS drwxr-xr-x awips fxalpha 4096 WFD drwxr-xr-x awips fxalpha 4096 WMESO-1 drwxr-xr-x awips fxalpha 4096 WMESO-2","title":"HDF5 Data Store"},{"location":"cave/import-export/","text":"Import/Export \uf0c1 Export Images/GIFs \uf0c1 The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF. Export KML \uf0c1 The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth. CAVE Import Formats \uf0c1 CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays CAVE Export Formats \uf0c1 CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"Import/Export"},{"location":"cave/import-export/#importexport","text":"","title":"Import/Export"},{"location":"cave/import-export/#export-imagesgifs","text":"The D2D screen can be exported as a PNG image as well as an animated GIF using the File > Export > Image menu option. This captures the current state of the screen, and allows you to set animation options (frame number, dwell time, etc) for exporting GIFs. If you choose to animate, you will either need to rename the destination file to have the .gif extension, or CAVE will pop up a dialog when you go to save, asking you to confirm that you want to output a GIF.","title":"Export Images/GIFs"},{"location":"cave/import-export/#export-kml","text":"The Export submenu also includes a KML option ( File > Export > KML ), which allows users to save D2D displays or GFE grids in the KML (Keyhole Markup Language) file format. When zipped (compressed), the KML file format forms a KMZ file, which can be used in applications such as Google Earth. The KML dialog box includes options to select frames to export. This includes exporting all frames, the current/displayed frame, a range of frames, and, in GFE, the selected time range as highlighted in the Grid Manager. Additional options are available for selection under the \"Other Options\" section: Export Hidden : When selected, all displayed and hidden products listed in the Product Legend section of the Main Display Pane will be exported. Export Maps : When selected, all enabled maps displayed within the Main Display Pane will be exported. Shade Earth : When selected, a shaded background is applied to the exported product. If loaded in Google Earth, the earth will be overlaid with a black backdrop, and data will be displayed as it would in D2D with a black background. Show Background Tiles : When selected, data (such as plot data) will display on top of black tiles when loaded in Google Earth.","title":"Export KML"},{"location":"cave/import-export/#cave-import-formats","text":"CAVE supported the following geo-referenced data files. CAVE can import the following through formats through the File > Import menu. Background... Image... BCD File GeoTIFF LPI File SPI File Displays","title":"CAVE Import Formats"},{"location":"cave/import-export/#cave-export-formats","text":"CAVE can export to the following through the File > Export menu. Image Print Screen KML Editor Display... Perspective Displays...","title":"CAVE Export Formats"},{"location":"cave/localization-perspective/","text":"Localization perspective \uf0c1 Localization Levels \uf0c1 AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base Localization Editor \uf0c1 The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor. Customizing CAVE Menus \uf0c1 Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Localization Perspective"},{"location":"cave/localization-perspective/#localization-perspective","text":"","title":"Localization perspective"},{"location":"cave/localization-perspective/#localization-levels","text":"AWIPS uses a hierarchical system known as Localization to configure many aspects of EDEX and CAVE, such as available menu items, color maps, and derived parameters. This system allows a user to override existing configurations and customize CAVE. For example, a User -level localization file will supercede any similar file in a higher level (such as Site ). There are three levels of localization , starting with the default BASE BASE - default SITE - 3-letter WFO ID (required) overrides base USER - user-level localization overrides site and base","title":"Localization Levels"},{"location":"cave/localization-perspective/#localization-editor","text":"The Localization Perspective acts as file editor for the XML, Python, and text files which customize the look and feel of CAVE. This perspective is available in the menu CAVE > Perspective > Localization . Users may copy and add files to available directories at their own User localization version. Examples of things that can be accessed through the perspective include (this list is not all-inclusive): NCP Predefined Areas, Color Maps and Style Rules D2D Volume Browser Controls D2D Bundles - Scales (WFO, State(s), etc.) CAVE Map Overlays, Color Maps and Style Rules GFE Tools and Utilities The left panel contains a directory heirarchy of CAVE files for D2D, GFE, and NCP, which can be copied and edited as user localization files. There may be several versions of each file including BASE , CONFIGURED (GFE only), SITE , and USER . Each file version is listed separately under the actual file name. The File Editor view opens the selected configuration file in an appropriate editor. For example, a Python file is opened in a Python editor, and an XML file is opened in an XML editor.","title":"Localization Editor"},{"location":"cave/localization-perspective/#customizing-cave-menus","text":"Navigate to D2D > Menus and select a submenu (e.g. satellite ). This directory lists all of the menu file contributions made by this data plugin. Most data menu directories will have an index.xml file from which you can investigate the menu structure and make needed changes. Selecting a file such as index.xml (by double clicking, or expanding) will show a sub-menu with a default localization level (typically BASE or CONFIGURED ). Double-click this file to open in the file editor (you may need to click Source at the bottom of the view to see the raw XML). Right-click this file and select Copy To > User ( username ) and you will see the file localization versions update with the new copy. Select this file to edit, and override, the existing version.","title":"Customizing CAVE Menus"},{"location":"cave/maps-views-projections/","text":"Maps, Views, Projections \uf0c1 Default Map Scales \uf0c1 The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site) New Map Editor / View \uf0c1 Adding a New Map Editor \uf0c1 This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor Renaming Map Editor \uf0c1 Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc). New Projection \uf0c1 A new map projection can be created using the file menu: File > New Projection .","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#maps-views-projections","text":"","title":"Maps, Views, Projections"},{"location":"cave/maps-views-projections/#default-map-scales","text":"The first toolbar menu item is a dropdown menu for different geographic areas and map projections. The default view is always CONUS , which is a North Polar Steregraphic projection centered on the Continental United States. Default projections and areas available in the menu CONUS N. Hemisphere (North Polar Stereographic) Regional (for the selected localization site) WFO (for the selected localization site) World - Mercator World - CED World - Mollweide GOES East Full Disk (Geostationary) GOES West Full Disk (Geostationary) Regional Mercator projections for Africa Alaska Antarctica Arctic Australia,New Zealand Europe Hawaii Japan Pacific Ocean Puerto Rico South America WFO (Has a submenu which contains a map scale for every NWS localization site)","title":"Default Map Scales"},{"location":"cave/maps-views-projections/#new-map-editor-view","text":"","title":"New Map Editor / View"},{"location":"cave/maps-views-projections/#adding-a-new-map-editor","text":"This can be done in two ways: using the file menu and right clicking on the tab bar. Using the file menu, simply go to: File > New Map . This opens a new map editor tab with the default projection (CONUS Polar Stereographic). To use the tab bar, right-click on or next to any tab and select New Editor","title":"Adding a New Map Editor"},{"location":"cave/maps-views-projections/#renaming-map-editor","text":"Any of the map editor tabs can be renamed. This can be particularly helpful if you have multiple tabs, with a different focus on each (ie. different geographic reigon, different types of data, etc).","title":"Renaming Map Editor"},{"location":"cave/maps-views-projections/#new-projection","text":"A new map projection can be created using the file menu: File > New Projection .","title":"New Projection"},{"location":"cave/nsharp/","text":"NSHARP \uf0c1 NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source from the Volume menu Make sure it has data, signified by a green box to the right Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display NSHARP Configurations \uf0c1 NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview . Skew-T Display \uf0c1 The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. There are many different display configurations available. The one shown in these next few sections is the SPC Wide Screen Configuration . The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display. Windspeed vs Height and Inferred Temperature Advection \uf0c1 The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information. Hodograph Display \uf0c1 This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display. Insets \uf0c1 In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information. Table Output Displays \uf0c1 The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available. Graphs/Statistics \uf0c1 In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information. Sounding Inventory \uf0c1 The display configuration used in this example is the D2D Standard SkewT Screen Configuration . This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"NSHARP"},{"location":"cave/nsharp/#nsharp","text":"NSHARP, which stands for the N ational Center S ounding and H odograph A nalysis and R esearch P rogram, is an AWIPS plugin originally based on NAWIPS NSHAREP, SPCs BigSHARP sounding display tool, and the Python package SHARpy . NSHARP is available a number of ways in CAVE: From the D2D toolbar select the NSHARP icon: From the Upper Air menu select NSHARP Soundings From the Upper Air menu select a station from the RAOB menus From the Upper Air menu select NUCAPS Soundings From the Models or Tools menu select Volume Browser Make sure Sounding is selected from the menu at the top Select a source from the Volume menu Make sure it has data, signified by a green box to the right Select Soundings from the Fields menu Select any point from the Planes menu and an option will load in the table To create a new point go to select Tools > Points and use the right-click-hold menu to create a new point anywhere on the map Use the Load button to load data and open the NSharp display","title":"NSHARP"},{"location":"cave/nsharp/#nsharp-configurations","text":"NSHARP has four configurations for use in different operational settings: SPC Wide - more insets and graphs at the expense of timeline/station inventory. D2D Skewt Standard - default for WFOs, larger SkewT with inventory, no Wind/Height, temperature advection, insets, or graphs. D2D Lite - Skew-T, table, and inventory only. OPC - Ocean Prediction Center display. To change the NSHARP confiuguration: Open the NSHARP(D2D) controls tab by clicking on the Nsharp toolbar ( ) icon again Click the Configure button Click Display Pane Configuration (third from the bottom) Use the dropdown to choose a configuration, apply, save, close If you would like to interactively explore the different graphical areas in NSHARP on the Web , see the NSHARP Interactive Overview .","title":"NSHARP Configurations"},{"location":"cave/nsharp/#skew-t-display","text":"The Skew-T display renders a vertical profile of temperature, dew point, and wind for RAOBs and model point soundings using a Skew-T Log-P diagram. There are many different display configurations available. The one shown in these next few sections is the SPC Wide Screen Configuration . The box in the upper-left of the main display is linked to the cursor readout when over the SkewT chart. It reports the temperature, dewpoint, wind direction and speed, pressure, height AGL, and relative humidity of the trace. Skew-T is the default upper air chart in AWIPS, and can be changed to turbulence display ( T ) or an icing display ( I ). These options are available as buttons at the bottom of the NSHARP(D2D) controls tab (mentioned in NSHARP Configurations ). Use the AWIPS-2 NSHARP Interactive Overview page for more information about the Skew-T display.","title":"Skew-T Display"},{"location":"cave/nsharp/#windspeed-vs-height-and-inferred-temperature-advection","text":"The windspeed vs height and inferred temperature advection with height plot is situated next to the SkewT to show the values at the same heights. Inferred temperature advection is from the thermal wind. Use the AWIPS-2 NSHARP Interactive Overview page for more information.","title":"Windspeed vs Height and Inferred Temperature Advection"},{"location":"cave/nsharp/#hodograph-display","text":"This panel contains the hodograph display from the sounding data. The rings in the hodograph represent the wind speed in 20 knot increments. The hodograph trace uses different colors to highlight wind observations in 3 km height increments. This display also contains information such as the mean wind, Bunkers Left/Right Moving storm motion, upshear and downshear Corfidi vectors, and a user-defined motion. Use the AWIPS NSHARP Interactive Overview page for more information about the hodograph display.","title":"Hodograph Display"},{"location":"cave/nsharp/#insets","text":"In the SPC Wide Screen Configuration there are four small insets beneath the hodograph containing storm-relative windspeed versus height, a Storm Slinky, Theta-E vs Pressure, Possible Watch Type, Thea-E vs Height, and storm-relative wind vectors. There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the four boxes. There are also two buttons PvIn and NxIn in the control tab that can be used to cycle through the previous and next inset. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Insets"},{"location":"cave/nsharp/#table-output-displays","text":"The Table Output Displays contains five different pages of parameters ranging from parcel instability to storm relative shear to severe hazards potential. There are two buttons PtDt and NxDt in the controls tab that can be used to cycle through the previous and next tables. Use the AWIPS NSHARP Interactive Overview page for more information on the tables and a list/definition of the parameters available.","title":"Table Output Displays"},{"location":"cave/nsharp/#graphsstatistics","text":"In the SPC Wide Screen Configuration there are two graphs boxes under the insets, and they can display information on Enhanced Bulk Shear, Significant Tornado Parameter, Significant Hail Parameter (SHIP), Winter Weather, Fire Weather, Hail model (not implemented), and the Sounding Analog Retrieval System (SARS). There are buttons in the NSHARP(D2D) control button tab that toggle the six possible contents in the two boxes. Use the AWIPS NSHARP Interactive Overview page for more information.","title":"Graphs/Statistics"},{"location":"cave/nsharp/#sounding-inventory","text":"The display configuration used in this example is the D2D Standard SkewT Screen Configuration . This section controls the inventory of the soundings that have been loaded for potential display in NSHARP. The different colors of the text represent variously that a sounding/station is being displayed, available for display, or not available for display. Use the AWIPS NSHARP Interactive Overview page for more information on how to use the sounding inventory and time line.","title":"Sounding Inventory"},{"location":"cave/warngen/","text":"WarnGen Walkthrough \uf0c1 WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the NSF Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages) Quick Steps - Using WarnGen in NSF Unidata AWIPS CAVE \uf0c1 Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (BUF is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kbuf > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon Video - Using WarnGen in AWIPS \uf0c1 The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video. Load NEXRAD level 3 display \uf0c1 Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (BUF Buffalo, New York, in this example). Select SITE Localization \uf0c1 Open CAVE > Preferences > Localization , select the CWA site ID (BUF) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window. Load single radar data from the local radars \uf0c1 Use the specialized site menu for the local radar data products, ex. Click on the local radar kbuf > Z + SRM8 . This menu includes several sections with submenus for all available radar data at that site. Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM. Launch WarnGen \uf0c1 Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window. Generate a Storm Motion Vector \uf0c1 Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits. Moving Vertex Points \uf0c1 Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream. Add and Remove Vertex Points \uf0c1 There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex . Redrawing a Polygon \uf0c1 Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon. Text Window \uf0c1 Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Click Enter for the text window to open. This will open the Text Window. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The NSF Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#warngen-walkthrough","text":"WarnGen is an AWIPS graphics application for creating and issuing warnings as is done by National Weather Service offices. In the NSF Unidata AWIPS release it is a non-operational forecasting tool, meaning it allows users to experiment and simulate with the drawing and text-generation tools, but prevents you from transmitting a generated warning upstream . In order to select a feature it must be within your CAVE localization coverage (load Maps > County Warning Areas to see coverages)","title":"WarnGen Walkthrough"},{"location":"cave/warngen/#quick-steps-using-warngen-in-nsf-unidata-awips-cave","text":"Load NEXRAD Display from the Radar menu Choose a CWA with active severe weather (BUF is used in the video below) Re-localize to this site in the CAVE > Preferences > Localization menu Exit out of CAVE and reload (you should notice the new CWA at the top of CAVE) Load radar data from the local radar menu kbuf > Z + SRM8 Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM Click WarnGen toolbar button or load from Tools > WarnGen Drag the storm marker to the center of a storm feature Step through frames back and forth and adjust the marker to match the trajectory of the storm feature Click Track in the Warngen GUI to update the polygon shape and trajectory From the WarnGen dialog select the type of warning to generate, time range, basis of the warning, and any threats (wind, hail, etc) Click Create Text at the bottom of the WarnGen dialog to generate a text warning product in a new window Click Reset at the top of the WarnGen dialog to reset the storm marker at any time Select Line of Storms to enable a two-pointed vector which is to be positioned parallel to a storm line To add another vertex , middle button click along the polygon","title":"Quick Steps - Using WarnGen in NSF Unidata AWIPS CAVE"},{"location":"cave/warngen/#video-using-warngen-in-awips","text":"The video below walks through creating a warning polygon and text in AWIPS. More detailed information can be found in the text below the video.","title":"Video - Using WarnGen in AWIPS"},{"location":"cave/warngen/#load-nexrad-level-3-display","text":"Select the menu Radar > NEXRAD Display and note coverage areas of current severe weather. We choose a CWA ID that contains some active severe weather (BUF Buffalo, New York, in this example).","title":"Load NEXRAD level 3 display"},{"location":"cave/warngen/#select-site-localization","text":"Open CAVE > Preferences > Localization , select the CWA site ID (BUF) for the coverage area you want to use, followed by Apply and Okay and restart CAVE. Once CAVE is restarted, you should notice the new CWA at the top of the CAVE window.","title":"Select SITE Localization"},{"location":"cave/warngen/#load-single-radar-data-from-the-local-radars","text":"Use the specialized site menu for the local radar data products, ex. Click on the local radar kbuf > Z + SRM8 . This menu includes several sections with submenus for all available radar data at that site. Use the \"period\" key in the number pad to toggle between the 0.5 Reflectivity and SRM.","title":"Load single radar data from the local radars"},{"location":"cave/warngen/#launch-warngen","text":"Select WarnGen from the D2D Toolbar or from the Tools > WarnGen menu. When started, the storm centroid marker appears and the WarnGen GUI will pop up as a separate window.","title":"Launch WarnGen"},{"location":"cave/warngen/#generate-a-storm-motion-vector","text":"Click and drag Drag Me to Storm to the feature you want to track (WarnGen uses a dot to track a single storm and a line to track a line of storms). Step back 3 to 4 frames. Drag the dot to the previous position of the feature you first marked to create the storm motion vector. Click the Track button in the WarnGen GUI to update the polygon based off the storm motion. Review the product loop and make adjustments to ensure the vector is accurate. The initial polygon may have unhatched areas that will be removed from the warning due to crossing CWAs or not meeting area thresholds in the county for inclusion. The Warned/Hatched Area button allows you to preview the polygon shape that will be issued, so you can make further edits.","title":"Generate a Storm Motion Vector"},{"location":"cave/warngen/#moving-vertex-points","text":"Vertices can be moved by clicking and dragging with the mouse. The warning polygon, including stippling, will update automatically. When reshaping your warning polygon in this manner, the philosophy is to include all areas that are at risk of experiencing severe weather covered by that warning type. Effective polygons account for uncertainty over time and typically widen downstream.","title":"Moving Vertex Points"},{"location":"cave/warngen/#add-and-remove-vertex-points","text":"There will be some occasions where you will want to add vertices to your warning polygon. Most often, these situations will involve line warnings with bowing segments or single storm warnings where you want to account for storm motion uncertainty or multiple threat areas that may have differing storm motions. New vertices are added to the warning polygon two ways. Either by Right Mouse Button \"click and hold\" or a simple Middle Mouse Button click on the warning polygon line segment where you want to add the vertex. Vertex points are removed from the warning polygon using the same context relative menu. Instead of selecting a line segment, you select the vertex you wish to remove and then right mouse button click and hold and select remove vertex .","title":"Add and Remove Vertex Points"},{"location":"cave/warngen/#redrawing-a-polygon","text":"Click the Reset button to clear the current polygon and vector and reset the storm centroid marker. Generate a new storm motion by moving the storm markers and select the Track button in the WarnGen GUI to draw the new polygon.","title":"Redrawing a Polygon"},{"location":"cave/warngen/#text-window","text":"Once you are satisfied with your polygon and have chosen your selections, click Create Text in the WarnGen GUI. Initially the AWIPS Header Block window appears. Click Enter for the text window to open. This will open the Text Window. Using the customized settings in the WarnGen GUI, WarnGen translates the information into a text product that is displayed in a text window on the Text Display. The auto-generated text contains the storm speed and direction, the counties and cities affected by the warning/advisory, the valid times of the product, the warning/advisory body text (including any optional bullets selected in the GUI), and additional code to help our partners to efficiently process and disseminate the warning/advisory. The locked parts of the text are highlighted in blue and most of your text should not need to be edited if you configured your WarnGen window correctly. The NSF Unidata AWIPS release is non-operational . You will be allowed to simulate the drawing and text-generation of warnings, but are prevented from transmitting a generated warning upstream Note: Edits made to product text in the editor window should be limited to items such as forecaster name/initials, call-to-action text, etc. If changes are warranted for items such as storm motion, warned counties, or Latitude/Longitude points, close the editor window and make changes using the D-2D and WarnGen graphical tools, then recreate the polygon and/or the text.","title":"Text Window"},{"location":"dev/awips-development-environment/","text":"AWIPS Development Environment (ADE) \uf0c1 Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user. 1. Remove AWIPS Instances \uf0c1 First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2 2. Set Up AWIPS Repo \uf0c1 Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl. 3. Install the ADE \uf0c1 Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU . 4. Download the Source Code \uf0c1 If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse. 5. Configure Eclipse \uf0c1 Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up. Set Preferences \uf0c1 Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\" Importing Git Repos \uf0c1 All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish Final Setup \uf0c1 Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\" 6. Run CAVE \uf0c1 CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode : Run Application \uf0c1 Select Run As > Eclipse Application Debug Application \uf0c1 Select Debug > Eclipse Application Troubleshooting \uf0c1 If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Development"},{"location":"dev/awips-development-environment/#awips-development-environment-ade","text":"Detailed instructions on how to download the latest source code and run CAVE from Eclipse. It is important to keep in mind these instructions are intended for a system that is specifically used for developing AWIPS. It should not be used in conjunction with installed production versions of AWIPS. The following yum commands listed in these instructions may need to be run as the root user, but the rest of the commands should be run as the local user.","title":"AWIPS Development Environment (ADE)"},{"location":"dev/awips-development-environment/#1-remove-awips-instances","text":"First, make sure to remove any instances of AWIPS that are already installed, this can potentially cause problems when setting up the development environment. Below is an example that had CAVE installed. Uninstall with yum: yum clean all yum groupremove awips2-cave Check to make sure all rpms have been removed: rpm -qa | grep awips2 Remove the awips2 directory: rm -rf /awips2","title":"1. Remove AWIPS Instances"},{"location":"dev/awips-development-environment/#2-set-up-awips-repo","text":"Create a repo file named /etc/yum.repos.d/awips2.repo , and set the contents to the following: sudo vi /etc/yum.repos.d/awips2.repo [awips2repo] name=AWIPS II Repository baseurl=https://downloads.unidata.ucar.edu/awips2/current/linux/rpms/ el7-dev/ enabled=1 protect=0 gpgcheck=0 proxy=_none_ This file may already exist if AWIPS had been previously installed on the machine, so make sure to edit the baseurl.","title":"2. Set Up AWIPS Repo"},{"location":"dev/awips-development-environment/#3-install-the-ade","text":"Install the AWIPS Development Environment (ADE) using yum. This will install Eclipse (4.6.1), Java (1.8), Ant (1.9.6), Python 2.7 and its modules (Numpy, Matplotlib, Shapely, Jep, and others). yum clean all yum groupinstall awips2-ade Check the libGLU package is installed by running rpm -qa | grep mesa-libGLU . If nothing is returned, install the package via: yum install mesa-libGLU .","title":"3. Install the ADE"},{"location":"dev/awips-development-environment/#4-download-the-source-code","text":"If it's not already installed, install git: yum install git Next clone all of the required repositories for AWIPS: git clone https://github.com/Unidata/awips2.git git clone https://github.com/Unidata/awips2-cimss.git git clone https://github.com/Unidata/awips2-core.git git clone https://github.com/Unidata/awips2-core-foss.git git clone https://github.com/Unidata/awips2-drawing.git git clone https://github.com/Unidata/awips2-foss.git git clone https://github.com/Unidata/awips2-goesr.git git clone https://github.com/Unidata/awips2-gsd.git git clone https://github.com/Unidata/awips2-ncep.git git clone https://github.com/Unidata/awips2-nws.git Make sure to run git checkout in each repo if you'd wish to develop from a branch different from the default. It's best to do this before importing the repos into eclipse.","title":"4. Download the Source Code"},{"location":"dev/awips-development-environment/#5-configure-eclipse","text":"Open eclipse by running: /awips2/eclipse/eclipse It is fine to choose the default workspace upon starting up.","title":"5. Configure Eclipse"},{"location":"dev/awips-development-environment/#set-preferences","text":"Verify or make the following changes to set up eclipse for AWIPS development: Window > Preferences > Java > Installed JREs Set to /awips2/java Window > Preferences > PyDev > Interpreters > Python Interpreter Set to /awips2/python/bin/python Add all paths to the SYSTEM pythonpath if prompted There might be some unresolved errors. These should be made to warnings instead. Window > Preferences > Java > Compiler > Building > Build path Problems > Circular Dependencies > Change to Warning Window > Preferences > Plug-in Development > API Baselines > Missing API Baseline > Change to Warning Turn off automatic building (you will turn this back on after importing the repos) Project > Uncheck \"Build Automatically\"","title":"Set Preferences"},{"location":"dev/awips-development-environment/#importing-git-repos","text":"All of the git repos that were cloned in the previous step will need to be imported into Eclipse. But, be aware the awips2 repo is done last, because it requires different steps. File > Import > Git > Projects from Git > Next Continue with the default selection, Existing local repository > Add.. > add each of the git repos (for example .../awips2-core ) > check the checkbox > Finish Then for each of the repos (except awips2 right now): Select the repo name > Next > Continue with default selection (Working Tree) > Next > Continue with default selections (all choices selected) > Finish Finally, for awips2 repo, follow all the above steps except in the Working Tree, only select: cave > Next > Finish edexOsgi > Next > Finish","title":"Importing Git Repos"},{"location":"dev/awips-development-environment/#final-setup","text":"Project > Clean > OK Use default selections: Clean all projects , Start a build immediately , Build the entire workspace Clean the build and ensure no errors are reported. Turn automatic building back on Project > Check \"Build Automatically\"","title":"Final Setup"},{"location":"dev/awips-development-environment/#6-run-cave","text":"CAVE can be ran from eclipse by using the com.raytheon.viz.product.awips/developer.product Double-click the developer.product file to open the Project Explorer in Eclipse. Select Overview > Synchronize Use the Project Explorer on the left-hand side of eclipse to run CAVE as a Java application or in Debug mode :","title":"6. Run CAVE"},{"location":"dev/awips-development-environment/#run-application","text":"Select Run As > Eclipse Application","title":"Run Application"},{"location":"dev/awips-development-environment/#debug-application","text":"Select Debug > Eclipse Application","title":"Debug Application"},{"location":"dev/awips-development-environment/#troubleshooting","text":"If you are getting a lot of errors, try changing your Java Compiler to 1.7, build the project, then change back to 1.8 and rebuild. Window > Preferences > Java > Compiler > Compiler compliance level setting","title":"Troubleshooting"},{"location":"dev/build-datadelivery/","text":"Data Delivery has been implemented into the AWIPS(II) baseline to provide access to data that is not resident locally at a Weather Forecast Office, River Forecast Center, or National Center. Data Delivery gives users the ability to create queries (One Time Requests) and subscriptions to data sets (provided OGC / OpenDAP servers such as THREDDS). build.edex/build.xml \uf0c1 <target name=\"main-build\" depends=\"clean\"> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.common.base.feature\" /> </antcall> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.base.feature\" /> </antcall> ... <antcall target=\"build\"> <param name=\"feature\" value=\"gov.nasa.msfc.sport.edex.glmdecoder.feature\" /> </antcall> <!-- <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.datadelivery.feature\" /> </antcall> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.ogc.feature\" /> </antcall> --> </target> Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc. wa-build \uf0c1 The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... <target name=\"wa-build\" depends=\"main-build\" description=\"Builds work assignment specific features after the main build\"> <if> <available file=\"${basedir}/features.txt\" type=\"file\" /> <then> <loadfile property=\"wa.features\" srcfile=\"${basedir}/features.txt\" /> <for param=\"line\" list=\"${wa.features}\" delimiter=\"${line.separator}\"> <sequential> <antcall target=\"build\"> <param name=\"feature\" value=\"@{line}\" /> </antcall> </sequential> </for> </then> </if> <antcall target=\"wa-cleanup\" /> <antcall target=\"clean\" /> </target>","title":"Build datadelivery"},{"location":"dev/build-datadelivery/#buildedexbuildxml","text":"<target name=\"main-build\" depends=\"clean\"> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.common.base.feature\" /> </antcall> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.base.feature\" /> </antcall> ... <antcall target=\"build\"> <param name=\"feature\" value=\"gov.nasa.msfc.sport.edex.glmdecoder.feature\" /> </antcall> <!-- <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.datadelivery.feature\" /> </antcall> <antcall target=\"build\"> <param name=\"feature\" value=\"com.raytheon.uf.edex.ogc.feature\" /> </antcall> --> </target> Notice the last two commented out, com.raytheon.uf.edex.datadelivery.feature and com.raytheon.uf.edex.ogc.feature . These feature sets do not exist , but could easily be created in the same wat as other features (like com.raytheon.uf.common.base.feature , com.raytheon.uf.edex.base.feature , etc.","title":"build.edex/build.xml"},{"location":"dev/build-datadelivery/#wa-build","text":"The source code comments provide the following guidance: In the work assignment's edexOsgi/build.edex directory, create a file named similar to the following: edexOsgi/build.edex/5-Data_Delivery-wa-build.properties In the file, there should be one line such as: wa.features=feature1,feature2 However, the wa-build Ant target requires a file features.txt exist. So if is 5-Data_Delivery-wa-build.properties or features.txt ? Because the delimiter being specified is a line separator (and not a comma \"wa.features=feature1,feature2\" as with versions proir to 16.2.2). So we can infer that a file should exist called features.txt should exist which has one WA feature per line. And what do you know, a similar file exist for the CAVE build in awips2-builds/cave/build/features.txt : cat awips2-builds/cave/build/features.txt com.raytheon.uf.common.base.feature com.raytheon.uf.viz.dataplugin.obs.feature ... <target name=\"wa-build\" depends=\"main-build\" description=\"Builds work assignment specific features after the main build\"> <if> <available file=\"${basedir}/features.txt\" type=\"file\" /> <then> <loadfile property=\"wa.features\" srcfile=\"${basedir}/features.txt\" /> <for param=\"line\" list=\"${wa.features}\" delimiter=\"${line.separator}\"> <sequential> <antcall target=\"build\"> <param name=\"feature\" value=\"@{line}\" /> </antcall> </sequential> </for> </then> </if> <antcall target=\"wa-cleanup\" /> <antcall target=\"clean\" /> </target>","title":"wa-build"},{"location":"edex/case-studies/","text":"Case Study Server Configuration \uf0c1 This document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data). Quick Install \uf0c1 Follow the EDEX Install Instructions including iptables config and an optional SSD mount (for large data volumes). groupadd fxalpha && useradd -G fxalpha awips mkdir -p /awips2/data_store wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-server -y Disable Data Purging \uf0c1 The easiest way to disable data purging is to add an <exclude>purge.*</exclude> entry in /awips2/edex/conf/modes/modes.xml so that the purge plugin is not loaded when the EDEX ingest JVM is started: vi /awips2/edex/conf/modes/modes.xml <?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?> <edexModes> <mode name=\"ingest\"> <exclude>.*request.*</exclude> <exclude>edex-security.xml</exclude> ... <exclude>purge.*</exclude> </mode> ... </edexModes> Start EDEX \uf0c1 Start EDEX without running the LDM, since we do not want current data. Run the following command: edex start base Double check everything is running, except the LDM: edex [edex status] postgres :: running :: pid 43644 pypies :: running :: pid 3557 qpid :: running :: pid 43742 EDEXingest :: running :: pid 6564 44301 44597 EDEXgrib :: running :: pid 6565 44302 44598 EDEXrequest :: running :: pid 6566 44303 44599 ldmadmin :: not running Ingest Case Study Data \uf0c1 Raw data files of any type can be copied or moved into /awips2/data_store/ingest/ to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. Individual files can be ingested on the command line with the regex header/pattern supplied as the last argument: qpidNotify.py /full/path/to/data.file [regex match] For example: qpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc qpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib qpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3 Viewing Archive Data in CAVE \uf0c1 Because we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested. However, to display specific time-based data (in case you ingest more than one case study), there are two options: Set Load Mode to Inventory \uf0c1 In the top-left toolbar change Valid time seq to Inventory . Now any data product selected from the menus or the Product Browser should prompt you to select the exact time. Set Data Display Time in CAVE \uf0c1 At the bottom of the CAVE application, double-click the Time: entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.","title":"Archive Case Studies"},{"location":"edex/case-studies/#case-study-server-configuration","text":"This document covers what is necessary to install and run AWIPS EDEX as an archive and case study server (no purging of processed data).","title":"Case Study Server Configuration"},{"location":"edex/case-studies/#quick-install","text":"Follow the EDEX Install Instructions including iptables config and an optional SSD mount (for large data volumes). groupadd fxalpha && useradd -G fxalpha awips mkdir -p /awips2/data_store wget -O /etc/yum.repos.d/awips2.repo https://downloads.unidata.ucar.edu/awips2/current/linux/awips2.repo yum clean all yum groupinstall awips2-server -y","title":"Quick Install"},{"location":"edex/case-studies/#disable-data-purging","text":"The easiest way to disable data purging is to add an <exclude>purge.*</exclude> entry in /awips2/edex/conf/modes/modes.xml so that the purge plugin is not loaded when the EDEX ingest JVM is started: vi /awips2/edex/conf/modes/modes.xml <?xml version=\"1.0\" encoding=\"UTF-8\" standalone=\"yes\"?> <edexModes> <mode name=\"ingest\"> <exclude>.*request.*</exclude> <exclude>edex-security.xml</exclude> ... <exclude>purge.*</exclude> </mode> ... </edexModes>","title":"Disable Data Purging"},{"location":"edex/case-studies/#start-edex","text":"Start EDEX without running the LDM, since we do not want current data. Run the following command: edex start base Double check everything is running, except the LDM: edex [edex status] postgres :: running :: pid 43644 pypies :: running :: pid 3557 qpid :: running :: pid 43742 EDEXingest :: running :: pid 6564 44301 44597 EDEXgrib :: running :: pid 6565 44302 44598 EDEXrequest :: running :: pid 6566 44303 44599 ldmadmin :: not running","title":"Start EDEX"},{"location":"edex/case-studies/#ingest-case-study-data","text":"Raw data files of any type can be copied or moved into /awips2/data_store/ingest/ to be picked up and decoded by EDEX. Most data types are recognized by regular expression matching of the WMO Header or filename. Individual files can be ingested on the command line with the regex header/pattern supplied as the last argument: qpidNotify.py /full/path/to/data.file [regex match] For example: qpidNotify.py /home/awips/uniwisc_U5_132GOES-15_IMG10.7um_4km_20171024_1830.area.png uniwisc qpidNotify.py /awips2/data_store/grid/NAM12/conduit/NAM_CONUS_12km_conduit_20171025_1200Z_F084_TMPK-7.000007.grib2 grib qpidNotify.py /awips2/data_store/radar/FTG_N0Q_20171015_1815 Level3","title":"Ingest Case Study Data"},{"location":"edex/case-studies/#viewing-archive-data-in-cave","text":"Because we are installing and configuring a standalone EDEX archive server without real-time LDM data ingest (and with purge disabled), any case study data that is ingested will be the \"latest available\" to CAVE, and you will see CAVE product menu time fill in with the latest of all data ingested. However, to display specific time-based data (in case you ingest more than one case study), there are two options:","title":"Viewing Archive Data in CAVE"},{"location":"edex/case-studies/#set-load-mode-to-inventory","text":"In the top-left toolbar change Valid time seq to Inventory . Now any data product selected from the menus or the Product Browser should prompt you to select the exact time.","title":"Set Load Mode to Inventory"},{"location":"edex/case-studies/#set-data-display-time-in-cave","text":"At the bottom of the CAVE application, double-click the Time: entry to bring up a dialog window where you can set CAVE to a previous time, and choose the option of freezing CAVE at that time or allowing CAVE to \"move forward in time\" from that position as if it were real-time.","title":"Set Data Display Time in CAVE"},{"location":"edex/data-distribution-files/","text":"Data Distribution Files \uf0c1 Overview \uf0c1 EDEX uses distribution files to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act. Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the manual endpoint (/awips2/data_store/ingest/), then this behaviour could be different. If a piece of data does not match any distribution XML, EDEX will: Create an entry in /awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log Skip processing of the unrecognized file. Distribution files are stored in the common_static branch of the Localization Store (series of directories that exist in /awips2/edex/data/utility/ ), and a list of available files can be found in the base-level directory. The base directory is: /awips2/edex/data/utility/common_static/base/distribution/ . For each plug-in, the distribution file is named [data-type].xml . For example, the distribution file for radar data is radar.xml . The distribution files follow the AWIPS base/site localization pattern: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution [root@edex distribution]# ls acars.xml goesr.xml poessounding.xml airep.xml goessounding.xml profiler.xml airmet.xml grib.xml radar.xml atcf.xml intlsigmet.xml redbook.xml aww.xml lsr.xml satellite.gini.xml ... Creating a Site Override \uf0c1 Base files are located in /awips2/edex/data/utility/common_static/base/distribution/ Site override distribution files are located in /awips2/edex/data/utility/common_static/ site/XXX /distribution/ , where XXX is the site identifier. Note that site-level files override the base files; as a result, local modifications to distribution files must be made as follows: The base distribution file must be copied from /awips2/edex/data/utility/common_static/base/distribution to /awips2/edex/data/utility/common_static/site/XXX/distribution The local modification must be made to the file in /awips2/edex/data/utility/common_static/site/XXX/distribution The basic structure of the distribution file is: <requestPatterns xmlns:ns2=\"group\"> <regex>[pattern]</regex> <regex>[pattern]</regex> </requestPatterns> In each tag, [pattern] is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed. The contents of the base version of the radar distribution file: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/ [root@edex]# tail -4 radar.xml <requestPatterns > <regex>^SDUS[234578]. .*</regex> <regex>^Level3.*</regex> </requestPatterns> Looking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the external.dropbox QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a message in the QPID queue Ingest.radar. The radar decoder will then consume the message and process the radar data accordingly. Adding a REGEX to the Satellite Data Distribution File \uf0c1 As a quick example, suppose we have a local data source for satellite imagery that does not have a WMO header; also suppose that the data source writes to files whose names start with LOCAL.sat . To add this locally produced satellite data file to the EDEX distribution; perform the following steps. Copy the base version of satellite.gini.xml from the base distribution directory /awips2/edex/data/utility/common_static/base/distribution into the site distribution directory /awips2/edex/data/utility/common_static/site/XXX/distribution Edit the site version of satellite.gini.xml , adding a new <regex> </regex> tag immediately below the existing regular expression ( <regex> </regex> ) tag. The contents of the tag will be ^LOCAL.sat . The final result will be: <requestPatterns xmlns:ns2=\"group\"> <regex>TI[CGT]... ....</regex> <regex>rad/NEXRCOMP</regex> <regex>.\\*.gini.\\*</regex> <regex>^LOCAL.sat.*</regex> </requestPatterns> Save the file and exit the editor. EDEX will automatically pick up the new distribution pattern. Raw files are written to subdirectories in /awips2/data_store/ , and a message is sent via QPID to the EDEX distribution service from the LDM. When a regular expression match is found in a data distribution file, the raw data file is placed in a queue for the matching plugin to decode and process. The distribution files are used to match file headers as well as filenames, which is how files dropped into EDEX's manual endpoint ( /awips2/data_store/ingest/ ) are processed. Editing an EDEX Data Distribution File \uf0c1 Because these files are in the common/_static directory, they have to be manually edited using a text editor. You should not edit the base files; rather, as stated above, you should copy the base version to your site and then edit the site version . The regular expressions in the distribution files do not necessarily need to correspond with the regular expressions in the LDM pqact.conf file. It is important to note that: The regex in the pqact.conf file applies to the productID that is passed through the LDM. and The regex in the distribution files (.xml) typically applies to the header in the file. It can also apply to the filename, if the file is coming through the manual endpoint, or if the data has no header to begin with. If patterns exist in pqact.conf but there are no corresponding matching regex expressions in any distribution file, then raw data files will be written to /awips2/data_store/ but will not be ingested and processed by EDEX. Entries for these non-ingested files would be written to the unrecognized files log in /awips/edex/logs . Examples \uf0c1 Surface Obs \uf0c1 Its distribution file is located at: /awips2/edex/data/utility/common_static/base/distribution/obs.xml : <requestPatterns xmlns:ns2=\"group\"> <regex>^S[AP].*</regex> </requestPatterns> It will process any file header that starts with SA or SP , which should match any WMO header that contains METAR data (e.g. SAUS , SPUS , SACN , SAMX ). Text Data \uf0c1 Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/text.xml : <requestPatterns> <regex>^[ACFNRUW][A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex> <regex>^S[A-CEG-Z].*</regex> <!-- Only have AFOS mapping for T[BCX] --> <regex>^T[BCX].*</regex> <regex>^SF[A-OQ-TV-Z].*</regex> <regex>^SDUS1.*</regex> <regex>^SDUS4[1-6].*</regex> <regex>^SDUS9[^7].*</regex> <regex>^SFU[^S].*</regex> <regex>^SFUS4[^1].*</regex> <regex>^SFP[^A].*</regex> <regex>^SFPA[^4].*</regex> <regex>^SFPA4[^12].*</regex> <regex>^BMBB91.*</regex> <regex>^N[A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex> <regex>^F[EHIJKLMQVWX].*</regex> <regex>wcl_decrypted</regex> <regex>ecmwf_mos_decrypted</regex> </requestPatterns> Processes lots of WM patterns. The second pattern ^S[A-CEG-Z].* matches any header that starts with S except for SD or SF . This is because it matches A through C ( A-C ), E, and G through Z ( G-Z ). So it also matches the SA and SP files that the obs.xml plugin matches. This means that METARs are processed by both plugins simultaneously. Grib Data \uf0c1 Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/grib.xml : <requestPatterns> <!-- Super Set of all possible WMO grib patterns --> <!-- Is specifically not restricting on CCCC since HPE isn't populating it --> <regex>^[EHLMOYZ][A-Z]{3}\\d{2}</regex> <!-- Exclude Data Delivery specific patterns --> <regexExclude>^LZ[ABC][ABC]9[123] (KWBC|KNCF)</regexExclude> <!-- ECMWF decrypted --> <regex>ecmwf_decrypted</regex> <!-- NWPS pattern --> <regex>\\p{Alpha}{3}_nwps_CG1</regex> <regex>\\p{Alpha}{3}_nwps_CG0_Trkng</regex> <!-- grib files without WMO headers --> <regex>.*grib.*</regex> <regex>.*GRIB.*</regex> <regex>.*grb.*</regex> <regex>^US058.*</regex> <regex>^CMC_reg.*</regex> </requestPatterns> The grib/grid decoder distribution file matches all numerical grids distributed over the IDD NGRID feed by matching WMO header, and from CONDUIT by matching various .grib file extensions. It also includes an example of a regexExclude message which can be used to single out matching values that aren't to be included. Addtional Information \uf0c1 Important notes about regular expressions: Any time a new entry is placed in the pqact.conf file on LDM, it is likely a corresponding entry needs to be added to the appropriate Data Distribution file in the data distribution directory, or the data file will be logged to edex-ingest-unrecognized-files-YYYYMMDD.log . The exception to this rule is if the new data coming from the LDM is a type of data that already exists and EDEX already has a distribution file with a matching regex that will recognize it. If you have written a new regex for a distribution file to match on a filename, and it is not matching, then the file most likely has a header. In this case EDEX will only look at the header to match the regex. You must change your regex to something that matches the header, not the filename.","title":"Data Distribution Files"},{"location":"edex/data-distribution-files/#data-distribution-files","text":"","title":"Data Distribution Files"},{"location":"edex/data-distribution-files/#overview","text":"EDEX uses distribution files to alert the appropriate decoding plug-in that new data has been recieved. These files do so by use of XML and regular expressions. If the WMO header, or file name*, matches a regular expression listed in a distribution XML, then EDEX will put a message into the QPID queue for its corresponding decoder to recognize and process. It is worth noting that more than one distribution file can recognize a single peice of data and notify their decoders to act. Sometimes the distribution file will not look at the filename. If this file is coming in through the LDM using a proper FILE action, then the it is possible the distribution file will only look at the header and not the filename. If the file is ingested using the manual endpoint (/awips2/data_store/ingest/), then this behaviour could be different. If a piece of data does not match any distribution XML, EDEX will: Create an entry in /awips2/edex/logs/edex-ingest-unrecognized-files-yyyymmdd.log Skip processing of the unrecognized file. Distribution files are stored in the common_static branch of the Localization Store (series of directories that exist in /awips2/edex/data/utility/ ), and a list of available files can be found in the base-level directory. The base directory is: /awips2/edex/data/utility/common_static/base/distribution/ . For each plug-in, the distribution file is named [data-type].xml . For example, the distribution file for radar data is radar.xml . The distribution files follow the AWIPS base/site localization pattern: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution [root@edex distribution]# ls acars.xml goesr.xml poessounding.xml airep.xml goessounding.xml profiler.xml airmet.xml grib.xml radar.xml atcf.xml intlsigmet.xml redbook.xml aww.xml lsr.xml satellite.gini.xml ...","title":"Overview"},{"location":"edex/data-distribution-files/#creating-a-site-override","text":"Base files are located in /awips2/edex/data/utility/common_static/base/distribution/ Site override distribution files are located in /awips2/edex/data/utility/common_static/ site/XXX /distribution/ , where XXX is the site identifier. Note that site-level files override the base files; as a result, local modifications to distribution files must be made as follows: The base distribution file must be copied from /awips2/edex/data/utility/common_static/base/distribution to /awips2/edex/data/utility/common_static/site/XXX/distribution The local modification must be made to the file in /awips2/edex/data/utility/common_static/site/XXX/distribution The basic structure of the distribution file is: <requestPatterns xmlns:ns2=\"group\"> <regex>[pattern]</regex> <regex>[pattern]</regex> </requestPatterns> In each tag, [pattern] is replaced with a regular expression that will match either the filename or the WMO header of the raw data. Only data that matches a pattern in the distribution file will be processed. The contents of the base version of the radar distribution file: [root@edex]# cd /awips2/edex/data/utility/common_static/base/distribution/ [root@edex]# tail -4 radar.xml <requestPatterns > <regex>^SDUS[234578]. .*</regex> <regex>^Level3.*</regex> </requestPatterns> Looking at the base radar.xml distribution file in this example, there are two regular expressions. The first regular expression matches the standard WMO ID of radar products. Using edexBridge the LDM will place a message in the external.dropbox QPID queue, indicating a radar product has arrived. EDEX will then take the message containing the radar WMO ID (which comes from the file header) and compare it against the regular expressions in radar.xml. If a match is found, EDEX places a message in the QPID queue Ingest.radar. The radar decoder will then consume the message and process the radar data accordingly.","title":"Creating a Site Override"},{"location":"edex/data-distribution-files/#adding-a-regex-to-the-satellite-data-distribution-file","text":"As a quick example, suppose we have a local data source for satellite imagery that does not have a WMO header; also suppose that the data source writes to files whose names start with LOCAL.sat . To add this locally produced satellite data file to the EDEX distribution; perform the following steps. Copy the base version of satellite.gini.xml from the base distribution directory /awips2/edex/data/utility/common_static/base/distribution into the site distribution directory /awips2/edex/data/utility/common_static/site/XXX/distribution Edit the site version of satellite.gini.xml , adding a new <regex> </regex> tag immediately below the existing regular expression ( <regex> </regex> ) tag. The contents of the tag will be ^LOCAL.sat . The final result will be: <requestPatterns xmlns:ns2=\"group\"> <regex>TI[CGT]... ....</regex> <regex>rad/NEXRCOMP</regex> <regex>.\\*.gini.\\*</regex> <regex>^LOCAL.sat.*</regex> </requestPatterns> Save the file and exit the editor. EDEX will automatically pick up the new distribution pattern. Raw files are written to subdirectories in /awips2/data_store/ , and a message is sent via QPID to the EDEX distribution service from the LDM. When a regular expression match is found in a data distribution file, the raw data file is placed in a queue for the matching plugin to decode and process. The distribution files are used to match file headers as well as filenames, which is how files dropped into EDEX's manual endpoint ( /awips2/data_store/ingest/ ) are processed.","title":"Adding a REGEX to the Satellite Data Distribution File"},{"location":"edex/data-distribution-files/#editing-an-edex-data-distribution-file","text":"Because these files are in the common/_static directory, they have to be manually edited using a text editor. You should not edit the base files; rather, as stated above, you should copy the base version to your site and then edit the site version . The regular expressions in the distribution files do not necessarily need to correspond with the regular expressions in the LDM pqact.conf file. It is important to note that: The regex in the pqact.conf file applies to the productID that is passed through the LDM. and The regex in the distribution files (.xml) typically applies to the header in the file. It can also apply to the filename, if the file is coming through the manual endpoint, or if the data has no header to begin with. If patterns exist in pqact.conf but there are no corresponding matching regex expressions in any distribution file, then raw data files will be written to /awips2/data_store/ but will not be ingested and processed by EDEX. Entries for these non-ingested files would be written to the unrecognized files log in /awips/edex/logs .","title":"Editing an EDEX Data Distribution File"},{"location":"edex/data-distribution-files/#examples","text":"","title":"Examples"},{"location":"edex/data-distribution-files/#surface-obs","text":"Its distribution file is located at: /awips2/edex/data/utility/common_static/base/distribution/obs.xml : <requestPatterns xmlns:ns2=\"group\"> <regex>^S[AP].*</regex> </requestPatterns> It will process any file header that starts with SA or SP , which should match any WMO header that contains METAR data (e.g. SAUS , SPUS , SACN , SAMX ).","title":"Surface Obs"},{"location":"edex/data-distribution-files/#text-data","text":"Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/text.xml : <requestPatterns> <regex>^[ACFNRUW][A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex> <regex>^S[A-CEG-Z].*</regex> <!-- Only have AFOS mapping for T[BCX] --> <regex>^T[BCX].*</regex> <regex>^SF[A-OQ-TV-Z].*</regex> <regex>^SDUS1.*</regex> <regex>^SDUS4[1-6].*</regex> <regex>^SDUS9[^7].*</regex> <regex>^SFU[^S].*</regex> <regex>^SFUS4[^1].*</regex> <regex>^SFP[^A].*</regex> <regex>^SFPA[^4].*</regex> <regex>^SFPA4[^12].*</regex> <regex>^BMBB91.*</regex> <regex>^N[A-Z][A-Z0-9]{4} [A-Z0-9]{4}</regex> <regex>^F[EHIJKLMQVWX].*</regex> <regex>wcl_decrypted</regex> <regex>ecmwf_mos_decrypted</regex> </requestPatterns> Processes lots of WM patterns. The second pattern ^S[A-CEG-Z].* matches any header that starts with S except for SD or SF . This is because it matches A through C ( A-C ), E, and G through Z ( G-Z ). So it also matches the SA and SP files that the obs.xml plugin matches. This means that METARs are processed by both plugins simultaneously.","title":"Text Data"},{"location":"edex/data-distribution-files/#grib-data","text":"Its distribution file is located at /awips2/edex/data/utility/common_static/base/distribution/grib.xml : <requestPatterns> <!-- Super Set of all possible WMO grib patterns --> <!-- Is specifically not restricting on CCCC since HPE isn't populating it --> <regex>^[EHLMOYZ][A-Z]{3}\\d{2}</regex> <!-- Exclude Data Delivery specific patterns --> <regexExclude>^LZ[ABC][ABC]9[123] (KWBC|KNCF)</regexExclude> <!-- ECMWF decrypted --> <regex>ecmwf_decrypted</regex> <!-- NWPS pattern --> <regex>\\p{Alpha}{3}_nwps_CG1</regex> <regex>\\p{Alpha}{3}_nwps_CG0_Trkng</regex> <!-- grib files without WMO headers --> <regex>.*grib.*</regex> <regex>.*GRIB.*</regex> <regex>.*grb.*</regex> <regex>^US058.*</regex> <regex>^CMC_reg.*</regex> </requestPatterns> The grib/grid decoder distribution file matches all numerical grids distributed over the IDD NGRID feed by matching WMO header, and from CONDUIT by matching various .grib file extensions. It also includes an example of a regexExclude message which can be used to single out matching values that aren't to be included.","title":"Grib Data"},{"location":"edex/data-distribution-files/#addtional-information","text":"Important notes about regular expressions: Any time a new entry is placed in the pqact.conf file on LDM, it is likely a corresponding entry needs to be added to the appropriate Data Distribution file in the data distribution directory, or the data file will be logged to edex-ingest-unrecognized-files-YYYYMMDD.log . The exception to this rule is if the new data coming from the LDM is a type of data that already exists and EDEX already has a distribution file with a matching regex that will recognize it. If you have written a new regex for a distribution file to match on a filename, and it is not matching, then the file most likely has a header. In this case EDEX will only look at the header to match the regex. You must change your regex to something that matches the header, not the filename.","title":"Addtional Information"},{"location":"edex/data-plugins/","text":"td:first-child { font-weight: bold } AWIPS Plugins and Supported Data Types \uf0c1 NAME DESCRIPTION aqi Air Quality Index data bufrmos Model Output Statistics bufrua Upper air radiosonde data climate-hmdb Climate text products geodata NetCDF JTS Geometry records geomag SWPC Geomagnetic Forecast (RTKP) gfe Graphical Forecast Editor grids ghcd SWPC Generic High Cadence Data gpd NCEP Generic Point Data grid Binary gridded data grib1/grib2 idft Ice Drift Forecasts madis NCEP Meteorological Assimilation Data Ingest System ( MADIS ) manualIngest Manual data ingest plugin metartohmdb Adds metar records to the Verification and Climate database modelsounding Individual grid point soundings from the GFS and NAM models mping Meteorological Phenomena Identification Near the Ground ( mPING ) ncpafm Point/Area Forecast Matrices data nctext NCEP Text decoders ncuair NCEP Upper Air decoder ndm National Dataset Maintenance ingester ntrans NCCEP Ntrans Metafiles obs Surface observations from METARs pgen NCEP NAWIPS PGEN decoder redbook Redbook graphics sfcobs Surface observations other than METAR format including buoys solarimage SWPC Solar imagery ssha NCEP Sea Surface Height Anomaly BUFR data text Various Text Products vaa Volcanic ash advisories AWIPS Plugins for Remote Sensing/Lightning \uf0c1 NAME DESCRIPTION binlightning Lightning data from the National Lightning Detection Network bufrascat Advanced Scatterometer wind data bufrhdw GOES High Density Winds bufrmthdw MTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds bufrssmi Special Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites crimss NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings dmw GOES-R Derived Motion Winds glm GOES Geostationary Lightning Mapper goesr Plugins to decode and display GOES-R products goessounding GOES Satellite Soundings lma Lightning Mapping Array mcidas NCEP decoder for McIDAS AREA files modis NASA Moderate-resolution Imaging Spectroradiometer ncscat NCEP ASCAT/Quikscat records npp National Polar-Orbiting Partnership Satellites Soundings nucaps Soundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites poessounding Polar Operational Environmental Satellite soundings radar WSR-88D and TDWR Level 3 data regionalsat Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground satellite-gini GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) satellite-mcidas McIDAS area files (Raytheon/D2D-developed) viirs NPP Visible Infrared Imaging Radiometer Suite data sgwh NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3 textlightning Text lightning data AWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro) \uf0c1 NAME DESCRIPTION atcf Automated Tropical Cyclone Forecast convectprob NOAA/CIMSS Prob Severe Model editedregions Hazard Services Edited Regions editedevents Hazard Services Edited Events cwat County Warning Area Threat produced by SCAN (System for Convection Analysis and Nowcasting). CWAT was formerly called SCAN Convective Threat Index (SCTI). ffg Flash flood guidance metadata (countybased ffg from RFCs) ffmp Flash Flood Monitoring and Prediction data (raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. The WMO header for the RFC FFG begins with \u201cZEGZ98\u201d. ) fog Fog Monitor . Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 \u00b5m, and 10.7 \u00b5m]) freezingLevel MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) fssobs Observations for the Fog monitor, SNOW, and SAFESEAS (raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs). lsr Local Storm Reports mpe Multi-sensor Precipitation Estimation preciprate Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]. qpf Quantitative Precipitation Forecast from SCAN. (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN\u2019s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) satpre Satellite-estimated Pecipiration (hydroApps) scan SCAN (System for Convection Analysis and Nowcasting). (Inputs for the SCAN Table include radar, cloud-to-ground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. shef Standard Hydrometeorological Exchange Format data. warning Watches, Warnings, and Advisories wcp SPC Convective Watches svrwx SPC Local Storm Report Summaries tcg Tropical Cyclone Guidance tcm Tropical Cyclone Forecast/Advisory tcs Tropical Cyclone Forecast/Advisory stormtrack NCEP StormTrack Plug-In (Automatic Tropical Cyclone Forecast & Ensemble cyclones) vil Cell-based Vertically Integrated Liquid from SCAN (Input is radar) spc Storm Prediction Center Convective Outlook KML files AWIPS Plugins for Aviation \uf0c1 NAME DESCRIPTION acars Aircraft Communications Addressing and Reporting System (ACARS) observations acarssounding Vertical profiles derived from ACARS data airep Automated Aircraft Reports airmet \u201cAirmen\u2019s Meteorological Information\u201d: aviation weather advisories for potentially hazardous, but non-severe weather asdi FAA Aircraft Situation Data for Industry aww Airport Weather Warning bufrncwf National Convective Weather Forecast for Aviation bufrsigwx Aviation Significant Weather ccfp Aviation Collaborative Convective Forecast Product convsigmet Aviation Significant Meteorological Information for convective weather cwa Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) intlsigmet International Significant Meteorological Information for Aviation nctaf NCEP TAF decoders nonconvsigmet Aviation Significant Meteorological Information for non-convective weather pirep Pilot Reports taf Terminal Aerodrome Forecasts","title":"Data Plugins"},{"location":"edex/data-plugins/#awips-plugins-and-supported-data-types","text":"NAME DESCRIPTION aqi Air Quality Index data bufrmos Model Output Statistics bufrua Upper air radiosonde data climate-hmdb Climate text products geodata NetCDF JTS Geometry records geomag SWPC Geomagnetic Forecast (RTKP) gfe Graphical Forecast Editor grids ghcd SWPC Generic High Cadence Data gpd NCEP Generic Point Data grid Binary gridded data grib1/grib2 idft Ice Drift Forecasts madis NCEP Meteorological Assimilation Data Ingest System ( MADIS ) manualIngest Manual data ingest plugin metartohmdb Adds metar records to the Verification and Climate database modelsounding Individual grid point soundings from the GFS and NAM models mping Meteorological Phenomena Identification Near the Ground ( mPING ) ncpafm Point/Area Forecast Matrices data nctext NCEP Text decoders ncuair NCEP Upper Air decoder ndm National Dataset Maintenance ingester ntrans NCCEP Ntrans Metafiles obs Surface observations from METARs pgen NCEP NAWIPS PGEN decoder redbook Redbook graphics sfcobs Surface observations other than METAR format including buoys solarimage SWPC Solar imagery ssha NCEP Sea Surface Height Anomaly BUFR data text Various Text Products vaa Volcanic ash advisories","title":"AWIPS Plugins and Supported Data Types"},{"location":"edex/data-plugins/#awips-plugins-for-remote-sensinglightning","text":"NAME DESCRIPTION binlightning Lightning data from the National Lightning Detection Network bufrascat Advanced Scatterometer wind data bufrhdw GOES High Density Winds bufrmthdw MTSAT (Japanese Multi-Functional Transport Satellite) High Density Winds bufrssmi Special Sensor Microwave/Imager data from DMSP (Defesne Meteorological Satellite Program) satellites crimss NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings dmw GOES-R Derived Motion Winds glm GOES Geostationary Lightning Mapper goesr Plugins to decode and display GOES-R products goessounding GOES Satellite Soundings lma Lightning Mapping Array mcidas NCEP decoder for McIDAS AREA files modis NASA Moderate-resolution Imaging Spectroradiometer ncscat NCEP ASCAT/Quikscat records npp National Polar-Orbiting Partnership Satellites Soundings nucaps Soundings from NOAA Unique CrIS/ATMS Processing System from NPP (National Polar-Orbiting Partnership) Satellites poessounding Polar Operational Environmental Satellite soundings radar WSR-88D and TDWR Level 3 data regionalsat Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground satellite-gini GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) satellite-mcidas McIDAS area files (Raytheon/D2D-developed) viirs NPP Visible Infrared Imaging Radiometer Suite data sgwh NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), SGWH2 (Jason-2), or Jason-3 textlightning Text lightning data","title":"AWIPS Plugins for Remote Sensing/Lightning"},{"location":"edex/data-plugins/#awips-plugins-for-decision-assistance-watchwarnhazardshydro","text":"NAME DESCRIPTION atcf Automated Tropical Cyclone Forecast convectprob NOAA/CIMSS Prob Severe Model editedregions Hazard Services Edited Regions editedevents Hazard Services Edited Events cwat County Warning Area Threat produced by SCAN (System for Convection Analysis and Nowcasting). CWAT was formerly called SCAN Convective Threat Index (SCTI). ffg Flash flood guidance metadata (countybased ffg from RFCs) ffmp Flash Flood Monitoring and Prediction data (raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. The WMO header for the RFC FFG begins with \u201cZEGZ98\u201d. ) fog Fog Monitor . Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 \u00b5m, and 10.7 \u00b5m]) freezingLevel MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) fssobs Observations for the Fog monitor, SNOW, and SAFESEAS (raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs). lsr Local Storm Reports mpe Multi-sensor Precipitation Estimation preciprate Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]. qpf Quantitative Precipitation Forecast from SCAN. (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN\u2019s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) satpre Satellite-estimated Pecipiration (hydroApps) scan SCAN (System for Convection Analysis and Nowcasting). (Inputs for the SCAN Table include radar, cloud-to-ground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. shef Standard Hydrometeorological Exchange Format data. warning Watches, Warnings, and Advisories wcp SPC Convective Watches svrwx SPC Local Storm Report Summaries tcg Tropical Cyclone Guidance tcm Tropical Cyclone Forecast/Advisory tcs Tropical Cyclone Forecast/Advisory stormtrack NCEP StormTrack Plug-In (Automatic Tropical Cyclone Forecast & Ensemble cyclones) vil Cell-based Vertically Integrated Liquid from SCAN (Input is radar) spc Storm Prediction Center Convective Outlook KML files","title":"AWIPS Plugins for Decision Assistance (Watch/Warn/Hazards/Hydro)"},{"location":"edex/data-plugins/#awips-plugins-for-aviation","text":"NAME DESCRIPTION acars Aircraft Communications Addressing and Reporting System (ACARS) observations acarssounding Vertical profiles derived from ACARS data airep Automated Aircraft Reports airmet \u201cAirmen\u2019s Meteorological Information\u201d: aviation weather advisories for potentially hazardous, but non-severe weather asdi FAA Aircraft Situation Data for Industry aww Airport Weather Warning bufrncwf National Convective Weather Forecast for Aviation bufrsigwx Aviation Significant Weather ccfp Aviation Collaborative Convective Forecast Product convsigmet Aviation Significant Meteorological Information for convective weather cwa Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) intlsigmet International Significant Meteorological Information for Aviation nctaf NCEP TAF decoders nonconvsigmet Aviation Significant Meteorological Information for non-convective weather pirep Pilot Reports taf Terminal Aerodrome Forecasts","title":"AWIPS Plugins for Aviation"},{"location":"edex/data-purge/","text":"Purging and Retention \uf0c1 Purge Types \uf0c1 There are two main forms of data purging in AWIPS. The most often thought of is the purging for processed data . This has to do with how long data is stored for after it has been decoded and processed. The second type of purging has to do with raw data . This has to do with how long data is stored for before it has been decoded. Processed Data Purging \uf0c1 AWIPS uses a plugin-based purge strategy for processed HDF5 data . This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written. Note : Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. Purging rules are defined in XML files in the Localization Store. On EDEX, most are located in /awips2/edex/data/utility/common_static/base/purge , and follow the base/site localization pattern (e.g. site purge files are in site/XXX/purge rather than base/purge , where XXX is the site identifier). Each data set can have a purge rule defined, and the xml file is named after the data set: ls /awips2/edex/data/utility/common_static/base/purge/ acarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml acarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml aggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml airepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml ... Time-based purge \uf0c1 If a plugin has no XML file, the default rule of 1 day (24 hours) is used, from /awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml : <purgeRuleSet> <defaultRule> <period>01-00:00:00</period> </defaultRule> </purgeRuleSet> Time-based purging is set with the period tag and uses the reference time of the data. The reference time of the data is determined by the decoder. 30-day NEXRAD3 Example \uf0c1 Modify /awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml to increase the data retention period from 1 to 31 days: <purgeRuleSet> <defaultRule> <period>31-00:00:00</period> </defaultRule> </purgeRuleSet> Note : you do NOT have to restart EDEX when you change a purge rule! Frame-Based Purge \uf0c1 Some plugins use frame-base purging, retaining and certain number of product \"versions\". /awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml <defaultRule> <versionsToKeep>2</versionsToKeep> <period>07-00:00:00</period> </defaultRule> <rule> <keyValue>LAPS</keyValue> <versionsToKeep>30</versionsToKeep> </rule> <rule regex=\"true\"> <keyValue>NAM(?:12|20|40)</keyValue> <versionsToKeep>2</versionsToKeep> <modTimeToWait>00-00:15:00</modTimeToWait> </rule> ... In the above example, notice a default rule (2) is specified, as well as specific models with their own rules. The tag modTimeToWait can be used in conjunction with versionsToKeep and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait. Purge Logs \uf0c1 Data purge events are logged to the file edex-ingest-purge-[yyyymmdd].log , where [yyyymmdd] is the date stamp. tail -f edex-ingest-purge-20120327.log --------START LOG PURGE--------- INFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log INFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Archived 14 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped processing 1 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::---------END LOG PURGE----------- All Purge Rules \uf0c1 To see all purge rule directories (base, site, configured): find /awips2/edex/data/utility -name purge /awips2/edex/data/utility/common_static/base/purge If any overrides have been made, then it's possible that site directories may show up as results from the find command as well. Raw Data Purging \uf0c1 Raw data are files that have been brought in by the LDM and recognized by an action in the pqact.conf file. These files are written to subdirectories of /awips2/data_store/ . This data will wait here until it is purged, from the purging rules defined in /awips2/edex/data/utility/common_static/base/archiver/purger/RAW_DATA.xml . If the purge time is too short, and the processing latencies on EDEX are too long, it is possible that EDEX will miss some of this data, and the purge times will need to be adjusted by changing the <defaultRetentionHours> or <selectedRetentionHours> tag on the relevent data sets. Default Retention \uf0c1 The defaultRetentionHours tag is defined at the beginning of the RAW_DATA.xml file. It is the duration that will apply to any piece of data that does not fall under an explicitly defined category . The default value for our EDEX is 1 hour: <archive> <name>Raw</name> <rootDir>/awips2/data_store/</rootDir> <defaultRetentionHours>1</defaultRetentionHours> <category> ... Selected Retention \uf0c1 Data sets are broken up into categories in the RAW_DATA.xml file. These categories are groupings of similar data. Each category has a selectedRetentionHours tag which specifies how long the matching data will be kept for. For example, there is a Model category which sets the purge time to 3 hours for all grib, bufrmos, and modelsounding data: ... <category> <name>Model</name> <selectedRetentionHours>3</selectedRetentionHours> <dataSet> <dirPattern>(grib|grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*)</dirPattern> <displayLabel>{1} - {6}</displayLabel> <dateGroupIndices>2,3,4,5</dateGroupIndices> </dataSet> <dataSet> <dirPattern>(bufrmos|modelsounding)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})</dirPattern> <displayLabel>{1}</displayLabel> <dateGroupIndices>2,3,4,5</dateGroupIndices> </dataSet> </category> ... Logging \uf0c1 Raw data purging can be seen in the purge logs as well ( /awips2/edex/logs/edex-ingest-purge-[yyyymmdd].log where [yyyymmdd] is the date stamp). [centos@tg-atm160027-edex-dev purge]$ grep -i 'archive' /awips2/edex/logs/edex-ingest-purge-20200728.log INFO 2020-07-28 20:05:23,959 2329 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\". INFO 2020-07-28 20:05:23,960 2330 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\", deleted 0 files and directories. INFO 2020-07-28 20:05:23,961 2331 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/bufrhdw\" INFO 2020-07-28 20:05:23,963 2332 [Purge-Archive] ArchivePurgeManager: EDEX - Locked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,963 2333 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Products, directory \"/awips2/data_store/xml\". INFO 2020-07-28 20:05:23,964 2334 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Products, directory \"/awips2/data_store/xml\", deleted 5 files and directories. INFO 2020-07-28 20:05:23,967 2335 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,967 2336 [Purge-Archive] ArchivePurger: EDEX - Raw::Archive Purged 28387 files in 23.8s. INFO 2020-07-28 20:05:23,979 2337 [Purge-Archive] ArchivePurgeManager: EDEX - Purging directory: \"/awips2/edex/data/archive\". INFO 2020-07-28 20:05:23,992 2338 [Purge-Archive] ArchivePurger: EDEX - Processed::Archive Purged 0 files in 25ms. INFO 2020-07-28 20:05:23,992 2339 [Purge-Archive] ArchivePurger: EDEX - Archive Purge finished. Time to run: 23.9s ...","title":"Purging and Retention"},{"location":"edex/data-purge/#purging-and-retention","text":"","title":"Purging and Retention"},{"location":"edex/data-purge/#purge-types","text":"There are two main forms of data purging in AWIPS. The most often thought of is the purging for processed data . This has to do with how long data is stored for after it has been decoded and processed. The second type of purging has to do with raw data . This has to do with how long data is stored for before it has been decoded.","title":"Purge Types"},{"location":"edex/data-purge/#processed-data-purging","text":"AWIPS uses a plugin-based purge strategy for processed HDF5 data . This allows the user to change the purge frequency for each plugin individually, and even set purge rules for specific products for a particular plugin. There is also a default purge rules file for those products which do not have specific rules written. Note : Purging is triggered by a quartz timer event that fires at 30 minutes after each hour. Purging rules are defined in XML files in the Localization Store. On EDEX, most are located in /awips2/edex/data/utility/common_static/base/purge , and follow the base/site localization pattern (e.g. site purge files are in site/XXX/purge rather than base/purge , where XXX is the site identifier). Each data set can have a purge rule defined, and the xml file is named after the data set: ls /awips2/edex/data/utility/common_static/base/purge/ acarsPurgeRules.xml bufruaPurgeRules.xml pirepPurgeRules.xml acarssoundingPurgeRules.xml ccfpPurgeRules.xml poessoundingPurgeRules.xml aggregatePurgeRules.xml convsigmetPurgeRules.xml pointsetPurgeRules.xml airepPurgeRules.xml cwaPurgeRules.xml profilerPurgeRules.xml ...","title":"Processed Data Purging"},{"location":"edex/data-purge/#time-based-purge","text":"If a plugin has no XML file, the default rule of 1 day (24 hours) is used, from /awips2/edex/data/utility/common_static/base/purge/defaultPurgeRules.xml : <purgeRuleSet> <defaultRule> <period>01-00:00:00</period> </defaultRule> </purgeRuleSet> Time-based purging is set with the period tag and uses the reference time of the data. The reference time of the data is determined by the decoder.","title":"Time-based purge"},{"location":"edex/data-purge/#30-day-nexrad3-example","text":"Modify /awips2/edex/data/utility/common_static/base/purge/radarPurgeRules.xml to increase the data retention period from 1 to 31 days: <purgeRuleSet> <defaultRule> <period>31-00:00:00</period> </defaultRule> </purgeRuleSet> Note : you do NOT have to restart EDEX when you change a purge rule!","title":"30-day NEXRAD3 Example"},{"location":"edex/data-purge/#frame-based-purge","text":"Some plugins use frame-base purging, retaining and certain number of product \"versions\". /awips2/edex/data/utility/common_static/base/purge/gridPurgeRules.xml <defaultRule> <versionsToKeep>2</versionsToKeep> <period>07-00:00:00</period> </defaultRule> <rule> <keyValue>LAPS</keyValue> <versionsToKeep>30</versionsToKeep> </rule> <rule regex=\"true\"> <keyValue>NAM(?:12|20|40)</keyValue> <versionsToKeep>2</versionsToKeep> <modTimeToWait>00-00:15:00</modTimeToWait> </rule> ... In the above example, notice a default rule (2) is specified, as well as specific models with their own rules. The tag modTimeToWait can be used in conjunction with versionsToKeep and will increase the versionsToKeep by 1 if data matching this rule has been stored within modTimeToWait.","title":"Frame-Based Purge"},{"location":"edex/data-purge/#purge-logs","text":"Data purge events are logged to the file edex-ingest-purge-[yyyymmdd].log , where [yyyymmdd] is the date stamp. tail -f edex-ingest-purge-20120327.log --------START LOG PURGE--------- INFO 2012-03-27 00:30:00,027 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped file with invalid fileName: afos-trigger.log INFO 2012-03-27 00:30:00,193 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Removed 1 old files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Archived 14 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::Skipped processing 1 files INFO 2012-03-27 00:31:23,155 [DefaultQuartzScheduler_Worker-3] PurgeLogger: EDEX - PURGE LOGS::---------END LOG PURGE-----------","title":"Purge Logs"},{"location":"edex/data-purge/#all-purge-rules","text":"To see all purge rule directories (base, site, configured): find /awips2/edex/data/utility -name purge /awips2/edex/data/utility/common_static/base/purge If any overrides have been made, then it's possible that site directories may show up as results from the find command as well.","title":"All Purge Rules"},{"location":"edex/data-purge/#raw-data-purging","text":"Raw data are files that have been brought in by the LDM and recognized by an action in the pqact.conf file. These files are written to subdirectories of /awips2/data_store/ . This data will wait here until it is purged, from the purging rules defined in /awips2/edex/data/utility/common_static/base/archiver/purger/RAW_DATA.xml . If the purge time is too short, and the processing latencies on EDEX are too long, it is possible that EDEX will miss some of this data, and the purge times will need to be adjusted by changing the <defaultRetentionHours> or <selectedRetentionHours> tag on the relevent data sets.","title":"Raw Data Purging"},{"location":"edex/data-purge/#default-retention","text":"The defaultRetentionHours tag is defined at the beginning of the RAW_DATA.xml file. It is the duration that will apply to any piece of data that does not fall under an explicitly defined category . The default value for our EDEX is 1 hour: <archive> <name>Raw</name> <rootDir>/awips2/data_store/</rootDir> <defaultRetentionHours>1</defaultRetentionHours> <category> ...","title":"Default Retention"},{"location":"edex/data-purge/#selected-retention","text":"Data sets are broken up into categories in the RAW_DATA.xml file. These categories are groupings of similar data. Each category has a selectedRetentionHours tag which specifies how long the matching data will be kept for. For example, there is a Model category which sets the purge time to 3 hours for all grib, bufrmos, and modelsounding data: ... <category> <name>Model</name> <selectedRetentionHours>3</selectedRetentionHours> <dataSet> <dirPattern>(grib|grib2)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})/(.*)</dirPattern> <displayLabel>{1} - {6}</displayLabel> <dateGroupIndices>2,3,4,5</dateGroupIndices> </dataSet> <dataSet> <dirPattern>(bufrmos|modelsounding)/(\\d{4})(\\d{2})(\\d{2})/(\\d{2})</dirPattern> <displayLabel>{1}</displayLabel> <dateGroupIndices>2,3,4,5</dateGroupIndices> </dataSet> </category> ...","title":"Selected Retention"},{"location":"edex/data-purge/#logging","text":"Raw data purging can be seen in the purge logs as well ( /awips2/edex/logs/edex-ingest-purge-[yyyymmdd].log where [yyyymmdd] is the date stamp). [centos@tg-atm160027-edex-dev purge]$ grep -i 'archive' /awips2/edex/logs/edex-ingest-purge-20200728.log INFO 2020-07-28 20:05:23,959 2329 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\". INFO 2020-07-28 20:05:23,960 2330 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Observation, directory \"/awips2/data_store/bufrhdw\", deleted 0 files and directories. INFO 2020-07-28 20:05:23,961 2331 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/bufrhdw\" INFO 2020-07-28 20:05:23,963 2332 [Purge-Archive] ArchivePurgeManager: EDEX - Locked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,963 2333 [Purge-Archive] ArchivePurgeManager: EDEX - Start purge of category Raw - Products, directory \"/awips2/data_store/xml\". INFO 2020-07-28 20:05:23,964 2334 [Purge-Archive] ArchivePurgeManager: EDEX - End purge of category Raw - Products, directory \"/awips2/data_store/xml\", deleted 5 files and directories. INFO 2020-07-28 20:05:23,967 2335 [Purge-Archive] ArchivePurgeManager: EDEX - Unlocked: \"/awips2/data_store/xml\" INFO 2020-07-28 20:05:23,967 2336 [Purge-Archive] ArchivePurger: EDEX - Raw::Archive Purged 28387 files in 23.8s. INFO 2020-07-28 20:05:23,979 2337 [Purge-Archive] ArchivePurgeManager: EDEX - Purging directory: \"/awips2/edex/data/archive\". INFO 2020-07-28 20:05:23,992 2338 [Purge-Archive] ArchivePurger: EDEX - Processed::Archive Purged 0 files in 25ms. INFO 2020-07-28 20:05:23,992 2339 [Purge-Archive] ArchivePurger: EDEX - Archive Purge finished. Time to run: 23.9s ...","title":"Logging"},{"location":"edex/distributed-computing/","text":"Distributed EDEX \uf0c1 AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early NSF Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data. NSF Unidata's Current EDEX Server \uf0c1 Currently, we use a distributed architecture comprised of three machines: 1 main EDEX machine and 2 ancillary EDEX machines. The main EDEX machine decodes and processes the majority of the data, while serving and storing all of the data. Our two ancillary machines -- one for radar data and one for satellite data -- each decode and process a subset of the data and send it back to the main EDEX for storage and requesting. The main EDEX is an instance of our Database and Request Server and more information on our ancillary EDEX machines is below as well. Example Installation \uf0c1 This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data. Database/Request Server \uf0c1 For this example, this server will be referred to by the IP address 10.0.0.9 . 1. Install \uf0c1 wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --database 2. IPtables Config \uf0c1 It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is not recommended that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it is recommended that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server). vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT -A INPUT -s 10.0.0.7 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT -A EDEX -j REJECT COMMIT Note the line -A INPUT -s 10.0.0.7 -j EDEX as well as the following -A EDEX ... rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). The two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections 3. Database Config \uf0c1 In the file /awips2/database/data/pg_hba.conf you define remote connections for all postgres tables with as <IP address>/32 , after the block of IPv4 local connections and generic <IP address/24> for hostnossl: vi /awips2/database/data/pg_hba.conf # \"local\" is for Unix domain socket connections only local all all trust hostssl all all 162.0.0.0/8 cert clientcert=1 hostssl all all 127.0.0.1/32 cert clientcert=1 hostssl all all 10.0.0.7/32 cert clientcert=1 hostnossl postgres all 10.0.0.0/24 md5 hostnossl fxatext all 10.0.0.0/24 md5 hostnossl metadata all 10.0.0.0/24 md5 # IPv6 local connections: hostssl all all ::1/128 cert clientcert=1 hostnossl all all ::1/128 md5 4. Start EDEX \uf0c1 edex start database This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs) 5. Monitor Services \uf0c1 The command edex will show which services are running, and for a Database/Request server, will not include the LDM, EDEXingest, or EDEXgrib: edex [edex status] postgres :: running :: pid 571 pypies :: running :: pid 639 qpid :: running :: pid 674 EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: running :: pid 987 1029 23792 Since this Database/Request server is not running the main edexIngest JVM, we won't see anything from edex log , instead watch the Request Server with the command edex log request Confirm that EDEX Request connects to PostgreSQL! With the above edex log request , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-request, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-request, auth-request, awipstools-request, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to PostgreSQL - double-check DB_ADDR in /awips2/edex/bin/setup.env Ancillary EDEX Server (Ingest/Decode EDEX Server) \uf0c1 For this example, this server will be referred to by the IP address 10.0.0.7 . The Main EDEX server will be referred to by the IP address 10.0.0.9 . 1. Install \uf0c1 wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --ingest 2. EDEX Config \uf0c1 vi /awips2/edex/bin/setup.env Here you should redefine DB_ADDR and PYPIES_SERVER to point to the Main or Database/Request server (10.0.0.9) and the EXT_ADDR to point to the current Ingest server (10.0.0.7) export EXT_ADDR=10.0.0.7 # postgres connection export DB_ADDR=10.0.0.9 export DB_PORT=5432 # pypies hdf5 connection export PYPIES_SERVER=http://10.0.0.9:9582 # qpid connection export BROKER_ADDR=${EXT_ADDR} Notice that EXT_ADDR and BROKER_ADDR (qpid) should remain defined as the localhost IP address (10.0.0.7) 3. Modify the edexServiceList \uf0c1 Most likely if you are running a distributed EDEX setup, you are only processing a subset of data. You can change your edexServiceList to only run the processes you need. You will need to update the /etc/init.d/edexServiceList file. For example replace the services with the associated right column based on the data you're processing: export SERVICES=('') Data Processing: edexServiceList radar ingestRadar satellite ingestGoesR model ingestGrids, ingestGrib 4. Configure your LDM \uf0c1 You'll want to modify your pqact.conf file to store only the data you want processed. There are example files in /awips2/ldm/etc that you can copy over to the main pqact.conf file. For example if you are wanting to process goesr data only, you can do the following steps: cd /awips2/ldm/etc mv pqact.conf pqact.conf.orig cp pqact.goesr pqact.conf You will also want to edit the pqact.conf file on your Main EDEX and comment out any entries you're processing on this EDEX server. 5. Start EDEX \uf0c1 edex start This will start LDM, Qpid and the specified EDEX Ingest JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM) 4. Monitor Services \uf0c1 Watch the edex JVM log with the command edex log Confirm that EDEX connects to PostgreSQL! With the above edex log , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-ingest, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-ingest, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to the remote PostgreSQL instance - double-check DB_ADDR in /awips2/edex/bin/setup.env You can manually check remote PostgreSQL connectivity on any EDEX Ingest server from the command line: su - awips psql -U awips -h <remote IP address> -p 5432 metadata Where the default passwd is awips and is defined in files in /awips2/edex/conf/db/hibernateConfig/ Additional Notes \uf0c1 Be mindful of what IP address and hostnames are used in /awips2/edex/bin/setup.env and /awips2/database/data/pg_hba.conf , and that they are resolvable from the command line. Consult or edit /etc/hosts as needed. You can install multiple awips2-ingest servers, each decoding a different dataset or feed, all pointing to the same Database/Request server ( DB_ADDR and PYPIES_SERVER in /awips2/edex/bin/setup.env ): Every EDEX Ingest IP address must be allowed in both iptables and pg_hba.conf as shown above .","title":"Distributed EDEX"},{"location":"edex/distributed-computing/#distributed-edex","text":"AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS could consist of a dozen servers or more, the early NSF Unidata releases were stripped of operations-specific configurations and plugins, and released as a standalone server. This worked, since (at the time) a single EDEX instance with an attached SSD could handle most of NOAAport. However, with GOES-R(16) coming online in 2017, and more gridded forecast models being created at finer temporal and spatial resolutions, there is now a need to distribute the data decoding across multiple machines to handle this firehose of data.","title":"Distributed EDEX"},{"location":"edex/distributed-computing/#nsf-unidatas-current-edex-server","text":"Currently, we use a distributed architecture comprised of three machines: 1 main EDEX machine and 2 ancillary EDEX machines. The main EDEX machine decodes and processes the majority of the data, while serving and storing all of the data. Our two ancillary machines -- one for radar data and one for satellite data -- each decode and process a subset of the data and send it back to the main EDEX for storage and requesting. The main EDEX is an instance of our Database and Request Server and more information on our ancillary EDEX machines is below as well.","title":"NSF Unidata's Current EDEX Server"},{"location":"edex/distributed-computing/#example-installation","text":"This walkthrough will install different EDEX components on two machines in the XSEDE Jetstream Cloud, the first is used to store and serve while the second is used to ingest and decode data.","title":"Example Installation"},{"location":"edex/distributed-computing/#databaserequest-server","text":"For this example, this server will be referred to by the IP address 10.0.0.9 .","title":"Database/Request Server"},{"location":"edex/distributed-computing/#1-install","text":"wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --database","title":"1. Install"},{"location":"edex/distributed-computing/#2-iptables-config","text":"It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is not recommended that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it is recommended that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server). vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT -A INPUT -s 10.0.0.7 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT -A EDEX -j REJECT COMMIT Note the line -A INPUT -s 10.0.0.7 -j EDEX as well as the following -A EDEX ... rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5). The two ports left open to all connections (9581,9582) in addition to default port 22 are for outside CAVE client connections","title":"2. IPtables Config"},{"location":"edex/distributed-computing/#3-database-config","text":"In the file /awips2/database/data/pg_hba.conf you define remote connections for all postgres tables with as <IP address>/32 , after the block of IPv4 local connections and generic <IP address/24> for hostnossl: vi /awips2/database/data/pg_hba.conf # \"local\" is for Unix domain socket connections only local all all trust hostssl all all 162.0.0.0/8 cert clientcert=1 hostssl all all 127.0.0.1/32 cert clientcert=1 hostssl all all 10.0.0.7/32 cert clientcert=1 hostnossl postgres all 10.0.0.0/24 md5 hostnossl fxatext all 10.0.0.0/24 md5 hostnossl metadata all 10.0.0.0/24 md5 # IPv6 local connections: hostssl all all ::1/128 cert clientcert=1 hostnossl all all ::1/128 md5","title":"3. Database Config"},{"location":"edex/distributed-computing/#4-start-edex","text":"edex start database This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs)","title":"4. Start EDEX"},{"location":"edex/distributed-computing/#5-monitor-services","text":"The command edex will show which services are running, and for a Database/Request server, will not include the LDM, EDEXingest, or EDEXgrib: edex [edex status] postgres :: running :: pid 571 pypies :: running :: pid 639 qpid :: running :: pid 674 EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: running :: pid 987 1029 23792 Since this Database/Request server is not running the main edexIngest JVM, we won't see anything from edex log , instead watch the Request Server with the command edex log request Confirm that EDEX Request connects to PostgreSQL! With the above edex log request , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-request, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-request, auth-request, awipstools-request, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to PostgreSQL - double-check DB_ADDR in /awips2/edex/bin/setup.env","title":"5. Monitor Services"},{"location":"edex/distributed-computing/#ancillary-edex-server-ingestdecode-edex-server","text":"For this example, this server will be referred to by the IP address 10.0.0.7 . The Main EDEX server will be referred to by the IP address 10.0.0.9 .","title":"Ancillary EDEX Server (Ingest/Decode EDEX Server)"},{"location":"edex/distributed-computing/#1-install_1","text":"wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --ingest","title":"1. Install"},{"location":"edex/distributed-computing/#2-edex-config","text":"vi /awips2/edex/bin/setup.env Here you should redefine DB_ADDR and PYPIES_SERVER to point to the Main or Database/Request server (10.0.0.9) and the EXT_ADDR to point to the current Ingest server (10.0.0.7) export EXT_ADDR=10.0.0.7 # postgres connection export DB_ADDR=10.0.0.9 export DB_PORT=5432 # pypies hdf5 connection export PYPIES_SERVER=http://10.0.0.9:9582 # qpid connection export BROKER_ADDR=${EXT_ADDR} Notice that EXT_ADDR and BROKER_ADDR (qpid) should remain defined as the localhost IP address (10.0.0.7)","title":"2. EDEX Config"},{"location":"edex/distributed-computing/#3-modify-the-edexservicelist","text":"Most likely if you are running a distributed EDEX setup, you are only processing a subset of data. You can change your edexServiceList to only run the processes you need. You will need to update the /etc/init.d/edexServiceList file. For example replace the services with the associated right column based on the data you're processing: export SERVICES=('') Data Processing: edexServiceList radar ingestRadar satellite ingestGoesR model ingestGrids, ingestGrib","title":"3. Modify the edexServiceList"},{"location":"edex/distributed-computing/#4-configure-your-ldm","text":"You'll want to modify your pqact.conf file to store only the data you want processed. There are example files in /awips2/ldm/etc that you can copy over to the main pqact.conf file. For example if you are wanting to process goesr data only, you can do the following steps: cd /awips2/ldm/etc mv pqact.conf pqact.conf.orig cp pqact.goesr pqact.conf You will also want to edit the pqact.conf file on your Main EDEX and comment out any entries you're processing on this EDEX server.","title":"4. Configure your LDM"},{"location":"edex/distributed-computing/#5-start-edex","text":"edex start This will start LDM, Qpid and the specified EDEX Ingest JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM)","title":"5. Start EDEX"},{"location":"edex/distributed-computing/#4-monitor-services","text":"Watch the edex JVM log with the command edex log Confirm that EDEX connects to PostgreSQL! With the above edex log , ensure that the log progresses past this point : Spring-enabled Plugins: ----------------------- acars-common, acars-common-dataaccess, acarssounding-common, activetable-common, activetable-ingest, airep-common, airep-common-dataaccess, airmet-common, atcf-common, atcf-ingest, aww-common... JAXB context for PersistencePathKeySet inited in: 5ms INFO 20:21:09,134 5584 [EDEXMain] Reflections: Reflections took 436 ms to scan 258 urls, producing 31 keys and 3637 values Found 499 db classes in 720 ms If the log stops at the Found db classes... line, that means EDEX is not connecting to the remote PostgreSQL instance - double-check DB_ADDR in /awips2/edex/bin/setup.env You can manually check remote PostgreSQL connectivity on any EDEX Ingest server from the command line: su - awips psql -U awips -h <remote IP address> -p 5432 metadata Where the default passwd is awips and is defined in files in /awips2/edex/conf/db/hibernateConfig/","title":"4. Monitor Services"},{"location":"edex/distributed-computing/#additional-notes","text":"Be mindful of what IP address and hostnames are used in /awips2/edex/bin/setup.env and /awips2/database/data/pg_hba.conf , and that they are resolvable from the command line. Consult or edit /etc/hosts as needed. You can install multiple awips2-ingest servers, each decoding a different dataset or feed, all pointing to the same Database/Request server ( DB_ADDR and PYPIES_SERVER in /awips2/edex/bin/setup.env ): Every EDEX Ingest IP address must be allowed in both iptables and pg_hba.conf as shown above .","title":"Additional Notes"},{"location":"edex/edex-ingest-docker-container/","text":"Docker EDEX \uf0c1 Project home: https://github.com/Unidata/edex-docker EDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration. The EDEX Docker Image is built on CentOS 7 and contains the latest NSF Unidata AWIPS release (18.1.1). This container is an ingest-only install, meaning there is no database or request server . This example requires a Database/Request server be configured for you to access remotely. See the Distributed EDEX document for more. Download and Install Docker \uf0c1 Download and install Docker and Docker Compose: Docker for CentOS 7 Linux Docker for Mac Docker for Windows docker-compose (it should be bundled with Docker by default on Mac and Windows) Run the EDEX Ingest Container \uf0c1 Clone the source repository: git clone https://github.com/Unidata/edex-docker.git cd edex-docker Run the container with docker-compose: docker-compose up -d edex-ingest Confirm the container is running: docker ps -a Enter the container: docker exec -it edex-ingest bash Stop the container: docker-compose stop Delete the container (keep the image): docker-compose rm -f Run commands inside the container, such as: docker exec edex-ingest edex which should return something like: [edex status] qpid :: running :: pid 22474 EDEXingest :: running :: pid 21860 31513 EDEXgrib :: not running ldmadmin :: running :: pid 22483 edex (status|start|stop|setup|log|purge|qpid|users) To update to the latest version and restart: docker pull unidata/edex-ingest:latest docker-compose stop docker-compose up -d edex-ingest Configuration and Customization \uf0c1 The file docker-compose.yml defines files to mount to the container and which ports to open: edex-ingest: image: unidata/edex-ingest:latest container_name: edex-ingest volumes: - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf - ./bin/setup.env:/awips2/edex/bin/setup.env - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh ports: - \"388:388\" ulimits: nofile: soft: 1024 hard: 1024 Mounted Files \uf0c1 etc/ldmd.conf \uf0c1 Defines which data feeds to receive. By default there is only one active request line ( REQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu ) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file etc/ldmd.conf will be read the next time you restart the container. etc/pqact.conf \uf0c1 Defines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in NSF Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in ldmd.conf (above). bin/setup.env \uf0c1 Defines the remote EDEX Database/Request server: ### EDEX localization related variables ### export AW_SITE_IDENTIFIER=OAX export EXT_ADDR=js-157-198.jetstream-cloud.org Note : EXT_ADDR must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our edex-ingest access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server. bin/runedex.sh \uf0c1 The default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see ENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"] in Dockerfile.edex ), essentially: /awips2/qpid/bin/qpid-wrapper & /awips2/edex/bin/start.sh -noConsole ingest & ldmadmin mkqueue ldmadmin start","title":"Docker EDEX"},{"location":"edex/edex-ingest-docker-container/#docker-edex","text":"Project home: https://github.com/Unidata/edex-docker EDEX can be run inside a docker container, which allows you to process data into an AWIPS system without requiring accessing and altering the machine's native CentOS installation and configuration. The EDEX Docker Image is built on CentOS 7 and contains the latest NSF Unidata AWIPS release (18.1.1). This container is an ingest-only install, meaning there is no database or request server . This example requires a Database/Request server be configured for you to access remotely. See the Distributed EDEX document for more.","title":"Docker EDEX"},{"location":"edex/edex-ingest-docker-container/#download-and-install-docker","text":"Download and install Docker and Docker Compose: Docker for CentOS 7 Linux Docker for Mac Docker for Windows docker-compose (it should be bundled with Docker by default on Mac and Windows)","title":"Download and Install Docker"},{"location":"edex/edex-ingest-docker-container/#run-the-edex-ingest-container","text":"Clone the source repository: git clone https://github.com/Unidata/edex-docker.git cd edex-docker Run the container with docker-compose: docker-compose up -d edex-ingest Confirm the container is running: docker ps -a Enter the container: docker exec -it edex-ingest bash Stop the container: docker-compose stop Delete the container (keep the image): docker-compose rm -f Run commands inside the container, such as: docker exec edex-ingest edex which should return something like: [edex status] qpid :: running :: pid 22474 EDEXingest :: running :: pid 21860 31513 EDEXgrib :: not running ldmadmin :: running :: pid 22483 edex (status|start|stop|setup|log|purge|qpid|users) To update to the latest version and restart: docker pull unidata/edex-ingest:latest docker-compose stop docker-compose up -d edex-ingest","title":"Run the EDEX Ingest Container"},{"location":"edex/edex-ingest-docker-container/#configuration-and-customization","text":"The file docker-compose.yml defines files to mount to the container and which ports to open: edex-ingest: image: unidata/edex-ingest:latest container_name: edex-ingest volumes: - ./etc/ldmd.conf:/awips2/ldm/etc/ldmd.conf - ./etc/pqact.conf:/awips2/ldm/etc/pqact.conf - ./bin/setup.env:/awips2/edex/bin/setup.env - ./bin/runedex.sh:/awips2/edex/bin/runedex.sh ports: - \"388:388\" ulimits: nofile: soft: 1024 hard: 1024","title":"Configuration and Customization"},{"location":"edex/edex-ingest-docker-container/#mounted-files","text":"","title":"Mounted Files"},{"location":"edex/edex-ingest-docker-container/#etcldmdconf","text":"Defines which data feeds to receive. By default there is only one active request line ( REQUEST IDS|DDPLUS \".*\" idd.unidata.ucar.edu ) to not overwhelm small EDEX containers ingesting large volumes of radar and gridded data files. Any updates to the file etc/ldmd.conf will be read the next time you restart the container.","title":"etc/ldmd.conf"},{"location":"edex/edex-ingest-docker-container/#etcpqactconf","text":"Defines how products are processed and where they are written to on the filesystem. This is the full set of pattern actions used in NSF Unidata AWIPS, and generally you do not need to edit this file. Instead control which data feeds are requested in ldmd.conf (above).","title":"etc/pqact.conf"},{"location":"edex/edex-ingest-docker-container/#binsetupenv","text":"Defines the remote EDEX Database/Request server: ### EDEX localization related variables ### export AW_SITE_IDENTIFIER=OAX export EXT_ADDR=js-157-198.jetstream-cloud.org Note : EXT_ADDR must be set to an allowed EDEX Database/Request Server. In this example we are using a JetStream Cloud instance, which controls our edex-ingest access with IPtables, SSL certificates, and PostgreSQL pg_hba.conf rules. This server will not allow outside connections, you must change this to point to an appropriate server.","title":"bin/setup.env"},{"location":"edex/edex-ingest-docker-container/#binrunedexsh","text":"The default script run when the container is started, acts as a sort-of service manager for EDEX and the LDM (see ENTRYPOINT [\"/awips2/edex/bin/runedex.sh\"] in Dockerfile.edex ), essentially: /awips2/qpid/bin/qpid-wrapper & /awips2/edex/bin/start.sh -noConsole ingest & ldmadmin mkqueue ldmadmin start","title":"bin/runedex.sh"},{"location":"edex/edex-users/","text":"Monitor Users \uf0c1 To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu ... Logging Daily EDEX Users \uf0c1 To get a running log of who has accessed EDEX, you can create a short script. The example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's edex users list to a logfile /home/awips/edex-users.log : vi~/edexUsers.sh #!/bin/bash /awips2/edex/bin/edex users >> /home/awips/edex-users.log crontab -e 0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1","title":"Monitor Users"},{"location":"edex/edex-users/#monitor-users","text":"To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu ...","title":"Monitor Users"},{"location":"edex/edex-users/#logging-daily-edex-users","text":"To get a running log of who has accessed EDEX, you can create a short script. The example below is a script that runs once daily at 20 minutes after 00 UTC, appending each day's edex users list to a logfile /home/awips/edex-users.log : vi~/edexUsers.sh #!/bin/bash /awips2/edex/bin/edex users >> /home/awips/edex-users.log crontab -e 0 20 * * * /home/awips/edexUsers.sh 1>> /dev/null 2>&1","title":"Logging Daily EDEX Users"},{"location":"edex/ldm/","text":"LDM Feeds \uf0c1 Default LDM Feeds for EDEX \uf0c1 Data feeds are defined by the ldmd.conf file in /awips2/ldm/etc/ldmd.conf . The default feeds that come \"turned on\" with our EDEX are the following: REQUEST FNEXRAD \".*\" idd.unidata.ucar.edu # MRMS - NSF Unidata feed via NCEP REQUEST NEXRAD3 \".*\" idd.unidata.ucar.edu # Radar Level3 REQUEST HDS \"^SDUS6.*\" idd.unidata.ucar.edu # Radar Level3 - specific files REQUEST WMO \".*\" idd.unidata.ucar.edu # WMO Feedtype includes HDS|IDS|DDPLUS REQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI and GOES Products REQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM Gridded Product (Texas Tech-Eric Bruning) REQUEST NOTHER \"^TI[A-W]... KNES\" idd.unidata.ucar.edu # VIIRS and GOES CMI via SBN REQUEST NOTHER \"^IXT[WXY]01\" idd.unidata.ucar.edu #Special SBN GOES Derived products-different WMO (COD, CPS, CTP) REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12 REQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS Optional LDM Feeds \uf0c1 Some additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and restart the LDM . FNMOC and CMC models \uf0c1 REQUEST FNMOC \".*\" idd.unidata.ucar.edu REQUEST CMC \".*\" idd.unidata.ucar.edu Lightning (restricted to educational use with rebroadcasting restricted) \uf0c1 REQUEST LIGHTNING \".*\" striker2.atmos.albany.edu REQUEST LIGHTNING \".*\" idd.unidata.ucar.edu FSL/GSD Experimental HRRR (Sub-hourly) \uf0c1 REQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu Restart the LDM \uf0c1 Use the following commands to restart the LDM: sudo service edex_ldm restart ldmadmin restart Monitor Incoming Data Feeds \uf0c1 To watch incoming data in real-time: notifyme -vl - To watch for a specific product and feed and time (360 sec = 6 min): notifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360 To watch the same on a remote queue: notifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360 LDM Logging \uf0c1 To open a real-time readout of LDM logging you can run use the edex command. To exit, press CTRL+C . edex log ldm [edex] EDEX Log Viewer :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit Aug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT Aug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally). Aug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE Aug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally). Aug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM Aug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).","title":"LDM Feeds"},{"location":"edex/ldm/#ldm-feeds","text":"","title":"LDM Feeds"},{"location":"edex/ldm/#default-ldm-feeds-for-edex","text":"Data feeds are defined by the ldmd.conf file in /awips2/ldm/etc/ldmd.conf . The default feeds that come \"turned on\" with our EDEX are the following: REQUEST FNEXRAD \".*\" idd.unidata.ucar.edu # MRMS - NSF Unidata feed via NCEP REQUEST NEXRAD3 \".*\" idd.unidata.ucar.edu # Radar Level3 REQUEST HDS \"^SDUS6.*\" idd.unidata.ucar.edu # Radar Level3 - specific files REQUEST WMO \".*\" idd.unidata.ucar.edu # WMO Feedtype includes HDS|IDS|DDPLUS REQUEST UNIWISC|NIMAGE \".*\" idd.unidata.ucar.edu # AREA/GINI and GOES Products REQUEST DIFAX \"GLM\" idd.unidata.ucar.edu # GOES GLM Gridded Product (Texas Tech-Eric Bruning) REQUEST NOTHER \"^TI[A-W]... KNES\" idd.unidata.ucar.edu # VIIRS and GOES CMI via SBN REQUEST NOTHER \"^IXT[WXY]01\" idd.unidata.ucar.edu #Special SBN GOES Derived products-different WMO (COD, CPS, CTP) REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST CONDUIT \"nam\" idd.unidata.ucar.edu # NAM12 REQUEST CONDUIT \"pgrb2\" idd.unidata.ucar.edu # GFS","title":"Default LDM Feeds for EDEX"},{"location":"edex/ldm/#optional-ldm-feeds","text":"Some additional feeds are included but commented out using '#'. To activate the feed, simply remove the #, save the file, and restart the LDM .","title":"Optional LDM Feeds"},{"location":"edex/ldm/#fnmoc-and-cmc-models","text":"REQUEST FNMOC \".*\" idd.unidata.ucar.edu REQUEST CMC \".*\" idd.unidata.ucar.edu","title":"FNMOC and CMC models"},{"location":"edex/ldm/#lightning-restricted-to-educational-use-with-rebroadcasting-restricted","text":"REQUEST LIGHTNING \".*\" striker2.atmos.albany.edu REQUEST LIGHTNING \".*\" idd.unidata.ucar.edu","title":"Lightning (restricted to educational use with rebroadcasting restricted)"},{"location":"edex/ldm/#fslgsd-experimental-hrrr-sub-hourly","text":"REQUEST FSL2 \"^GRIB2.FSL.HRRR\" hrrr.unidata.ucar.edu","title":"FSL/GSD Experimental HRRR (Sub-hourly)"},{"location":"edex/ldm/#restart-the-ldm","text":"Use the following commands to restart the LDM: sudo service edex_ldm restart ldmadmin restart","title":"Restart the LDM"},{"location":"edex/ldm/#monitor-incoming-data-feeds","text":"To watch incoming data in real-time: notifyme -vl - To watch for a specific product and feed and time (360 sec = 6 min): notifyme -vl - -h localhost -f NEXRAD3 -p DHR -o 360 To watch the same on a remote queue: notifyme -vl - -h idd.unidata.ucar.edu -f NEXRAD3 -p DHR -o 360","title":"Monitor Incoming Data Feeds"},{"location":"edex/ldm/#ldm-logging","text":"To open a real-time readout of LDM logging you can run use the edex command. To exit, press CTRL+C . edex log ldm [edex] EDEX Log Viewer :: Viewing /awips2/ldm/logs/ldmd.log. Press CTRL+C to exit Aug 26 15:05:10 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_MXUPHL01-21387192.grib2\": 406227 20160826210510.477 NGRID 21387192 YZCG86 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/MXUPHL01/5000-2000 m HGHT Aug 26 15:05:11 edextest edexBridge[5812] NOTE: Sent 2 messages (0 at the end of the queue, 2 normally). Aug 26 15:05:11 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_CICEP-21387200.grib2\": 369464 20160826210511.484 NGRID 21387200 YMCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/CICEP/0 - NONE Aug 26 15:05:12 edextest edexBridge[5812] NOTE: Sent 9 messages (0 at the end of the queue, 9 normally). Aug 26 15:05:12 edextest pqact[5811] NOTE: Filed in \"/awips2/data_store/grid/HRRR/HRRR_CONUS_2p5km_201608262000_F006_LTNG-21387205.grib2\": 482800 20160826210512.254 NGRID 21387205 YZCG98 KWBY 262000 !grib2/ncep/HRRR/#255/201608262000F006/LTNG/0 - EATM Aug 26 15:05:13 edextest edexBridge[5812] NOTE: Sent 1 messages (0 at the end of the queue, 1 normally).","title":"LDM Logging"},{"location":"edex/new-grid-grib1-old/","text":"Ingest a New Grid Using .grib Files \uf0c1 Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table Ingest an Unsupported Grid \uf0c1 Download Test Data \uf0c1 Download an example grib1 file and rename to a *.grib extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://downloads.unidata.ucar.edu/awips2/current/files/14102318_nmm_d01.GrbF00600 -O wrf.grib cp wrf.grib /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension. Check Grib Logs \uf0c1 Confirm that the grib file decodes in the grib log file: edex log grib INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... Check HDF5 Data \uf0c1 Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively). Determine Grid Projection \uf0c1 When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata metadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:7:0:89'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 -----+-----+------------------+------------------+-----------+-----------+------------------+-------------------+-------------------+------------------+------------------ 201 | 155 | 4.29699993133545 | 4.29699993133545 | 6378160 | 6356775 | 42.2830009460449 | -72.3610000610352 | -67.0770034790039 | 45.3680000305176 | 45.3680000305176 (1 row) Compare with the projection info returned by wgrib on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 Notice that our grib file has a Lambert Conformal projection. We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage area. Create Grid Projection File \uf0c1 Projection Types \uf0c1 Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) <lambertConformalGridCoverage> <name>305</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>16.322</la1> <lo1>-125.955</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>151</nx> <ny>113</ny> <dx>40.63525</dx> <dy>40.63525</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-95.0</lov> <latin1>25.0</latin1> <latin2>25.0</latin2> </lambertConformalGridCoverage> polarStereoGridCoverage (example seaice_south1_grid.xml ) <polarStereoGridCoverage> <name>405</name> <description>Sea Ice south 690X710 13km grid</description> <la1>-36.866</la1> <lo1>139.806</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>690</nx> <ny>710</ny> <dx>12.7</dx> <dy>12.7</dy> <spacingUnit>km</spacingUnit> <minorAxis>6371229.0</minorAxis> <majorAxis>6371229.0</majorAxis> <lov>100.0</lov> </polarStereoGridCoverage> latLonGridCoverage (example UkmetHR-SHemisphere.xml ) <latLonGridCoverage> <name>864162002</name> <description>UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E </description> <la1>-89.721</la1> <lo1>71.25</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>864</nx> <ny>162</ny> <dx>0.833</dx> <dy>0.556</dy> <spacingUnit>degree</spacingUnit> <la2>-0.278</la2> <lo2>70.416</lo2> </latLonGridCoverage> mercatorGridCoverage (example gridNBM_PR.xml ) <mercatorGridCoverage> <name>NBM_PR</name> <description> National Blend Grid over Puerto Rico - (1.25 km)</description> <la1>16.9775</la1> <lo1>-68.0278</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>339</nx> <ny>225</ny> <dx>1.25</dx> <dy>1.25</dy> <la2>19.3750032477232</la2> <lo2>-63.984399999999994</lo2> <latin>20</latin> <spacingUnit>km</spacingUnit> <minorAxis>6371200</minorAxis> <majorAxis>6371200</majorAxis> </mercatorGridCoverage> Creating a New Projection File \uf0c1 Copy an existing xml file with the same grid projection type (in this case lambertConformalGridCoverage ) to a new file wrf.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp RUCIcing.xml wrf.xml And edit the new wrf.xml to define the projection values using the output from wgrib or the database (example provided): vi wrf.xml <lambertConformalGridCoverage> <name>201155</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>42.2830009460449</la1> <lo1>-72.3610000610352</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>201</nx> <ny>155</ny> <dx>4.29699993133545</dx> <dy>4.29699993133545</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-67.0770034790039</lov> <latin1>45.3680000305176</latin1> <latin2>45.3680000305176</latin2> </lambertConformalGridCoverage> Notice the <name>201155</name> tag was created by using the number of grid points (201 and 155). This name can be anything as long as it is unique and will be used to match against in the model definition. Create Model Definition \uf0c1 Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib file has a center ID of 7 (NCEP) we will edit the gribModels_NCEP-7.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NCEP-7.xml In <gribModelSet> add an entry: <model> <name>WRF</name> <center>7</center> <subcenter>0</subcenter> <grid>201155</grid> <process> <id>89</id> </process> </model> Save the file and restart EDEX for the changes to take effect: sudo service edex_camel restart ingestGrib Now copy the wrf.grib file again to /awips2/data_store/ingest/ . If everything is correct we will not see any persistence errors since the grid is now named WRF and not GribModel:7:0:89 . cp wrf.grib /awips2/data_store/ingest/ edex log grib After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid . Adding a Table \uf0c1 If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . The center and subcenter have been identified previously here , as 7 and 0, respectively. So, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/ To find the discipline of a grib product, you need the process and table values from the grib file. These are output with the wgrib -V command: wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 < LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 For our example, the process is 89 and table is 2 . Next, take a look in: /awips2/edex/data/utility/common_static/base/grid/grib1ParameterConvTable.xml And find the entry that has grib1 data with TableVersion 2 and Value 89: <grib1Parameter> <center>7</center> <grib1TableVersion>2</grib1TableVersion> <grib1Value>89</grib1Value> <grib2discipline>0</grib2discipline> <grib2category>3</grib2category> <grib2Value>10</grib2Value> </grib1Parameter> Here, we can see the discipline and category values (referred to as x above) are 0 and 3, respectively. So, the table needed for our example file is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/4.2.0.3.table","title":"Ingest a New Grid Using .grib Files"},{"location":"edex/new-grid-grib1-old/#ingest-a-new-grid-using-grib-files","text":"Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/data_store/ingest/ To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table","title":"Ingest a New Grid Using .grib Files"},{"location":"edex/new-grid-grib1-old/#ingest-an-unsupported-grid","text":"","title":"Ingest an Unsupported Grid"},{"location":"edex/new-grid-grib1-old/#download-test-data","text":"Download an example grib1 file and rename to a *.grib extension, then copy to the manual ingest point /awips2/data_store/ingest/ wget https://downloads.unidata.ucar.edu/awips2/current/files/14102318_nmm_d01.GrbF00600 -O wrf.grib cp wrf.grib /awips2/data_store/ingest/ Remember that the data distribution file ( /awips2/edex/data/utility/common_static/base/distribution/grib.xml ) will match filenames which have the *.grib* extension.","title":"Download Test Data"},{"location":"edex/new-grid-grib1-old/#check-grib-logs","text":"Confirm that the grib file decodes in the grib log file: edex log grib INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/data_store/ingest/grib/20141026/14/wrf.grib processed in: 0.2240 (sec) Latency: 21.9140 (sec) ...","title":"Check Grib Logs"},{"location":"edex/new-grid-grib1-old/#check-hdf5-data","text":"Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:7:0:89 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (7, 0, 89, respectively).","title":"Check HDF5 Data"},{"location":"edex/new-grid-grib1-old/#determine-grid-projection","text":"When the grid was ingested a record was added to the grid_coverage table with its navigation information: psql metadata metadata=# select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2 from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:7:0:89'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 -----+-----+------------------+------------------+-----------+-----------+------------------+-------------------+-------------------+------------------+------------------ 201 | 155 | 4.29699993133545 | 4.29699993133545 | 6378160 | 6356775 | 42.2830009460449 | -72.3610000610352 | -67.0770034790039 | 45.3680000305176 | 45.3680000305176 (1 row) Compare with the projection info returned by wgrib on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 Notice that our grib file has a Lambert Conformal projection. We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage area.","title":"Determine Grid Projection"},{"location":"edex/new-grid-grib1-old/#create-grid-projection-file","text":"","title":"Create Grid Projection File"},{"location":"edex/new-grid-grib1-old/#projection-types","text":"Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) <lambertConformalGridCoverage> <name>305</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>16.322</la1> <lo1>-125.955</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>151</nx> <ny>113</ny> <dx>40.63525</dx> <dy>40.63525</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-95.0</lov> <latin1>25.0</latin1> <latin2>25.0</latin2> </lambertConformalGridCoverage> polarStereoGridCoverage (example seaice_south1_grid.xml ) <polarStereoGridCoverage> <name>405</name> <description>Sea Ice south 690X710 13km grid</description> <la1>-36.866</la1> <lo1>139.806</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>690</nx> <ny>710</ny> <dx>12.7</dx> <dy>12.7</dy> <spacingUnit>km</spacingUnit> <minorAxis>6371229.0</minorAxis> <majorAxis>6371229.0</majorAxis> <lov>100.0</lov> </polarStereoGridCoverage> latLonGridCoverage (example UkmetHR-SHemisphere.xml ) <latLonGridCoverage> <name>864162002</name> <description>UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E </description> <la1>-89.721</la1> <lo1>71.25</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>864</nx> <ny>162</ny> <dx>0.833</dx> <dy>0.556</dy> <spacingUnit>degree</spacingUnit> <la2>-0.278</la2> <lo2>70.416</lo2> </latLonGridCoverage> mercatorGridCoverage (example gridNBM_PR.xml ) <mercatorGridCoverage> <name>NBM_PR</name> <description> National Blend Grid over Puerto Rico - (1.25 km)</description> <la1>16.9775</la1> <lo1>-68.0278</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>339</nx> <ny>225</ny> <dx>1.25</dx> <dy>1.25</dy> <la2>19.3750032477232</la2> <lo2>-63.984399999999994</lo2> <latin>20</latin> <spacingUnit>km</spacingUnit> <minorAxis>6371200</minorAxis> <majorAxis>6371200</majorAxis> </mercatorGridCoverage>","title":"Projection Types"},{"location":"edex/new-grid-grib1-old/#creating-a-new-projection-file","text":"Copy an existing xml file with the same grid projection type (in this case lambertConformalGridCoverage ) to a new file wrf.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp RUCIcing.xml wrf.xml And edit the new wrf.xml to define the projection values using the output from wgrib or the database (example provided): vi wrf.xml <lambertConformalGridCoverage> <name>201155</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>42.2830009460449</la1> <lo1>-72.3610000610352</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>201</nx> <ny>155</ny> <dx>4.29699993133545</dx> <dy>4.29699993133545</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-67.0770034790039</lov> <latin1>45.3680000305176</latin1> <latin2>45.3680000305176</latin2> </lambertConformalGridCoverage> Notice the <name>201155</name> tag was created by using the number of grid points (201 and 155). This name can be anything as long as it is unique and will be used to match against in the model definition.","title":"Creating a New Projection File"},{"location":"edex/new-grid-grib1-old/#create-model-definition","text":"Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib file has a center ID of 7 (NCEP) we will edit the gribModels_NCEP-7.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NCEP-7.xml In <gribModelSet> add an entry: <model> <name>WRF</name> <center>7</center> <subcenter>0</subcenter> <grid>201155</grid> <process> <id>89</id> </process> </model> Save the file and restart EDEX for the changes to take effect: sudo service edex_camel restart ingestGrib Now copy the wrf.grib file again to /awips2/data_store/ingest/ . If everything is correct we will not see any persistence errors since the grid is now named WRF and not GribModel:7:0:89 . cp wrf.grib /awips2/data_store/ingest/ edex log grib After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid .","title":"Create Model Definition"},{"location":"edex/new-grid-grib1-old/#adding-a-table","text":"If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . The center and subcenter have been identified previously here , as 7 and 0, respectively. So, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/ To find the discipline of a grib product, you need the process and table values from the grib file. These are output with the wgrib -V command: wgrib -V wrf.grib rec 799:27785754:date 2014102318 ALBDO kpds5=84 kpds6=1 kpds7=0 levels=(0,0) grid=255 sfc 6hr fcst: bitmap: 736 undef ALBDO=Albedo [%] timerange 0 P1 6 P2 0 TimeU 1 nx 201 ny 155 GDS grid 3 num_in_ave 0 missing 0 center 7 subcenter 0 process 89 Table 2 scan: WE:SN winds(grid) Lambert Conf: Lat1 42.283000 Lon1 -72.361000 Lov -67.077000 Latin1 45.368000 Latin2 45.368000 < LatSP 0.000000 LonSP 0.000000 North Pole (201 x 155) Dx 4.297000 Dy 4.297000 scan 64 mode 8 min/max data 5 21.9 num bits 8 BDS_Ref 50 DecScale 1 BinScale 0 For our example, the process is 89 and table is 2 . Next, take a look in: /awips2/edex/data/utility/common_static/base/grid/grib1ParameterConvTable.xml And find the entry that has grib1 data with TableVersion 2 and Value 89: <grib1Parameter> <center>7</center> <grib1TableVersion>2</grib1TableVersion> <grib1Value>89</grib1Value> <grib2discipline>0</grib2discipline> <grib2category>3</grib2category> <grib2Value>10</grib2Value> </grib1Parameter> Here, we can see the discipline and category values (referred to as x above) are 0 and 3, respectively. So, the table needed for our example file is: /awips2/edex/data/utility/common_static/base/grib/tables/7/0/4.2.0.3.table","title":"Adding a Table"},{"location":"edex/new-grid/","text":"Ingest a New Grid \uf0c1 Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/edex/data/manual/ This page explains how to ingest .grib2 products. To view information about .grib products, please see this page . To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table Ingest an Unsupported Grid \uf0c1 Download Test Data \uf0c1 Download an example grib2 file (make sure the extension is .grib2 or the EDEX distribution file may not recognize it), and then copy to the manual ingest point /awips2/edex/data/manual/ : wget https://downloads.unidata.ucar.edu/awips2/current/files/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2 cp cpti.grib2 /awips2/edex/data/manual/ Check Grib Logs \uf0c1 Confirm that the grib file decodes in the grib log file. Look in the current log file (/awips2/edex/logs/edex-ingestGrib-[YYYYMMDD].log) for the following: INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... This step will fail for our example because the parameter is not yet defined. The error will look like: INFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0 INFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/edex/data/manual/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec) INFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61] In order to successfully ingest the example file, you must define the appropriate table . Check HDF5 Data \uf0c1 Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (161, 0, 97, respectively). Determine Grid Projection \uf0c1 When a grid is ingested a record is added to the grid_coverage table with its navigation information: psql metadata metadata=> select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2,spacingunit,lad,la2,latin,lo2,firstgridpointcorner from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:161:0:97'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 | spacingunit | lad | la2 | latin | lo2 | firstgridpointcorner -----+-----+-------+-------+-----------+-----------+-----------+-----+-----+--------+--------+-------------+-----+-----+-------+-----+---------------------- 600 | 640 | 0.005 | 0.005 | | | 40.799999 | 261 | | | | degree | | | | | UpperLeft (1 row) Compare with the projection info returned by wgrib2 on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib2 -grid -nxny cpti.grib2 1:0:grid_template=0:winds(N/S): lat-lon grid:(600 x 640) units 1e-06 input WE:NS output WE:SN res 48 lat 40.799999 to 37.599999 by 0.005000 lon 260.999999 to 263.999999 by 0.005000 #points=384000:(600 x 640) ... Notice that our grib2 file has a Lat/lon Grid projection, that starts in the UpperLeft corner (as defined by input West to East, North to South). Where: nx is 600 ny is 640 dx is 0.005 dy is 0.005 la1 is 40.799999 lo1 is 261 We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage (la1 and lo1) area. Create Grid Projection File \uf0c1 Projection Types \uf0c1 You may not have information for every tag listed, for example it's not required for the latLonGridCoverage to have spacingUnit, la2, lo2. Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) <lambertConformalGridCoverage> <name>305</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>16.322</la1> <lo1>-125.955</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>151</nx> <ny>113</ny> <dx>40.63525</dx> <dy>40.63525</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-95.0</lov> <latin1>25.0</latin1> <latin2>25.0</latin2> </lambertConformalGridCoverage> polarStereoGridCoverage (example seaice_south1_grid.xml ) <polarStereoGridCoverage> <name>405</name> <description>Sea Ice south 690X710 13km grid</description> <la1>-36.866</la1> <lo1>139.806</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>690</nx> <ny>710</ny> <dx>12.7</dx> <dy>12.7</dy> <spacingUnit>km</spacingUnit> <minorAxis>6371229.0</minorAxis> <majorAxis>6371229.0</majorAxis> <lov>100.0</lov> </polarStereoGridCoverage> latLonGridCoverage (example UkmetHR-SHemisphere.xml ) <latLonGridCoverage> <name>864162002</name> <description>UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E </description> <la1>-89.721</la1> <lo1>71.25</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>864</nx> <ny>162</ny> <dx>0.833</dx> <dy>0.556</dy> <spacingUnit>degree</spacingUnit> <la2>-0.278</la2> <lo2>70.416</lo2> </latLonGridCoverage> mercatorGridCoverage (example gridNBM_PR.xml ) <mercatorGridCoverage> <name>NBM_PR</name> <description> National Blend Grid over Puerto Rico - (1.25 km)</description> <la1>16.9775</la1> <lo1>-68.0278</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>339</nx> <ny>225</ny> <dx>1.25</dx> <dy>1.25</dy> <la2>19.3750032477232</la2> <lo2>-63.984399999999994</lo2> <latin>20</latin> <spacingUnit>km</spacingUnit> <minorAxis>6371200</minorAxis> <majorAxis>6371200</majorAxis> </mercatorGridCoverage> Creating a New Projection File \uf0c1 Copy an existing xml file with the same grid projection type (in this case latLonGridCoverage ) to a new file cpti.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp MRMS-1km-CONUS.xml cpti.xml And edit the new cpti.xml to define the projection values using the output from wgrib2 or the database (example provided): vi cpti.xml <latLonGridCoverage> <name>600640</name> <description>Small domain for CPTI products</description> <la1>40.799999</la1> <lo1>261</lo1> <firstGridPointCorner>UpperLeft</firstGridPointCorner> <nx>600</nx> <ny>640</ny> <dx>0.005</dx> <dy>0.005</dy> <spacingUnit>degree</spacingUnit> </latLonGridCoverage> Notice the <name>600640</name> tag was created by using the number of grid points (600 and 640). This name can be anything as long as it is unique and will be used to match against in the model definition. Create Model Definition \uf0c1 Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib2 file has a center of 161 (NOAA) we will edit the gribModels_NOAA-161.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NOAA-161.xml In <gribModelSet> , under the <-- Subcenter 0 --> comment, add an entry: <model> <name>CPTI</name> <center>161</center> <subcenter>0</subcenter> <grid>600640</grid> <process> <id>97</id> </process> </model> Save the model file and restart edex: sudo service edex_camel restart ingestGrib Now if you drop cpti.grib2 into the manual endpoint again, it should ingest without any persistence errors. Adding a Table \uf0c1 If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . There are also default parameters that all grib products may access located in this directory: /awips2/edex/data/utility/common_static/base/grib/tables/-1/-1/ If you are using a grib2 file, then you can use either the log output or the -center , -subcenter , and -full_name options on wgrib2 to get the center, subcenter, discipline, category, and parameter information: The table would be found in the directory structure using this file's center and subcenter. Finding Center \uf0c1 The center can be found by either: Running the following command: wgrib2 -center cpti.grib2 1:0:center=US NOAA Office of Oceanic and Atmospheric Research ... And then looking up the corresponding value for \"US NOAA Office of Oceanic and Atmospheric Research\" at this website , where it happens to be 161 . OR: Running the following command: wgrib2 -varX cpti.grib2 1:0:var209_255_1_ 161 _3_61 ... Where the 4th argument after \"var\" is the center id, in this case 161 . Finding Subcenter \uf0c1 To get the subcenter, simply run: wgrib2 -subcenter cpti.grib2 1:0:subcenter= 0 ... The subcenter of this file is 0 . Based on the center and subcenter, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/ Finding Discipline and Category \uf0c1 To find the exact table, we need the discipline and category: wgrib2 -full_name cpti.grib2 1:0:var 209 _ 3 _ 61 .500_m_above_mean_sea_level ... In this case the discipline is 209 and category is 3 , so the corresponding table is: 4.2.209.3.table Corresponding Table \uf0c1 The full path to the corresponding table would be: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/4.2.209.3.table The parameter ID was also listed in that output as 61 . Make sure that specific parameter information is defined in the table: ... 56:56:Reflectivity at -20C:dBZ:ReflectivityM20C 57:57:Reflectivity At Lowest Altitude (RALA):dBZ:ReflectivityAtLowestAltitude 58:58:Merged Reflectivity At Lowest Altitude (RALA):dBZ:MergedReflectivityAtLowestAltitude 59:59:CPTI 80mph+:%:CPTI80mph 61:61:CPTI 110mph+:%:CPTI110mph You will have to restart ingestGrib for the changes to take place: sudo service edex_camel restart ingestGrib Now you can try re-ingesting the grib2 file . Creating Menu Items \uf0c1 After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid . Implementing a Production Process \uf0c1 The ingest method mentioned earlier is strictly meant to only be used during testing and development of ingesting new grid data. The reasoning is because the manual end point is very inefficent during its ingest. It creates copies of the data file and uses more resources than you'd want in a production process. Once you are satisfied with the data ingest and display in CAVE, then we highly recommend you implement a production process for ingest that does not involve the manual directory ( /awips2/edex/data/manual/ ). The recommended way is to make use of a Python script we distribute with AWIPS (EDEX). This script is called notifyAWIPS2-unidata.py and located in th /awips2/ldm/dev/ directory. If you are already using a script to manually gather the data, then adding an additional call like the one below, should ingest your data to EDEX in an efficient manner: /awips2/ldm/dev/notifyAWIPS2-unidata.py [path-to-new-grib-file] Make sure the python script is executable. To do this you may have to run chmod +x /awips2/ldm/dev/notifyAWIPS2-unidata.py Using wgrib2 \uf0c1 Mentioned in this page are a few command parameters for wgrib2 such as -grid , varX , -center , -subcenter , and -full_name . A complete list of all available parameters can be found here . Troubleshooting Grib Ingest \uf0c1 Make sure the latitude and longitude entries in your coverage specification file match those of your ingested raw grib file. There is a tolerance of +/- 0.1 degree to keep in mind when defining your coverage area. If some of the information is unknown, using a grib utility application such as wgrib and wgrib2 can be useful in determining the information that must be added to correctly process a new grib file. If you are experiencing Segmentation fault errors when running wgrib2, it may be best to install the latest version using the following command: yum install wgrib2 And then you may either need to change where wgrib2 points to, or use /bin/wgrib2 to run the recently downloaded version.","title":"Ingest a New Grid"},{"location":"edex/new-grid/#ingest-a-new-grid","text":"Unrecognized grids can be decoded by EDEX simply by dropping *.grib or *.grib2 files into /awips2/edex/data/manual/ This page explains how to ingest .grib2 products. To view information about .grib products, please see this page . To add support for a new grid, two edits must be made: Geospatial projection must be defined in a grid navigation file Grid name , center , subcenter , and process ID must be defined in a model definition file If the parameters in the grib file haven't been previously specified, another change may be needed as well: Center , subcenter , discipline , category , and possibly parameter ID information may need to be defined in a table","title":"Ingest a New Grid"},{"location":"edex/new-grid/#ingest-an-unsupported-grid","text":"","title":"Ingest an Unsupported Grid"},{"location":"edex/new-grid/#download-test-data","text":"Download an example grib2 file (make sure the extension is .grib2 or the EDEX distribution file may not recognize it), and then copy to the manual ingest point /awips2/edex/data/manual/ : wget https://downloads.unidata.ucar.edu/awips2/current/files/CPTI_00.50_20180502-000144.grib2 -O cpti.grib2 cp cpti.grib2 /awips2/edex/data/manual/","title":"Download Test Data"},{"location":"edex/new-grid/#check-grib-logs","text":"Confirm that the grib file decodes in the grib log file. Look in the current log file (/awips2/edex/logs/edex-ingestGrib-[YYYYMMDD].log) for the following: INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.1200 (sec) Latency: 21.8080 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.1180 (sec) Latency: 21.8140 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.4230 (sec) Latency: 21.8360 (sec) INFO [Ingest.GribDecode] /awips2/edex/data/manual/cpti.grib2 processed in: 0.2240 (sec) Latency: 21.9140 (sec) ... This step will fail for our example because the parameter is not yet defined. The error will look like: INFO 2020-07-20 20:34:17,710 2565 [GribPersist-1] GridDao: EDEX - Discarding record due to missing or unknown parameter mapping: /grid/2018-05-02_00:01:44.0_(0)/GribModel:161:0:97/null/null/403/Missing/FH/500.0/-999999.0 INFO 2020-07-20 20:34:17,710 2566 [GribPersist-1] Ingest: EDEX: Ingest - grib2:: /awips2/edex/data/manual/CPTI_00.50_20180502-000144.grib2 processed in: 2.3550 (sec) INFO 2020-07-20 20:34:17,827 2567 [Ingest.GribDecode-6] grib: EDEX - No parameter information for center[161], subcenter[0], tableName[4.2.209.3], parameter value[61] In order to successfully ingest the example file, you must define the appropriate table .","title":"Check Grib Logs"},{"location":"edex/new-grid/#check-hdf5-data","text":"Check that the hdf5 data directory exists for our unnamed grid ls -latr /awips2/edex/data/hdf5/grid/GribModel:161:0:97 Though the grib file has been decoded, it has been given a generic name with its center, subcenter, and process IDs (161, 0, 97, respectively).","title":"Check HDF5 Data"},{"location":"edex/new-grid/#determine-grid-projection","text":"When a grid is ingested a record is added to the grid_coverage table with its navigation information: psql metadata metadata=> select nx,ny,dx,dy,majoraxis,minoraxis,la1,lo1,lov,latin1,latin2,spacingunit,lad,la2,latin,lo2,firstgridpointcorner from gridcoverage where id=(select distinct(location_id) from grid_info where datasetid='GribModel:161:0:97'); nx | ny | dx | dy | majoraxis | minoraxis | la1 | lo1 | lov | latin1 | latin2 | spacingunit | lad | la2 | latin | lo2 | firstgridpointcorner -----+-----+-------+-------+-----------+-----------+-----------+-----+-----+--------+--------+-------------+-----+-----+-------+-----+---------------------- 600 | 640 | 0.005 | 0.005 | | | 40.799999 | 261 | | | | degree | | | | | UpperLeft (1 row) Compare with the projection info returned by wgrib2 on the original file (look at the bolded sections below and make sure they match up with the corresponding entries returned from the database above): wgrib2 -grid -nxny cpti.grib2 1:0:grid_template=0:winds(N/S): lat-lon grid:(600 x 640) units 1e-06 input WE:NS output WE:SN res 48 lat 40.799999 to 37.599999 by 0.005000 lon 260.999999 to 263.999999 by 0.005000 #points=384000:(600 x 640) ... Notice that our grib2 file has a Lat/lon Grid projection, that starts in the UpperLeft corner (as defined by input West to East, North to South). Where: nx is 600 ny is 640 dx is 0.005 dy is 0.005 la1 is 40.799999 lo1 is 261 We will need these values for the next step. There is a tolerance of +/- 0.1 degrees to keep in mind when defining your coverage (la1 and lo1) area.","title":"Determine Grid Projection"},{"location":"edex/new-grid/#create-grid-projection-file","text":"","title":"Create Grid Projection File"},{"location":"edex/new-grid/#projection-types","text":"You may not have information for every tag listed, for example it's not required for the latLonGridCoverage to have spacingUnit, la2, lo2. Grid projection files are stored in /awips2/edex/data/utility/common_static/base/grib/grids/ and there are four grid coverage types available: lambertConformalGridCoverage (example: RUCIcing.xml ) <lambertConformalGridCoverage> <name>305</name> <description>Regional - CONUS (Lambert Conformal)</description> <la1>16.322</la1> <lo1>-125.955</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>151</nx> <ny>113</ny> <dx>40.63525</dx> <dy>40.63525</dy> <spacingUnit>km</spacingUnit> <minorAxis>6356775.0</minorAxis> <majorAxis>6378160.0</majorAxis> <lov>-95.0</lov> <latin1>25.0</latin1> <latin2>25.0</latin2> </lambertConformalGridCoverage> polarStereoGridCoverage (example seaice_south1_grid.xml ) <polarStereoGridCoverage> <name>405</name> <description>Sea Ice south 690X710 13km grid</description> <la1>-36.866</la1> <lo1>139.806</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>690</nx> <ny>710</ny> <dx>12.7</dx> <dy>12.7</dy> <spacingUnit>km</spacingUnit> <minorAxis>6371229.0</minorAxis> <majorAxis>6371229.0</majorAxis> <lov>100.0</lov> </polarStereoGridCoverage> latLonGridCoverage (example UkmetHR-SHemisphere.xml ) <latLonGridCoverage> <name>864162002</name> <description>UKMet HiRes combined - Southern Hemisphere Longitude range 71.25E - 70.416E </description> <la1>-89.721</la1> <lo1>71.25</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>864</nx> <ny>162</ny> <dx>0.833</dx> <dy>0.556</dy> <spacingUnit>degree</spacingUnit> <la2>-0.278</la2> <lo2>70.416</lo2> </latLonGridCoverage> mercatorGridCoverage (example gridNBM_PR.xml ) <mercatorGridCoverage> <name>NBM_PR</name> <description> National Blend Grid over Puerto Rico - (1.25 km)</description> <la1>16.9775</la1> <lo1>-68.0278</lo1> <firstGridPointCorner>LowerLeft</firstGridPointCorner> <nx>339</nx> <ny>225</ny> <dx>1.25</dx> <dy>1.25</dy> <la2>19.3750032477232</la2> <lo2>-63.984399999999994</lo2> <latin>20</latin> <spacingUnit>km</spacingUnit> <minorAxis>6371200</minorAxis> <majorAxis>6371200</majorAxis> </mercatorGridCoverage>","title":"Projection Types"},{"location":"edex/new-grid/#creating-a-new-projection-file","text":"Copy an existing xml file with the same grid projection type (in this case latLonGridCoverage ) to a new file cpti.xml : cd /awips2/edex/data/utility/common_static/base/grib/grids/ cp MRMS-1km-CONUS.xml cpti.xml And edit the new cpti.xml to define the projection values using the output from wgrib2 or the database (example provided): vi cpti.xml <latLonGridCoverage> <name>600640</name> <description>Small domain for CPTI products</description> <la1>40.799999</la1> <lo1>261</lo1> <firstGridPointCorner>UpperLeft</firstGridPointCorner> <nx>600</nx> <ny>640</ny> <dx>0.005</dx> <dy>0.005</dy> <spacingUnit>degree</spacingUnit> </latLonGridCoverage> Notice the <name>600640</name> tag was created by using the number of grid points (600 and 640). This name can be anything as long as it is unique and will be used to match against in the model definition.","title":"Creating a New Projection File"},{"location":"edex/new-grid/#create-model-definition","text":"Model definition XML files are found in /awips2/edex/data/utility/common_static/base/grib/models/ . Since our grib2 file has a center of 161 (NOAA) we will edit the gribModels_NOAA-161.xml file. cd /awips2/edex/data/utility/common_static/base/grib/models/ vi gribModels_NOAA-161.xml In <gribModelSet> , under the <-- Subcenter 0 --> comment, add an entry: <model> <name>CPTI</name> <center>161</center> <subcenter>0</subcenter> <grid>600640</grid> <process> <id>97</id> </process> </model> Save the model file and restart edex: sudo service edex_camel restart ingestGrib Now if you drop cpti.grib2 into the manual endpoint again, it should ingest without any persistence errors.","title":"Create Model Definition"},{"location":"edex/new-grid/#adding-a-table","text":"If you ingest a piece of data and the parameter appears as unknown in the metadata database, ensure that the correct parameter tables are in place for the center/subcenter. The tables are located in /awips2/edex/data/utility/common_static/base/grib/tables/ . They are then broken into subdirectories using the following structure: /[Center]/[Subcenter]/4.2.[Discipine].[Category].table . There are also default parameters that all grib products may access located in this directory: /awips2/edex/data/utility/common_static/base/grib/tables/-1/-1/ If you are using a grib2 file, then you can use either the log output or the -center , -subcenter , and -full_name options on wgrib2 to get the center, subcenter, discipline, category, and parameter information: The table would be found in the directory structure using this file's center and subcenter.","title":"Adding a Table"},{"location":"edex/new-grid/#finding-center","text":"The center can be found by either: Running the following command: wgrib2 -center cpti.grib2 1:0:center=US NOAA Office of Oceanic and Atmospheric Research ... And then looking up the corresponding value for \"US NOAA Office of Oceanic and Atmospheric Research\" at this website , where it happens to be 161 . OR: Running the following command: wgrib2 -varX cpti.grib2 1:0:var209_255_1_ 161 _3_61 ... Where the 4th argument after \"var\" is the center id, in this case 161 .","title":"Finding Center"},{"location":"edex/new-grid/#finding-subcenter","text":"To get the subcenter, simply run: wgrib2 -subcenter cpti.grib2 1:0:subcenter= 0 ... The subcenter of this file is 0 . Based on the center and subcenter, the corresponding directory is: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/","title":"Finding Subcenter"},{"location":"edex/new-grid/#finding-discipline-and-category","text":"To find the exact table, we need the discipline and category: wgrib2 -full_name cpti.grib2 1:0:var 209 _ 3 _ 61 .500_m_above_mean_sea_level ... In this case the discipline is 209 and category is 3 , so the corresponding table is: 4.2.209.3.table","title":"Finding Discipline and Category"},{"location":"edex/new-grid/#corresponding-table","text":"The full path to the corresponding table would be: /awips2/edex/data/utility/common_static/base/grib/tables/161/0/4.2.209.3.table The parameter ID was also listed in that output as 61 . Make sure that specific parameter information is defined in the table: ... 56:56:Reflectivity at -20C:dBZ:ReflectivityM20C 57:57:Reflectivity At Lowest Altitude (RALA):dBZ:ReflectivityAtLowestAltitude 58:58:Merged Reflectivity At Lowest Altitude (RALA):dBZ:MergedReflectivityAtLowestAltitude 59:59:CPTI 80mph+:%:CPTI80mph 61:61:CPTI 110mph+:%:CPTI110mph You will have to restart ingestGrib for the changes to take place: sudo service edex_camel restart ingestGrib Now you can try re-ingesting the grib2 file .","title":"Corresponding Table"},{"location":"edex/new-grid/#creating-menu-items","text":"After you have confirmed that the grid was ingested with the given name, you can edit the D2D product menus to display the new grid .","title":"Creating Menu Items"},{"location":"edex/new-grid/#implementing-a-production-process","text":"The ingest method mentioned earlier is strictly meant to only be used during testing and development of ingesting new grid data. The reasoning is because the manual end point is very inefficent during its ingest. It creates copies of the data file and uses more resources than you'd want in a production process. Once you are satisfied with the data ingest and display in CAVE, then we highly recommend you implement a production process for ingest that does not involve the manual directory ( /awips2/edex/data/manual/ ). The recommended way is to make use of a Python script we distribute with AWIPS (EDEX). This script is called notifyAWIPS2-unidata.py and located in th /awips2/ldm/dev/ directory. If you are already using a script to manually gather the data, then adding an additional call like the one below, should ingest your data to EDEX in an efficient manner: /awips2/ldm/dev/notifyAWIPS2-unidata.py [path-to-new-grib-file] Make sure the python script is executable. To do this you may have to run chmod +x /awips2/ldm/dev/notifyAWIPS2-unidata.py","title":"Implementing a Production Process"},{"location":"edex/new-grid/#using-wgrib2","text":"Mentioned in this page are a few command parameters for wgrib2 such as -grid , varX , -center , -subcenter , and -full_name . A complete list of all available parameters can be found here .","title":"Using wgrib2"},{"location":"edex/new-grid/#troubleshooting-grib-ingest","text":"Make sure the latitude and longitude entries in your coverage specification file match those of your ingested raw grib file. There is a tolerance of +/- 0.1 degree to keep in mind when defining your coverage area. If some of the information is unknown, using a grib utility application such as wgrib and wgrib2 can be useful in determining the information that must be added to correctly process a new grib file. If you are experiencing Segmentation fault errors when running wgrib2, it may be best to install the latest version using the following command: yum install wgrib2 And then you may either need to change where wgrib2 points to, or use /bin/wgrib2 to run the recently downloaded version.","title":"Troubleshooting Grib Ingest"},{"location":"edex/settings/","text":"EDEX Settings \uf0c1 Plugin Configuration \uf0c1 The directory /awips2/edex/conf/resources contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start: acarssounding.properties autobldsrv.properties com.raytheon.edex.plugin.gfe.properties com.raytheon.edex.text.properties com.raytheon.uf.common.registry.ebxml.properties com.raytheon.uf.edex.archive.cron.properties com.raytheon.uf.edex.database.properties com.raytheon.uf.edex.registry.ebxml.properties distribution.properties edex-localization-http.properties edex-requestsrv.properties edex-uengine.properties eventBus.properties ftp.properties goesr.properties grib.properties maintenance.properties proxy.properties purge.properties quartz.properties radar.properties stats.properties textdbsrv.properties warning.properties Look at purge.properties for example: # Master switch to enable and disable purging purge.enabled=true # Interval at which the purge job kicks off purge.cron=0+0/15+*+*+*+? # Interval at which the outgoing files are purged purge.outgoing.cron=0+30+*+*+*+? # Interval at which the logs are purged purge.logs.cron=0+30+0+*+*+? # Interval at which hdf5 orphans are purged purge.orphan.period=24h # Number of days older than the earliest known data to delete. purge.orphan.buffer=7 ... In grib.properties , goesr.properties , and radar.properties you can adjust the number of decoder threads for each plugin. cat radar.properties # Number threads for radar products ingested from the SBN radar-decode.sbn.threads=5 Ingest Modes \uf0c1 By default, EDEX starts three \"modes\": ingest , ingestGrib , and request (each as its own JVM). The file /awips2/edex/conf/modes/modes.xml contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more. EDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. In /awips2/edex/conf/modes/modes.xml there are a number of unused plugin decoders excluded because the data are not available outside of the SBN: ... <mode name=\"ingest\"> <exclude>.*request.*</exclude> <exclude>edex-security.xml</exclude> <exclude>ebxml.*\\.xml</exclude> <exclude>grib-decode.xml</exclude> <exclude>grid-staticdata-process.xml</exclude> <exclude>.*(dpa|taf|nctext).*</exclude> <exclude>webservices.xml</exclude> <exclude>.*datadelivery.*</exclude> <exclude>.*bandwidth.*</exclude> <exclude>.*sbn-simulator.*</exclude> <exclude>hydrodualpol-ingest.xml</exclude> <exclude>grid-metadata.xml</exclude> <exclude>.*ogc.*</exclude> <exclude>obs-ingest-metarshef.xml</exclude> <exclude>ffmp-ingest.xml</exclude> <exclude>scan-ingest.xml</exclude> <exclude>cwat-ingest.xml</exclude> <exclude>fog-ingest.xml</exclude> <exclude>vil-ingest.xml</exclude> <exclude>preciprate-ingest.xml</exclude> <exclude>qpf-ingest.xml</exclude> <exclude>fssobs-ingest.xml</exclude> <exclude>cpgsrv-spring.xml</exclude> </mode> ... In this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM. Note : TAF and NCTEXT plugins are disabled here due to performance issues. JVM Memory \uf0c1 The directory /awips2/edex/etc/ contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request): ls -al /awips2/edex/etc/ -rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh -rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh -rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh -rw-r--r-- 1 awips fxalpha 337 Jul 24 18:36 ingest.sh -rw-r--r-- 1 awips fxalpha 848 Jul 24 18:42 profiler.sh -rw-r--r-- 1 awips fxalpha 1188 Jul 24 18:41 registry.sh -rw-r--r-- 1 awips fxalpha 601 Jul 24 18:36 request.sh Each file contains the Xmx definition for maximum memory: ... export INIT_MEM=512 # in Meg export MAX_MEM=4096 # in Meg ... After editing these files, you must restart : service edex_camel restart .","title":"EDEX Settings"},{"location":"edex/settings/#edex-settings","text":"","title":"EDEX Settings"},{"location":"edex/settings/#plugin-configuration","text":"The directory /awips2/edex/conf/resources contains configuration text files for specific plugins, which allow for user-defined values which are read by AWIPS plugins on EDEX start: acarssounding.properties autobldsrv.properties com.raytheon.edex.plugin.gfe.properties com.raytheon.edex.text.properties com.raytheon.uf.common.registry.ebxml.properties com.raytheon.uf.edex.archive.cron.properties com.raytheon.uf.edex.database.properties com.raytheon.uf.edex.registry.ebxml.properties distribution.properties edex-localization-http.properties edex-requestsrv.properties edex-uengine.properties eventBus.properties ftp.properties goesr.properties grib.properties maintenance.properties proxy.properties purge.properties quartz.properties radar.properties stats.properties textdbsrv.properties warning.properties Look at purge.properties for example: # Master switch to enable and disable purging purge.enabled=true # Interval at which the purge job kicks off purge.cron=0+0/15+*+*+*+? # Interval at which the outgoing files are purged purge.outgoing.cron=0+30+*+*+*+? # Interval at which the logs are purged purge.logs.cron=0+30+0+*+*+? # Interval at which hdf5 orphans are purged purge.orphan.period=24h # Number of days older than the earliest known data to delete. purge.orphan.buffer=7 ... In grib.properties , goesr.properties , and radar.properties you can adjust the number of decoder threads for each plugin. cat radar.properties # Number threads for radar products ingested from the SBN radar-decode.sbn.threads=5","title":"Plugin Configuration"},{"location":"edex/settings/#ingest-modes","text":"By default, EDEX starts three \"modes\": ingest , ingestGrib , and request (each as its own JVM). The file /awips2/edex/conf/modes/modes.xml contains all available mode definitions, including some specific modes for Hydro Server Applications, ebXML Registries, Data Delivery, and more. EDEX services are registered through spring, and by including or excluding specific spring files (usually by datatype plugin name) we can finely customize EDEX startup. In /awips2/edex/conf/modes/modes.xml there are a number of unused plugin decoders excluded because the data are not available outside of the SBN: ... <mode name=\"ingest\"> <exclude>.*request.*</exclude> <exclude>edex-security.xml</exclude> <exclude>ebxml.*\\.xml</exclude> <exclude>grib-decode.xml</exclude> <exclude>grid-staticdata-process.xml</exclude> <exclude>.*(dpa|taf|nctext).*</exclude> <exclude>webservices.xml</exclude> <exclude>.*datadelivery.*</exclude> <exclude>.*bandwidth.*</exclude> <exclude>.*sbn-simulator.*</exclude> <exclude>hydrodualpol-ingest.xml</exclude> <exclude>grid-metadata.xml</exclude> <exclude>.*ogc.*</exclude> <exclude>obs-ingest-metarshef.xml</exclude> <exclude>ffmp-ingest.xml</exclude> <exclude>scan-ingest.xml</exclude> <exclude>cwat-ingest.xml</exclude> <exclude>fog-ingest.xml</exclude> <exclude>vil-ingest.xml</exclude> <exclude>preciprate-ingest.xml</exclude> <exclude>qpf-ingest.xml</exclude> <exclude>fssobs-ingest.xml</exclude> <exclude>cpgsrv-spring.xml</exclude> </mode> ... In this example, request, ebXML, grib plugins, OGC and other plugins are excluded because they are included in their own mode/JVM. Note : TAF and NCTEXT plugins are disabled here due to performance issues.","title":"Ingest Modes"},{"location":"edex/settings/#jvm-memory","text":"The directory /awips2/edex/etc/ contains files which define the amount of memory used for each of the three EDEX JVMs (ingest, ingestGrib, request): ls -al /awips2/edex/etc/ -rw-r--r-- 1 awips fxalpha 1287 Jul 24 18:41 centralRegistry.sh -rw-r--r-- 1 awips fxalpha 1155 Jul 24 18:42 default.sh -rw-r--r-- 1 awips fxalpha 1956 Jul 24 18:41 ingestGrib.sh -rw-r--r-- 1 awips fxalpha 337 Jul 24 18:36 ingest.sh -rw-r--r-- 1 awips fxalpha 848 Jul 24 18:42 profiler.sh -rw-r--r-- 1 awips fxalpha 1188 Jul 24 18:41 registry.sh -rw-r--r-- 1 awips fxalpha 601 Jul 24 18:36 request.sh Each file contains the Xmx definition for maximum memory: ... export INIT_MEM=512 # in Meg export MAX_MEM=4096 # in Meg ... After editing these files, you must restart : service edex_camel restart .","title":"JVM Memory"},{"location":"install/install-cave/","text":"Install CAVE \uf0c1 CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well. Latest CAVE Versions \uf0c1 Linux: 20.3.2-2 Windows: 20.3.2-2 Mac: 20.3.2-2 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX. Functionality/Reporting \uf0c1 If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form . General Requirements \uf0c1 Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability Linux \uf0c1 Latest Version: 20.3.2-2 System Requirements \uf0c1 64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Upgrade Existing Installation \uf0c1 Whether you have CAVE currently installed or not, you can follow the Download and Installation Instructions below. The script will remove the old version of CAVE if needed, and install the latest version. If you would like to completely remove CAVE, please see the uninstall instructions further down this page . Download and Installation Instructions \uf0c1 Download the following installer: awips_install.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install.sh Run the installer: sudo ./awips_install.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/ Run CAVE \uf0c1 To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux. Windows \uf0c1 Latest Version: 20.3.2-2 For Windows, we offer two installation options: a Direct Windows Installation , or a Linux Virtual Machine . The virtual machine option won't render RGB composites of satellite imagery. Method 1: Direct Windows Install \uf0c1 We offer CAVE installers at both the user-level (no administrative permissions needed), and the system-level (useful in a lab setting for instance). If you need the system-level installer, please skip to the System-Level Installation section , otherwise simply proceed with the next sections. Upgrade Existing Installation \uf0c1 If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed: First remove it by going to the Installed Apps settings dialog. You can access this window by: Start bar > Settings > Apps > Installed Apps. Typing \"remove\" in the start bar should bring you to this screen as well Find AWIPS CAVE, click on it, and click Uninstall. Once the uninstall is finished, simply download and install the latest version as instructed below. Download and Installation Instructions \uf0c1 Download and install: awips-cave.msi Run CAVE \uf0c1 To run CAVE, either: Double click on the CAVE icon on your desktop Type \"cave\" in the start bar and hit enter Find and run CAVE app in the file browser: C:\\Users\\%USER%\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\CAVE.bat System-Level Installation \uf0c1 If you need a system-level installation of CAVE, please fill out this brief access form for the .msi, and then proceed with installation similar to that described above. Method 2: Linux Virtual Machine \uf0c1 Please note, running CAVE in a Virtual Machine does have reduced functionality than running CAVE directly on hardware (ex: rendering RGB satellite images). System Requirements \uf0c1 VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown Upgrade Existing Installation \uf0c1 If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed you can either: Download the new Virtual Machine ( as described below ) and you will see the new VM in VMware, similar to this screenshot: Upgrade the version of CAVE within the Virtual Machine by following the Linux instructions Download and Installation Instructions \uf0c1 Download the zipped file containing the virtual machine: CentOS7-Unidata-CAVE-20.3.2-2 Unzip the folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 20.3.2-2.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed Run CAVE \uf0c1 Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE macOS \uf0c1 Latest Version: 20.3.2-2 System Requirements \uf0c1 Nvidia Graphics Card (Some Intel Graphics cards seem to work as well) Upgrade Existing Installation \uf0c1 If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed: Remove the existing installation by locating it (it maybe be in your Applications folder), and dragging it to the trash. Clear CAVE's cache by removing caveData ( see these instructions for removal ). Follow the Download and Installation Instructions from below to install the newest version of CAVE. Download and Installation Instructions \uf0c1 Download and install CAVE: awips-cave.dmg You can click and drag the CAVE icon into the Applications Directory to install at the System Application level -- this may require Administrator Privileges You can drag that icon to any other location (Desktop, local user's Applications directory, etc) to install CAVE at that location -- this will not require Administrator Privileges Run CAVE \uf0c1 To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below: EDEX Connection \uf0c1 NSF Unidata and Jetstream2 have partnered to offer a EDEX data server in the cloud, open to the public. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu . Local Cache \uf0c1 After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home/<user>/caveData/ macOS: /Users/<user>/Library/caveData/ Windows: C:\\Users\\<user>\\caveData\\ Uninstalling CAVE \uf0c1 Linux \uf0c1 These are instructions to manually uninstall CAVE. However, the awips_install.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData Windows \uf0c1 To completely remove CAVE: Type \"remove\" in the search bar and select Add or remove programs . This will open the Applications settings. From here, find AWIPS CAVE and select \"Uninstall\". macOS \uf0c1 To completely remove CAVE: Find where it is installed (might be the Applications folder) and drag into the trash. Then remove caveData .","title":"Install CAVE"},{"location":"install/install-cave/#install-cave","text":"CAVE is the C ommon A WIPS V isualization E nvironment that is used for rendering and analyzing data for AWIPS. The installer may require administrator priviledges to install and may require other system changes (environment variables, etc) as well.","title":"Install CAVE"},{"location":"install/install-cave/#latest-cave-versions","text":"Linux: 20.3.2-2 Windows: 20.3.2-2 Mac: 20.3.2-2 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX.","title":"Latest CAVE Versions"},{"location":"install/install-cave/#functionalityreporting","text":"If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form .","title":"Functionality/Reporting"},{"location":"install/install-cave/#general-requirements","text":"Regardless of what Operating System CAVE is running on, these general requirements are recommended in order for CAVE to perform optimally: Local machine Running CAVE via X11 forwarding or ssh tunneling is not supported. Using a VNC connection is the only remote option , and may result in worse performance than running locally. OpenGL 2.0 Compatible Devices At least 4GB RAM At least 2GB Disk Space for Caching NVIDIA Graphics Card Latest NVIDIA Driver While other graphics cards may work, NVIDIA Quadro graphics card is recommended for full visualization capability","title":"General Requirements"},{"location":"install/install-cave/#linux","text":"Latest Version: 20.3.2-2","title":"Linux "},{"location":"install/install-cave/#system-requirements","text":"64 bit CentOS/Red Hat 7 Bash shell environment While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024.","title":"System Requirements"},{"location":"install/install-cave/#upgrade-existing-installation","text":"Whether you have CAVE currently installed or not, you can follow the Download and Installation Instructions below. The script will remove the old version of CAVE if needed, and install the latest version. If you would like to completely remove CAVE, please see the uninstall instructions further down this page .","title":"Upgrade Existing Installation"},{"location":"install/install-cave/#download-and-installation-instructions","text":"Download the following installer: awips_install.sh In a terminal, go to the download directory Make the installer an executable by running: chmod 755 awips_install.sh Run the installer: sudo ./awips_install.sh --cave This will install the application in /awips2/cave/ and set the local cache to ~/caveData/","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave","text":"To run CAVE either: Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE Additionally users can choose to run a virtual machine (VM) on Linux.","title":"Run CAVE"},{"location":"install/install-cave/#windows","text":"Latest Version: 20.3.2-2 For Windows, we offer two installation options: a Direct Windows Installation , or a Linux Virtual Machine . The virtual machine option won't render RGB composites of satellite imagery.","title":"Windows "},{"location":"install/install-cave/#method-1-direct-windows-install","text":"We offer CAVE installers at both the user-level (no administrative permissions needed), and the system-level (useful in a lab setting for instance). If you need the system-level installer, please skip to the System-Level Installation section , otherwise simply proceed with the next sections.","title":"Method 1: Direct Windows Install"},{"location":"install/install-cave/#upgrade-existing-installation_1","text":"If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed: First remove it by going to the Installed Apps settings dialog. You can access this window by: Start bar > Settings > Apps > Installed Apps. Typing \"remove\" in the start bar should bring you to this screen as well Find AWIPS CAVE, click on it, and click Uninstall. Once the uninstall is finished, simply download and install the latest version as instructed below.","title":"Upgrade Existing Installation"},{"location":"install/install-cave/#download-and-installation-instructions_1","text":"Download and install: awips-cave.msi","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_1","text":"To run CAVE, either: Double click on the CAVE icon on your desktop Type \"cave\" in the start bar and hit enter Find and run CAVE app in the file browser: C:\\Users\\%USER%\\AppData\\Roaming\\UCAR Unidata\\AWIPS CAVE\\CAVE.bat","title":"Run CAVE"},{"location":"install/install-cave/#system-level-installation","text":"If you need a system-level installation of CAVE, please fill out this brief access form for the .msi, and then proceed with installation similar to that described above.","title":"System-Level Installation"},{"location":"install/install-cave/#method-2-linux-virtual-machine","text":"Please note, running CAVE in a Virtual Machine does have reduced functionality than running CAVE directly on hardware (ex: rendering RGB satellite images).","title":"Method 2: Linux Virtual Machine"},{"location":"install/install-cave/#system-requirements_1","text":"VMWare Workstation Player must be installed (free software): For high definition monitors (4k), you will want to enable the high DPI setting for VMWare Workstation Player Create a desktop shortcut for VMWare Workstation Player Right-click the shortcut and select Properties Open the Compatability Tab Select the \"Change high DPI settings\" button Check the \"High DPI scaling ovveride\" checkbox and choose \"Application\" in the enabled dropdown","title":"System Requirements"},{"location":"install/install-cave/#upgrade-existing-installation_2","text":"If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed you can either: Download the new Virtual Machine ( as described below ) and you will see the new VM in VMware, similar to this screenshot: Upgrade the version of CAVE within the Virtual Machine by following the Linux instructions","title":"Upgrade Existing Installation"},{"location":"install/install-cave/#download-and-installation-instructions_2","text":"Download the zipped file containing the virtual machine: CentOS7-Unidata-CAVE-20.3.2-2 Unzip the folder. Open VMWare Player and go to Player > File... > Open and locate the folder that was created from the downloaded zipped file. Select the file called \"CentOS 7 - Unidata CAVE 20.3.2-2.vmx\" . Run this new VM option. If it asks if it's been moved or copied, select \"I Copied It\" . There will be a user in the Linux machine named \"awips\" and the password is \"awips\" The root password is \"unidataAWIPS\" if ever needed","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_2","text":"Once inside the VM, to run CAVE either: Use the desktop icon Use the terminal and type the command cave Find the application in the Linux Desktop menu: Applications > Internet > AWIPS CAVE","title":"Run CAVE"},{"location":"install/install-cave/#macos","text":"Latest Version: 20.3.2-2","title":"macOS "},{"location":"install/install-cave/#system-requirements_2","text":"Nvidia Graphics Card (Some Intel Graphics cards seem to work as well)","title":"System Requirements"},{"location":"install/install-cave/#upgrade-existing-installation_3","text":"If you do not currently have CAVE installed, please go directly to the Download and Installation Instructions . If you already have CAVE installed: Remove the existing installation by locating it (it maybe be in your Applications folder), and dragging it to the trash. Clear CAVE's cache by removing caveData ( see these instructions for removal ). Follow the Download and Installation Instructions from below to install the newest version of CAVE.","title":"Upgrade Existing Installation"},{"location":"install/install-cave/#download-and-installation-instructions_3","text":"Download and install CAVE: awips-cave.dmg You can click and drag the CAVE icon into the Applications Directory to install at the System Application level -- this may require Administrator Privileges You can drag that icon to any other location (Desktop, local user's Applications directory, etc) to install CAVE at that location -- this will not require Administrator Privileges","title":"Download and Installation Instructions"},{"location":"install/install-cave/#run-cave_3","text":"To run CAVE either: Use the System Menu Go > Applications > CAVE Type \u2318 + Spacebar and then type \"cave\", the application should appear and you can hit enter to run it The first time CAVE is opened, it will ask you if you are sure you want to run it, because it was downloaded from the internet and not the Apple Store. This is normal, and hit Open. Your message my differ slightly but should look like the image below:","title":"Run CAVE"},{"location":"install/install-cave/#edex-connection","text":"NSF Unidata and Jetstream2 have partnered to offer a EDEX data server in the cloud, open to the public. Select the server in the Connectivity Preferences dialog, or enter edex-cloud.unidata.ucar.edu .","title":"EDEX Connection"},{"location":"install/install-cave/#local-cache","text":"After connecting to an EDEX server, you will have a local directory named caveData which contains files synced from EDEX as well as a client-side cache for data and map resources. You can reset CAVE by removing the caveData directory and reconnecting to an EDEX server. Your local files have been removed, but if you are re-connecting to an EDEX server you have used before, the remote files will sync again to your local ~/caveData (bundles, colormaps, etc.). Linux: /home/<user>/caveData/ macOS: /Users/<user>/Library/caveData/ Windows: C:\\Users\\<user>\\caveData\\","title":"Local Cache"},{"location":"install/install-cave/#uninstalling-cave","text":"","title":"Uninstalling CAVE"},{"location":"install/install-cave/#linux_1","text":"These are instructions to manually uninstall CAVE. However, the awips_install.sh script will do these steps for you if you are installing a newer version of CAVE. 1. Make sure you have exited out of any CAVE sessions Check to make sure your /etc/yum.repos.d/awips2.repo file has enabled=1 . 2. Remove currently installed CAVE sudo yum clean all sudo yum groupremove \"AWIPS CAVE\" If you are having trouble removing a group, see the troubleshooting section. 3. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 4. Remove the cave directory in /awips2 and caveData from your home directory rm -rf /awips2/cave rm -rf ~/caveData","title":"Linux"},{"location":"install/install-cave/#windows_1","text":"To completely remove CAVE: Type \"remove\" in the search bar and select Add or remove programs . This will open the Applications settings. From here, find AWIPS CAVE and select \"Uninstall\".","title":"Windows"},{"location":"install/install-cave/#macos_1","text":"To completely remove CAVE: Find where it is installed (might be the Applications folder) and drag into the trash. Then remove caveData .","title":"macOS"},{"location":"install/install-edex/","text":"Install EDEX \uf0c1 EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines Latest Version \uf0c1 20.3.2-2 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX. Functionality/Reporting \uf0c1 If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form . System requirements \uf0c1 64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support. Download and Installation Instructions \uf0c1 The first 3 steps should all be run as root 1. Install EDEX \uf0c1 Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --edex awips_install.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install.sh --edex\" 2. EDEX Setup \uf0c1 The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu 3. Configure iptables \uf0c1 This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API. Open Port 9588 \uf0c1 If you are running a Registry (Data Delivery) server, you will also want to open port 9588 . To open ports to all connections \uf0c1 vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT To open ports to specific IP addresses \uf0c1 In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT Restart iptables \uf0c1 service iptables restart Troubleshooting \uf0c1 For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart 4. Start EDEX \uf0c1 These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart Additional Notes \uf0c1 Ensure SELinux is Disabled \uf0c1 vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com SSD Mount \uf0c1 Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5 Configure LDM Feeds \uf0c1 EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested: Configuration file: /awips2/ldm/etc/ldmd.conf \uf0c1 This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual Configuration File: /awips2/ldm/etc/pqact.conf \uf0c1 This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds Configuration File: /awips2/ldm/etc/registry.xml \uf0c1 This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: <queue> <path>/awips2/ldm/var/queues/ldm.pq</path> <size>24GB</size> <slots>default</slots> </queue> Read more about registry.xml in the LDM User Manual Directories to Know \uf0c1 /awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint. What Version is my EDEX? \uf0c1 rpm -qa | grep awips2-edex Uninstalling EDEX \uf0c1 These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Install EDEX"},{"location":"install/install-edex/#install-edex","text":"EDEX is the E nvironmental D ata Ex change system that represents the backend server for AWIPS. EDEX is only supported for Linux systems: CentOS and RHEL, and ideally, it should be on its own dedicated machine. It requires administrator priviledges to make root-level changes. EDEX can run on a single machine or be spread across multiple machines. To learn more about that please look at Distributed EDEX, Installing Across Multiple Machines","title":"Install EDEX"},{"location":"install/install-edex/#latest-version","text":"20.3.2-2 View release notes Version 20.* of CAVE is not compatible with Version 18.* EDEX and vice versa, Version 18.* of CAVE is not compatible with Version 20.* EDEX.","title":"Latest Version"},{"location":"install/install-edex/#functionalityreporting","text":"If you come across issues/bugs/missing functionality, we also encourage you to report it using this short form .","title":"Functionality/Reporting"},{"location":"install/install-edex/#system-requirements","text":"64-bit CentOS/RHEL 7 While CentOS8 has reach End of Life as of Dec. 31, 2021, CentOS7 End of Life isn't until June 30, 2024. Bash shell environment 16+ CPU cores (each CPU core can run a decorder in parallel) 24GB RAM 700GB+ Disk Space gcc-c++ package A Solid State Drive (SSD) is recommended A SSD should be mounted either to /awips2 (to contain the entire EDEX system) or to /awips2/edex/data/hdf5 (to contain the large files in the decoded data store). EDEX can scale to any system by adjusting the incoming LDM data feeds or adjusting the resources (CPU threads) allocated to each data type. EDEX is only supported for 64-bit CentOS and RHEL 7 Operating Systems. EDEX is not supported in Debian, Ubuntu, SUSE, Solaris, macOS, or Windows. You may have luck with Fedora Core 12 to 14 and Scientific Linux, but we will not provide support.","title":"System requirements"},{"location":"install/install-edex/#download-and-installation-instructions","text":"The first 3 steps should all be run as root","title":"Download and Installation Instructions"},{"location":"install/install-edex/#1-install-edex","text":"Download and run the installer: awips_install.sh wget https://downloads.unidata.ucar.edu/awips2/current/linux/awips_install.sh chmod 755 awips_install.sh sudo ./awips_install.sh --edex awips_install.sh --edex will perform the following steps (it's always a good idea to review downloaded shell scripts): Checks to see if EDEX is currently running, if so stops the processes with the edex stop command If EDEX is installed, asks the user if it can be removed and where to backup the data to and does a yum groupremove awips2-server If the user/group awips:fxalpha does not exist, it gets created Saves the appropriate yum repo file to /etc/yum.repos.d/awips2.repo Increases process and file limits for the the awips account in /etc/security/limits.conf Creates /awips2/data_store if it does not exist already Runs yum groupinstall awips2-server If you receive an error relating to yum, then please run sudo su - -c \"[PATH_TO_INSTALL_FILE]/awips_install.sh --edex\"","title":"1. Install EDEX"},{"location":"install/install-edex/#2-edex-setup","text":"The external and localhost addresses need to be specified in /etc/hosts 127.0.0.1 localhost localhost.localdomain XXX.XXX.XXX.XXX edex-cloud edex-cloud.unidata.ucar.edu","title":"2. EDEX Setup"},{"location":"install/install-edex/#3-configure-iptables","text":"This should be a one time configuration change. Configure iptables to allow TCP connections on ports 9581 and 9582 if you want to serve data publicly to CAVE clients and the Python API.","title":"3. Configure iptables"},{"location":"install/install-edex/#open-port-9588","text":"If you are running a Registry (Data Delivery) server, you will also want to open port 9588 .","title":"Open Port 9588"},{"location":"install/install-edex/#to-open-ports-to-all-connections","text":"vi /etc/sysconfig/iptables *filter :INPUT ACCEPT [0:0] :FORWARD ACCEPT [0:0] :OUTPUT ACCEPT [0:0] -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -p icmp -j ACCEPT -A INPUT -i lo -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT -A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT #-A INPUT -m state --state NEW -m tcp -p tcp --dport 9588 -j ACCEPT # for registry/dd -A INPUT -j REJECT --reject-with icmp-host-prohibited -A FORWARD -j REJECT --reject-with icmp-host-prohibited COMMIT","title":"To open ports to all connections"},{"location":"install/install-edex/#to-open-ports-to-specific-ip-addresses","text":"In this example, the IP range 128.117.140.0/24 will match all 128.117.140.* addresses, while 128.117.156.0/24 will match 128.117.156.*. vi /etc/sysconfig/iptables *filter :INPUT DROP [0:0] :FORWARD DROP [0:0] :OUTPUT ACCEPT [0:0] :EXTERNAL - [0:0] :EDEX - [0:0] -A INPUT -i lo -j ACCEPT -A INPUT -p icmp --icmp-type any -j ACCEPT -A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT -A INPUT -s 128.117.140.0/24 -j EDEX -A INPUT -s 128.117.156.0/24 -j EDEX -A INPUT -j EXTERNAL -A EXTERNAL -j REJECT -A EDEX -m state --state NEW -p tcp --dport 22 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9581 -j ACCEPT -A EDEX -m state --state NEW -p tcp --dport 9582 -j ACCEPT #-A EDEX -m state --state NEW -p tcp --dport 9588 -j ACCEPT # for registry/dd -A EDEX -j REJECT COMMIT","title":"To open ports to specific IP addresses"},{"location":"install/install-edex/#restart-iptables","text":"service iptables restart","title":"Restart iptables"},{"location":"install/install-edex/#troubleshooting","text":"For CentOS 7 error: Redirecting to /bin/systemctl restart iptables.service Failed to restart iptables.service: Unit iptables.service failed to load: No such file or directory. The solution is: yum install iptables-services systemctl enable iptables service iptables restart","title":"Troubleshooting"},{"location":"install/install-edex/#4-start-edex","text":"These steps should be run as user awips with sudo. Switch to the user by running su - awips . edex start To manually start, stop, and restart: service edex_postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth service, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running. Start ldm manually: service edex_ldm start To restart EDEX edex restart","title":"4. Start EDEX"},{"location":"install/install-edex/#additional-notes","text":"","title":"Additional Notes"},{"location":"install/install-edex/#ensure-selinux-is-disabled","text":"vi /etc/sysconfig/selinux # This file controls the state of SELinux on the system. # SELINUX= can take one of these three values: # enforcing - SELinux security policy is enforced. # permissive - SELinux prints warnings instead of enforcing. # disabled - No SELinux policy is loaded. SELINUX=disabled # SELINUXTYPE= can take one of these two values: # targeted - Targeted processes are protected, # mls - Multi Level Security protection. SELINUXTYPE=targeted Read more about selinux at redhat.com","title":"Ensure SELinux is Disabled"},{"location":"install/install-edex/#ssd-mount","text":"Though a Solid State Drive is not required, it is strongly encouraged in order to handle the amount of disk IO for real-time IDD feeds. The simplest configuration would be to mount an 500GB+ SSD to /awips2 to contain both the installed software (approx. 20GB) and the real-time data (approx. 150GB per day). The default purge rules are configured such that the processed data in /awips2 does not exceed 450GB. The raw data is located in /awips2/data_store , and is scoured every hour and should not exceed 50GB. If you want to increase EDEX data retention you should mount a large disk to /awips2/edex/data/hdf5 since this will be where the archived processed data exists, and any case studies created. Filesystem Size Used Avail Use% Mounted on /dev/sda1 30G 2.5G 26G 9% / tmpfs 28G 0 28G 0% /dev/shm /dev/sdc1 788G 81G 667G 11% /awips2 /dev/sdb1 788G 41G 708G 10% /awips2/edex/data/hdf5","title":"SSD Mount"},{"location":"install/install-edex/#configure-ldm-feeds","text":"EDEX installs its own version of the LDM to the directory /awips2/ldm . As with a the default LDM configuration, two files are used to control what IDD feeds are ingested:","title":"Configure LDM Feeds"},{"location":"install/install-edex/#configuration-file-awips2ldmetcldmdconf","text":"This file specifies an upstream LDM server to request data from, and what feeds to request: REQUEST NEXRAD3 \"./p(DHR|DPR|DSP|DTA|DAA|DVL|EET|HHC|N0Q|N0S|N0U|OHA|NVW|NTV|NST).\" idd.unidata.ucar.edu REQUEST FNEXRAD|IDS|DDPLUS|UNIWISC \".*\" idd.unidata.ucar.edu REQUEST NGRID \".*\" idd.unidata.ucar.edu REQUEST NOTHER \"^TIP... KNES.*\" idd.unidata.ucar.edu Read more about ldmd.conf in the LDM User Manual","title":"Configuration file: /awips2/ldm/etc/ldmd.conf"},{"location":"install/install-edex/#configuration-file-awips2ldmetcpqactconf","text":"This file specifies the WMO headers and file pattern actions to request: # Redbook graphics ANY ^([PQ][A-Z0-9]{3,5}) (....) (..)(..)(..) !redbook [^/]*/([^/]*)/([^/]*)/([^/]*)/([0-9]{8}) FILE -overwrite -close -edex /awips2/data_store/redbook/\\8/\\4\\5Z_\\8_\\7_\\6-\\1_\\2_(seq).rb.%Y%m%d%H # NOAAPORT GINI images NIMAGE ^(sat[^/]*)/ch[0-9]/([^/]*)/([^/]*)/([^ ]*) ([^/]*)/([^/]*)/([^/]*)/ (T[^ ]*) ([^ ]*) (..)(..)(..) FILE -overwrite -close -edex /awips2/data_store/sat/\\(11)\\(12)Z_\\3_\\7_\\6-\\8_\\9_(seq).satz.%Y%m%d%H Read more about pqact.conf in the LDM User Manual See available AWIPS LDM feeds","title":"Configuration File: /awips2/ldm/etc/pqact.conf"},{"location":"install/install-edex/#configuration-file-awips2ldmetcregistryxml","text":"This file specifies configuration and runtime parameters. If you are pulling in a lot of data, you may want to consider increasing your LDM queue size: <queue> <path>/awips2/ldm/var/queues/ldm.pq</path> <size>24GB</size> <slots>default</slots> </queue> Read more about registry.xml in the LDM User Manual","title":"Configuration File: /awips2/ldm/etc/registry.xml"},{"location":"install/install-edex/#directories-to-know","text":"/awips2 - Contains all of the installed AWIPS software. /awips2/edex/logs - EDEX logs. /awips2/httpd_pypies/var/log/httpd - httpd-pypies logs. /awips2/database/data/pg_log - PostgreSQL logs. /awips2/qpid/log - Qpid logs. /awips2/edex/data/hdf5 - HDF5 data store. /awips2/edex/data/utility - Localization store and configuration files. /awips2/ldm/etc - Location of ldmd.conf and pqact.conf /awips2/ldm/logs - LDM logs. /awips2/data_store - Raw data store. /awips2/data_store/ingest - Manual data ingest endpoint.","title":"Directories to Know"},{"location":"install/install-edex/#what-version-is-my-edex","text":"rpm -qa | grep awips2-edex","title":"What Version is my EDEX?"},{"location":"install/install-edex/#uninstalling-edex","text":"These are instructions to manually uninstall EDEX. However, the awips_install.sh script will do all of these steps for you if you are installing a newer version of EDEX. 1. Make sure all EDEX processes are stopped sudo edex stop sudo edex status [edex status] postgres :: not running pypies :: not running qpid :: not running EDEXingest :: not running EDEXgrib :: not running EDEXrequest :: not running ldmadmin :: not running 2. Backup any important configuration files that you may want to reference Here are some possible important directories/files to backup: /awips2/database/data/pg_hba.conf /awips2/edex/data/utility/* /awips2/edex/bin/* /awips2/ldm/* /awips2/dev/* /awips2/edex/conf* /awips2/edex/etc/* /awips2/edex/logs/* /usr/bin/edex/* /etc/init.d/edexServiceList 3. See what AWIPS yum groups are currently installed In this case the AWIPS EDEX Server group is installed sudo yum grouplist Available Environment Groups: Minimal Install Compute Node Infrastructure Server File and Print Server Cinnamon Desktop MATE Desktop Basic Web Server Virtualization Host Server with GUI GNOME Desktop KDE Plasma Workspaces Development and Creative Workstation Installed Groups: AWIPS EDEX Server Development Tools Available Groups: AWIPS ADE SERVER AWIPS CAVE AWIPS Development AWIPS EDEX DAT Server AWIPS EDEX Database/Request Server AWIPS EDEX Decode/Ingest Node (No Database, PyPIES, GFE) Cinnamon Compatibility Libraries Console Internet Tools Educational Software Electronic Lab Fedora Packager General Purpose Desktop Graphical Administration Tools Haskell LXQt Desktop Legacy UNIX Compatibility MATE Milkymist Scientific Support Security Tools Smart Card Support System Administration Tools System Management TurboGears application framework Xfce 4. Remove any currently installed AWIPS yum groups sudo yum clean all sudo yum groupremove \"AWIPS EDEX Server\" If you are having trouble removing a group, see the troubleshooting section. 5. Check to make sure all awips rpms have been removed rpm -qa | grep awips2 If you still have rpms installed, remove them sudo yum remove awips2-* 6. Remove everything in the /awips2 directory rm -rf /awips2/*","title":"Uninstalling EDEX"},{"location":"install/start-edex/","text":"EDEX Basic Commands \uf0c1 These steps should be run as user awips with sudo. Switch to the user by running su - awips . NSF Unidata's EDEX install also comes with a simple edex program that can help execute basic EDEX utilities. The most basic of the commands are the following: To start all EDEX services: edex start To stop all EDEX services: edex stop Service and Boot Settings \uf0c1 These commands will start and stop five EDEX service files installed into /etc/init.d/ , four of which are run on boot: service postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running: service edex_ldm start All of these services are started and stopped by the single program: edex as mentioned above. LDM Troubleshooting \uf0c1 If the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned above . If sudo service edex_ldm start does not start up LDM smoothly, please try these steps: All of the following commands should be run as user awips and the service commands may need to be run with sudo . Run sudo service edex_ldm start or ldmadmin start and recieve this message: ldmadmin start start_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all is well and then execute \"ldmadmin clean\" to remove the PID-file. Run ldmadmin clean and sudo service edex_ldm start and receive this error: ldmadmin clean sudo service edex_ldm start Checking the product-queue... The writer-counter of the product-queue isn't zero. Either a process has the product-queue open for writing or the queue might be corrupt. Terminate the process and recheck or use pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq to validate the queue and set the writer-counter to zero. LDM not started To resolve the above, run: pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq ldmadmin delqueue ldmadmin mkqueue sudo service edex_ldm start EDEX Commands \uf0c1 NSF Unidata's version of EDEX installs with a helpful edex script that can be used for basic EDEX tasks. edex start \uf0c1 edex start Starting EDEX PostgreSQL: [ OK ] Starting httpd: [ OK ] Starting QPID [ OK ] Starting EDEX Camel (request): Starting EDEX Camel (ingest): Starting EDEX Camel (ingestGrib): Starting AWIPS LDM:The product-queue is OK. ... edex start base \uf0c1 To start all EDEX services except the LDM: edex start base edex stop \uf0c1 edex stop Stopping EDEX Camel (request): Stopping EDEX Camel (ingest): Stopping EDEX Camel (ingestGrib): Stopping QPID [ OK ] Stopping httpd: [ OK ] Stopping EDEX PostgreSQL: [ OK ] Stopping AWIPS LDM:Stopping the LDM server... ... edex setup \uf0c1 edex setup [edex] EDEX IP and Hostname Setup Checking /awips2/database/data/pg_hba.conf [OK] Checking /awips2/edex/bin/setup.env [OK] [edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf [done] This command configures and/or confirms that the EDEX hostname and IP address definitions exist ( edex setup is run by edex start ). Note : If your EDEX server is running but you see the message \"Connectivity Error: Unable to validate localization preferences\" in CAVE, it may mean that the domain name defined in /awips2/edex/bin/setup.env can not be resolved from outside the server. Some machines have different internally-resolved and externally-resolved domain names (cloud-based especially). The name defined in setup.env must be externally-resolvable . edex log \uf0c1 edex log [edex] EDEX Log Viewer :: No log specified - Defaulting to ingest log :: Viewing /awips2/edex/logs/edex-ingest-20151209.log. Press CTRL+C to exit INFO [Ingest.binlightning-1] /awips2/data_store/SFPA42_KWBC_091833_38031177.2015120918 processed in: 0.0050 (sec) Latency: 0.0550 (sec) INFO [Ingest.obs-1] /awips2/data_store/metar/SAIN31_VABB_091830_131392869.2015120918 processed in: 0.0810 (sec) Latency: 0.1800 (sec) More edex logs... edex log grib edex log request edex log ldm edex log radar edex log satellite edex log text edex qpid \uf0c1 Shows a list of the the Qpid message queue to monitor data ingest (messages in vs messages out, i.e. decoded): [centos@js-156-89 ~]$ edex qpid Queues queue dur excl msg msgIn msgOut bytes bytesIn bytesOut cons bind ================================================================================================ external.dropbox Y Y 11 1.26m 1.26m 621 79.6m 79.6m 5 1 Ingest.Radar Y Y 4 589k 589k 184 27.1m 27.1m 5 1 Ingest.GribDecode Y Y 0 370k 370k 0 103m 103m 11 1 Ingest.GribSplit Y Y 2 361k 361k 201 31.9m 31.9m 5 1 Ingest.modelsounding Y Y 0 100k 100k 0 6.54m 6.54m 1 1 Ingest.Text Y Y 0 97.8k 97.8k 0 5.25m 5.25m 2 1 Ingest.GOESR Y Y 0 83.4k 83.4k 0 6.92m 6.92m 2 1 Ingest.obs Y Y 0 46.2k 46.2k 0 2.40m 2.40m 1 1 Grid.PostProcess Y Y 0 20.2k 20.2k 0 6.68m 6.68m 1 1 Ingest.sfcobs Y Y 0 10.5k 10.5k 0 577k 577k 1 1 Ingest.goessounding Y Y 0 6.68k 6.68k 0 427k 427k 1 1 Ingest.Glm Y Y 0 5.61k 5.61k 0 581k 581k 1 1 Ingest.aww Y Y 0 3.32k 3.32k 0 182k 182k 1 1 edex users \uf0c1 To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu edex purge \uf0c1 To view any stuck purge jobs in PortgreSQL (a rare but serious problem if your disk fills up). The solution to this is to run edex purge reset .","title":"EDEX Basic Commands"},{"location":"install/start-edex/#edex-basic-commands","text":"These steps should be run as user awips with sudo. Switch to the user by running su - awips . NSF Unidata's EDEX install also comes with a simple edex program that can help execute basic EDEX utilities. The most basic of the commands are the following: To start all EDEX services: edex start To stop all EDEX services: edex stop","title":"EDEX Basic Commands"},{"location":"install/start-edex/#service-and-boot-settings","text":"These commands will start and stop five EDEX service files installed into /etc/init.d/ , four of which are run on boot: service postgres start service httpd-pypies start service qpidd start service edex_camel start The fifth, edex_ldm , does not run at boot to prevent filling up disk space if EDEX is not running: service edex_ldm start All of these services are started and stopped by the single program: edex as mentioned above.","title":"Service and Boot Settings"},{"location":"install/start-edex/#ldm-troubleshooting","text":"If the EDEX machine is shut down abruptly, when restarted, it should start up the processes mentioned above . If sudo service edex_ldm start does not start up LDM smoothly, please try these steps: All of the following commands should be run as user awips and the service commands may need to be run with sudo . Run sudo service edex_ldm start or ldmadmin start and recieve this message: ldmadmin start start_ldm(): PID-file \"/awips2/ldm/ldmd.pid\" exists. Verify that all is well and then execute \"ldmadmin clean\" to remove the PID-file. Run ldmadmin clean and sudo service edex_ldm start and receive this error: ldmadmin clean sudo service edex_ldm start Checking the product-queue... The writer-counter of the product-queue isn't zero. Either a process has the product-queue open for writing or the queue might be corrupt. Terminate the process and recheck or use pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq to validate the queue and set the writer-counter to zero. LDM not started To resolve the above, run: pqcat -l- -s -q /awips2/ldm/var/queues/ldm.pq && pqcheck -F -q /awips2/ldm/var/queues/ldm.pq ldmadmin delqueue ldmadmin mkqueue sudo service edex_ldm start","title":"LDM Troubleshooting"},{"location":"install/start-edex/#edex-commands","text":"NSF Unidata's version of EDEX installs with a helpful edex script that can be used for basic EDEX tasks.","title":"EDEX Commands"},{"location":"install/start-edex/#edex-start","text":"edex start Starting EDEX PostgreSQL: [ OK ] Starting httpd: [ OK ] Starting QPID [ OK ] Starting EDEX Camel (request): Starting EDEX Camel (ingest): Starting EDEX Camel (ingestGrib): Starting AWIPS LDM:The product-queue is OK. ...","title":"edex start"},{"location":"install/start-edex/#edex-start-base","text":"To start all EDEX services except the LDM: edex start base","title":"edex start base"},{"location":"install/start-edex/#edex-stop","text":"edex stop Stopping EDEX Camel (request): Stopping EDEX Camel (ingest): Stopping EDEX Camel (ingestGrib): Stopping QPID [ OK ] Stopping httpd: [ OK ] Stopping EDEX PostgreSQL: [ OK ] Stopping AWIPS LDM:Stopping the LDM server... ...","title":"edex stop"},{"location":"install/start-edex/#edex-setup","text":"edex setup [edex] EDEX IP and Hostname Setup Checking /awips2/database/data/pg_hba.conf [OK] Checking /awips2/edex/bin/setup.env [OK] [edit] Hostname edex.unidata.ucar.edu added to /awips2/ldm/etc/ldmd.conf [done] This command configures and/or confirms that the EDEX hostname and IP address definitions exist ( edex setup is run by edex start ). Note : If your EDEX server is running but you see the message \"Connectivity Error: Unable to validate localization preferences\" in CAVE, it may mean that the domain name defined in /awips2/edex/bin/setup.env can not be resolved from outside the server. Some machines have different internally-resolved and externally-resolved domain names (cloud-based especially). The name defined in setup.env must be externally-resolvable .","title":"edex setup"},{"location":"install/start-edex/#edex-log","text":"edex log [edex] EDEX Log Viewer :: No log specified - Defaulting to ingest log :: Viewing /awips2/edex/logs/edex-ingest-20151209.log. Press CTRL+C to exit INFO [Ingest.binlightning-1] /awips2/data_store/SFPA42_KWBC_091833_38031177.2015120918 processed in: 0.0050 (sec) Latency: 0.0550 (sec) INFO [Ingest.obs-1] /awips2/data_store/metar/SAIN31_VABB_091830_131392869.2015120918 processed in: 0.0810 (sec) Latency: 0.1800 (sec) More edex logs... edex log grib edex log request edex log ldm edex log radar edex log satellite edex log text","title":"edex log"},{"location":"install/start-edex/#edex-qpid","text":"Shows a list of the the Qpid message queue to monitor data ingest (messages in vs messages out, i.e. decoded): [centos@js-156-89 ~]$ edex qpid Queues queue dur excl msg msgIn msgOut bytes bytesIn bytesOut cons bind ================================================================================================ external.dropbox Y Y 11 1.26m 1.26m 621 79.6m 79.6m 5 1 Ingest.Radar Y Y 4 589k 589k 184 27.1m 27.1m 5 1 Ingest.GribDecode Y Y 0 370k 370k 0 103m 103m 11 1 Ingest.GribSplit Y Y 2 361k 361k 201 31.9m 31.9m 5 1 Ingest.modelsounding Y Y 0 100k 100k 0 6.54m 6.54m 1 1 Ingest.Text Y Y 0 97.8k 97.8k 0 5.25m 5.25m 2 1 Ingest.GOESR Y Y 0 83.4k 83.4k 0 6.92m 6.92m 2 1 Ingest.obs Y Y 0 46.2k 46.2k 0 2.40m 2.40m 1 1 Grid.PostProcess Y Y 0 20.2k 20.2k 0 6.68m 6.68m 1 1 Ingest.sfcobs Y Y 0 10.5k 10.5k 0 577k 577k 1 1 Ingest.goessounding Y Y 0 6.68k 6.68k 0 427k 427k 1 1 Ingest.Glm Y Y 0 5.61k 5.61k 0 581k 581k 1 1 Ingest.aww Y Y 0 3.32k 3.32k 0 182k 182k 1 1","title":"edex qpid"},{"location":"install/start-edex/#edex-users","text":"To see a list of clients connecting to your EDEX server, use the edex users [YYYYMMDD] command, where [YYYYMMDD] is the optional date string. edex users -- EDEX Users 20160826 -- user@101.253.20.225 user@192.168.1.67 awips@0.0.0.0 awips@sdsmt.edu","title":"edex users"},{"location":"install/start-edex/#edex-purge","text":"To view any stuck purge jobs in PortgreSQL (a rare but serious problem if your disk fills up). The solution to this is to run edex purge reset .","title":"edex purge"},{"location":"python/overview/","text":"Python-AWIPS \uf0c1 The python-awips package provides a data access framework for requesting meteorological and related datasets from an EDEX server. As with any python code, python-awips can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data. Thorough documentation for python-awips can be found here .","title":"Python-AWIPS"},{"location":"python/overview/#python-awips","text":"The python-awips package provides a data access framework for requesting meteorological and related datasets from an EDEX server. As with any python code, python-awips can be used to interact in command line, or scripting form, with the EDEX. This is an alternative to using CAVE to interact with the data. Thorough documentation for python-awips can be found here .","title":"Python-AWIPS"},{"location":"raytheon/cave_d2d/","text":"Raytheon: CAVE D2D User's Manual (13.4.1) \uf0c1 This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to NSF Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: CAVE User's Manual"},{"location":"raytheon/cave_d2d/#raytheon-cave-d2d-users-manual-1341","text":"This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to NSF Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: CAVE D2D User's Manual (13.4.1)"},{"location":"raytheon/smm/","text":"Raytheon: System Manager's Manual (13.4.1) \uf0c1 This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to NSF Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: AWIPS System Manager's Manual"},{"location":"raytheon/smm/#raytheon-system-managers-manual-1341","text":"This manual is from Raytheon, specifically for the NWS AWIPS, some of the content may not apply to NSF Unidata's AWIPS. Also, this manual is for an older version of AWIPS, but it is the most recent version of the manual we have access to. This browser does not support PDFs. Please download the PDF to view it: Download PDF","title":"Raytheon: System Manager's Manual (13.4.1)"}]}