doc update for distributed edex
This commit is contained in:
parent
70b9ca97fe
commit
5656a7296d
6 changed files with 159 additions and 27 deletions
|
@ -17,7 +17,7 @@ Through a grant provided by [Jetstream](https://jetstream-cloud.org/), Unidata i
|
|||
|
||||

|
||||
|
||||
# Documentation
|
||||
# Documentation - http://unidata.github.io/awips2/
|
||||
|
||||
* [Unidata AWIPS User Manual](http://unidata.github.io/awips2/)
|
||||
* [How to Install CAVE](http://unidata.github.io/awips2/install/install-cave)
|
||||
|
@ -99,8 +99,6 @@ Instructions on how to deploy CAVE from Eclipse.
|
|||
|
||||
1. Import **awips2/cave** > Select All Projects > Finish
|
||||
2. Import **awips2/edexOsgi** > Select All Projects > Finish
|
||||
3. Import **awips2/Radar** > Select All Projects > Finish
|
||||
> The **Radar** folder contains the EDEX Radar Server plugins. Though the Unidata release does not build or use the radar server, the common libraries are required for other AWIPS radar processing and visualization plugins.
|
||||
|
||||
Now import all other repositories fully:
|
||||
|
||||
|
|
|
@ -10,8 +10,6 @@ td:first-child { font-weight: bold }
|
|||
| acarssounding | Vertical profiles derived from ACARS data |
|
||||
| airep | Automated Aircraft Reports |
|
||||
| airmet | “Airmen’s Meteorological Information”: aviation weather advisories for potentially hazardous, but non-severe weather |
|
||||
| arealffgGenerator | Creates a mosaic of gridded ffg fields generated by RFCs for a WFO's area |
|
||||
| arealQpeGen | Creates a mosaic of gridded QPE for a WFO's area |
|
||||
| atcf | Automated Tropical Cyclone Forecast |
|
||||
| aww | Airport Weather Warning |
|
||||
| binlightning | Lightning data from the National Lightning Detection Network |
|
||||
|
@ -31,17 +29,11 @@ td:first-child { font-weight: bold }
|
|||
| convsigmet | Aviation Significant Meteorological Information for convective weather |
|
||||
| crimss | NPP/NPOESS CrIMSS (Cross Track Infrared and Microwave Sounding Suite) soundings |
|
||||
| cwa | Aviation Center Weather Advisory, issued by CWSUs (Center Weather Service Units) |
|
||||
| cwat | County Warning Area Threat produced by SCAN. CWAT was formerly called SCAN Convective Threat Index (SCTI). Raw data inputs include radar, cloud-to-ground lightning from the NLDN, and a few RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for CWAT are 1 km Composite Reflectivity [CZ, 37]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signatures [TVS, 61]. RAP13 fields include 700 mb Wind, Freezing Level, 1000-500 mb Thickness and 500 mb Wind as specified in the SCANRunSiteConfig.xml file.) |
|
||||
| dmw | GOES-R Derived Motion Winds |
|
||||
| ffg | Flash flood guidance metadata (countybased ffg from RFCs) |
|
||||
| ffmp | Flash Flood Monitoring and Prediction data. Raw data inputs: radar, gridded flash flood guidance from River Forecast Centers, highresolution precipitation estimates [HPE] and nowcasts [HPN], QPF from SCAN and gage data from the IHFS [Integrated Hydrologic Forecast System] database. Radar data [with WSR-88D product mnemonics and numbers] needed for FFMP are Digital Hybrid Reflectivity [DHR, 32] and Digital Precipitation Rate [DPR, 176]. The raw GRIB files containing RFC Flash Flood Guidance are identified in the tables in Part 2 of this document as NWS_151 or FFG-XXX, where XXX is an RFC identifier such as TUA, KRF, or ALR. |
|
||||
| fog | Fog Monitor. Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs, and satellite [visible, 3.9 µm, and 10.7 µm]) |
|
||||
| freezingLevel | MPE Rapid Refresh Freezing Level scheduled process (MpeRUCFreezingLevel) |
|
||||
| fssobs | Observations for the Fog monitor, SNOW, and SAFESEAS. Raw data inputs: METAR, Mesonet, maritime, buoys, MAROBs. |
|
||||
| gaff | Generate Areal Flash Flood Guidance |
|
||||
| geomag | NCEP SWPC Geomagnetic Data |
|
||||
| gfe | Graphical Forecast Editor grids |
|
||||
| ghcd | NCEP Generic High Cadence Data |
|
||||
| goesr | Plugins to decode and display GOES-R products |
|
||||
| goessounding | GOES Satellite Soundings |
|
||||
| gpd | NCEP Generic Point Data |
|
||||
|
@ -49,7 +41,6 @@ td:first-child { font-weight: bold }
|
|||
| idft | Ice Drift Forecasts |
|
||||
| intlsigmet | International Significant Meteorological Information for Aviation |
|
||||
| lma | Lightning Mapping Array |
|
||||
| loctables | Location Tables Ingest |
|
||||
| lsr | Local Storm Reports |
|
||||
| manualIngest | Manual data ingest plugin |
|
||||
| mcidas | NCEP decoder for McIDAS AREA files |
|
||||
|
@ -69,18 +60,12 @@ td:first-child { font-weight: bold }
|
|||
| pgen | NCEP NAWIPS PGEN decoder |
|
||||
| pirep | Pilot Reports |
|
||||
| poessounding | Polar Operational Environmental Satellite soundings |
|
||||
| preciprate | Precipitation Rate from SCAN. Raw data input: radar data [with WSR-88D product mnemonic and number] needed for preciprate are Digital Hybrid Reflectivity [DHR, 32]) |
|
||||
| profiler | Wind Profiler data |
|
||||
| q2FileProcessor | Q2 Verification System gzipped files |
|
||||
| qc | QC mesonet data |
|
||||
| qpf | Quantitative Precipitation Forecast from SCAN (raw data inputs: radar and some RAP13 fields. Radar data [with WSR-88D product mnemonics and numbers] needed for SCAN’s QPF are 0.5 degree Base Reflectivity [Z, 19], 4 km Vertically Integrated Liquid [VIL, 57], and Storm Track [STI, 58]. The RAP13 field needed is 700 mb Wind, as defined in the SCANRunSiteConfig.xml file.) |
|
||||
| radar | WSR-88D and TDWR data |
|
||||
| redbook | “Redbook” graphics |
|
||||
| redbook | Redbook graphics |
|
||||
| regionalsat | Decoder implementation for netcdf3 files generated by the Alaska Region and GOES-R Proving Ground |
|
||||
| satellite-gini | GINI-formatted satellite imagery (GOES, POES, VIIRS, FNEXRAD) |
|
||||
| satellite-mcidas | McIDAS area files (Raytheon/D2D-developed) |
|
||||
| satpre | Satellite-estimated Pecipiration (hydroApps) |
|
||||
| scan | SCAN (System for Convection Analysis and Nowcasting) (Inputs for the SCAN Table include radar, cloud-toground lightning from the NLDN, fields from RAP13, and CWAT. Specific radar products [with WSR-88D product mnemonics and numbers] are: 1 km Composite Reflectivity [CZ, 37]; 0.5 degree Base Reflectivity [Z, 19]; 4 km Vertically Integrated Liquid [VIL, 57]; Storm Track [STI, 58]; Mesocyclone Detections [MD, 141]; and Tornadic Vortex Signature [TVS, 61]. The SCAN Digital Mesocyclone Detection Table uses the WSR-88D DMD product [number 149]. RAP13 fields used for SCAN include CAPE, 0-3 km Storm Relative Helicity, 700 mb Wind, Freezing Level, 1000-500 mb Thickness, and 500 mb Wind.) |
|
||||
| sfcobs | Surface observations other than METAR format including buoys |
|
||||
| sgwh | NCEP BUFR Significant Wave Height data - SGWH (Jason-1), SGWHA (Altika), SGWHC (CryoSat), SGWHE (Envisat), SGWHG (GFO), or SGWH2 (Jason-2) |
|
||||
| shef | Standard Hydrometeorological Exchange Format data. |
|
||||
|
@ -96,6 +81,5 @@ td:first-child { font-weight: bold }
|
|||
| textlightning | Text lightning data |
|
||||
| vaa | Volcanic ash advisories |
|
||||
| viirs | Visible Infrared Imaging Radiometer Suite data |
|
||||
| vil | Cell-based Vertically Integrated Liquid from SCAN (System for Convection Analysis and Nowcasting) |
|
||||
| warning | Watches, Warnings, and Advisories |
|
||||
| wcp | SPC Convective Watches |
|
||||
|
|
146
docs/edex/distributed-computing.md
Normal file
146
docs/edex/distributed-computing.md
Normal file
|
@ -0,0 +1,146 @@
|
|||
|
||||
|
||||
AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive could handle most of the entire NOAAport data volume. However, with GOES-R(16) coming online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was now a need to distribute EDEX data decoding in order to handle this firehose of data.
|
||||
|
||||
---
|
||||
|
||||
This walkthrough will install different EDEX components on two ma`chines in the XSEDE Jetstream Cloud, the first is used to **ingest and decode** while the second is used to **store and serve** data.
|
||||
|
||||
---
|
||||
|
||||
## Database/Request Server
|
||||
|
||||
!!! note "Specs"
|
||||
* IP address **10.0.0.9**
|
||||
* CentOS 6.9
|
||||
* m1.medium (CPU: 6, Mem: 16 GB)
|
||||
* 1000GB attached storage for `/awips2/edex/data/hdf5`
|
||||
|
||||
### 1. Install
|
||||
|
||||
groupadd fxalpha && useradd -G fxalpha awips
|
||||
mkdir /awips2
|
||||
wget -O /etc/yum.repos.d/awips2.repo http://www.unidata.ucar.edu/software/awips2/doc/awips2.repo
|
||||
yum clean all
|
||||
yum groupinstall awips2-database
|
||||
|
||||
### 2. IPtables Config
|
||||
|
||||
It is required that ports 5432 and 5672 be open for the specific IP addresses of outside EDEX ingest servers. It is *not recommended* that you leave port 5432 open to all connections (since the default awips database password is known, and is not meant as a security measure). Further, it *is recommended* that you change the default postgres awips user password (which then requires a reconfiguration of every remote EDEX ingest server in order to connect to this database/request server).
|
||||
|
||||
vi /etc/sysconfig/iptables
|
||||
|
||||
*filter
|
||||
:INPUT DROP [0:0]
|
||||
:FORWARD DROP [0:0]
|
||||
:OUTPUT ACCEPT [0:0]
|
||||
:EXTERNAL - [0:0]
|
||||
:EDEX - [0:0]
|
||||
-A INPUT -i lo -j ACCEPT
|
||||
-A INPUT -p icmp --icmp-type any -j ACCEPT
|
||||
-A INPUT -m state --state ESTABLISHED,RELATED -j ACCEPT
|
||||
-A INPUT -m state --state NEW -m tcp -p tcp --dport 22 -j ACCEPT
|
||||
-A INPUT -m state --state NEW -m tcp -p tcp --dport 5672 -j ACCEPT
|
||||
-A INPUT -m state --state NEW -m tcp -p tcp --dport 9581 -j ACCEPT
|
||||
-A INPUT -m state --state NEW -m tcp -p tcp --dport 9582 -j ACCEPT
|
||||
-A INPUT -s 10.0.0.7 -j EDEX
|
||||
-A INPUT -j EXTERNAL
|
||||
-A EXTERNAL -j REJECT
|
||||
-A EDEX -m state --state NEW -p tcp --dport 5432 -j ACCEPT
|
||||
-A EDEX -m state --state NEW -p tcp --dport 5672 -j ACCEPT
|
||||
-A EDEX -j REJECT
|
||||
COMMIT
|
||||
|
||||
Note the line **`-A INPUT -s 10.0.0.7 -j EDEX`** as well as the following **`-A EDEX ...`** rules for ports 5432 (PostgreSQL) and 5672 (PyPIES/HDF5).
|
||||
|
||||
!!! Note "The three ports left open to all connections (5672,9581,9582) in addition to default port 22 are for outside CAVE client connections"
|
||||
|
||||
### 3. Database Config
|
||||
|
||||
In the file `/awips2/data/pg_hba.conf` you define remote connections for all postgres tables with as `<IP address>/32`, after the block of IPv4 local connections:
|
||||
|
||||
vi /awips2/data/pg_hba.conf
|
||||
|
||||
# IPv4 local connections:
|
||||
host fxatext all 127.0.0.1/32 trust
|
||||
host hd_ob92oax all 127.0.0.1/32 trust
|
||||
host dc_ob7oax all 127.0.0.1/32 trust
|
||||
host hmdb all 127.0.0.1/32 trust
|
||||
host metadata all 127.0.0.1/32 md5
|
||||
host maps all 127.0.0.1/32 md5
|
||||
host postgres all 127.0.0.1/32 md5
|
||||
host ncep all 127.0.0.1/32 md5
|
||||
host ebxml all 127.0.0.1/32 trust
|
||||
host replication replication 127.0.0.1/32 md5
|
||||
# Remote connections
|
||||
host fxatext all 10.0.0.7/32 md5
|
||||
host hd_ob92oax all 10.0.0.7/32 md5
|
||||
host dc_ob7oax all 10.0.0.7/32 md5
|
||||
host hmdb all 10.0.0.7/32 md5
|
||||
host metadata all 10.0.0.7/32 md5
|
||||
host maps all 10.0.0.7/32 md5
|
||||
host postgres all 10.0.0.7/32 md5
|
||||
host ncep all 10.0.0.7/32 md5
|
||||
host ebxml all 10.0.0.7/32 md5
|
||||
host replication replication 10.0.0.7/32 md5
|
||||
# IPv6 local connections:
|
||||
host all all ::1/128 md5
|
||||
host replication replication ::1/128 md5
|
||||
|
||||
### 4. Start EDEX
|
||||
|
||||
edex start database
|
||||
|
||||
This will start PostgreSQL, httpd-pypies, Qpid, and the EDEX Request JVM (and will not start the LDM or the EDEX Ingest and IngestGrib JVMs)
|
||||
|
||||
---
|
||||
|
||||
## Ingest/Decode Server
|
||||
|
||||
!!! note "Specs"
|
||||
* IP address **10.0.0.9**
|
||||
* CentOS 6.9
|
||||
* m1.xxlarge (CPU: 44, Mem: 120 GB)
|
||||
|
||||
### 1. Install
|
||||
|
||||
groupadd fxalpha && useradd -G fxalpha awips
|
||||
wget -O /etc/yum.repos.d/awips2.repo http://www.unidata.ucar.edu/software/awips2/doc/awips2.repo
|
||||
yum clean all
|
||||
yum groupinstall awips2-ingest
|
||||
|
||||
### 2. EDEX Config
|
||||
|
||||
`vi /awips2/edex/bin/setup.env`
|
||||
|
||||
Here you should redefine `DB_ADDR` and `PYPIES_SERVER` to point to the **Database/Request** server (10.0.0.9)
|
||||
|
||||
export EDEX_SERVER=10.0.0.7
|
||||
|
||||
# postgres connection
|
||||
export DB_ADDR=10.0.0.9
|
||||
export DB_PORT=5432
|
||||
|
||||
# pypies hdf5 connection
|
||||
export PYPIES_SERVER=http://10.0.0.9:9582
|
||||
|
||||
# qpid connection
|
||||
export BROKER_ADDR=${EDEX_SERVER}
|
||||
|
||||
Notice that `EDEX_SERVER` and `BROKER_ADDR` (qpid) should remain defined as the *localhost* IP address (10.0.0.7)
|
||||
|
||||
### 3. Start EDEX
|
||||
|
||||
edex start ingest
|
||||
|
||||
This will start Qpid and the EDEX Ingest and IngestGrib JVMs (and not start PostgreSQL, httpd-pypies, or the EDEX Request JVM)
|
||||
|
||||
---
|
||||
|
||||
## Additional Notes
|
||||
|
||||
* Install more than one `awips2-ingest` servers, with all pointing to the same `DB_ADDR` and `PYPIES_SERVER` Database/Request Server (10.0.0.9 below) and each decoding a different data set:
|
||||
|
||||

|
||||
|
||||
* Every EDEX Ingest IP address must be allowed in both **iptables** and **pg_hba.conf** as [shown above](#2-iptables-config).
|
BIN
docs/images/awips2_distributed.png
Normal file
BIN
docs/images/awips2_distributed.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 52 KiB |
|
@ -50,10 +50,18 @@ Unidata supports two visualization frameworks for rendering data: [CAVE](install
|
|||
|
||||
Unidata AWIPS source code and binaries (RPMs) are considered to be in the public domain, meaning there are no restrictions on any download, modification, or distribution in any form (original or modified). Unidata AWIPS contains no proprietery content and is therefore not subject to export controls as stated in the [Master Rights](https://github.com/Unidata/awips2/blob/unidata_16.2.2/rpms/legal/Master_Rights_File.pdf) licensing file.
|
||||
|
||||
<img style="float:right;width:450px;" src="http://www.unidata.ucar.edu/software/awips2/images/awips2_coms.png">
|
||||
---
|
||||
|
||||
|
||||
## Distributed Computing
|
||||
|
||||
AWIPS makes use of service-oriented architecture to request, process, and serve real-time meteorological data. While originally developed for use on internal NWS forecast office networks, where operational installations of AWIPS can consist of a dozen servers or more, because the AWIPS source code was hard-coded with the NWS network configuration, the early Unidata releases were stripped of operation-specific configurations and plugins, and released specifically for standalone installation. This made sense given that a single EDEX instance with a Solid State Drive could handle most of the entire NOAAport data volume. However, with GOES-R(16) coming online, and more gridded forecast models being created at finer temporal and spatial resolutions, there was now a need to distribute EDEX data decoding in order to handle this firehose of data.
|
||||
|
||||
* Read More: [Distributed EDEX](edex/distributed-computing)
|
||||
|
||||
---
|
||||
|
||||
<img style="float:right;width:450px;" src="http://www.unidata.ucar.edu/software/awips2/images/awips2_coms.png">
|
||||
## Software Components
|
||||
|
||||
* [EDEX](#edex)
|
||||
|
@ -65,11 +73,6 @@ Unidata AWIPS source code and binaries (RPMs) are considered to be in the public
|
|||
* [HDF5](#hdf5)
|
||||
* [PyPIES](#pypies)
|
||||
|
||||
|
||||
The primary AWIPS application for data ingest, processing, and storage is the Environmental Data EXchange (**EDEX**) server; the primary AWIPS application for visualization/data manipulation is the Common AWIPS Visualization Environment (**CAVE**) client, which is typically installed on a workstation separate from other AWIPS components.
|
||||
|
||||
In addition to programs developed specifically for AWIPS, AWIPS uses several commercial off-the-shelf (COTS) and Free or Open Source software (FOSS) products to assist in its operation. The following components, working together and communicating, compose the entire AWIPS system.
|
||||
|
||||
### EDEX
|
||||
|
||||
The main server for AWIPS. Qpid sends alerts to EDEX when data stored by the LDM is ready for processing. These Qpid messages include file header information which allows EDEX to determine the appropriate data decoder to use. The default ingest server (simply named ingest) handles all data ingest other than grib messages, which are processed by a separate ingestGrib server. After decoding, EDEX writes metadata to the database via Postgres and saves the processed data in HDF5 via PyPIES. A third EDEX server, request, feeds requested data to CAVE clients. EDEX ingest and request servers are started and stopped with the commands `edex start` and `edex stop`, which runs the system script `/etc/rc.d/init.d/edex_camel`
|
||||
|
@ -116,5 +119,5 @@ edexBridge, invoked in the LDM configuration file `/awips2/ldm/etc/ldmd.conf`, i
|
|||
|
||||
**PyPIES**, Python Process Isolated Enhanced Storage, was created for AWIPS to isolate the management of HDF5 Processed Data Storage from the EDEX processes. PyPIES manages access, i.e., reads and writes, of data in the HDF5 files. In a sense, PyPIES provides functionality similar to a DBMS (i.e PostgreSQL for metadata); all data being written to an HDF5 file is sent to PyPIES, and requests for data stored in HDF5 are processed by PyPIES.
|
||||
|
||||
PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by `edex start` and `edex stop`, and is controlled by the system script `/etc/rc.d/init.d/https-pypies`
|
||||
PyPIES is implemented in two parts: 1. The PyPIES manager is a Python application that runs as part of an Apache HTTP server, and handles requests to store and retrieve data. 2. The PyPIES logger is a Python process that coordinates logging. PyPIES is started and stopped by `edex start` and `edex stop`, and is controlled by the system script `/etc/rc.d/init.d/httpd-pypies`
|
||||
|
||||
|
|
|
@ -46,6 +46,7 @@ pages:
|
|||
- Editing Menus: cave/d2d-edit-menus.md
|
||||
- Change Localization: cave/cave-localization.md
|
||||
- EDEX User Manual:
|
||||
- Distributed EDEX: edex/distributed-computing.md
|
||||
- EDEX Start and Stop: install/start-edex.md
|
||||
- LDM Feeds: edex/ldm.md
|
||||
- Data Distribution Files: edex/data-distribution-files.md
|
||||
|
|
Loading…
Add table
Reference in a new issue