Reset Application (All timestamps are displayed in UTC)
Hail Size:
LSR Buffer: km
Warning Buffer: degrees
Warning Type:
LSR Type:
App listing per-WFO tags

IEM Cow (NWS Storm Based Warning Verification)

IEM Cow API Access

Data presented here is unofficial and should be used for educational purposes only. This application allows you to view warnings and storm reports issued by a Weather Forecast Office (WFO) for a period of time of your choice. The application attempts to automatically provide verification numbers for the warnings issued.

This application works for dates after 8 June 2005.

Storm Based Warnings

On the first of October 2007, the National Weather Service began issuing warnings for tornados, severe weather, floods, and marine hazards on a storm by storm basis. Previously, the warnings were issued for an entire county. This application attempts to provide verification statistics for these storm based warnings.

Related links:

Verification Methodology

The map on the left illustrates some of the spatial statistics the Cow produces. The following is a brief description of these values.

API for IEM Cow Data

20 June 2018: The IEM is pleased to announce availability of an API for programatic access to IEM Cow data and statistics.

6 August 2019: The API should be considered stable now and has been favorably compared against the legacy PHP based Cow statistics. Please feel free to use it and report back any issues you find.

25 May 2022: The website Cow interface was updated to use this API service to drive the statistics displayed (dogfooding). As doing such, some shortcomings of the API were realized. So the JSON schema returned by the service was improved some, but should all still be backwards compat.

6 July 2022: The API returned value for unwarned_reports was updated to not include severe thunderstorm type reports made during a tornado warning.

The JSON emitting service endpoint is:

This endpoint accepts a large number of CGI parameters via HTTP GET. None of the parameters are required.

callback=func not used This supports JSON-P style requests with the resulting JSON data begin encapsulated by a javascript function call. i.e. callback=gotdata
wfo=XXX All WFOs considered This specifies the three character WFO identifier that you wish to get statistics for. If none are specified, then you get the entire NWS. You can provide this parameter one or more times, i.e. wfo=XXX&wfo=YYY
Time Specification Option 1
Last 4 Hour Period This start time and end time sets the window to look for NWS Storm Based Warnings and Local Storm Reports. For the case of warnings, the warning must have been issued after the start time and have an expiration prior to the end time. The tricky issue issue is when warnings cross either the start or end time, this can lead to incomplete statistics (ie a storm report was actually covered by a warning, but that warning was outside your time domain. These timestamps are in UTC., i.e. begints=2018-06-18T12:00Z&endts=2018-06-19T12:00Z
Time Specification Option 2
syear=YYYY smonth=MM sday=DD shour=HH24
eyear=YYYY emonth=MM eday=DD ehour=HH24
Last 4 Hour Period Same time details above as with Option 1, but here you are just specifying manually each part of the date, i.e. syear=2018&smonth=6&sday=18&shour=12&eyear=2018&emonth=6&eday=19&ehour=12&
phenomena=XX All TO, SV, FF, MA, DS These are the VTEC phenomena codes that you want the data and stats for. These are two character and the currently supported Storm Based Warning types are Tornado (TO), Severe Thunderstorm (SV), Flash Flood (FF), Marine (MA), and Dust Storm (DS). You can specify more than one phenomena, i.e. phenomena=TO&phenomena=SV
lsrtype=XX All TO, SV, FF, MA, DS So like in the case of phenomena above, here you are specifying which category of Local Storm Reports to consider. This gets to be a bit difficult to fully explain. In general, the codes reflect with Local Storm Report types should be used that can potentially verify the warning. i.e. lsrtype=TO&lsrtype=SV. The two letter identifiers are aliases to explicit LSR types. You can specify the explicit types of T: Tornado, G: TStorm Wind Gust, D: TStorm Wind Damanage, H: Hail.
hailsize=SIZE_IN_INCHES 1.00 inches What hail size in inches should be considered when verifying the warnings. The present day standard is one inch, but previously it was 0.75 inches. This parameter only considers one value, i.e. hailsize=1.50
lsrbuffer=DIST_IN_KM 15 km IEM Cow attempts to provide an areal verification percentage within the polygons, this areal value is computed by buffering out the point LSR reports by the given radius in kilometers. The GIS operation is done in USGS Albers (EPSG:2163). example, lsrbuffer=15
warningbuffer=DIST_IN_KM 1 km So this is kind of a bug-a-boo and dirty little secret, but the NWS Storm Based Warnings are not necessarily exact in latitude/longitude space. The basic data provides polygon points with two places of decimal precision. Given political boundaries and other lame limitations, there are places in the country that would never receive a warning if not for allowing the polygons to buffer out slightly for folks implementing workflows with this data. So we default to buffer out the warning 1km, which is used in verification, but not in size calculations. example, warningbuffer=1
wind=SPEED_IN_MPH 58 MPH For wind Local Storm Reports, what minimum speed should be considered for verifying a warning. This value is not used in the case of Marine Warnings. example, wind=58
windhailtag=N_or_Y No For Severe Thunderstorm Warnings, the tags used to denote the wind speed and hail size at issuance are used to verify the warning. For example, if a hail tag of 2 inches was used at issuance, any reports below that would not be considered as verifying the report.
limitwarns=N_or_Y No Use the wind and hailsize parameters to filter considered warnings for the verification. For example, if you set wind=70 then any warnings issued with a wind tag below 70 MPH would be ignored.
fcster=string Not considered With this enabled, resulting stats should not be used This will limit considered warnings to those signed by the exact string provided. This creates a problematic situation with all storm reports considered, but only a subset of warnings provided for verification. The actual verification of individual warnings will be accurate, but the bulk stats are not correct. example, fcster=forecaster10

Resulting JSON Schema.

{"generated_at": "ISO9660",
 "params": {dictionary of how API was called},
 "stats": {dictionary of generated statistics},
   "area_verify%": float # percent of polygon area verified
   "avg_leadtime[min]": int # Average leadtime in minutes
   "avg_size[sq km]": float # Average polygon size in square kilometers
   "CSI[1]": float # Critical Success Index 0-1
   "events_verified": int # Number of warning events verified
   "events_total": int # Number of warning events considered for report
   "FAR[1]": float # False Alarm Rate 0-1
   "max_leadtime[min]": int # Longest leadtime in minutes
   "min_leadtime[min]": int # Shortest leadtime in minutes
   "POD[1]": float # Probability of Detection 0-1
   "reports_total: int # Number of LSRs considered for report
   "shared_border%": float # percent of polygon border coincident with political bounds
   "size_poly_vs_county[%s]" : float # percent of polygon size compared to county size
   "tdq_stormreports": int # number of non-verifying storm reports in a TOR but outside SVR
   "unwarned_reports": int # number of LSRs without a warning

 "events": GeoJSON-style object of warnings,
   "features": []
     "id": str # unique id for this warning
       "ar_ugc": [] # list of ugc codes for this warning
       "ar_ugcname": [] # list of ugc names for this warning
       "areaverify": float # percent of polygon area verified
       "carea": float # area of associated counties/parishes in square kilometers
       "eventid": int # VTEC Event Identifier
       "expire": "ISO8601" # expiration time of warning
       "fcster": str # Forecaster/Product Signature of the warning
       "hailtag": float # IBW Hail Size (inch) at issuance
       "issue": "ISO8601" # issue time of warning
       "lat0": float # latitude of polygon centroid
       "lead0": int # leadtime in minutes of the first verifying LSR
       "lon0": float # longitude of polygon centroid
       "parea": float # sq km area of polygon computed in EPSG:2163
       "perimeter": float # perimeter of polygon computed in EPSG:2163
       "phenomena": str # VTEC Phenomena Two-Letter Code
       "sharedborder": float # perimeter of polygon shared with political border
       "significance": str # VTEC Significance One-Letter Code
       "status": str # VTEC Status of last event product update
       "statuses": str # typo of status, but kept for backwards compatibility
       "stormreports": str # comma seperated list of storm report IDs that verified warning
       "stormreports_all": str # comma seperated list of all storm reports within space/time bounds
       "verify": bool # true if warning verified
       "wfo": str # WFO that issued warning
       "windtag": float # IBW Wind Speed (mph) at issuance
       "year": int # year of warning
 "stormreports": GeoJSON-style object on LSRS
   "features": []
     "id": int # Sequential LSR identifier used for cross-references
       "city": str # City name of LSR
       "county": str # County name of LSR
       "lat0": float # latitude of LSR
       "leadtime": int # leadtime in minutes to first verifying warning
       "lon0": float # longitude of LSR
       "lsrtype": str # LSR type (SV, TO, FF, MA)
       "magnitude": float # LSR magnitude
       "remark": str # LSR remark
       "source": str # LSR source
       "state": str # State identifier of LSR
       "tdq": bool # Was this LSR covered only by a TOR warning, but not of a
                   # TOR LSR type?  Such events are not counted against verif.
       "type": str # IEM internal LSR type code (1 char)
       "typetext": str # LSR type found in NWS Product Text
       "valid": "ISO8601" # LSR valid time
       "warned": bool # Was this LSR warned for?
       "wfo": str # WFO that issued warning

There is a Python based example that uses this API to generate shapefiles of the verification data.