Home NIBRS Developments in NIBRS 2004 Bulletins

2004 Bulletins

Developments in the National Incident-Based Reporting System (NIBRS)

Note: The excerpts are presented as they were originally published in the UCR State Program Bulletin and therefore will include any additions, deletions, or clarifications released in subsequent bulletins. Readers are urged to read this document in its entirety before making any programming changes.

View by Topic:

UCR State Program Bulletin 04-2, May 2004

Bias Motivation Code 99 in Data Element 8A

The national UCR Program had been receiving a large number of offenses from agencies that submit data via NIBRS that have a bias motivation code of 99 = Unknown in Data Element 8A when compared to offenses indicating a specific bias motivation code (codes 11-52) or no bias motivation (88 = None). As a result, the national program personnel have been contacting state UCR program managers in those states that submit an excessive number of offenses coded as bias motivation code 99 = Unknown and have been requesting that the state follow-up with the local reporting agency. This has had a positive result, for the number of offenses coded with bias motivation 99 = Unknown has dropped by approximately 50 percent within the last 2–3 years. The national program appreciates the efforts of the state program managers and local reporting agencies in their commitment to accurate hate crime reporting.

Procedural Change for Converting NIBRS Data to Zero Hate Crime Data

Currently, the national UCR Program’s data processing procedure converts an agency’s NIBRS incidents coded as 88 = None and/or 99 = Unknown to zero hate crimes. This conversion program excludes the Zero-Reporting Segment Level and Group B arrest data.

The national program has identified three problems with this procedure:

  • In hate crime statistical analysis, all data have value—including zero data, incident data, and unknown data. The current procedure converts unknown data to zero data and nullifies the value of the unknown data.

  • The conversion program populates the hate crime database with zeros even when an agency has not submitted current-year NIBRS data. This occurs because the NIBRS database is open for current-year and previous-year submissions.

  • When determining if an agency has hate crime zero data, the conversion program excludes the NIBRS zero-reporting record or the Group B Arrest Report, both of which are indicators that the agency had no offense data to report.

Therefore, beginning with the data for Hate Crime Statistics, 2003, the national program is implementing the following data processing changes to the NIBRS conversion process:

  • When a NIBRS agency submits only 99 = Unknown in the bias motivation field, the system will not enter zeros for that agency’s hate crime record. Consequently, agencies reporting all NIBRS data with the bias motivation code 99 = Unknown will not be listed in the publication Hate Crime Statistics, Table 14, “Zero Hate Crime Data Submitted.” The national UCR Program will, however, continue to consider that NIBRS agency as participating in the hate crime data collection program.

  • Current-year zero hate crime data will be based on same-year NIBRS data submissions (e.g., 2003 hate crime data will be based on 2003 NIBRS data submissions).

  • The NIBRS submissions will be converted to zero hate crime data quarterly. An agency must submit at least 1 month of NIBRS data during a quarter for the national program to consider the agency as participating in the hate crime program for that quarter.

  • The Zero-Reporting Segment Level and the Group B Arrest Report will be reviewed when converting NIBRS submissions to zero hate crime data. An agency must submit at least 1 month of NIBRS data during a quarter for the national program to consider the agency as participating in the hate crime data collection program for that quarter.

Web Document for NIBRS Vendors Updated

The national program asks state and local agencies to inform their vendors that program staff have recently updated the Web document entitled Developments in the NIBRS. This document was initially published on the Internet in July 2002 to keep vendors informed of NIBRS programming changes. The document, which can be found at www.fbi.gov/ucr/ucr.htm, includes excerpts from UCR State Program Bulletins that provide a historical perspective of the evolution of NIBRS including procedural changes, reporting clarifications, and policy additions that have occurred from 1999 to April 2004. The national program asks agencies to caution their vendors to read the document in its entirety before making any programming changes. The excerpts are presented as they were originally published in the State Program Bulletin, and subsequent excerpts may have additions, deletions, or clarifications to those from earlier bulletins.

Back to top

UCR State Program Bulletin 04-3, August 2004

Calculating Trends in National Incident-Based Reporting System Data

The staff of the national UCR Program review and trend data submitted via the National Incident-Based Reporting System (NIBRS) to ensure data quality. The program also publishes trend data in its various crime reports. (For NIBRS data to be published in the Preliminary Semiannual and Annual Uniform Crime Reports, the national Ppogram extracts NIBRS data representing each agency’s Part I offenses at 6- and 12-month intervals and converts them to summary data.) When trending NIBRS data submitted by NIBRS-certified state programs and certified agencies of non-program states, the FBI’s CSMU NIBRS staff use the following processes at 6-month and 12-month intervals:

  • Initially, the staff review Group A offenses to ensure that the data are reasonable, i.e., likely to have occurred given the offense and the circumstances surrounding the incident. The staff also look for an invalid “none,” i.e., an indication of false reporting that the system generates when an agency actually filed only a Group B arrest report for the period, or a monthly offense count that is significantly higher or lower than the agency’s typical offense count for the specified time period. The staff also review the number (and type) of premises entered for the agency’s burglary offense total to prevent the volume of burglaries from being falsely elevated when the NIBRS numbers are converted to summary totals for publication. (The staff ensures that the Hotel Rule is applied properly in the conversion process.) If the NIBRS staff find invalid or questionable data, they manually remove the data so that the staff can perform a more accurate comparison with the valid data of the common months.

  • Next, the staff measure the data for violent crimes (murder and nonnegligent manslaughter; forcible rape; robbery; and assault, both aggravated and simple) and property crimes (burglary, larceny-theft, motor vehicle theft, and arson) from the two reporting periods to ensure that the offense trends fall within acceptable norms.

Percent Change: The staff derive the volume trends for violent crime offenses and property crime offenses by running standardized programs on the UCR database that (1) subtract the previous year’s figure from the current year’s figure, (2) divide the difference by the previous year’s figure, and (3) multiply that number by 100. Typically, a comparison of violent crime from one reporting period to the next yields a volume change ranging from minus 15 percent to plus 15 percent. A comparison of property crimes from one reporting period to the next usually yields a volume change ranging from minus 25 percent to plus 25 percent. Upon receiving the trend printouts, the NIBRS staff flag unusually high fluctuations in violent or property crime, i.e., those percent changes that exceed the standard ranges, to analyze the data further.

Z-score: The staff calculate the z-score for violent crime offenses and property crime offenses by running standardized programs on the UCR database that (1) subtract the previous year’s figure from the current year’s figure and (2) divide the difference by the square root of the previous year’s figure plus the current year’s figure. The critical value of the z-score may fluctuate plus or minus 3 from the mean figure. The NIBRS staff flag scores that fall outside of the acceptable range for further review.

Independent Analysis: To use the percent change and the z-score as the sole indicators of whether or not an agency’s data are in trend is limiting; therefore, the NIBRS staff also consider the population of the agency’s jurisdiction, the type of offense, and the month-to-month reported variations of data before making this determination.

  • If the NIBRS staff note that an agency’s data are out of trend, they send the agency a trend letter and provide a data listing noting the discrepancies. The staff request the agency to either verify the data on file or submit corrections. Though many questioned data turn out to be actual increases or decreases in the number of reported offenses, potential reasons for erroneous data include mistakes in data entry, underreporting, data transfer errors that may occur when a state aggregates its local agencies data into one state submission, incomplete data, vendor issues, or computer problems.

  • When the state program (on behalf of its agency[ies]) responds to the letter and verifies that the increase or decrease in its data submission was caused by a major fluctuation in criminal activity, the CSMU staff file the response for future reference if anyone questions the data. If a state program discovers that its data are erroneously out of trend or incorrect, the state program or the agency must supply the CSMU with electronic adjustments; unlike summary reporting, the NIBRS does not allow the FBI to make manual adjustments to the master files after the data have been electronically processed.

  • In addition to sending state UCR Programs a listing of Part I offense trends, the NIBRS staff provide the Comparative Report by Agency. This report furnishes aggregated crime counts for all NIBRS Group A offenses that are maintained by the national Program for current and previous years’ data, the percent change in crime levels from one reporting period to the next, and the z-score. These data provide an opportunity for the agency to identify any inaccurate crime totals as well as monitor data submissions, corrections, and current statistical counts. The comparative report is also a valuable resource for the agency if anyone questions its data.

State UCR Program managers requiring additional information on these procedures should contact the CSMU and ask to speak with their state’s statistical assistant.

Back to top

UCR State Program Bulletin 04-4, November 2004

Reporting of Hate Crime Incidents Requires More Follow Through

The national program’s staff routinely verify hate crime incidents involving the offense types of murder or forcible rape or the bias motivations of anti-physical disability or anti-mental disability. To do this, the staff contact the state program manager or, in some cases, the local agency that submitted reports with these offense types or bias motivations to ensure the accuracy of these reports.

Prior to publishing the 2003 hate crime data, the national program noted 128 reported hate crime incidents involving murder, forcible rape, anti-physical disability bias motivation, or anti-mental disability bias motivation. During the verification process, the national program found that 33.6 percent of the incidents were submitted correctly, 64.8 percent were determined not to be bias motivated, and 1.6 percent were incorrectly classified. The following table shows the disposition of those incidents submitted by both Summary and National Incident-Based Reporting System (NIBRS) contributors.

Findings of Verification Process Of 128 Incidents Disposition
Correct Submissions 43 Retained in hate crime database
Found not to be Bias Motivated 83
(82 NIBRS; 1 hard copy)
Deleted from hate crime database
Incorrectly Classified 2 Adjusted in hate crime database

The national program urges state and local agencies to be diligent when classifying and verifying incidents as bias motivated. In addition, the program asks that agencies make appropriate corrections and submit them in a timely manner.

Conference Membership Calls for Clarification

During the annual conference of the Association of State UCR Programs, members agreed on the following modifications to NIBRS data values, which are effective immediately.

Expand Property Description Code 27 = Recording—Audio/Visual

In regard to Data Value 27 = Recording—Audio/Visual (Data Element 15, Property Description), the membership agreed to expand the types of media covered to include blank or recorded media:

27 = Recordings—Audio/Visual (phonograph records and blank or recorded compact discs, audio or video tapes, digital video discs, cassettes, etc.)

Provide Examples of Weapon Type Code 15 = Other Firearm

The membership also agreed to add examples in order to clarify Data Value 15 = Other Firearm (Data Element 13, Type Weapon/Force Involved). The revision is as follows:

15 = Other Firearm (e.g., bazookas, stinger missiles, and rocket and grenade launchers)

Back to top