***NOTE ABOUT THE UNBUFFERED VALIDATION ACCURACY TABLES BEGINNING IN 2016: The training and validation data used to create and accuracy assess the CDL has traditionally been based on ground truth data that is buffered inward 30 meters. This was done 1) because satellite imagery (as well as the polygon reference data) in the past was not georeferenced to the same precision as now (i.e. everything "stacked" less perfectly), 2) to eliminate from training spectrally-mixed pixels at land cover boundaries, and 3) to be spatially conservative during the era when coarser 56 meter AWiFS satellite imagery was incorporated. Ultimately, all of these scenarios created "blurry" edge pixels through the seasonal time series which it was found if ignored from training in the classification helped improve the quality of CDL. However, the accuracy assessment portion of the analysis also used buffered data meaning those same edge pixels were not assessed fully with the rest of the classification. This would be inconsequential if those edge pixels were similar in nature to the rest of the scene but they are not- they tend to be more difficult to classify correctly. Thus, the accuracy assessments as have been presented are inflated somewhat. Beginning with the 2016 CDL season we are creating CDL accuracy assessments using unbuffered validation data. These "unbuffered" accuracy metrics will now reflect the accuracy of field edges which have not been represented previously. Beginning with the 2016 CDLs we published both the traditional "buffered" accuracy metrics and the new "unbuffered" accuracy assessments. The purpose of publishing both versions is to provide a benchmark for users interested in comparing the different validation methods. For the 2018 CDL season we are now only publishing the unbuffered accuracy only publishing the unbuffered accuracy assessments within the official metadata files and offer the full "unbuffered" error matrices for download on the FAQs webpage. Both metadata and FAQs are accessible at <https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>. We plan to continue producing these unbuffered accuracy assessments for future CDLs. However, there are no plans to create these unbuffered accuracy assessments for past years. It should be noted that accuracy assessment is challenging and the CDL group has always strived to provide robust metrics of usability to the land cover community. This admission of modestly inflated accuracy measures does not render past assessments useless. They were all done consistently so comparison across years and/or states is still valid. Yet, by providing both scenarios for 2016 gives guidance on the bias. If the following table does not display properly, then please visit this internet site <https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php> to view the original metadata file.
USDA, National Agricultural Statistics Service, 2018 Washington Cropland Data Layer
STATEWIDE AGRICULTURAL ACCURACY REPORT
Crop-specific covers only *Correct Accuracy Error Kappa
------------------------- ------- -------- ------ -----
OVERALL ACCURACY** 456,927 82.7% 17.3% 0.806
Cover Attribute *Correct Producer's Omission User's Commission Cond'l
Type Code Pixels Accuracy Error Kappa Accuracy Error Kappa
---- ---- ------ -------- ----- ----- -------- ----- -----
Corn 1 45,702 87.2% 12.8% 0.865 87.1% 12.9% 0.864
Sorghum 4 - n/a n/a n/a 0.0% 100.0% 0.000
Soybeans 5 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Sunflower 6 172 63.0% 37.0% 0.630 73.5% 26.5% 0.735
Sweet Corn 12 6,652 63.9% 36.1% 0.637 90.8% 9.2% 0.906
Mint 14 310 15.7% 84.3% 0.157 84.2% 15.8% 0.842
Barley 21 71 14.7% 85.3% 0.147 33.0% 67.0% 0.330
Spring Wheat 23 123 54.4% 45.6% 0.541 2.1% 97.9% 0.020
Winter Wheat 24 18,562 75.3% 24.7% 0.748 88.5% 11.5% 0.882
Rye 27 - n/a n/a n/a 0.0% 100.0% 0.000
Oats 28 422 41.7% 58.3% 0.417 66.5% 33.5% 0.664
Canola 31 285 62.6% 37.4% 0.626 72.5% 27.5% 0.725
Flaxseed 32 - n/a n/a n/a 0.0% 100.0% 0.000
Safflower 33 - n/a n/a n/a 0.0% 100.0% 0.000
Mustard 35 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Alfalfa 36 38,176 86.6% 13.4% 0.859 82.3% 17.7% 0.814
Other Hay/Non Alfalfa 37 25,469 87.5% 12.5% 0.870 78.5% 21.5% 0.779
Buckwheat 39 462 22.4% 77.6% 0.223 92.4% 7.6% 0.924
Sugarbeets 41 - n/a n/a n/a 0.0% 100.0% 0.000
Dry Beans 42 1,163 49.6% 50.4% 0.495 61.4% 38.6% 0.613
Potatoes 43 27,064 91.8% 8.2% 0.915 92.5% 7.5% 0.922
Other Crops 44 4 6.7% 93.3% 0.067 3.9% 96.1% 0.039
Watermelons 48 - n/a n/a n/a 0.0% 100.0% 0.000
Onions 49 3,605 71.7% 28.3% 0.716 90.3% 9.7% 0.903
Chick Peas 51 25 8.9% 91.1% 0.089 75.8% 24.2% 0.758
Lentils 52 - n/a n/a n/a 0.0% 100.0% 0.000
Peas 53 1,652 75.8% 24.2% 0.758 56.8% 43.2% 0.567
Caneberries 55 7,717 80.3% 19.7% 0.801 89.8% 10.2% 0.897
Hops 56 28,833 93.7% 6.3% 0.935 94.3% 5.7% 0.941
Herbs 57 - n/a n/a n/a 0.0% 100.0% 0.000
Clover/Wildflowers 58 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Sod/Grass Seed 59 1,328 71.7% 28.3% 0.716 35.8% 64.2% 0.356
Fallow/Idle Cropland 61 18,468 57.3% 42.7% 0.562 79.2% 20.8% 0.785
Cherries 66 23,781 74.9% 25.1% 0.741 83.1% 16.9% 0.825
Peaches 67 936 48.7% 51.3% 0.486 62.3% 37.7% 0.622
Apples 68 124,660 90.5% 9.5% 0.890 91.1% 8.9% 0.896
Grapes 69 53,547 93.1% 6.9% 0.926 90.7% 9.3% 0.901
Christmas Trees 70 4,409 67.8% 32.2% 0.676 90.9% 9.1% 0.908
Other Tree Crops 71 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Walnuts 76 7 9.7% 90.3% 0.097 58.3% 41.7% 0.583
Pears 77 9,856 75.3% 24.7% 0.750 83.9% 16.1% 0.837
Triticale 205 355 16.9% 83.1% 0.168 34.2% 65.8% 0.341
Carrots 206 498 52.7% 47.3% 0.527 71.6% 28.4% 0.715
Asparagus 207 1,154 73.3% 26.7% 0.732 91.4% 8.6% 0.914
Garlic 208 - n/a n/a n/a 0.0% 100.0% 0.000
Peppers 216 - n/a n/a n/a 0.0% 100.0% 0.000
Greens 219 100 54.6% 45.4% 0.546 43.1% 56.9% 0.431
Plums 220 1 1.1% 98.9% 0.011 2.9% 97.1% 0.028
Strawberries 221 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Squash 222 - 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Apricots 223 180 28.2% 71.8% 0.281 55.9% 44.1% 0.559
Vetch 224 - n/a n/a n/a 0.0% 100.0% 0.000
Dbl Crop WinWht/Corn 225 - n/a n/a n/a 0.0% 100.0% 0.000
Pumpkins 229 88 25.4% 74.6% 0.253 42.5% 57.5% 0.425
Blueberries 242 10,399 77.1% 22.9% 0.769 87.9% 12.1% 0.877
Cabbage 243 2 7.1% 92.9% 0.071 28.6% 71.4% 0.286
Cranberries 250 689 76.9% 23.1% 0.769 97.0% 3.0% 0.970
*Correct Pixels represents the total number of independent validation pixels correctly identified in the error matrix.
**The Overall Accuracy represents only the FSA row crops and annual fruit and vegetables (codes 1-61, 66-80, 92 and 200-255).
FSA-sampled grass and pasture. Non-agricultural and NLCD-sampled categories (codes 62-65, 81-91 and 93-199) are not included in the Overall Accuracy.
The accuracy of the non-agricultural land cover classes within the Cropland Data Layer is entirely dependent upon the USGS, National Land Cover Database (NLCD 2011). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover. For more information on the accuracy of the NLCD please reference <https://www.mrlc.gov/>.
Attribute_Accuracy_Value:
Classification accuracy is generally 85% to 95% correct for the major crop-specific land cover categories. See the 'Attribute Accuracy Report' section of this metadata file for the detailed accuracy report.
Attribute_Accuracy_Explanation:
The strength and emphasis of the CDL is crop-specific land cover categories. The accuracy of the CDL non-agricultural land cover classes is entirely dependent upon the USGS, National Land Cover Database (NLCD 2011). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover.
These definitions of accuracy statistics were derived from the following book: Congalton, Russell G. and Kass Green. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, Florida: CRC Press, Inc. 1999. The 'Producer's Accuracy' is calculated for each cover type in the ground truth and indicates the probability that a ground truth pixel will be correctly mapped (across all cover types) and measures 'errors of omission'. An 'Omission Error' occurs when a pixel is excluded from the category to which it belongs in the validation dataset. The 'User's Accuracy' indicates the probability that a pixel from the CDL classification actually matches the ground truth data and measures 'errors of commission'. The 'Commission Error' represent when a pixel is included in an incorrect category according to the validation data. It is important to take into consideration errors of omission and commission. For example, if you classify every pixel in a scene to 'wheat', then you have 100% Producer's Accuracy for the wheat category and 0% Omission Error. However, you would also have a very high error of commission as all other crop types would be included in the incorrect category. The 'Kappa' is a measure of agreement based on the difference between the actual agreement in the error matrix (i.e., the agreement between the remotely sensed classification and the reference data as indicated by the major diagonal) and the chance agreement which is indicated by the row and column totals. The 'Conditional Kappa Coefficient' is the agreement for an individual category within the entire error matrix.