***NOTE ABOUT THE UNBUFFERED VALIDATION ACCURACY TABLES BEGINNING IN 2016: The training and validation data used to create and accuracy assess the CDL has traditionally been based on ground truth data that is buffered inward 30 meters. This was done 1) because satellite imagery (as well as the polygon reference data) in the past was not georeferenced to the same precision as now (i.e. everything "stacked" less perfectly), 2) to eliminate from training spectrally-mixed pixels at land cover boundaries, and 3) to be spatially conservative during the era when coarser 56 meter AWiFS satellite imagery was incorporated. Ultimately, all of these scenarios created "blurry" edge pixels through the seasonal time series which it was found if ignored from training in the classification helped improve the quality of CDL. However, the accuracy assessment portion of the analysis also used buffered data meaning those same edge pixels were not assessed fully with the rest of the classification. This would be inconsequential if those edge pixels were similar in nature to the rest of the scene but they are not- they tend to be more difficult to classify correctly. Thus, the accuracy assessments as have been presented are inflated somewhat. Beginning with the 2016 CDL season we are creating CDL accuracy assessments using unbuffered validation data. These "unbuffered" accuracy metrics will now reflect the accuracy of field edges which have not been represented previously. Beginning with the 2016 CDLs we published both the traditional "buffered" accuracy metrics and the new "unbuffered" accuracy assessments. The purpose of publishing both versions is to provide a benchmark for users interested in comparing the different validation methods. For the 2019 CDL season we are now only publishing the unbuffered accuracy only publishing the unbuffered accuracy assessments within the official metadata files and offer the full "unbuffered" error matrices for download on the FAQs webpage. Both metadata and FAQs are accessible at <https://www.nass.usda.gov/Research_and_Science/Cropland/SARS1a.php>. We plan to continue producing these unbuffered accuracy assessments for future CDLs. However, there are no plans to create these unbuffered accuracy assessments for past years. It should be noted that accuracy assessment is challenging and the CDL group has always strived to provide robust metrics of usability to the land cover community. This admission of modestly inflated accuracy measures does not render past assessments useless. They were all done consistently so comparison across years and/or states is still valid. Yet, by providing both scenarios for 2016 gives guidance on the bias. If the following table does not display properly, then please visit this internet site <https://www.nass.usda.gov/Research_and_Science/Cropland/metadata/meta.php> to view the original metadata file.
USDA, National Agricultural Statistics Service, 2019 Idaho Cropland Data Layer
STATEWIDE AGRICULTURAL ACCURACY REPORT
Crop-specific covers only *Correct Accuracy Error Kappa
------------------------- ------- -------- ------ -----
OVERALL ACCURACY** 423,376 85.2% 14.8% 0.832
Cover Attribute *Correct Producer's Omission User's Commission Cond'l
Type Code Pixels Accuracy Error Kappa Accuracy Error Kappa
---- ---- ------ -------- ----- ----- -------- ----- -----
Corn 1 35,506 92.1% 7.9% 0.917 91.1% 8.9% 0.907
Sorghum 4 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Sunflower 6 174 68.5% 31.5% 0.685 88.3% 11.7% 0.883
Sweet Corn 12 210 48.8% 51.2% 0.488 64.4% 35.6% 0.644
Pop or Orn Corn 13 2 2.7% 97.3% 0.027 40.0% 60.0% 0.400
Mint 14 1,454 72.5% 27.5% 0.724 91.6% 8.4% 0.916
Barley 21 54,365 86.2% 13.8% 0.853 88.0% 12.0% 0.871
Durum Wheat 22 133 33.6% 66.4% 0.336 74.7% 25.3% 0.747
Spring Wheat 23 43,110 83.0% 17.0% 0.820 84.9% 15.1% 0.840
Winter Wheat 24 73,301 91.2% 8.8% 0.904 91.6% 8.4% 0.908
Other Small Grains 25 0 n/a n/a n/a 0.0% 100.0% 0.000
Rye 27 12 10.2% 89.8% 0.102 32.4% 67.6% 0.324
Oats 28 1,014 30.4% 69.6% 0.303 67.6% 32.4% 0.674
Millet 29 35 28.9% 71.1% 0.289 97.2% 2.8% 0.972
Canola 31 3,508 81.3% 18.7% 0.812 92.2% 7.8% 0.921
Flaxseed 32 63 29.3% 70.7% 0.293 96.9% 3.1% 0.969
Safflower 33 2,038 78.3% 21.7% 0.783 85.3% 14.7% 0.853
Rape Seed 34 0 n/a n/a n/a 0.0% 100.0% 0.000
Mustard 35 1,409 77.8% 22.2% 0.778 89.7% 10.3% 0.897
Alfalfa 36 96,844 90.7% 9.3% 0.895 87.0% 13.0% 0.854
Other Hay/Non Alfalfa 37 12,279 57.4% 42.6% 0.566 74.3% 25.7% 0.736
Camelina 38 0 0.0% 100.0% 0.000 n/a n/a n/a
Buckwheat 39 24 55.8% 44.2% 0.558 45.3% 54.7% 0.453
Sugarbeets 41 19,480 92.2% 7.8% 0.921 97.0% 3.0% 0.970
Dry Beans 42 4,788 85.0% 15.0% 0.849 82.8% 17.2% 0.826
Potatoes 43 34,870 94.0% 6.0% 0.937 95.9% 4.1% 0.958
Other Crops 44 273 50.3% 49.7% 0.503 82.0% 18.0% 0.820
Misc Vegs & Fruits 47 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Watermelons 48 0 0.0% 100.0% 0.000 n/a n/a n/a
Onions 49 786 83.7% 16.3% 0.837 81.0% 19.0% 0.810
Chick Peas 51 8,134 90.8% 9.2% 0.907 91.5% 8.5% 0.914
Lentils 52 2,106 77.9% 22.1% 0.778 78.6% 21.4% 0.785
Peas 53 1,680 59.3% 40.7% 0.592 82.5% 17.5% 0.825
Hops 56 543 80.7% 19.3% 0.807 92.2% 7.8% 0.922
Herbs 57 0 n/a n/a n/a 0.0% 100.0% 0.000
Clover/Wildflowers 58 22 25.9% 74.1% 0.259 66.7% 33.3% 0.667
Sod/Grass Seed 59 5,399 77.6% 22.4% 0.774 87.7% 12.3% 0.876
Fallow/Idle Cropland 61 18,239 78.3% 21.7% 0.778 85.8% 14.2% 0.854
Cherries 66 35 62.5% 37.5% 0.625 60.3% 39.7% 0.603
Peaches 67 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Apples 68 4 11.8% 88.2% 0.118 28.6% 71.4% 0.286
Grapes 69 0 0.0% 100.0% 0.000 n/a n/a n/a
Christmas Trees 70 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Walnuts 76 0 0.0% 100.0% 0.000 n/a n/a n/a
Pears 77 0 0.0% 100.0% 0.000 n/a n/a n/a
Triticale 205 737 29.1% 70.9% 0.290 56.8% 43.2% 0.567
Carrots 206 72 34.4% 65.6% 0.344 52.9% 47.1% 0.529
Peppers 216 1 2.0% 98.0% 0.020 14.3% 85.7% 0.143
Nectarines 218 0 0.0% 100.0% 0.000 n/a n/a n/a
Greens 219 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Plums 220 6 42.9% 57.1% 0.429 31.6% 68.4% 0.316
Squash 222 0 0.0% 100.0% 0.000 n/a n/a n/a
Dbl Crop WinWht/Corn 225 20 12.6% 87.4% 0.126 90.9% 9.1% 0.909
Lettuce 227 11 29.7% 70.3% 0.297 91.7% 8.3% 0.917
Dbl Crop Triticale/Corn 228 671 36.2% 63.8% 0.361 78.4% 21.6% 0.783
Dbl Crop WinWht/Sorghum 236 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Blueberries 242 0 0.0% 100.0% 0.000 n/a n/a n/a
Radishes 246 0 0.0% 100.0% 0.000 0.0% 100.0% 0.000
Turnips 247 18 14.3% 85.7% 0.143 78.3% 21.7% 0.783
*Correct Pixels represents the total number of independent validation pixels correctly identified in the error matrix.
**The Overall Accuracy represents only the FSA row crops and annual fruit and vegetables (codes 1-61, 66-80, 92 and 200-255).
FSA-sampled grass and pasture. Non-agricultural and NLCD-sampled categories (codes 62-65, 81-91 and 93-199) are not included in the Overall Accuracy.
The accuracy of the non-agricultural land cover classes within the Cropland Data Layer is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover. For more information on the accuracy of the NLCD please reference <https://www.mrlc.gov/>.
Attribute_Accuracy_Value:
Classification accuracy is generally 85% to 95% correct for the major crop-specific land cover categories. See the 'Attribute Accuracy Report' section of this metadata file for the detailed accuracy report.
Attribute_Accuracy_Explanation:
The strength and emphasis of the CDL is crop-specific land cover categories. The accuracy of the CDL non-agricultural land cover classes is entirely dependent upon the USGS, National Land Cover Database (NLCD 2016). Thus, the USDA, NASS recommends that users consider the NLCD for studies involving non-agricultural land cover.
These definitions of accuracy statistics were derived from the following book: Congalton, Russell G. and Kass Green. Assessing the Accuracy of Remotely Sensed Data: Principles and Practices. Boca Raton, Florida: CRC Press, Inc. 1999. The 'Producer's Accuracy' is calculated for each cover type in the ground truth and indicates the probability that a ground truth pixel will be correctly mapped (across all cover types) and measures 'errors of omission'. An 'Omission Error' occurs when a pixel is excluded from the category to which it belongs in the validation dataset. The 'User's Accuracy' indicates the probability that a pixel from the CDL classification actually matches the ground truth data and measures 'errors of commission'. The 'Commission Error' represent when a pixel is included in an incorrect category according to the validation data. It is important to take into consideration errors of omission and commission. For example, if you classify every pixel in a scene to 'wheat', then you have 100% Producer's Accuracy for the wheat category and 0% Omission Error. However, you would also have a very high error of commission as all other crop types would be included in the incorrect category. The 'Kappa' is a measure of agreement based on the difference between the actual agreement in the error matrix (i.e., the agreement between the remotely sensed classification and the reference data as indicated by the major diagonal) and the chance agreement which is indicated by the row and column totals. The 'Conditional Kappa Coefficient' is the agreement for an individual category within the entire error matrix.