Announcement: All noncommercial projects registered to use Earth Engine before April 15, 2025 must verify noncommercial eligibility to maintain Earth Engine access.
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Missing the information I need","missingTheInformationINeed","thumb-down"],["Too complicated / too many steps","tooComplicatedTooManySteps","thumb-down"],["Out of date","outOfDate","thumb-down"],["Samples / code issue","samplesCodeIssue","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2023-10-06 UTC."],[[["\u003cp\u003e\u003ccode\u003eConfusionMatrix.accuracy()\u003c/code\u003e computes the overall accuracy of a confusion matrix, which is defined as the ratio of correct predictions to the total number of predictions.\u003c/p\u003e\n"],["\u003cp\u003eIt takes a \u003ccode\u003eConfusionMatrix\u003c/code\u003e object as input and returns the accuracy as a float.\u003c/p\u003e\n"],["\u003cp\u003eThis function is useful for evaluating the performance of classification models by providing a single metric summarizing the overall correctness of predictions.\u003c/p\u003e\n"],["\u003cp\u003eExample code snippets demonstrate how to create a confusion matrix and calculate its overall accuracy using the Earth Engine API in both JavaScript and Python.\u003c/p\u003e\n"]]],["The content details the computation of a confusion matrix's overall accuracy, calculated as correct predictions divided by the total. It demonstrates how to construct a `ConfusionMatrix` object from an array, representing actual vs. predicted values. The `accuracy()` method returns a float representing this overall accuracy. Other methods shown include calculating consumer's and producer's accuracy, and the kappa statistic using a `ConfusionMatrix`. Both JavaScript and Python examples are provided.\n"],null,[]]