Kellan Elliott-McCrea has a great post about the high cost of false positives when it comes to building software that detects fraud, spam, abuse, or whatever. The cost of false positives is explained by the base rate fallacy. The BBC explains the base rate fallacy very well. Here’s a snippet:
If 3,000 people are tested, and the test is 90% accurate, it is also 10% wrong. So it will probably identify 301 terrorists – about 300 by mistake and 1 correctly. You won’t know from the test which is the real terrorist. So the chance that our man in the mac is the real thing is 1 in 301.
Anybody who wants to talk about screening systems without an understanding of the base rate fallacy needs to do more homework.
Screening systems and the base rate fallacy
Kellan Elliott-McCrea has a great post about the high cost of false positives when it comes to building software that detects fraud, spam, abuse, or whatever. The cost of false positives is explained by the base rate fallacy. The BBC explains the base rate fallacy very well. Here’s a snippet:
Anybody who wants to talk about screening systems without an understanding of the base rate fallacy needs to do more homework.
Commentary
securitysoftware development
Previous post
Is a desktop email client in my future?Next post
It’s hard not to feel despair at this point