Skip to content

A new Bayesian model, DISC-LM, that adapts the naive Bayes assumption of class-conditional feature independence to efficiently learn new categories

Notifications You must be signed in to change notification settings

JanaJarecki/Naive-and-robust-class-conditional-independence-in-human-classification-learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Naïve and Robust: Class-Conditional Independence in Human Classification Learning

Contributing Authors

Jana B. Jarecki, Björn Meder, Jonathan D. Nelson

Dates

Data collected in 2013. Paper(s) published in 2013 and 2017.

Abstract

Humans excel in categorization. Yet from a computational standpoint, learning a novel probabilistic classification task involves severe computational challenges. The present paper investigates one way to address these challenges: assuming class-conditional independence of features. This feature independence assumption simplifies the inference problem, allows for informed inferences about novel feature combinations, and performs robustly across different statistical environments. We designed a new Bayesian classification learning model (the dependence-independence structure and category learning model, DISC-LM) that incorporates varying degrees of prior belief in class-conditional independence, learns whether or not independence holds, and adapts its behavior accordingly. Theoretical results from two simulation studies demonstrate that classification behavior can appear to start simple, yet adapt effectively to unexpected task structures. Two experiments—designed using optimal experimental design principles—were conducted with human learners. Classification decisions of the majority of participants were best accounted for by a version of the model with very high initial prior belief in class-conditional independence, before adapting to the true environmental structure. Class-conditional independence may be a strong and useful default assumption in category learning tasks.

Publications

  • Jarecki, J. B., Meder, B., & Nelson, J. D. (2017). Naïve and Robust: Class-Conditional Independence in Human Classification Learning. Cognitive Science, 42(1), 4–42. https://doi.org/10.1111/cogs.12496
  • Jarecki, J. B., Meder, B., & Nelson, J. D. (2013). The Assumption of Class-Conditional Independence in Category Learning. In Proceedings of the 35th Annual Conference of the Cognitive Science Society, 2650–2655, https://mindmodeling.org/cogsci2013/papers/0478/index.html

Open Data

Data accessible at data 2 behavioral, 4 simulation datasets. Data dictionary under codebook.

Machine-Learning Models

1 new model based on Naive Bayes

Funding

Max-Planck-Institute for Human Development, Berlin, and in part by grants NE 1713/1-2 to JDN, and ME 3717/2-2 to BM, from the Deutsche Forschungsgemeinschaft (DFG) as part of the priority program “New Frame- works of Rationality” (SPP 1516).

Notes

None.

About

A new Bayesian model, DISC-LM, that adapts the naive Bayes assumption of class-conditional feature independence to efficiently learn new categories

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published