4.7 Article

Interactive topic modeling

期刊

MACHINE LEARNING
卷 95, 期 3, 页码 423-469

出版社

SPRINGER
DOI: 10.1007/s10994-013-5413-0

关键词

Topic models; Latent Dirichlet Allocation; Feedback; Interactive topic modeling; Online learning; Gibbs sampling

资金

  1. National Science Foundation [0705832, 1018625]
  2. Army Research Laboratory [W911NF-09-2-0072]
  3. Division of Computing and Communication Foundations
  4. Direct For Computer & Info Scie & Enginr [1018625] Funding Source: National Science Foundation
  5. Div Of Information & Intelligent Systems
  6. Direct For Computer & Info Scie & Enginr [0705832] Funding Source: National Science Foundation

向作者/读者索取更多资源

Topic models are a useful and ubiquitous tool for understanding large corpora. However, topic models are not perfect, and for many users in computational social science, digital humanities, and information studies-who are not machine learning experts-existing models and frameworks are often a take it or leave it proposition. This paper presents a mechanism for giving users a voice by encoding users' feedback to topic models as correlations between words into a topic model. This framework, interactive topic modeling (itm), allows untrained users to encode their feedback easily and iteratively into the topic models. Because latency in interactive systems is crucial, we develop more efficient inference algorithms for tree-based topic models. We validate the framework both with simulated and real users.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.7
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据