期刊
MACHINE LEARNING
卷 95, 期 3, 页码 423-469出版社
SPRINGER
DOI: 10.1007/s10994-013-5413-0
关键词
Topic models; Latent Dirichlet Allocation; Feedback; Interactive topic modeling; Online learning; Gibbs sampling
资金
- National Science Foundation [0705832, 1018625]
- Army Research Laboratory [W911NF-09-2-0072]
- Division of Computing and Communication Foundations
- Direct For Computer & Info Scie & Enginr [1018625] Funding Source: National Science Foundation
- Div Of Information & Intelligent Systems
- Direct For Computer & Info Scie & Enginr [0705832] Funding Source: National Science Foundation
Topic models are a useful and ubiquitous tool for understanding large corpora. However, topic models are not perfect, and for many users in computational social science, digital humanities, and information studies-who are not machine learning experts-existing models and frameworks are often a take it or leave it proposition. This paper presents a mechanism for giving users a voice by encoding users' feedback to topic models as correlations between words into a topic model. This framework, interactive topic modeling (itm), allows untrained users to encode their feedback easily and iteratively into the topic models. Because latency in interactive systems is crucial, we develop more efficient inference algorithms for tree-based topic models. We validate the framework both with simulated and real users.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据