Evaluating the crowd quality for subjective questions based on a Spark computing environment

Title
Evaluating the crowd quality for subjective questions based on a Spark computing environment
Authors
Keywords
Crowdsourcing, Subjective questions, Personnel quality evaluation, Confidence interval
Publisher
Elsevier BV
Online
2020-01-21
DOI
10.1016/j.future.2020.01.010

Ask authors/readers for more resources

Reprint

Contact the author

Find Funding. Review Successful Grants.

Explore over 25,000 new funding opportunities and over 6,000,000 successful grants.

Explore

Ask a Question. Answer a Question.

Quickly pose questions to the entire community. Debate answers and get clarity on the most important issues facing researchers.

Get Started