Working with researchers from LTU and MTSU, Derek Diamond developed a novel algorithm the can estimate data annotators in crowdsourcing projects. Crowdsourcing is a common tool in industry and academia to analyze complex large databases. Projects such as Zooniverse and services such as Amazon Mechanical Turk are based on crowdsouring to provide industry and academia with solutions to problems that require human analysis of large databases.
Some tasks that are simple for humans are still considered very difficult for computing machines. The concept of crowdsouring uses a large number of users, normally through the internet, to annotate and analyze pieces of information that cannot be analyzed by the existing computer methodology.
Derek Diamond and his collaborators used the idea that inconsistent annotations will be revealed by weaker automatic classification if a machine learning algorithm is trained with these data. That can be used to rank data annotators automatically by their consistency, which can be used to allocate the human resources more efficiently among samples.
The study was peer-reviewed and published in IEEE Transactions on Human-Machine Systems – one of the world’s leading and most competitive computer science scientific journals.