A Generic Framework for Constraint-Driven Data Selection in Mobile Crowd Photographing

Huihui Chen, Bin Guo, Zhiwen Yu, Liming (Luke) Chen, Xiaojuan Ma

Research output: Contribution to journalArticlepeer-review

45 Citations (Scopus)
105 Downloads (Pure)

Abstract

Mobile crowd photographing (MCP) is an emerging area of interest for researchers as the built-in cameras of mobile devices are becoming one of the commonly used visual logging approaches in our daily lives. In order to meet diverse MCP application requirements and constraints of sensing targets, a multifacet task model should be defined for a generic MCP data collection framework. Furthermore, MCP collects pictures in a distributed way in which a large number of contributors upload pictures whenever and wherever it is suitable. This inevitably leads to evolving picture streams. This paper investigates the multiconstraint-driven data selection problem in MCP picture aggregation and proposes a pyramid-tree (PTree) model which can efficiently select an optimal subset from the evolving picture streams based on varied coverage needs of MCP tasks. By utilizing the PTree model in a generic MCP data collection framework, which is called CrowdPic, we test and evaluate the effectiveness, efficiency, and flexibility of the proposed framework through crowdsourcing-based and simulation-based experiments. Both the theoretical analysis and simulation results indicate that the PTree-based framework can effectively select a subset with high utility coverage and low redundancy ratio from the streaming data. The overall framework is also proved flexible and applicable to a wide range of MCP task scenarios.
Original languageEnglish
Pages (from-to)284 - 296
Number of pages12
JournalIEEE Internet of Things
Volume4
Issue number1
Early online date5 Jan 2017
DOIs
Publication statusPublished (in print/issue) - Feb 2017

Fingerprint

Dive into the research topics of 'A Generic Framework for Constraint-Driven Data Selection in Mobile Crowd Photographing'. Together they form a unique fingerprint.

Cite this