An investigation of using mobile and situated crowdsourcing to collect annotated travel activity data in real-word settings. Issue 102 (June 2017)
- Record Type:
- Journal Article
- Title:
- An investigation of using mobile and situated crowdsourcing to collect annotated travel activity data in real-word settings. Issue 102 (June 2017)
- Main Title:
- An investigation of using mobile and situated crowdsourcing to collect annotated travel activity data in real-word settings
- Authors:
- Chang, Yung-Ju
Paruthi, Gaurav
Wu, Hsin-Ying
Lin, Hsin-Yu
Newman, Mark W. - Abstract:
- Abstract: Collecting annotated activity data is vital to many forms of context-aware system development. Leveraging a crowd of smartphone users to collect annotated activity data in the wild is a promising direction because the data being collected are realistic and diverse. However, current research lacks a systematic analysis comparing different approaches for collecting such data and investigating how users use these approaches to collect activity data in real world settings. In this paper, we report results from a field study investigating the use of mobile crowdsourcing to collect annotated travel activity data through three approaches: Participatory, Context-Triggered In Situ, and Context-Triggered Post Hoc . In particular, we conducted two phases of analysis. In Phase One, we analyzed and compared the resulting data collected via the three approaches and user experience. In Phase Two, we analyzed users' recording and annotation behavior as well as the annotation content in using each approach in the field. Our results suggested that although Context-Triggered approaches produced a larger number of recordings, they did not necessarily lead to a larger quantity of data than the Participatory approach. It was because many of the recordings were either not labeled, incomplete, and/or fragmented due to the imperfect context detection. In addition, recordings collected by the Participatory approach tended to be more complete and contain less noise. In terms of userAbstract: Collecting annotated activity data is vital to many forms of context-aware system development. Leveraging a crowd of smartphone users to collect annotated activity data in the wild is a promising direction because the data being collected are realistic and diverse. However, current research lacks a systematic analysis comparing different approaches for collecting such data and investigating how users use these approaches to collect activity data in real world settings. In this paper, we report results from a field study investigating the use of mobile crowdsourcing to collect annotated travel activity data through three approaches: Participatory, Context-Triggered In Situ, and Context-Triggered Post Hoc . In particular, we conducted two phases of analysis. In Phase One, we analyzed and compared the resulting data collected via the three approaches and user experience. In Phase Two, we analyzed users' recording and annotation behavior as well as the annotation content in using each approach in the field. Our results suggested that although Context-Triggered approaches produced a larger number of recordings, they did not necessarily lead to a larger quantity of data than the Participatory approach. It was because many of the recordings were either not labeled, incomplete, and/or fragmented due to the imperfect context detection. In addition, recordings collected by the Participatory approach tended to be more complete and contain less noise. In terms of user experience, while users appreciated automated recording and reminders because of their convenience, they highly valued having the control over what and when to record and annotate that the Participatory approach provided. Finally, we showed that activity type (Driver, Riding as Passenger, Walking) influenced users' behaviors in recording and annotating their activity data. It influenced the timing of recording and annotating using the Participatory approach, users' receptivity using the Context-Triggered In Situ approach, and the characteristics of the content of annotations. Based on these findings, we provide design and methodological recommendations for future work that aims to leverage mobile crowdsourcing to collect annotated activity data. Highlights: The Participatory approach produced high-quality annotated activity data. User burden and control are two crucial aspects for sustaining user compliance. Activity affects recording and annotation timing and characteristics of annotations. Activity affects users' receptivity when using the Context-Triggered approach. We offer suggestions on the approaches, tools, and instructions to collect activity. … (more)
- Is Part Of:
- International journal of human-computer studies. Issue 102(2017)
- Journal:
- International journal of human-computer studies
- Issue:
- Issue 102(2017)
- Issue Display:
- Volume 102, Issue 102 (2017)
- Year:
- 2017
- Volume:
- 102
- Issue:
- 102
- Issue Sort Value:
- 2017-0102-0102-0000
- Page Start:
- 81
- Page End:
- 102
- Publication Date:
- 2017-06
- Subjects:
- Mobile crowdsourcing -- Crowdsensing -- Wearable camera -- Annotated activity data collection -- Travel activity
Human-machine systems -- Periodicals
Systems engineering -- Periodicals
Human engineering -- Periodicals
Human engineering
Human-machine systems
Systems engineering
Periodicals
Electronic journals
004.019 - Journal URLs:
- http://www.sciencedirect.com/science/journal/10715819 ↗
http://www.elsevier.com/journals ↗ - DOI:
- 10.1016/j.ijhcs.2016.11.001 ↗
- Languages:
- English
- ISSNs:
- 1071-5819
- Deposit Type:
- Legaldeposit
- View Content:
- Available online (eLD content is only available in our Reading Rooms) ↗
- Physical Locations:
- British Library DSC - 4542.288100
British Library DSC - BLDSS-3PM
British Library HMNTS - ELD Digital store - Ingest File:
- 2369.xml