Abstract:
Many software users give feedback online about the applications they use, on channels such
as app stores, product forums, and social media. However, not all software users give online
feedback. If the demographics of a user-base aren’t fairly represented in this feedback, there
is a danger that the needs of less vocal users won’t be considered in development.
In this work, we directly survey software users about their feedback habits, as well as
what motivates and dissuades them from providing online feedback. We identify significant differences in the demographics of users who give feedback, and key differences in
what motivates their engagement. Our findings give meaningful context for requirements
sourced from online feedback, identifying demographic groups who are underrepresented,
and suggesting approaches to elicit more representative feedback.
An implication of our user study (described above), is that to get the most representative requirement information from a user-base, a range of feedback channels should be
mined. However, while much recent research has investigated how to automatically extracted requirements insights from app stores and Twitter, product forums have received
limited attention.
To address is gap, we perform two empirical studies of product forums, showing that:
1) forum feedback is a rich source of requirements information that can be leveraged by
developments teams; and 2) requirements information in product forums is often manually
identified and matched with related issue tracker and product documentation entries. We
propose automatic analysis techniques to help identify requirements in forum feedback,
and then match requirements between platforms, to elicit the most salient information for
development teams.