Katia Passerini, Ph.D., Seton Hall provost and executive vice president, co-authored a study that has been published in Computers in Human Behavior, a scholarly journal [impact factor 6.829 (Clarivate Analytics, 2019)], dedicated to examining the use of computers from a psychological perspective, focusing on socio-technical issues and interactions between humans and technology.
The study titled, "Diminishing Returns of Information Quality: Untangling the Determinants of Best Answer Selection," was co-authored with colleagues Dr. Babajide Osatuyi, assistant professor of management information systems at Penn State University, and Dr. Ofir Turel, professor of information systems and decision sciences at California State University.
“I am delighted that this article has finally reached the online press. I owe this to the persistence of my co-authors,” noted Dr. Passerini. “We worked to support each other as sound thinking and execution partners through the various revisions.”
Passerini earned both an M.B.A. and Ph.D. in management information systems from George Washington University and a master's degree in economics (equivalent) from the University of Rome II -Tor Vergata. While serving as Seton Hall provost, she maintains her commitment of service to the academy by volunteering as co-managing editor of the Journal of Small Business Management, a scholarly journal in entrepreneurship with a 5-year impact factor of 6.799 (2020). She has published over 100 peer-reviewed journal and proceedings articles and has received numerous teaching, research and service recognitions.
A summary of the study is below:
When using social media platforms to find the best answers to questions, did you ever wonder what makes some answers the best answers? Do people select answers because of the quality of the reply or because of the reputation of the person providing the reply (i.e., a social media influencer)? Or both?
As scholars in information systems (IS) increasingly ask for studies that examine actual use behaviors rather than behavioral intentions, this research focuses on understanding how a question and answer (Q&A) knowledge repository can be utilized to foster knowledge exchange and use. The study extends beyond traditional behavioral intentions analyses, which are often based on self-reported preferences signaling the intention of a future behavior, by measuring the extent to which answers provided by knowledge contributors are used by knowledge seekers and are subsequently rated as useful in addressing a seeker’s problem in an online community.
Actual knowledge use is based on the interplay between characteristics of members in the community such as the quality of information exchanged (i.e., quality of questions and answers), and the level of expertise of the community members (seekers and contributors). StackOverflow.com, the platform used in this study to collect data, offers an ideal setting for investigating actual use: participants in the knowledge exchange must use the answer (i.e., snippets of programming code) to validate its correctness and rate its ability to best solve a coding problem. The platform uses gamification to create incentives for sustained participation in the repository by the way of assigning points to programmers who contribute answers, and ask new questions, within the community. Such participation allows community members to accumulate points, to be ranked on a “leaderboard,” as well as to provide feedback on the quality of the answers in the repository, including selecting the best answer to a known coding problem.
The study proposes and validates a non-linear model of best answer selection providing empirical support for the hypothesized relationships. Multiple robustness tests confirm the validity of the results and move us forward in understanding the complex relationship between knowledge availability and the dynamic factors that determine its actual use. We find that the relationship between information quality and its use is indicative of its diminishing return property, a finding that has significant implications for information systems development and management strategy. For example, this non-linear relation suggests that despite the increasing reputation (i.e., expertise or experience) of an employee in a company, the quality of the employee’s answer may be inflated (as explained in the so-called Matthew effect), and it might reach a maximum utility point.
Beyond a certain point, the higher reputation of a contributor might fail to improve the selection of his/her answer and a novice answer might be selected as the best answer, either as a consequence of newer solutions that are more relevant or due to more accurate solutions. Hence, enterprise knowledge exchange repositories should enable and encourage broader participation and sustained contribution of solutions to known organizational problems from both novices and experts. This will help to minimize diminishing returns in information quality while continuously expanding knowledge sources and contributors to the knowledge base.
In highly dynamic environments, such as programming fields, seniority and reputation offer a significant initial advantage, until they don’t anymore. In the end, it is the answer that best solves the problem that will win.