For Me Page: User-Centric Content Curation |
||
|
|
|
© 2024 by IJCTT Journal | ||
Volume-72 Issue-1 |
||
Year of Publication : 2024 | ||
Authors : Saurav Bhattacharya | ||
DOI : 10.14445/22312803/IJCTT-V72I1P104 |
How to Cite?
Saurav Bhattacharya, "For Me Page: User-Centric Content Curation," International Journal of Computer Trends and Technology, vol. 72, no. 1, pp. 19-26, 2024. Crossref, https://doi.org/10.14445/22312803/IJCTT-V72I1P104
Abstract
This article examines the emerging paradigm of user-customizable algorithms in digital platforms, advocating for a shift towards user empowerment in content curation. By analyzing the limitations of current algorithmic curation, including echo chambers, misinformation proliferation, and lack of transparency, it introduces the concept of "write your own algo" as a solution. The paper reviews the literature on the discontents of algorithmic personalization, user empowerment, and bot detection, highlighting the need for more transparent and user-driven approaches. It presents a theoretical framework integrating algorithmic transparency, participatory design, user autonomy, and information ecology. The article proposes that user involvement in algorithm customization leads to increased satisfaction, engagement, trust, and content diversity while improving bot detection and content integrity. It discusses potential challenges such as cognitive overload and the technical complexities of user involvement. The conclusion emphasizes the importance of ethical, user-centric system design in digital platforms and calls for future research to empirically test and refine the propositions made.
Keywords
User-Customizable Algorithms, Content Curation, Algorithmic Transparency, User Engagement, Ethical Technology.
Reference
[1] ACRL-Association of College & Research Libraries, Framework for Information Literacy for Higher Education, 2016. [Online]. Available: https://www.ala.org/acrl/standards/ilframework
[2] Eytan Bakshy, Solomon Messing, and Lada A. Adamic, “Exposure to Ideologically Diverse News and Opinion on Facebook,” Science, vol. 348, no. 6239, pp. 1130-1132, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[3] Javier A. Bargas-Avila, and Kasper Hornbæk, “Old Wine in New Bottles or Novel Challenges: A Critical Analysis of Empirical Studies of User Experience,” Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2689-2698, 2011.
[CrossRef] [Google Scholar] [Publisher Link]
[4] Fred D. Davis “Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology,” MIS Quarterly, vol. 13, no. 3, pp. 319-340, 1989.
[CrossRef] [Google Scholar] [Publisher Link]
[5] Edward L. Deci, and Richard M. Ryan, Intrinsic Motivation and Self-Determination in Human Behavior, Plenum, 1985.
[Google Scholar] [Publisher Link]
[6] Nicholas Diakopoulos, “Algorithmic Accountability: Journalistic Investigation of Computational Power Structures,” Digital Journalism, vol. 3, no. 3, pp. 398-415, 2015.
[CrossRef] [Google Scholar] [Publisher Link]
[7] Martin J. Eppler, and Jeanne Mengis, “The Concept of Information Overload: A Review of Literature from Organization Science, Accounting, Marketing, MIS, and Related Disciplines,” The Information Society: An International Journal, vol. 20, no. 5, pp. 271-305, 2004.
[CrossRef] [Google Scholar] [Publisher Link]
[8] Emilio Ferrara et al., “The Rise of Social Bots,” Communications of the ACM, vol. 59, no. 7, pp. 96-104, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[9] Gerhard Fischer, and Thomas Herrmann, “Socio-Technical Systems: A Meta-Design Perspective,” International Journal of Sociotechnology and Knowledge Development, vol. 3, no. 1, pp. 1-33, 2011.
[CrossRef] [Google Scholar] [Publisher Link]
[10] Tarleton Gillespie, Pablo J. Boczkowski, and Kirsten A. Foot, “The Relevance of Algorithms,” Media Technologies: Essays on Communication, Materiality, and Society, MIT Press, pp. 167-194, 2014.
[Google Scholar] [Publisher Link]
[11] Natali Helberger, Kari Karppinen, and Lucia D’Acunto, “Exposure Diversity as a Design Principle for Recommender Systems,” Information, Communication & Society, vol. 21, no. 2, pp. 191-207, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[12] Matthew K.O. Lee, and Efraim Turban, “A Trust Model for Consumer Internet Shopping,” International Journal of Electronic Commerce, vol. 6, no. 1, pp. 75-91, 2001.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Kristina Lerman, Xiaoran Yan, and Xin-Zeng Wu, “The ‘Majority Illusion’ in Social Networks,” PLOS ONE, vol. 11, no. 2, 2016.
[CrossRef] [Google Scholar] [Publisher Link]
[14] Roger C. Mayer, James H. Davis, and F. David Schoorman, “An Integrative Model of Organizational Trust,” Academy of Management Review, vol. 20, no. 3, pp. 709-734, 1995.
[CrossRef] [Google Scholar] [Publisher Link]
[15] D. Harrison Mcknight et al., “Trust in a Specific Technology: An Investigation of Its Components and Measures,” ACM Transactions on Management Information Systems, vol. 2, no. 2, pp. 1-25, 2011.
[CrossRef] [Google Scholar] [Publisher Link]
[16] Miriam J. Metzger, and Andrew J. Flanagin, “Credibility and Trust of Information in Online Environments: The Use of Cognitive Heuristics,” Journal of Pragmatics, vol. 59, pp. 210-220, 2013.
[CrossRef] [Google Scholar] [Publisher Link]
[17] Bonnie A. Nardi, and Vicki O’Day, Information Ecologies: Using Technology with Heart, MIT Press, 1999.
[Google Scholar] [Publisher Link]
[18] Heather L. O’Brien, and Elaine G. Toms, “What is User Engagement? A Conceptual Framework for Defining User Engagement with Technology,” Journal of the American Society for Information Science and Technology, vol. 59, no. 6, pp. 938–955, 2008.
[CrossRef] [Google Scholar] [Publisher Link]
[19] Eli Pariser, The Filter Bubble: What the Internet is Hiding from You, Penguin Press, 2011.
[Google Scholar] [Publisher Link]
[20] Frank Pasquale, The Black Box Society: The Secret Algorithms That Control Money and Information, Harvard University Press, 2015.
[Google Scholar] [Publisher Link]
[21] Emilee Rader, Kelley Cotter, and Janghee Cho, “Explanations as Mechanisms for Supporting Algorithmic Transparency,” Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp. 1-13, 2018.
[CrossRef] [Google Scholar] [Publisher Link]
[22] Brad R. Rawlins, “Measuring the Relationship between Organizational Transparency and Employee Trust,” Public Relations Journal, vol. 2, no. 2, pp. 1-21, 2008.
[Google Scholar] [Publisher Link]
[23] Richard M. Ryan, and Edward L. Deci, “Self-Determination Theory and the Facilitation of Intrinsic Motivation, Social Development, and Well-Being,” American Psychologist, vol. 55, no. 1, pp. 68-78, 2000.
[CrossRef] [Google Scholar] [Publisher Link]
[24] Douglas Schuler, and Aki Namioka, Participatory Design: Principles and Practices, Lawrence Erlbaum Associates, 1993.
[Google Scholar] [Publisher Link]
[25] Jesper Simonsen, and Toni Robertson, Routledge International Handbook of Participatory Design, Routledge, 2013.
[Google Scholar] [Publisher Link]
[26] Cass R. Sunstein, #Republic: Divided Democracy in the Age of Social Media, Princeton University Press, 2017.
[Google Scholar] [Publisher Link]
[27] Matteo Turilli, and Luciano Floridi, “The Ethics of Information Transparency,” Ethics and Information Technology, vol. 11, no. 2, pp. 105-112, 2009.
[CrossRef] [Google Scholar] [Publisher Link]
[28] Sam Wineburg, and Sarah McGrew, “Lateral Reading: Reading Less and Learning More When Evaluating Digital Information,” SSRN, 2017.
[CrossRef] [Google Scholar] [Publisher Link]
[29] Samuel C. Woolley, and Philip N. Howard, “Political Communication, Computational Propaganda, and Autonomous Agents,” International Journal of Communication, vol. 10, pp. 4882-4890, 2016.
[Google Scholar] [Publisher Link]
[30] Shoshana Zuboff, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power, 2019.