What, if any, duty of care do online platforms owe to their content moderators? Should it make
Question:
What, if any, duty of care do online platforms owe to their content moderators? Should it make a difference if someone is an employee or a freelance ‘crowdsourced’ worker?
From an HRM perspective, what actions could be taken to protect the psychological safety and well-being of content moderators? The Internet facilitates faster communication, greater collaboration, the generation and distribution of knowledge, and education and self-improvement, to name just a few of the many positive outcomes associated with its development. Nonetheless, we would be remiss not to acknowledge the darker side of the Internet. This is often associated with the
‘dark’ or ‘invisible’ web, where a range of nefarious activities take place. However, mainstream search engines, such as Google, as well as popular platforms including Facebook, Instagram and YouTube, must also attempt to identify, evaluate and, where necessary, remove inappropriate content on the ‘visible’ web. This poses a significant challenge both pragmatically and philosophically.
Step by Step Answer: