In today’s world, there’s no question that digital platforms play an extremely influential role when it comes to inspiring creativity, enabling freedom of expression and building a strong community. While the internet provides us with many opportunities to freely exchange ideas and connect with others, the principle of freedom of expression is under intense scrutiny as platforms look to ensure they remain an inclusive and safe space for their users.
This industry faces an increasing responsibility to ensure the right voices and content is being spread and heard, and it’s not one that should fall on content platforms alone.
Leading digital platforms have begun creating third-party councils to develop forward-looking policies that not only address the challenges of today but also help plan ahead for the next set of issues the industry will face.
TikTok, for instance, has created an APAC Safety and Advisory Council which I sit on, alongside other leading legal, regulatory, and academic experts to provide advice on content moderation policies and trust and safety issues specific to the APAC region. The Council will provide subject matter expertise and advise on TikTok’s content moderation policies and practices to help shape regional and global guidelines.
Policies related to free speech and censorship
While today’s leading digital platforms all take a different approach to democratising content, allowing it to be developed, shared and consumed more easily, not all online content is appropriate or safe. For this reason, platforms must establish clear community guidelines and create forward-looking policies that will mitigate the spread of harmful content.
Most platforms agree that dangerous individuals and organisations should not be allowed to spread hateful ideologies or illegal activities, as well as violent and graphic content, content related to self-harm and dangerous acts, hate speech, harassment, sexually explicit, or misleading content. However, addressing these existing and emerging issues can be difficult as platforms are scrutinised for their moderation guidelines.
Also Read: Cybersecurity threats on the rise as companies shift to the WFH model
To provide more transparency into how platforms are keeping users safe through moderation practices, platforms such as TikTok have begun to develop Transparency Reports providing insight into how it responsibly responds to data requests and protects intellectual property. The Council’s mission moving forward is to help outline TikTok’s approach to policies to protect the safety of its community members across the APAC region, while maintaining full transparency to its users.
As a diverse group of legal, regulatory, and academic experts, we believe one of the best ways a platform can keep its users safe is by empowering the community with tools and education.
Policies related to online safety
The most important commitment the industry faces is to keep its community members safe. This is a challenging but critically important area for the industry to get right, and platforms should look to approach the protection and safety of their users through policies, product, people, and partners.
From a policy perspective, platforms should be steadfast in their commitment to immediately remove content, terminate accounts, and report harmful cases to law enforcement as appropriate. They should also build strong safety controls, and invest heavily in human and machine-based moderation tools, as well as work with third parties to identify and remove hateful content accordingly.
As external Council members, our primary focus is to identify and solve challenges related to children/underage kids, digital literacy, mental health and human rights. We are a diverse group of experts comprising of backgrounds in IT, digital safety and literacy, intellectual property and internet law, and advocates of child safety, women and other marginalised groups, who are committed to addressing these challenges.
Community effort to make digital platforms a safe space for all
If we exclusively put the onus on platforms to keep communities safe, we will fail. Policymakers, regulators, the platform and its users all have a stake in making digital platforms a safe space for all. Though we all come from different cultural and professional backgrounds and may provide differing opinions on how to keep the community safe, we will work together to spot gaps in content moderation policies and provide advice on the best path forward.
Also Read: VNG sues TikTok over alleged copyright infringement in Vietnam: Reuters
The road ahead isn’t going to be an easy one, per se, but it will be worthwhile as we collectively work together to tackle industry-wide issues. Our Council has long been committed to serving the online community in our own individual capacities.
Now, we’re looking forward to uniting in this endeavour to make the internet safer for users all across the APAC region, taking diverse cultural, religious, and other social nuances into account.
–
Editor’s note: e27 aims to foster thought leadership by publishing contributions from the community. Become a thought leader in the community and share your opinions or ideas and earn a byline by submitting a post.
Join our e27 Telegram group, or like the e27 Facebook page
Image Credit: Jon Tyson on Unsplash
The post How content platforms can work with the community to make online spaces safer for all appeared first on e27.