The Home Office along with the Department for Digital, Culture, Media and Sport (DCMS), has announced that the government will shortly introduce new online safety laws and stringent new standards for social media firms to enforce the online safety and privacy of citizens.
The new online safety laws will ensure heavy fines for social media firms and other online platforms if they fail to abide by their “duty of care” to keep their users safe and tackle illegal and harmful activity on their services. Aside from imposing fines, regulators will also be empowered to block access to erring online sites and platforms and impose liability on individual members of senior management.
“For too long these companies have not done enough to protect users, especially children and young people, from harmful content. That is not good enough, and it is time to do things differently.
“We have listened to campaigners and parents, and are putting a legal duty of care on internet companies to keep people safe. Online companies must start taking responsibility for their platforms, and help restore public trust in this technology,” said Prime Minister Theresa May.
“The tech giants and social media companies have a moral duty to protect the young people they profit from. Despite our repeated calls to action, harmful and illegal content – including child abuse and terrorism – is still too readily available online.
“That is why we are forcing these firms to clean up their act once and for all. I made it my mission to protect our young people – and we are now delivering on that promise,” said Home Secretary Sajid Javid.
New independent regulator to enforce online safety laws
The new online safety laws will apply to companies of all sizes, including social media platforms, file hosting sites, public discussion forums, messaging services, and search engines. Basically, any company that allows users to share or discover user-generated content or interact with each other online will come within the ambit of the new online safety laws.
The government has also published the Online Harms White Paper which, among other things, advocates the setting up of a new regulator to ensure companies meet their responsibilities. The regulator will have the power to force social media platforms and others to publish annual transparency reports on the amount of harmful content on their platforms and what they are doing to address this.
The regulator will also be able to ensure that online platforms and social media firms will be answerable for the use of their platforms for online harms such as inciting violence and violent content, encouraging suicide, disinformation, cyber bullying and children accessing inappropriate material.
Commenting on the government’s new Online Harms White Paper, Information Commissioner Elizabeth Denham said that the white paper proposals reflect people’s growing mistrust of social media and online services.
“People want to use these services, they appreciate the value of them, but they’re increasingly questioning how much control they have of what they see, and how their information is used. That relationship needs repairing, and regulation can help that. If we get this right, we can protect people online while embracing the opportunities of digital innovation.
“While this important debate unfolds, we will continue to take action. We have powers, provided under data protection law, to act decisively where people’s information is being misused online, and we have specific powers to ensure firms are accountable to the people whose data they use.
“We’ve already taken action against online services, we acted when people’s data was misused in relation to political campaigning, and we will be consulting shortly on a statutory code to protect children online. We see the current focus on online harms as complementary to our work, and look forward to participating in discussions regarding the White Paper,” she added.