top of page

Navigating the Digital Divide: Bill C-63 and the Evolving Tapestry of Online Liberty

Sarah Bensetiti

Secretary


Via News Canada


On February 26, 2024, Bill C-63, otherwise known as the Online Harms Act, commenced its legislative journey in the House of Commons of Canada. This milestone event signifies the initiation of a legislative process aimed at crafting a robust regulatory framework, specifically tailored for the digital domain, particularly social media platforms and associated entities. Currently undergoing its second reading, the bill has sparked a flurry of discussions, with various stakeholders expressing concerns regarding its impending implementation.


The primary objective of Bill C-63, as it stands, is to address the proliferation of harmful content on online platforms, particularly within the realm of social media. It seeks to establish mechanisms to hold both platforms and users accountable for the content shared, with the overarching goal of creating a safer online environment. The bill targets harmful content categories such as misinformation, hate speech, non-consensual intimate imagery, and other forms of digital abuse, with a focus on protecting vulnerable groups.


At its core, the bill aims to implement stringent content moderation measures, requiring social media platforms to actively monitor and remove content that incites violence or victimizes users. It pays special attention to protecting minors and survivors of sexual trauma, including the prevention of non-consensual, digitally altered, intimate imagery, such as deepfakes. The legislation also emphasizes the responsibility of social media operators to safeguard users and limit the spread of harmful content. To achieve this, companies will be required to provide users with reporting tools and adopt effective measures to mitigate the presence of harmful content.


Moreover, Bill C-63 institutes a framework for these safety protocols to undergo scrutiny by the Digital Safety Commission of Canada, a regulatory entity charged with supervising their enactment. Indeed, the legislation mandates the establishment of three novel regulatory bodies: the Digital Safety Commission of Canada, tasked with monitoring the accountability and transparency of social media operators; the Digital Safety Ombudsperson, dedicated to addressing systemic concerns surrounding online content moderation and advocating for public interest in online safety by serving as a resource for users; and the Digital Safety Office of Canada, designed to support the aforementioned bodies in fulfilling their mandates. The Digital Safety Commission of Canada is empowered with the authority to receive and manage user grievances regarding a platform’s dereliction of its duties. In addition, the Commission is authorized to expunge any deleterious content and is mandated to formulate fresh safety standards while providing educational resources to mitigate the dissemination of harmful content online. Fundamentally, the bill signifies a significant stride towards augmenting accountability and transparency in the digital realm, with the ultimate goal of cultivating a safer online milieu for all users.


The Bill redefines hate crime and hatred to delineate the liable parties for such transgressions online. It stipulates that any infraction under the Criminal Code qualifies as a hate crime when motivated by animosity predicated on race, nationality, ethnicity, language, colour, religion, gender, age, mental or physical disability, sexual orientation, or gender identity and expression. Depending on the severity, such offenses could incur life imprisonment and could be subject to international prosecution and charges. The penalties are even more stringent for hate propaganda offenses.


Furthermore, the Bill addresses the enforcement of extant laws pertaining to crimes involving the sexual exploitation of children by conferring new regulatory powers to curtail the proliferation of such content online. It also contemplates establishing a regime of administrative monetary penalties to be levied on social media operators failing in their obligations. Depending on the nature and magnitude of the violation and the history of compliance with the legislation, monetary penalties may encompass 6% of the gross global revenue of the offending party or $10 million in restitution from the social media platform itself.


Despite being in its incipient stage, the government has accorded significant importance to this Bill, indicating its probable enactment into law by year’s end. However, despite its ostensible aim of enhancing online user protection, the bill has encountered tepid reception from the Canadian populace.


The Canadian Constitution Foundation has underscored the Bill’s deficiencies and injustices, asserting that individuals could face life imprisonment merely for expressing words online and that preemptive measures, such as ankle monitors, could be imposed on individuals deemed likely to commit hate crimes in the foreseeable future based on “reasonable grounds” (which may be totally unreasonable). Moreover, individuals could be held accountable and penalized for online content flagged by others as harmful. The Foundation contends that these measures constitute blatant censorship and encroach upon freedom of expression due to their nebulousness and draconian penalties.

In summary, Bill C-63 aims to enhance online safety through new regulatory bodies and stricter penalties for hate crimes and child exploitation. However, concerns over potential censorship and vague provisions have sparked debate. As the bill progresses, finding a balance between security and freedom of expression remains crucial, and people must be wary of its implementation.

0 comments
bottom of page