Attorney General Todd Rokita joined a bipartisan coalition of 52 attorneys general expressing strong support for the hearings being conducted by the U.S. Senate Committee addressing protection and safety of kids and teens using social media.
The negative impact of social media on youth has caught the attention of attorneys general across the country. Concerns have grown with the recent research from Facebook’s own internal studies showing that social media is inflicting harm—in the form of increased mental distress, bullying, suicide, and other self-harm—on a significant number of kids.
“A top priority of mine is to protect Hoosiers from harm, this includes mental distress,†said Attorney General Todd Rokita. “All of us need to work together to put a stop to Facebook using algorithms which exploit younger audiences, leading to increased suicidal ideation, self-harm, and bullying. I am not alone in believing the well-being of our youth is at stake.†Attorney General Rokita continued. “This issue impacts our children and future generational leaders.â€
The letter, which will be entered into the Congressional record, recognizes the hearings will uncover critical information about the business practices that social media companies are using to gain the attention of more young people on their platforms. Last week, in advance of the Congressional hearings, Facebook announced their intent to “pause†the project. The attorneys general believe the project should be abandoned altogether.
The attorneys general write that “More engagement by the user equals more data to leverage for advertising, which equals greater profit. This prompts social media companies to design their algorithms and other features to psychologically manipulate young users into a state of addiction to their cell phone screens.â€
In April, Attorney General Todd Rokita announced he was investigating whether Facebook, along with four other Big Tech companies, has potentially harmed Indiana consumers through business practices that are abusive, deceptive and/or unfair.