Sextortion against children on Facebook and Instagram – Meta with new tools

By | September 27, 2023
Meta unveils privacy and messaging changes for teenagers

Meta has introduced some prevention tools after it emerged that the number of incidents of child sexual exploitation (sextortion) online has increased by 265 percent in recent years. The company plans to enable such safety tools by default in Facebook apps for younger users (18 in some countries, 16 in others).

META is encouraging teen Facebook users to turn on new privacy features. From who can see their friend list, who can view tagged posts, who can comment on public posts, to the pages and content they follow, Meta tries to make things more secure. About a year ago, the company’s app, Instagram, introduced these changes. The new tools are aimed at protecting young users from the ploys of crafty adults.

∙ The data is shocking

According to data from America’s National Center for Missing and Exploited Children, the number of children who were victims of exploitation rose from 12,070 in 2018 to 44,155 last year. That is a 265 percent increase. This data has been compiled in collaboration with public sector organizations and private organizations that are internet service providers.

∙ What is sextortion?

The FBI defines sextortion as when someone asks you to provide nude photos and videos, to make sexual concessions, or to pay money in exchange for not sharing your private images or other personal information. The FBI considers this a serious crime. Meta Company already prohibits adults from viewing messages and the like from teenagers they don’t know. The company tries not to include them in the People You May Know category. The upcoming tools aim to prevent messaging with adults with questionable practices. Such people will not be shown in the People You May Know category of teenagers.

See Also :   Smartwatch sales in India continue to grow, with 56.4% growth in the wearable market

∙ Who are adults with questionable practices?

Any child reporting an adult’s account will be treated as an adult with questionable practices. Child blockers will also be included in this category. The company is also testing the removal of the ‘message’ button from teenagers’ Instagram accounts as part of a further defense. The message button will be removed when children’s accounts are viewed by an adult with questionable practices.

∙ Efforts to make Facebook usage more comfortable

Facebook has started asking teenagers to report people they don’t like. The company believes that this will make children’s use of Facebook more comfortable. The company is also trying to make reporting tools easier to find. Facebook thinks that this has started as a success. Because in the first quarter of 2022, such reports by minors increased by 70 percent, according to the company. This has been found by comparing the figures for the same period of the previous year. These are the figures on Messenger and Instagram.

If a teen bans someone, Meta immediately sends child safety procedures. According to the company, such messages were sent to 10 crore people on Messenger in just one month last year. The company has created a variety of tools to let kids know about disturbing things on Facebook. Meta’s global head of security, Antigone Davies, says they have taken several steps to encourage reporting such issues to the company. As a result of this, Facebook says that the number of children reporting such incidents to the company has increased by 70 percent.

See Also :   Feature Phone: Nokia 2780 Flip Launched, Price?

∙ Parents should know what sextortion is

They are taking many security measures. The company also says that parents and guardians should be aware of what constitutes sextortion. Parents should be able to talk openly about this with their teenage children. The company also says that children should be able to talk about this without upsetting them.

∙ Experts say Horizon Worlds should be more feared

Apps run by Meta Company are popular. But cybersecurity experts say the situation with the company’s virtual reality app, Horizon Worlds, is more troubling. Horizon Worlds is for those 18 years of age or older. But one thing is clear from its reviews – kids who come to the app interact with adults there is Sera Gardner, vice president of Thorn, an organization that works to protect children online, told the Washington Post that when apps like Horizon Worlds are introduced to children, they may be among the first to jump on board.

Predators will try to capitalize on children entering an environment where there is little protection. What makes such people happy is that no adequate reporting system has been implemented in such apps. They will think this is a good place to groom children. Grooming is an attempt to turn children into listeners.

∙ Meta was fined 40 crore dollars

In September this year, Ireland fined Meta $400 million for failing to keep children safe. Children’s accounts were fined for defaulting to Public.