In a bid to “limit abuse,” the Facebook-owned messaging app WhatsApp introduced a new security feature Wednesday allowing users to control who is able to add them to groups.
In a statement on its website, the company said that the feature will be rolled out worldwide in the coming weeks for users who have the latest version of the app. The spreading of rumors can occur when users find themselves added to group chats without their express consent.
The group chat restrictions are not the only move the company has made to fight misinformation in recent days. In India, where national elections begin on April 11, the company just launched a fact-checking service.
The “checkpoint tipline” allows users to send in messages, photos and videos to check their veracity, according to the BBC.
Social media has been used to spread false information and exacerbate sectarian tension with deadly results in the nation, where there are over 200 million WhatsApp users. With elections looming, political candidates have created hundreds of thousands of WhatsApp group chats to spread political messages and memes.
The latest changes are not the first effort WhatsApp has enacted to fight misuse of its platform. After a story circulating in India’s Assam state led to mob violence in June, WhatsApp announced message forwarding limitations, which restrict the number of times a user is allowed to forward individual messages.
Other social media sites have taken similar measures. Facebook recently announced that it would act to reduce the spread of false information on its platform, blocking bogus accounts and hiring external fact-checking organizations to help with content moderation.