WhatsApp has blocked more than 1,30,000 accounts in just ten days. Because of sharing the inappropriate message like child pornography in private groups. Facebook-owned WhatsApp removed the account by using AI tools. It was found that the account was probably indulged in various illegal activities.
Additionally, WhatsApp has well shared the information in investigating child pornography they can get.
Measures were taken by WhatsApp
As we all know that messages on WhatsApp are end-to-end encrypted, which means WhatsApp can’t look into what users are sharing. But with the help of AI-based tools,
WhatsApp can look at unencrypted information like Profile Photos, and group information in order to flag off potential abusers. The company is using as well a technique which is called PhotoDNA. Facebook used this technique to identify pornography and abusive images. Now, WhatsApp is also using this technique to identify inappropriate images shared by users.
WhatsApp said, “Because they have zero tolerance policy around child sexual abuse”. Therefore, They deploy their most advanced technology. Including AI to scan profile photos and actively ban accounts suspected this inappropriate contents.
WhatsApp as well responded to how customers, finding child porn from third-party applications. which allowed them to look for a WhatsApp group. WhatsApp says they don’t provide such type of features in WhatsApp group. Now, WhatsApp is taking all the measures to limit this and will take action on it.
That’s all for today. Stay Tuned To TechBurner For More News.
Nikhil
January 14, 2019 at 8:41 pm
If this is true than good.
Sanu Sinha
January 14, 2019 at 8:47 pm
Yes Nikhil, it is true.