Microsoft's Bing search engine readily displays child pornography images and even suggests related keywords and images, an investigation by TechCrunch found.
What led to this?
Following an anonymous tip, the news outlet teamed up with the online safety startup AntiToxin to investigate Bing's problem from Dec. 30 to Jan. 7. What they found was alarming.
The report comes with this disclaimer:
[WARNING: Do not search for the terms discussed in this article on Bing or elsewhere as you could be committing a crime. AntiToxin is closely supervised by legal counsel and works in conjunction with Israeli authorities to perform this research and properly hand its findings to law enforcement. No illegal imagery is contained in this article, and it has been redacted with red boxes here and inside AntiToxin's report.]
Searches for terms such as "porn kids, porn CP (a known abbreviation for 'child pornography) and nude family kids" all surfaced illegal child exploitation imagery," the report states.
AntiToxin also found that Bing caused people to stumble upon the disturbing and illegal images even if they were not searching for them.
One of the searches was "Omegle Kids," referring to an app popular with teenagers. When the term was typed into the search engine, auto-complete suggestions included "Omegle Kids Girls 13," which led to a collection of child pornography, researchers found. If the images were clicked, more images were populated in the "similar images" feature. Additional child abuse pictures popped up under a search for "Omegle for 12 years old." That term led to a suggestion to search for "Kids On Omegle Showing."
Researchers concluded that Bing is actively "assisting pedophiles," more so than Google. The searches also suggest that juveniles are also being led to the images.
The report blasts the tech company, stating: "Microsoft Bing must invest more in combating this kind of abuse through both scalable technology solutions and human moderators.There's no excuse for a company like Microsoft, which earned $8.8 billion in profit last quarter, to be underfunding safety measures."
What was Microsoft's response?
A Microsoft spokesperson told the news outlet that it has fixed the problem and is blocking any similar searches from Bing. Another check by AntiToxin found that while search terms were cleaned up, others still lead to the illegal content.
"We index everything, as does Google, and we do the best job we can of screening it," the spokesperson said. "We use a combination of PhotoDNA and human moderation but that doesn't get us to perfect every time. We're committed to getting better all the time."
Microsoft did not say whether more human moderators will be employed.
The spokesperson also reportedly said:
"I sort of get the sense that you're saying we totally screwed up here and we've always been bad, and that's clearly not the case in the historic context. The truth is that it did totally screw up here, and the fact that it pioneered illegal imagery detection technology PhotoDNA that's used by other tech companies doesn't change that."
TechCrunch has also reported on groups that were trading the child exploitation images on WhatsApp, along with third-party Google Play apps that make the groups "easy to find." The reports led to WhatsApp banning groups and their members.
Also, Google removed the WhatApp group discovery apps from Google Play. And Google and Facebook blocked the apps from displaying their ads. Facebook agreed to issue refunds to advertisers.
"Speaking as a parent, we should expect responsible technology companies to double, and even triple-down to ensure they are not adding toxicity to an already perilous online environment for children," AntiToxin CEO Zohar Levkovitz told TechCrunch. "And as the CEO of AntiToxin Technologies, I want to make it clear that we will be on the beck and call to help any company that makes this its priority."