The pedophile problem of the Internet companies (Update)
In March 2019, Youtuber Matt Watson hit the headlines when he showed in a Youtube video how pedophiles use legal videos to heat themselves up. As a result, corporations stopped or reduced advertising on the platform. Youtube itself deleted over 400 channels and millions of comments.
But that doesn’t solve the problem. Only a part has been tackled. Let’s just go away from Youtube and dedicate ourselves to other interfaces, because access to children’s data is completely free to use in other ways. After the Watson video, I wanted to know if there was more. And it works. Neither the TOR browser nor other dark corners on the Internet need to be visited.
And to say it in advance: The data is usually available willingly published. But the dangers are underestimated.
How can you get pictures that certain groups of people could use for their own desires and perhaps worse?
That’s easy. You simply call up the websites of the major search engines. You all know them: Google, Bing, Startpage or DuckDuckGo. Our favourite search engines. Either we choose the tab Images (Pictures, Images etc.) before, or we click on it after executing the search. The result is then more than clear. A paradise for lovers of small children.
GOOGLE (picture search directly)
DUCKDUCKGO (search engine directly. After the search select tab images.)
BING from Microsoft (Direct to image search)
STARTPAGE (direct to image search)
Now we only play through a relatively harmless example and enter the following words into the search masks:
girl small young (German: Mädchen, Klein, Jung)
As parents, you don’t want to see your children in all the results, because the whole world can see them.
Of course, photos come from different categories: Models, stars and private pictures. Everything mixed. The pictures of the persons shown are not protected against abuse. Clear poses, sexy outfits, snapshots. Catalogue photos from alibaba.com, private websites or social networks. Clicking on a photo also takes you to a website – including family sites -, social networks and thus to real people, with their names, origin, place of residence and perhaps even more data. That’s the Internet. Also because many users simply misuse the own data. Limiting the search to other countries brings new results.
On Google, direct word combinations that represent sexual acts were apparently blocked. Maybe they have learned a little from the past. The results in the test are also more limited compared to the other search engines, but no less critical. Not so with DuckDuckGo, BING or Startpage. More pictures. More choice. A paradise for lovers of tendencies, which we are rightly not allowed to tolerate socially. But many people feed these interests unconsciously.
If we’re upset about pedophile comments on Youtube, then we should be upset about the infinite number of images we all have unprotected access to. Unless we, as a society, accept that this data treasure exists. But we only intervene if someone likes it (e.g. comment on Youtube).
We should answer that question.
Who has the possibility to activate the secure search on his firewall or router or by special software or has directly switched on a youth protection filter, can hope that no user in his own network can execute such searches. The problem is not unknown.
But all users are obliged not to post private pictures that can be abused online. Whether commercial websites such as catalogues or shopping sites share their images for image search, they must decide. But it does not make sense and is not necessary. For private persons so or so not. Or would you show every person on the street pictures of your family in swimwear? Or even naked? Certainly not.
Website operators can enter in the robots.txt file in the root directory for example:
User agent: Googlebot image
Google would no longer include images in its search.
Basically such pictures do not belong at all in the internet.
The search engines also offer video search.
Other word combinations, from which each normal humans because of fear would shrink back, can likewise lead to results, with which the operators of the search engines possibly already break law. At least the picture search in the search engines can be used actually only starting from 18 years.