After constant criticism, Facebook has now decided to take concrete measures to better protect children and young people in its Instagram subsidiary. As the social network points out, the main goal is to make it more difficult for underage users to find potentially malicious contacts. In addition, it will be more difficult for advertisers in the future to automatically filter and process targeted groups of young people.
“Where we can, we want to protect young people from approaching adults they don’t know or don’t really want to hear from,” says the official Facebook blog. Therefore, a number of changes have been identified and the incorporation of new features on Instagram. It will be rolling out soon, but initially only in Australia, France, Japan and the United States.
The most significant change concerns the creation of user accounts. If the user is under the age of 16, a special account will be automatically set up for him, which is difficult to track for potentially suspicious contacts as well as advertisers. We believe that private accounts are the right choice for young people. But we also know that some want to have public accounts in order to build a following. We have to find the right balance,” they confirm on Instagram.
In the case of young people and data protection activists, the announced changes to the photo and video sharing app were received somewhat ambiguously. While some speak of “real improvements”, others criticize that the new functionality has not been vetted by any official auditing body and that the portal as a whole is still doing too little to protect its minor members from malicious communications and content. (pte/cbe)
“Award-winning music trailblazer. Gamer. Lifelong alcohol enthusiast. Thinker. Passionate analyst.”