The app bans adults from direct messaging to teens who don’t follow them, and makes safety claims that are displayed to teens when they text adults displaying suspicious behavior.
Security claims give teen users the option to report or block adults who text them.
The prompts remind younger users not to feel pressured to respond to messages and to be careful when sharing photos, videos, or information with someone you don’t know.
Notifications appear when Instagram moderation systems detect suspicious behavior from adult users.
The company does not share details about how these systems work, but says: Such suspicious behavior could include sending a large number of friend requests or messages to people under the age of 18.
The Instagram platform owned by Facebook says: This feature will be available in some countries this month and will be available globally soon.
The photo-sharing platform said it is also developing artificial intelligence and machine learning technology to try to discover a person’s age when signing up for an account.
The app officially requires users to be 13 or older, but it’s easy to lie about an individual’s age.
The company said: It wants to do more to prevent this from happening, without going into any details about how the new machine learning systems can help solve this problem.
New teen users who have signed up for Instagram are also now encouraged to make the account private.
And if they choose to create a public account, Instagram sends them a notification later highlighting the benefits of the private account and reminding them to check the settings.
As we move Instagram to end-to-end encryption, it is investing in features that protect privacy and keep people safe without accessing direct message content.