Instagram will start notifying users their comments may be offensive before they are posted in an effort to curb cyberbullying. The company said Monday it started rolling out the artificial intelligence feature in the past few days.
In an example included in the company release, Instagram shows a user trying to comment “You are so ugly and stupid.” Instagram follows up with a message asking the user “Are you sure you want to post this?” with an “undo” button.
“From early tests of this feature, we have found that it encourages some people to undo their comment and share something less hurtful once they have had a chance to reflect,” Instagram said. To further help protect users from unwanted interactions, Instagram said it will start testing a new “restrict” feature.
Restricting a user will make it so the user’s comments are only visible to that person; a user will be able to choose whether or not to make the restricted person’s comments available to others by approving them.
Restricted users also will not be able to see when an account is active or when a person has read their direct messages.
The goal is to allow users a mechanism other than blocking or unfollowing accounts, which young users said could escalate situations, according to Instagram.
1 Comment
Pingback: TikTok Takes Aim at Cyberbullying with New Comment Care Mode - Innovation Village | Technology, Product Reviews, Business