The following post was written and/or published as a collaboration between Benzinga’s in-house sponsored content team and a financial partner of Benzinga.
“Mom, Dad, I am now on social media!” These few words are strong enough to send a thrill down the spine of a parent. Research shows that among the things that scares parents, social media ranks at the top.
Social media networks house and spread a wide range of content, the majority being inappropriate for children. This may include explicit and violent contents and cyberbullying. Effective oversight and monitoring of what children are seeing is required; unfortunately, none of the popular social media platforms like Facebook and Instagram, both of which are owned by Facebook Inc. FB, offer such tools to parents or guardians.
But what if there were a way for parents or guardians to track what their children are doing on social media and filter the bad contents? Well, that's exactly why Grom Social was developed.
What is Grom Social?
Grom Social is a social network platform owned by Grom Social Enterprises Inc. GROM. The platform provides entertainment for children younger than 13, giving parents or guardians the ability to monitor their kids.
Grom Social delivers its content through both mobile and desktop environments that entertain kids, let them interact with friends, read news and play games, while teaching them to be good digital citizens. A group of people called Grom helpers monitor the platform by filtering inappropriate behavior or content.
What Special Features Does the Platform Offer?
According to the company, Grom’s No. 1 priority is children’s safety. The company believes this is vital to keep kids from the endless dangers social media has on their mental health and upbringing. The platform offers 4 levels of safety:
Content filtering: Every user-generated content (text, photos) goes through a filtering process at WebPurify. The human moderators at WebPurify flag any unsuitable content and make final decisions based on Grom’s standards.
Moderation: After the content has gone through the filtering system at WebPurify, a staff of trained people at Grom gives the content a 2nd round of checks for any immoral or too personal content that may be shared.
User reporting: Any content shared on the app can be reported by a user to alert the staff to double check whether it is against the company’s policies. If it is, the content is quickly removed.
Parent monitoring and control: Parents have the ability to monitor and control their child`s Grom account using an app called MamaBear.
Before children can join Grom and starts using any of the features that require submission of personally identifiable information (PII), they must have parent approval. The platform provides an extra layer of protection by requesting a one-time payment of $0.99 paid via the credit card of the parent to verify their identity.
On Oct. 22, Grom introduced new safety features that were only available on the company's parent companion app. These features include visibility control that allows parents to monitor their child's video posts, comments, chats and friend requests and also settings to control a child’s PII.
The preceding post was written and/or published as a collaboration between Benzinga’s in-house sponsored content team and a financial partner of Benzinga. Although the piece is not and should not be construed as editorial content, the sponsored content team works to ensure that any and all information contained within is true and accurate to the best of their knowledge and research. This content is for informational purposes only and not intended to be investing advice.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.