Filtered By: Opinion
Opinion
LAW, ICT, AND HUMAN RIGHTS

Is Facebook Messenger Kids safe?


Schools use different messaging and video conferencing applications when communicating online, especially in the context of online learning. Among the most popular ones are Zoom and Google Meet. However, none of these apps are made specifically for children. Facebook knew this and did something about it.

This September, Facebook launched Facebook (FB) Messenger Kids, a messaging and video calling app designed for children aged 6 to 12 which aims to help them keep in touch with family and friends, and to participate in synchronous learning via a parent-controlled setting. The app boasts of a number of key features that allow parents or guardians to monitor and control their ward’s use of the app through the Parent Dashboard:

• Sleep Mode - enables one to set the predetermined downtime of the app.
• Recent contacts and chat history - lets one view the frequency of chats and video calls of the child with another user for a month.
• Log of images in chats - lets one view sent and received photos and videos, as well as remove and/or report any media deemed inappropriate.
• Reported and blocked contact history - gives one access to a list of reported and/or blocked users or content, and the reasons for such actions. It allows one to get notified via FB Messenger if the child reports or blocks a user.
• Remote device logout - authorizes one to log in and log out the child’s account in his or her device.
• Download your child’s information - allows one to request a copy of the child’s information, including contacts, and the messages and media he or she sent and received. The child is notified of the request through the app.

It also has opt-in friending features such as supervised friending, connecting with other kids through groups, making a kid’s name and profile photo visible to friends and their parents (all the way to the kids of the latter’s FB friends). That said, not all features have been rolled out worldwide as of this time.

For parents of minors, FB Messenger Kids is a gem. It gives them relief knowing who their child communicates with, while keeping him or her away from harm online. However, there’s more to the app than the perks it advertises. Like any other app out there, it is not spared from issues (both actual and possible), most of which are not immediately apparent from the standpoint of parents and guardians alike. Consider the following:

• Profiling. The app’s privacy policy indicates that the child’s personal data will be processed almost the same way FB processes those of adult users’. There may be less advertising, but perils associated with profiling remain. FB still gets to suggest friends to add, and sticker packs to download, based on the child’s activities on the app. When the company processes a child’s data at an early age and continues to do so for many years, it will build a profile of that child that may not be accurate. Remember that it is during childhood that we humans figure out a lot of things out ourselves. And yet, based on a possibly inaccurate profile, FB will still try to influence a child’s behavior and perspective.

• Diminishing space for self-expression and intrusion of privacy. We know the limitations of a child when it comes to making good, informed decisions. This, however, does not eliminate children’s right to privacy and their evolving capacity to decide for themselves as they grow older. Persistent and excessive parental monitoring of children’s online activities—which is definitely enabled and enhanced by the app—may diminish their safe space to express themselves, and could affect how their self-identity eventually develops.

• Vulnerability to hacking. FB accounts get hacked or compromised all the time. Since the Parent Dashboard is accessible through the parent’s FB account, the controls to the connected FB Messenger Kids (including the child’s personal data) will be equally at risk and accessible to hackers and other criminals. It’s still early to say, but the app itself may later turn out to be susceptible to the same threat.

• Co-opting consent as lawful basis for processing children’s data. Stricter rules apply when it comes to processing children’s personal data. This is why for most companies, securing the consent of parents or guardians is critical. FB, through the app, has found a way to do it by getting parents to buy into the idea that they actually need the app.


Unlike before, avoiding internet use these days is extremely difficult. The internet has become such a. huge part of our lives—including our children’s. Back in 2017, one in three internet users was a child, according to the United Nations Children’s Fund (UNICEF). With the current pandemic driving many people into online learning, this figure will have surely risen, with many children being forced to use platforms not really designed for them as users.

Many people have asked authorities to do something about this, and some already have. Just this 2 September 2020, the United Kingdom’s Information Commissioners Office (ICO) released the Age appropriate design: Code of practice for online services in order to help developers design online services that put the children’s best interests first. It will be fully enforced after a year.

Our own National Privacy Commission should also take a more active role in developing a similar Code for the privacy rights of children online which companies like Facebook, Google, and Zoom should be required to adhere to.

As we all move forward—even beyond this health crisis we find ourselves in—applications like FB Messenger Kids should be designed better, giving higher regard for the privacy of children by default. For starters, they should not use nudge techniques that would sway a child into providing more data. The companies behind these platforms should also inform child users how parental monitoring works in their products and how they can ask for help when they require assistance.

Parents stand to face bigger challenges, too. Their supervision will become more important, especially over younger children, with all these new technologies being developed and released every day. They need to adopt a healthy amount of skepticism towards the use of apps and the controls these give them, lest they do more harm than good to the well-being of their children. There has to be a balance between controlling children’s activities and allowing them some degree of autonomy and privacy. To achieve this, parents (and guardians, too) should be able to objectively assess their children’s evolving capacity to make their own decisions.

With each passing day, we grow more convinced that we cannot completely do away with technologies and platforms. They will become more integral to our lives, not less. What their developers, government regulators, and parents can do is to agree that the safety and privacy of children should always be the top priority and work from there. As the ICO has correctly pointed out, our shared goal should be to protect the child within the digital world, not from it.


Maris Miranda is a Certified Information Privacy Manager. A former member of the Privacy Policy Office of the National Privacy Commission, she now serves as a resource speaker and consultant on privacy and data protection.