As noted in our blogs earlier this week the IICSA investigation into the internet is well underway.
The internet plays such a huge role in all of our lives, not least for Generation Z – the post-millennial generation, born 1995 to 2010 – who most likely will not have known a time when it did not exist. Many worrying stories continually dominate the headlines where technology seems less of a boon (curiously, the name of an app which ranks higher in Google’s search engine than the word itself) in the lives of children and young people.
As the oft quoted Spider-Man phrase goes, with great power comes great responsibility. With such a wealth of, well, just about everything, at our children’s and young people’s fingertips, we are right to be concerned about children’s access to the all-pervading global network that is the internet.
According to The Children’s Commissioner’s report Who Knows What About Me? of November 2018, 39% of eight to 11 year olds have their own smartphone, and 52% have their own tablet. Notwithstanding that the minimum age requirements for most social media platforms (such as Facebook, Instagram, Twitter, and Snapchat) is 13, over half of the UK’s 11 to 12 year olds have a profile on a social networking site. On average, children post to social media 26 per times per day – which would equate to nearly 70,000 posts by age 18.
In sharing all this data, children may become vulnerable to identity fraud particularly if posting publicly on platforms, to being groomed, or may be at risk of profiling. Most social media platforms collect data such as current location and IP address, as well as the referring web page, pages visited, search terms and cookie information. This information is sometimes shared with advertising partners and affiliates and vice versa. This is nicely dressed up in privacy policies as being helpful in order to make the advertising more specific to the account user, but ultimately, a wealth of data is being collected without perhaps the account user being fully aware that this is being done, and why. For such a vulnerable market, the use of targeted adverts is a concern in itself.
Whilst a teenager may have little concern about Instagram retaining data about their latest boohoo.com order, the concern of course is what social media platform providers will be able to do now and in the future with the harvested data and the use of data analytics to profile the child’s anticipated or predicted behaviour. In the wrong hands this could be disastrous. As China’s social credit system has shown, we may not be too far away from this.
Article 17 of the General Data Protection Regulation (GDPR) provides EU citizens with the right to erasure, or “the right to be forgotten”, however, as one social media platform provider points out in their policy, search engines cache search results and therefore anything that might have been deleted from the social media platform, might still appear in search engines. This is a concern for children and young people especially, who might not fully appreciate the consequences of an innocuous post they may make on social media which could follow them around for the rest of their adult lives. Recital 65 of the GDPR is alive to such a risk, which acknowledges that children may give consent without being fully aware of the risks posed by the processing of such data, especially where content is posted on the internet.
Whilst domestic and EU laws and regulations are catching up with the problems posed by the internet, social media, and children’s safety whilst using these platforms, technology is developing much more quickly than the law is able to.
For example, there are many apps available which are able to hide items stored on devices that children do not want their parents to see, such as photographs. Secret Calculator is one such app which allows the user to hide images which do not appear on the camera roll. Ironically, children will feel that they have some privacy from their parents, but may not appreciate that the images will never truly be private if they are held on a device and held on an app’s data cloud storage.
Recitals 38 and 58 of the GDPR are clear that specific protection should be afforded to children relating to processing and transparency of the use of data, in that they may be less aware of the risks, consequences and rights in relation to processing of personal data. Interestingly, the use of secret apps may conflict with Article 8, particularly in cases where the app providers are unable to show they have made reasonable efforts to verify parental consent is provided.
It will be interesting to learn the findings of IICSA once their investigations have concluded. Empowering and educating children, together with policy change and tightening up of legislation to address specific dangers are likely to be the main ways of tackling the risks children face when using the internet.
Written by Amanda Munro, paralegal at BLM