Yubo logo
Safety

The state of online safety

5 minutes read

- Written by Yubo Team

Yubo Safety Board of Advisors

The state of online safety

Having grown up with digital technology, young people are no longer going online, they simply are online. Nearly all teenagers in the US (95%) have access to a smartphone and the majority of children in 19 European countries use their smartphones ‘daily’ or ‘almost all the time’. Almost half of Gen Zs in the US spend more time interacting with others on social media than in the physical world and 16 to 24-year-olds in the UK typically use nine online communication sites or apps on a regular basis.

The digital world offers young people a way to stay in touch, a place to find information, an opportunity to bond with others, a platform for having fun and somewhere to reach out for help. They are hyper-connected and hyper-engaged but many are also anxious, confused and in need of support. With technology evolving at such a fast pace, what are the potential risks for young people and how can social media companies encourage the responsible and safe use of their platforms?

Here at Yubo, we take a proactive approach to protecting, supporting and educating our younger users, who make up an important part of our user base. We’re guided by seven international safety and wellbeing experts in the digital space who, as members of our Safety Board, provide insights and guidance on the main concerns of the day to ensure that Yubo remains a safe and positive platform for our users.

 

Ahead of a Safety Board meeting at our Paris headquarters this summer, we spoke to each of the board members about the current state of online safety. This is a huge area with a broad spectrum of issues and we can’t cover everything in a single article so we’ve focused on three key challenges in this piece and we’ll be undertaking a deep dive into these issues in future articles.

1. Bullying and hate speech

Name calling, false rumours, physical threats… what happens on the playground also happens online. According to Pew Research Center, nearly half of teenagers in the US (46%) have been bullied or harassed online and, in a UK survey by The Diana Award, two fifths of young people said they have been subject to bullying behaviour, with 22% saying it happened on social media.

As Alex Holmes, Deputy CEO, The Diana Award, explains, “Online bullying often starts with bullying at school. When young people were at home during the COVID-19 pandemic, there was less online bullying because the playground drama had stopped. So, it’s important to look at the bigger picture.”

Instances of bullying or harassment that are based on prejudice, race, religion, sexual orientation, disability or gender identity (known as hate incidents) are on the increase. Participants in a study by the UK communications regulator Ofcom said that being exposed to online hate is a common feature of their online experience and that the frequency of exposure often increased around particular events, such as the Euro 2020 football tournament. 

“Young people might see racist or misogynist comments on social media and not have the skills to handle the situation,” comments Alex. “This is a real challenge, especially if the person who made the post is an influencer and has clout within their peer group. Even if a social media provider bans someone for hate speech, the community members may still share their content.”

In an article about Safer Internet Day 2023, Anne Collier, Founder of Net Family News and Executive Director of The Net Safety Collaborative, quoted a report by Project Rockit and the Young & Resilient Research Center, which says that online safety instruction must be about more than extreme harms, such as cyberbullying and grooming. It should focus on the complexities of “social relationships and daily dramas” too. “Growing young people’s social literacy with social-emotional learning is more important than ever” says Anne. 

She points out that media literacy is also extremely important for reducing harm in the era of misinformation and disinformation. “Teachers play a key role here but it’s a complex area and they have so many other things on their plate. I’d like to see media literacy incorporated into History, Social Studies and other parts of the curriculum and for technology companies to invest in more media and digital literacy programmes.”

Digital literacy and online safety are not just the responsibility of schools, of course. Parents are encouraged to take an interest in what their children do online, help them to develop social skills and create safe spaces for discussing any concerns. 

“It’s not about adults taking control,” comments Anne. “It’s about letting young people know that we trust them and want them to develop their agency and be in control of their digital experiences.”

Dr Richard Graham, Consultant Child and Adolescent Psychiatrist, adds, “It’s also about helping young people to understand and process these experiences so they can learn from them and become more resilient over time.”

Social media providers also play an important role in the development of social skills online and resilience in the face of digital challenges. Real-time interactions and awareness campaigns are powerful tools, for example. 

Alex notes that young people will, of course, make errors of judgment (e.g. posting negative comments based on someone’s appearance) and that it’s important they learn from them. 

“In-app ‘nudges’ by social media providers remind the user to behave more responsibly and also send an important message to the rest of the community,” says Alex, whilst Richard describes these nudges as “helpfully disruptive” as they provide an opportunity for young people to reflect on what they are about to do and surface the need to consider others. 

On Yubo, for example, users receive a pop-up alert if they are about to share personal information in private chat (e.g. if they type a phrase such as “I live in London” or “My phone number is…”) to make them think twice before sharing these details. 

2. Health and wellbeing

Research by ySkills, McKinsey Health Institute and others suggests that negative digital experiences have a negative impact on young people’s overall wellbeing.

“Technology companies need to understand how online risks, such as bullying and pressure from influencers and peers, might become harms,” comments Richard. “The intersection of life online and mental health highlights the risks, especially for those struggling with self-harm and eating disorders.”

Richard points out that digital technology used to have a “pull” focus (people would go online to find information and connect with others) but is now more about “push” (algorithms providing recommended content based on previous engagement or what is trending) so young people are much less able to control what they see online and may feel powerless, which can have a negative impact on their mental health. Fortunately, what happens on Yubo is not decided by algorithms and Yubo users therefore have considerable agency.

“Growing up in the digital age can be hard,” notes Anne. “It can feel like being in a fishbowl with the minutiae of your daily life available for so many people to see and everyone around you seeming so happy – you need to be resilient.” 

Digital resilience is gained through experience, including collective experiences. Indeed, a recent study by the University of East Anglia argues that it needs to be a “collective endeavour” that involves the young person, their family and their school as well as policymakers, governments and technology companies.

“There are many positives to being free to navigate the digital world on your own but what young people also seem to want is that one person – perhaps a parent or teacher – who can help them make sense of what’s going on,” says Richard. “The algorithms that push what appears in social media feeds are complex so it can be a challenge for families and schools, but the child’s psychological and physical safety is paramount.”

The good news is that many young people understand how social media might affect their mental health. Recent research shows that they choose to take regular breaks from their favourite apps and often seek out online content and services to improve their mood and manage any feelings of anxiety.

Annie Mullins OBE, Independent Safety Advisor, comments, “Yubo and other social media platforms have to understand that teenagers are vulnerable. Of course, they’ll push boundaries and make mistakes, that’s only natural, but there should be digital spaces in which they feel safe, and they should be signposted to wellbeing support if they need it.”

3. Child sexual abuse and exploitation

The volume of child sexual abuse material (CSAM) on the internet has increased significantly in recent years, exacerbated by the COVID-19 pandemic and rise in video content. In 2022, the National Center for Missing & Exploited Children (NCMEC) received more than 32 million CyberTipline reports containing images, videos and other content related to suspected child sexual exploitation. It escalated over 49,000 of these to law enforcement as urgent reports that involved a child in imminent danger.

“This is only the tip of the iceberg,” comments John Shehan, Vice President, NCMEC. “One recent trend we’ve identified is boys being enticed by fake profiles to share indecent images of themselves and then being extorted for money. We also need to keep an eye on the use of new technologies, such as generative AI, that are being used to create child sexual abuse material.”

He adds, “Technology companies must work with NGOs and law enforcement agencies to disrupt this kind of activity – only a handful are doing enough at the moment. Proposed legislation that requires companies to proactively monitor and report CSAM will massively supercharge children’s safety online.”

With research by Thorn revealing that 40% of children online have been approached by someone who they thought was attempting to “befriend and manipulate” them, safety by design is more important than ever. 

As Travis Bright, who works closely with NCMEC, says, “You can’t bolt on safety and privacy later. It has to be designed into your product at the very beginning, in the same way that anti-spam and malware tools are. One positive development in recent years is the introduction of image classifiers and AI models to detect grooming conversations and CSAM. These are already precise and will improve over time.” 

Thorn and the Tech Coalition recently announced a joint initiative to develop this kind of best-in-class technology, which detects and categorises when risky online content or behaviour falls into defined “classes” related to grooming (e.g. exposure to sexual material or someone seeking to meet up with a minor in person). On the rare occasion a grooming conversation occurs, the technology alerts a safety specialist who then checks if the young person is safe. The rest of the time, the conversations occur in private, without any moderator review. 

Mick Moran, Former Assistant Director of INTERPOL (the international organisation that facilitates worldwide police cooperation and crime control), also encourages social media companies to provide robust reporting mechanisms and to work closely with law enforcement. “You need to have the right processes in place to identify potentially criminal activity and deal with reports from users. There’s already good collaboration at a national and global level but I’d like to see industry strengthen its relationship with NGOs and law enforcement agencies even further so they can help to prevent more crimes from happening in the first place.”

To help stop the spread of nude, partially nude or sexually explicit images and videos of under-18s, NCMEC has launched a service called Take It Down that uses hash values (where an algorithm assigns a unique ‘digital fingerprint’ to an image). “This is a really exciting, much-needed development,” says Anne, who wrote an article about the new service earlier this year. “Finally there’s hope – much more than hope, actually – for teens to get discreet, tangible help in getting nude photos taken down.” 

Governments are also taking action – for example, the European Commission has proposed new legislation to help detect, report and prevent online child sexual abuse and support victims. With the latest Global Threat Assessment revealing that over 62% of CSAM online is hosted on servers based in the EU, the WeProtect Global Alliance and many others welcome this proposal.

————————————————————————————————————------------------------------------------------

Every member of the Yubo Safety Board is a passionate advocate for online safety and works hard to help young people thrive in their digital spaces. 

During our recent conversations, we covered a broad range of topics, including the importance of respecting children’s rights in the digital world, the role of age verification and the impact of Artificial Intelligence (AI). We also discussed how governments are introducing stricter legislation, such as the UK Online Safety Bill and the French government’s new bill to secure and regulate the digital space.

We’ll delve into these issues and take a closer look at specific elements of Yubo’s safety strategy in future articles but, for now, we’ll leave you with a few final comments from our Safety Board.

“There’s only so much that technology companies can do – the whole of society (parents, teachers, government…) has to take the digital world seriously and help young people to feel safe and able to express who they are.” Travis Bright

“I’m always impressed when technology companies put safety first. It shouldn’t simply be a box ticking exercise. They should actively seek out bad behaviour and stop it.” Mick Moran

“The climate emergency, the war in Ukraine, the cost of living… young people already have so much to deal with. The least we can do is help them to navigate their digital spaces with confidence and resilience.” Annie Mullins OBE

Watch our video of the Yubo Safety Board visit to our Paris headquarters:

Keep an eye on our website for the next in this series of articles.


You may also like:

Friendship

Let go of your expectations to find unexpected connections on Yubo

timeIcon
3 minute read
Company

Hosting A Women in Tech Event at Yubo: A Collaboration With OutGeekWomen

timeIcon
3 minute read
Yubo logo

About

  • About us
  • Newsroom
  • Careers
  • Contact us
InstagramYoutubeTikTokLinkedIn
  • Terms
  • Legal Notice
  • Law Enforcement
  • Do Not Sell or Share My Personal Information
Made with love in Paris