Julie Inman Grant, Children's eSafety Commissioner, explains the world of 'hidden' social media messaging and its implications for children.
By Julie Inman Grant, Children’s eSafety Commissioner
Dark social may sound like the grim reaper is having conversations with your kids online, or anyone for that matter, but thankfully, it’s a little less ominous.
The terms ‘clear web’, ‘deep web’, ‘dark web’, ‘dark social’ and ‘disappearing media’ tend to be tossed around interchangeably so we’ve created a handy little glossary box for you to properly understand the web lingo below.
What is Dark Social?
Dark social refers to the private features of a social networking service. This includes closed groups, Facebook Messenger or Twitter Direct Messages, messaging apps like WhatsApp and Snapchat, and in-game messaging features found in games like Minecraft and Roblox.
Think of it like this: you see a great-looking restaurant and send a website link about the venue to a friend, using Messenger. This link is provided privately, so the information you sent can’t be tracked using web analytics (which measure and analyse web traffic). Voilá, you’ve just shared using dark social.
While dark social is a term used predominantly in the marketing industry, it has implications from an online safety perspective as there are both benefits in this type of communication, and a number of risks.
Moving Into the Light
You may think that the term ‘dark social’ sounds a little menacing and perhaps a better term for this communication is ‘hidden social’. Because, in reality, it’s simply communication that is hidden from public view.
The good news is that there are many benefits. Most notably, these services:
- are free, and provide instant and easy to use ways to communicate with friends and family, similar to email and text messaging and are often used for group chat
- allow links to be shared without being tracked by platform analytics
- offer the ability to share information that is not for public consumption.
What Are the Risks?
While there are many benefits, there are also some real risks.
The hidden nature of these platforms makes it difficult to monitor and regulate the content that’s being shared between users. Dark social can also facilitate conversations with strangers, which could lead to grooming, or exposure to content that is inappropriate for younger children. Unfortunately, children and teens, who may be suspicious of an anonymous text on their mobile, or a call from an unknown number, might be more receptive to communication with a ‘random’ when using a messaging app, or via IM in a game.
In addition, it’s hard for parents and educators to keep an eye on content that’s being shared in this way, and there is a higher chance that prohibited content will pass through without the service able to detect it.
In dark social, cyberbullying and image-based abuse are also potential risks, particularly when the abuse is shared instantaneously with a group of users.
How You Can Help
There are some simple approaches parents can use to help children and teens navigate their communications using hidden [or dark] social.
- Teach them. Remember that when you hand a child a mobile phone or tablet, they need to know the safety rules. Just like you’d teach them road rules, be sure to teach them how to behave online. And, should they run into trouble, support them in working out how to deal with it, including where to find help.
- Help children to ‘read’ messages critically, paying attention to tone and meaning. By being savvy about how language is used they will be better able to recognise when other users are not behaving appropriately.
- Use technology tools. Check the privacy and safety settings for the services and apps they use and learn how to set these to suit you. Many of these services are currently improving their reporting functions for users, making it easier to report abuse.
- Talk. Have open conversations with your children about the devices and platforms they like to use. Join up yourself and learn the ins and outs so you’re well informed and can help guide them when they need.
What Else You Can Do
Most messaging platforms offer blocking functions and have some reporting capabilities.
Remember, if your child is seriously cyberbullied, you can contact the social media platform and ask them to remove the content. If this is not resolved within 48 hours, follow this up with a complaint to the eSafety Office, including the offending content and communication from the service provider about the report.
Reporting abuse on ‘dark social’ services is not always straightforward, and can involve decrypting messages to address the issue. Find out how to report something that might be upsetting your child on services like WhatsApp, Facebook Messenger and Snapchat at Games, Apps and Social Networking.
If your child sees offensive or illegal content, guide them through it by talking about the sites they visit. Be aware of how they use the internet and discuss the sites and apps that are okay to explore and those that are not. You can teach your child strategies about how to deal with offensive material but remember to be vigilant, especially if your child is prone to taking risks or is emotionally or psychologically vulnerable.
If you are concerned about grooming or unwanted contact, learn about how to protect your children, how it happens, where to find support and how to deal with it.
If your child experiences image-based abuse, help them to report it, teach them how to collect evidence and learn how to look after yourself, and your child.
For more, please visit esafety.gov.au
Glossary Key
Clear Web or ‘Surface Web’: The worldwide web we tend to traverse everyday. Any site that can be indexed by a standard search engine, like Google or Bing.
Deep Web: The Deep Web is anything a typical search engine cannot find or access.
Dark Web: A small portion of the Deep Web that is intentionally hidden and is inaccessible through standard web browsers. The Dark Web uses anonymity tools like TOR or I2P to hide their IP addresses and is the area of the net known for the most illicit activities such as black market drug sites or child sexual abuse material.
Dark Social*: Not as ominous as it might sound but generally refers to the social sharing of content that occurs outside of what can be measured by web analytics programs.
Disappearing Media: Also known as ‘ephemeral data’ or ‘self-destructing media’, made popular by apps like Snapchat where texts, images or videos expire after a given time, usually up to 10 seconds. The key here is that no image, message or video is truly disappearing and can be captured by a number of means!
Sources: BrightPlanet Services (adapted by Julie Inman Grant) and *Alexis C. Madrigal, senior editor at The Atlantic.
We thank the Office of the eSafety Commissioner for allowing us to share this blog, which first appeared on Office’s website. You can read the original here.
Like this post? Please share using the buttons located on the right of the page.
You can also subscribe to The Parents’ Website and get regular updates straight to your inbox.