Epic Games’ Battle Royale Fortnite video game has disrupted the gaming industry in a major way but its new voice reporting feature has raised privacy and ethical concerns from cybersecurity experts.
Since its 2017 launch, Fortnite has amassed over 500 million registered players and now averages around 221 million monthly active players.
Fortnite’s free-to-play and cross-platform release seven years ago made it unprecedentedly accessible for players. Combine that with colourful and bombastic art design and genuinely addictive shooting and building gameplay mechanics, and it has truly kept the world, and specifically young people, in a chokehold.
To date, Fortnite has generated revenue of over $26bn, of which over $6bn came in 2022 alone.
However, like any online video game popular with young people, the onus has fallen on its publisher, Epic Games, to make sure its platform remains as safe a space as possible, which, despite efforts, has not always remained the case.
Fortnite’s voice reporting tool
According to an ExpressVPN study in 2023, one in five children in the UK aged between four and 13 have experienced harassment of some form on Fortnite.
How well do you really know your competitors?
Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.
Thank you!
Your download email will arrive shortly
Not ready to buy yet? Download a free sample
We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form
By GlobalDataFortnite implemented its voice reporting feature at the end of 2023. Microsoft’s Xbox added similar reactive voice chat moderation in July last year, allowing players to capture and report 60-second audio clips of voice chat messages deemed inappropriate.
Fortnite’s security tool allows players to report another player’s last five minutes of voice chat, which is continually being recorded on a rolling basis.
When a user reports a conversation, “the voice chat audio captured from the last five minutes will be uploaded with the report and sent to Epic moderators for review,” Epic says.
Any previous audio over five minutes old is automatically deleted as new audio is captured.
The reporting feature is automatically enabled for all players under 18; minors who do not wish to have their voice chat audio captured must mute themselves or turn off voice chat completely.
Epic Games claims Fortnite’s voice chat audio is securely captured on a user’s device and not on Epic Games servers to protect privacy.
“Epic has no way of accessing any voice chat audio unless voice reporting is on and a participant submits a voice report,” according to the company.
Is Fortnite’s voice reporting tool an invasion of privacy?
However, the Fortnite-maker has come under fire from gaming and cybersecurity professionals who say the voice recording feature for minors opens up a myriad of privacy issues.
Lauren Hendry Parsons, a privacy advocate at ExpressVPN, told Verdict that voice recording minors, even on a five-minute basis, is not a reasonable solution to deter hate speech.
“Children may not fully comprehend the consequences of sharing personal information online, and taking this a step further, the idea that their ‘private’ conversations with their friends may be saved by a corporation at a moment’s notice with little visibility of how they’re used is very concerning,” Parsons tells Verdict.
“These decisions move gaming companies into incredibly murky territory, especially as many of their customers are under the age of 18,” Parsons says.
Epic Games says it will auto-delete clips after 14 days or “the duration of a sanction”. The company notes that it will keep clips for 14 more days if an appeal is made to make a decision.
Epic Games also said clips will be kept for “as long as legally required” if necessary.
In an emailed statement to Verdict, a spokesperson for Epic Games said: “Enabling players to submit audio evidence when reporting suspected violations of our Community Rules helps us take action against players using voice chat to bully, harass, discriminate, or engage in other inappropriate behaviour.”
“Voice chat audio is securely captured on the player’s device — for example their gaming console or PC — not the Epic Games servers, and Epic has no way of accessing any voice chat audio unless voice reporting is on and a participant submits a voice report,” the spokesperson added.
Epic Games said it was a deliberate design choice to “build a system that does not capture or monitor all voice traffic out of respect for players’ privacy and choices.”
The gaming industry and children’s privacy
Protecting children within the gaming industry has become more challenging with the increasing toxicity in gaming.
A 2023 Unity survey that primarily focused on the UK, South Korea, and the US revealed that the number of players facing toxic behaviour increased by 6% to 74% between 2021 and 2023.
“Gaming companies should ensure strict age-based settings to protect minors from this kind of behaviour,” Eren Cicyasvili, analyst at research company GlobalData, tells Verdict.
For instance, on Minecraft: Java Edition, Fortnite and Roblox users can use a profanity filter to cover up harmful language.
“This filter cannot be turned off in a child account. Roblox also has an age verification process that users must go through before using voice chat,” Cicyasvili added.
As of December 2022, Epic Games introduced a feature called Cabined Accounts. The feature requires all players globally to enter their date of birth at log in. If a player indicates they are under 13 or their country’s age of digital consent, the account will be a Cabined Account until they receive their parent or guardian’s consent.
Until consent is provided to Epic, underage players will be allowed to play games in a Cabined Account environment which disables features such as voice chat.
Cicyasvili says that game publishers will be increasingly scrutinised and compelled to comply with regulations in the coming years.
“Their failure to comply could attract hefty penalties and bans and cause them to lose players,” he says.
Lawmakers and regulators worldwide have taken different approaches to ensure the safety of children in video games.
Video games like Grand Theft Auto and PUBG Mobile, which include elements such as self-harm, violence, and vulgarity, have been banned in some countries.
In 2023, the UK gaming industry decided to limit children’s access to in-game loot boxes through a collection of guidelines.