Discord is making some important changes starting in March 2026, and if you’re a regular user, you’ll want to know about it. The platform is introducing age verification that will affect both new sign-ups and existing accounts.
From March, Discord will automatically apply age-appropriate safety settings to all accounts. This means sensitive content will be blurred unless you verify that you’re an adult. Also, messages from people you don’t know will go to requests instead of a mailbox.

The idea is to keep younger users safe from inappropriate content while allowing older users to access everything they want on the platform.
To verify your age, Discord is offering two methods:
- Facial estimation technology, where your device checks your age based on a selfie (don’t worry, the video stays private)
- Uploading an identity document for verification through a trusted partner
In some cases, you may be asked to do both for extra confirmation.

This isn’t the first platform to make these changes. Roblox has already started age checks for users who want to chat, with facial recognition placing people into age groups to limit direct interaction between adults and children. It’s clear that Discord is just one part of a bigger movement.
On a government level, countries all around the world are introducing their own measures to better protect minors online. For instance, the UAE recently passed the Child Digital Safety Law to strengthen its own protections for young internet users.
Stay tuned for more updates on how platforms and governments are making online spaces safer!