Copyright © 2025 Digizenship.
Contact Information
Digizenship Ltd, 86-90 Paul Street, London, England, EC2A 4NE
support@digizenship.com
Our Policies
/
Child Safety Policy
Child Safety Policy
Age Verification and Parental Consent
Protecting children is one of our highest priorities, and we have implemented comprehensive measures to ensure the safety of users under 18. We require users to provide their age during account registration and use various methods to verify age information, including cross-referencing with publicly available data and analyzing behavioral patterns that may suggest false age reporting.
For users under 13, we require verifiable parental consent before collecting any personal information, in compliance with the Children's Online Privacy Protection Act (COPPA). Parents must provide a valid form of identification and complete a consent process that may include email verification, phone verification, or other methods to ensure the consent is genuine. We regularly audit our age verification processes to ensure they remain effective against evolving deceptive practices.
Special Protections for Minor Users
Minor users receive enhanced privacy protections by default, including more restrictive privacy settings, limited contact from adult users they don't know, and additional safeguards around location sharing and personal information display. Their accounts are not eligible for certain features that may pose safety risks, such as live streaming with unknown viewers or participating in certain community features.
We prohibit targeted advertising to users under 13 and limit the types of advertising shown to users between 13 and 17. Educational content and age-appropriate products are prioritized in advertising shown to minors. We also restrict the collection and use of personal information from minor users, collecting only what is necessary to provide our service safely.
Content Filtering and Safety Tools
Our platform employs sophisticated content filtering technology specifically designed to protect minors from inappropriate content. This includes filtering for sexual content, violence, hate speech, and other harmful material. We maintain separate content standards for minor users that are more restrictive than those applied to adult users.
We provide parents and guardians with comprehensive tools to monitor and control their children's experience on our platform. These tools include activity dashboards, content filtering options, time limits, and the ability to approve or block specific contacts. Parents can also receive notifications about their child's account activity and safety reports.
Reporting and Response to Child Exploitation
We maintain a zero-tolerance policy toward child exploitation and have established specialized teams to address reports of child sexual abuse material (CSAM) and other forms of child exploitation. We use industry-leading technology, including PhotoDNA and other hash-matching systems, to proactively detect and remove such content.
All suspected child exploitation content is immediately reported to the National Center for Missing & Exploited Children (NCMEC) and relevant law enforcement agencies. We preserve evidence as required by law and cooperate fully with investigations. Users who attempt to share, solicit, or distribute child exploitation material are immediately and permanently banned from our platform.
Education and Awareness Programs
We provide extensive educational resources for parents, educators, and young users about online safety, digital citizenship, and privacy protection. These resources include interactive guides, video tutorials, and partnerships with child safety organizations to provide expert advice and support.
We regularly conduct safety awareness campaigns and work with schools and community organizations to educate young people about potential online risks, including cyberbullying, online predators, and inappropriate content. Our educational materials are developed in consultation with child development experts and are regularly updated to address emerging safety concerns.
Crisis Intervention and Support Services
We have established partnerships with crisis intervention services and mental health organizations to provide immediate support to young users who may be in danger or distress. Our platform includes easy access to crisis helplines, mental health resources, and reporting mechanisms specifically designed for young users.
Our content moderation teams are specially trained to recognize signs that a minor may be in danger and to take appropriate action, including contacting law enforcement when necessary. We maintain 24/7 capabilities to respond to urgent safety concerns involving minor users and work closely with families and authorities to ensure appropriate intervention and support.
Copyright © 2025 Digizenship.
Contact Information
Digizenship Ltd, 86-90 Paul Street, London, England, EC2A 4NE
support@digizenship.com
Copyright © 2025 Digizenship.
Contact Information
Digizenship Ltd, 86-90 Paul Street, London, England, EC2A 4NE
support@digizenship.com
Our Policies
/
Child Safety Policy
Child Safety Policy
Age Verification and Parental Consent
Protecting children is one of our highest priorities, and we have implemented comprehensive measures to ensure the safety of users under 18. We require users to provide their age during account registration and use various methods to verify age information, including cross-referencing with publicly available data and analyzing behavioral patterns that may suggest false age reporting.
For users under 13, we require verifiable parental consent before collecting any personal information, in compliance with the Children's Online Privacy Protection Act (COPPA). Parents must provide a valid form of identification and complete a consent process that may include email verification, phone verification, or other methods to ensure the consent is genuine. We regularly audit our age verification processes to ensure they remain effective against evolving deceptive practices.
Special Protections for Minor Users
Minor users receive enhanced privacy protections by default, including more restrictive privacy settings, limited contact from adult users they don't know, and additional safeguards around location sharing and personal information display. Their accounts are not eligible for certain features that may pose safety risks, such as live streaming with unknown viewers or participating in certain community features.
We prohibit targeted advertising to users under 13 and limit the types of advertising shown to users between 13 and 17. Educational content and age-appropriate products are prioritized in advertising shown to minors. We also restrict the collection and use of personal information from minor users, collecting only what is necessary to provide our service safely.
Content Filtering and Safety Tools
Our platform employs sophisticated content filtering technology specifically designed to protect minors from inappropriate content. This includes filtering for sexual content, violence, hate speech, and other harmful material. We maintain separate content standards for minor users that are more restrictive than those applied to adult users.
We provide parents and guardians with comprehensive tools to monitor and control their children's experience on our platform. These tools include activity dashboards, content filtering options, time limits, and the ability to approve or block specific contacts. Parents can also receive notifications about their child's account activity and safety reports.
Reporting and Response to Child Exploitation
We maintain a zero-tolerance policy toward child exploitation and have established specialized teams to address reports of child sexual abuse material (CSAM) and other forms of child exploitation. We use industry-leading technology, including PhotoDNA and other hash-matching systems, to proactively detect and remove such content.
All suspected child exploitation content is immediately reported to the National Center for Missing & Exploited Children (NCMEC) and relevant law enforcement agencies. We preserve evidence as required by law and cooperate fully with investigations. Users who attempt to share, solicit, or distribute child exploitation material are immediately and permanently banned from our platform.
Education and Awareness Programs
We provide extensive educational resources for parents, educators, and young users about online safety, digital citizenship, and privacy protection. These resources include interactive guides, video tutorials, and partnerships with child safety organizations to provide expert advice and support.
We regularly conduct safety awareness campaigns and work with schools and community organizations to educate young people about potential online risks, including cyberbullying, online predators, and inappropriate content. Our educational materials are developed in consultation with child development experts and are regularly updated to address emerging safety concerns.
Crisis Intervention and Support Services
We have established partnerships with crisis intervention services and mental health organizations to provide immediate support to young users who may be in danger or distress. Our platform includes easy access to crisis helplines, mental health resources, and reporting mechanisms specifically designed for young users.
Our content moderation teams are specially trained to recognize signs that a minor may be in danger and to take appropriate action, including contacting law enforcement when necessary. We maintain 24/7 capabilities to respond to urgent safety concerns involving minor users and work closely with families and authorities to ensure appropriate intervention and support.