In Yemen, where cultural and societal norms place strict expectations on women’s behavior and appearance—especially regarding modesty and online presence—a single social media post can pose a serious threat to a woman’s life.
That’s the kind of case the YODET Helpline was recently called to handle.
A fake social media account had been created using a young woman’s name and photos. The impersonator began publishing her personal images, some of which showed her without a hijab—something considered unacceptable in many parts of Yemeni society. The account also shared private information, exposing her to severe social stigma, family backlash, and even potential physical harm.
For women in Yemen, situations like this go beyond online harassment—they can lead to threats of violence, forced isolation, or worse. In some cases, a family's reaction could turn deadly due to the weight of "honor" and societal pressure.
The moment the message reached the YODET Helpline, the team treated it as a high-risk emergency. Their first step was to ensure the woman’s immediate safety, offering psychological support and connecting her with legal and protection resources. At the same time, the team collected digital evidence: screenshots, URLs, and witness statements, all crucial for a proper takedown request.
Thanks to YODET’s official communication channels with major tech companies like Meta and TikTok, the impersonating account was swiftly reported and permanently removed from the platform. This fast action was vital—not just to prevent further harm, but to potentially save a life.
After resolving the issue, YODET continued to provide support, ensuring the individual knew her rights, how to protect her digital identity, and where to turn in case of future threats.
This case is a powerful example of how online abuse can become a matter of life and death in conservative societies, and how critical YODET Helpline is in standing between victims and real-world consequences. In Yemen’s complex social landscape, digital safety is not just a tech issue—it’s a human rights imperative.