The Iraqi Digital Media Center (DMC) warned of the spread of fake content in Iraq using artificial intelligence technologies.
The center noted that "deepfake detection tools should not be overly relied upon, as these tools still face significant challenges that could reduce their effectiveness in rapidly evolving digital environments."
In a statement, the center stated that "reliable research and practical experiments have indicated that these tools can be deceived through so-called adversarial attacks, techniques that rely on subtle modifications that are imperceptible to the human eye but sufficient to manipulate test results. Researchers from prestigious universities have been able to achieve deception rates exceeding 99% in some cases."
The center expressed its gratitude for the efforts of the Iraqi Communications and Media Commission "in controlling the digital space, especially in combating fake accounts and misleading content."
The Iraqi Digital Media Center stated that "the issue is not limited to a technical aspect only, but rather relates to a significant internal situation, as Iraq has witnessed a significant spread of fake content on platforms, with the aim of influencing public opinion and distorting facts."
He pointed out that "fake videos and images have been detected, circulating with the aim of misleading and influencing political and social events in Iraq."
He explained that "this threat is part of an ongoing technological race between the producers of fake content and those seeking to detect it. As detection tools develop new techniques for detecting manipulation, developers of fake systems work to improve their models to outpace those techniques, in a relentless digital race."
According to the center, "One of the most significant factors limiting the effectiveness of detection tools is the lack of training data. These tools are often trained on homogeneous data, which leads to their performance deteriorating when faced with situations on which they were not trained. Research has shown that the accuracy of many detection tools does not exceed 69% in real-world scenarios."
He pointed out that "some deception methods do not require complex techniques, but rather simple modifications such as adding noise, re-recording fake audio through a speaker and re-recording it with a microphone, or deleting metadata from files. These are simple but effective tricks for distorting audio or visual analysis tools."
He added, "Deep fakes cannot be combated by relying on technology alone. Rather, it requires promoting critical thinking, media education, and careful verification of sources so that people do not fall victim to this dangerous form of misinformation."
The Iraqi Center for Digital Media warns against the spread of fake content.
