Online videos may be today’s most efficient information vector. They are the closest thing to a real-life event happening in front of your eyes, as opposed to pictures, texts, or even a combination of both. Nevertheless, the emergence of maliciously altered videos and deepfakes puts us at risk of being dangerously misinformed. To counter this, sound forensic methods supported by specifically designed signal processing tools and artificial intelligence are being used to detect most falsifications.
Yet often, misinformation conducted via video does not rely on a technical alteration but rather on false claims that accompany the footage. This could be achieved through a misleading title, a falsely alleged geographical location, or through anachronism, which consists in attributing filmed events to a false period. Anachronism can also include the resurfacing of older videos that not only add to the spread of fake news but affect open source research.
The following investigation shows how an unaltered video, which depicted a real event that happened at an originally correctly claimed date, confused the audience and online investigators by re-surfacing under a different title during a significantly more susceptible political context.