After two and a half years of ostracism and having had her children taken away from her, today Sobha Sajju stands vindicated.
New Delhi: If technology allowed Kerala resident Sobha Sajju’s name to be sullied, it also proved to be her salvation.
When a purported nude video of Sajju went viral two and a half years ago, her husband filed for divorce and took away their three children.
Today, she stands vindicated, with the country’s apex cyber forensics research body ruling that the woman in the video wasn’t Sajju at all.
The Centre for Development of Advanced Computing or C-DAC is said to have deployed advanced software to help Sajju emerge unscathed from the sea of stigma the video had thrust her into.
For her children
The video was first allegedly circulated on an office WhatsApp group by a colleague of her husband. The colleague, who has since been arrested, had alleged the woman was Sajju.
To most people who saw the video, however, it was clear that it wasn’t. But her husband, whom she suspects of having a role in its circulation, reportedly vouched during his questioning by the Kerala police cyber cell that the woman in the video was, in fact, Sajju.
Life as she knew it ended right there for her. He filed for divorce, and Sajju was just left with visitation rights to her children.
Her family believed her, but her husband continued to assert, even in custody hearings, that Sajju posted nude photos of herself online.
So, Sajju made up her mind that she wouldn’t rest till she secured conclusive proof that it was a different woman in the video. She wanted her children to know this for sure, she told news portal Newsminute in an interview.
When police took her video to the state forensic laboratory, a poorly framed question scuttled their chances to get the answer they needed.
“Police had asked the forensics lab to check whether the woman in the video and Sajju were similar,” said Kerala deputy superintendent of police E.S. Bijumon.
After investigating the video twice, the lab concluded that the two women did have similar physical features, without clarifying whether it implied they were the same person.
Bijumon subsequently took the case to C-DAC, making sure the question was framed in a more specific manner.
“When I met Sajju, she was worried and shaken,” he told ThePrint. “She said she was going through this trouble only because she wanted to prove to her children that she was not in the nude video,” he added.
“I took this request to C-DAC because I was touched by what she said,” Bijumon said to ThePrint
Also read: These apps can reveal your make-up’s dirty secrets
Tech saves the day
C-DAC usually only takes up investigations forwarded by the CBI, NIA and courts, but agreed to look into Sajju’s case when approached by Bijumon.
The Thiruvananthapuram branch of C-DAC pegged its investigation on comparing the video in question, frame-by-frame, to other footage and photographs featuring Sajju.
According to Bijumon, it was a software developed by the agency itself — CyberCheck Suite — that helped crack Sajju’s case.
CyberCheck Suite analyses the date of file creation and checks whether a given video has been modified. It can also identify if the video file was sent to other people from a mobile device.
In addition, Bijumon said, C-DAC used a host of facial recognition technology — proprietary, open source and commercially available.
Rising cases of fake videos
Even with the proof in hand, Sajju’s life, for now, remains caught up in the divorce proceedings launched by her husband.
Her ordeal proves how such videos can wreak havoc for lives in India at a time when easy internet and social media allow quick circulation of content.
While Sajju’s case involved a similar-looking woman, commonly available technology now allows people to morph someone’s face onto videos to peddle more easily palatable lies. These videos are called ‘deepfakes’.
“Deepfakes basically involve creating the face mask of an individual and superimposing it on another video,” said Samir Datt, the CEO of ForensicsGuru.com, a digital investigation solutions company.
A high-quality deepfake video is made with a sophisticated system priced at Rs 50 lakh or more.
But amateur versions abound, with the rise of downloadable apps such as FakeApp and FaceSwap. A person with deeper knowledge of software can access sources easily available online to churn out convincing videos where porn stars’ faces have been replaced with those of A-list celebrities.
Also read: Here’s what the world is doing to tackle fake news. India can learn
Rise in such cases
According to the National Crime Records Bureau, cyber crimes registered an increase of 28 per cent between 2014 and 2016.
Cyber crime is a sweeping term that covers everything from card fraud to data theft, but a big part of this is the proliferation of videos: Fake, real, and the surreptitiously recorded.
Many of these are of a pornographic nature, often circulated by jilted lovers to hit back at exes.
“In India, over the past two years, there has been a marked rise in cyber crimes related to the circulation of non-consensual sex videos, thanks to high-speed, cheap internet services,” said Ritesh Bhatia, a Mumbai-based cybercrime consultant who has assisted corporates, individuals and law enforcement agencies over the past 15 years.
If confronted with such videos, experts say, people should be on the lookout for the glitches common in fake footage: Suspiciously slow blinking, delayed bodily movement and abrupt changes in skin complexion are some of the most telltale signs.
Datt added that high-resolution deepfake videos might have flickering clips and blurred figures, with double edges to the face or thicker-than-normal eyebrows.
Meanwhile, as regards genuine videos shared to embarrass someone, law enforcement agencies often get in touch with the service providers to get to the source.
Lawyer Bivas Chatterjee, who recently led the Bengal police cyber cell to victory as it sought prosecute a 23-year-old who shared explicit images of a former lover, told The Telegraph how Reliance Jio and Google helped them nail the case.
“Reliance Jio provided evidence that their data had been used for the uploads, Google provided evidence that the material had been uploaded from the man’s Gmail account,” Chatterjee explained.