hidden image

Fraud at Fingertips

Aarti Aarti
10 Jul 2023

Swiss watches are costly and it is believed to have a mark-up of about ten-fold.  More so because its manufacturers not only use celebrity endorsement but take out substantial advertising budgets to push their watches. Even in the age of artificial intelligence (AI), Chatbot, celebrities continue to be highly influential when they promote or endorse a product/brand. Their actions and decisions are not merely watched but also often emulated by wide audiences. 

So, back home, banking on the celebrity endorsement marketing strategy, some unscrupulous elements tried their luck to impress investors, largely from the lower middle class, using morphed pictures of Amitabh Bachchan, Mukesh Ambani, Ratan Tata, Sachin Tendulkar, etc. Through their fraudulent money multiplier (Ponzi) scheme (where payment is made to existing investors with funds collected from new investors) they even managed to collect a whopping Rs 75 lakh and deposit the same in about 14 different bank accounts across India. Following a crackdown by the Economic Offence Wing (EOW) of Odisha Police on July 5, the bank accounts with the ill-gotten money are said to be frozen.

During a series of earlier raids this year, according to media reports, the EOW found that many bogus Apps were promoted by some so-called social media influencers, TV stars and small celebrities. Most of the victims who wished to become rich within a short span of time were from the lower middle class including several students. Every depositor enrolled under the scheme had to use the App to remit money and the UPI IDs, found linked to several savings bank/current accounts maintained in the name of shell companies/firms and individuals, kept changing constantly each minute. Close to Rs 1.22 crore has since been frozen. 

Notably, in March this year, while investigating complaints received from nearly 800 duped investors of Odisha’s Ganjam district, EOW came across a Ponzi scheme, registered in an overseas based website called “18football.com”, which turned out to be a “hybrid model of fraud” run online in the name of football betting/gaming application. Initially the fraudsters, to gain publicity, took up social work in the rural areas and even paid the promised return for a few days but no sooner the membership increased, they stopped making payment and closed down the website as well. 

Should we believe what we hear in a telephone conversation? This May, a 34-year-old Delhi doctor was cheated of about Rs 4.5 crore by a bunch of fraudsters who posed as cops from Mumbai Police, sleuths from Customs and Narcotics division. According to a Times of India report, the doctor received a call about the seizure of a courier package bound from Mumbai to Taiwan which contained her passport, two pair of shoes, banking documents, clothes and narcotics. When the doctor feigned ignorance, she was asked to register a complaint. Her call was directed to a Mumbai Police Inspector who advised to download Skype app and register her complaint online as she was based in Delhi. 

Meanwhile, the victim overheard some people discussing that her Aadhar had been misused for opening 23 bank accounts in Mumbai. The Inspector then told the doctor that she would be investigated for multiple criminal offences and be placed under arrest. Based on instructions the victim shared screenshots of all her bank accounts along with the balance to the last rupee on Skype. As advised, she broke her fixed deposits and transferred Rs 4.47 crore by RTGS to an account which, she was told, would be transferred back after verification. 

She later received a complaint on the letterhead of “Mumbai Police” and a letter from the “Reserve Bank of India”. The cheats asked her to get rid of all Skype conversations and await the clearance report. After a few days, she received two "clearance certificates” stating that her accounts were in order and that no illegal transactions were detected minus the Rs 4.47 crore transferred for verification.

On July 6, the Delhi police, according to media reports, busted a fake appointment/consultation scam in Delhi-NCR hospitals being operated from several states following a complaint received from an additional public prosecutor who lost Rs 1 lakh. The scammers, using fake websites of renowned hospitals in Delhi-NCR, targeted people searching for appointment/consultation. Asking the victims to download a mobile App for securing appointment, the cheats then told them to pay Rs 10 to confirm the booking but before victims could realise the fraud, close to Rs 1 lakh vanished  from their bank accounts. 

That scammers are tapping sophisticated AI tools to create havoc is a cause for concern. A few weeks ago, AI-produced e-books made their way to Amazon’s Kindle store in the best sellers’ category. Even as the said books were void of normal content, how they made it to the best sellers’ category is a mystery.

Deepfake voice recordings and videos of people, to fool their relatives and friends into transferring money, is posing several challenges globally. The first case of deepfake fraud is said to have taken place at a British energy firm four years ago apparently using a commercial voice generating software. The Chief Executive Officer received an urgent call from his superior, the Chief Executive of the firm’s German parent company asking him to transfer funds to a Hungarian supplier within an hour. The firm lost USD 243,000. Three years ago, a branch manager of a Japanese firm in Hong King received a telephone call from a man whose voice sounded like the director of his parent firm. The caller wanted USD 35 million to be transferred as the company was about to make an acquisition. Scammers had sued deep voice technology to clone the director’s speech.

Recently an Arizona mother received an unexpected phone call from her daughter who said she messed up and was sobbing. In the meanwhile, a man is heard directing the girl to put her head down and warns the mother not to approach the Police. It was thought to be a case of kidnapping and the ransom call turned out to be a total scam. Actually, her daughter was safe and the girl’s voice over the phone was a clone created by artificial intelligence. Once a scammer finds an audio voice of someone, he can replicate it within a few minutes. It is said that just three seconds are required for cloning human voice. To create deepfake videos, a few photos of the target’s face, which can be taken from social media, would suffice. 

Although the contents of the long-awaited data protection law in India, recently cleared by the Union Cabinet, has not been made public, it holds hope for the average citizen. Therefore, at an individual level, it is essential to be extremely cautious while using the cyberspace. It needs to be appreciated that a celebrity endorsement does not mean that such an investment is legitimate or suitable for all investors. Full details of the proposed investment need to be scrutinised before investing in it. Never click on any unknown email link as it can be dangerous and malware could infect your device. 

Mobile Apps are another area of serious worry. If an App asks for too many permissions it ought to be viewed with concern. Recently an acquaintance was expecting a courier and she was asked to download an App to track the status of the package. The App asked for various permissions like viewing the contacts list and photo gallery as also permission to read SMS and emails. She became jittery and consulted a friend who asked her not to download the App. 

For instance, police have advised that one’s photo used as display picture on social media accounts could be used by cyber criminals for morphing. Care ought to be taken while sharing personal documents like PAN, Aadhaar, Voter’s Card, etc. with strangers especially over the cyberspace. 

As a first step, Googling for telephone numbers must be avoided because in the absence of a mobile telephone directory one does not know who is on the other side. Remember, curiosity killed the cat. In matter of cyberspace, awareness is the best defence. It is better to be safe than feel sorry afterwards.

Recent Posts

Badlapur, known for both a film and a city, recently made headlines due to the sexual abuse of two young girls at a preschool.
apicture A. J. Philip
30 Sep 2024
To combat global challenges, the current generation must adopt Gandhi's values of tolerance and non-violence.
apicture Jacob Peenikaparambil
30 Sep 2024
The controversy over the allegation of using animal fat in Tirupati laddus has sparked political debate.
apicture M L Satyan
30 Sep 2024
The recent controversy surrounding the Tirupati Laddu, one of India's most revered religious offerings, has sparked a profound firestorm of religious, political, and social debate.
apicture Dr John Singarayar
30 Sep 2024
Regularity and radicality are two fundamental dimensions of life that everyone must engage with at some point.
apicture Jayaseelan Savariarpitchai SDB
30 Sep 2024
As night set in, I would put the front glass pane up, and believe you me, no air conditioner in the world could beat the refreshing gusts of cool air driven in by the thrust of the bus.
apicture Robert Clements
30 Sep 2024
India's Constitution is unique and has evolved organically.
apicture Pauly Muricken
23 Sep 2024
His government's meat ban in towns along the Narmada River disproportionately affects only certain communities and is clearly motivated by a Hindutva-driven political agenda.
apicture Jacob Peenikaparambil
23 Sep 2024