
Inside a romance scam compoundand how people get tricked into being there
www.technologyreview.com
Heading north in the dark, the only way Gavesh could try to track his progress through the Thai countryside was by watching the road signs zip by. The Jeeps three occupantsGavesh, a driver, and a young Chinese womanhad no languages in common, so they drove for hours in nervous silence as they wove their way out of Bangkok and toward Mae Sot, a city on Thailands western border with Myanmar. When they reached the city, the driver pulled off the road toward a small hotel, where another car was waiting. I had some suspicionslike, why are we changing vehicles? Gavesh remembers. But it happened so fast. They left the highway and drove on until, in total darkness, they parked at what looked like a private house. We stopped the vehicle. There were people gathered. Maybe 10 of them. They took the luggage and they asked us to come, Gavesh says. One was going in front, there was another one behind, and everyone said: Go, go, go. Gavesh and the Chinese woman were marched through the pitch-black fields by flashlight to a riverside where a boat was moored. By then, it was far too late to back out. Gaveshs journey had started, seemingly innocently, with a job ad on Facebook promising work he desperately needed. Instead, he found himself trafficked into a business commonly known as pig butcheringa form of fraud in which scammers form romantic or other close relationships with targets online and extract money from them. The Chinese crime syndicates behind the scams have netted billions of dollars, and they have used violence and coercion to force their workers, many of them people trafficked like Gavesh, to carry out the frauds from large compounds, several of which operate openly in the quasi-lawless borderlands of Myanmar. We spoke to Gavesh and five other workers from inside the scam industry, as well as anti-trafficking experts and technology specialists. Their testimony reveals how global companies, including American social media and dating apps and international cryptocurrency and messaging platforms, have given the fraud business the means to become industrialized. By the same token, it is Big Tech that may hold the key to breaking up the scam syndicatesif only these companies can be persuaded or compelled to act. Were identifying Gavesh using a pseudonym to protect his identity. He is from a country in South Asia, one he asked us not to name. He hasnt shared his story much, and he still hasnt told his family. He worries about how theyd handle it. Until the pandemic, he had held down a job in the tourism industry. But lockdowns had gutted the sector, and two years later he was working as a day laborer to support himself and his father and sister. I was fed up with my life, he says. I was trying so hard to find a way to get out. When he saw the Facebook post in mid-2022, it seemed like a godsend. A company in Thailand was looking for English-speaking customer service and data entry specialists. The monthly salary was $1,500far more than he could earn at homewith meals, travel costs, a visa, and accommodation included. I knew if I got this job, my life would turn around. I would be able to give my family a good life, Gavesh says. What came next was life-changing, but not in the way Gavesh had hoped. The advert was a fraudand a classic tactic syndicates use to force workers like Gavesh into an economy that operates as something like a dark mirror of the global outsourcing industry. The true scale of this type of fraud is hard to estimate, but the United Nations reported in 2023 that hundreds of thousands of people had been trafficked to work as online scammers in Southeast Asia. One 2024 study, from the University of Texas, estimates that the criminal syndicates that run these businesses have stolen at least $75 billion since 2020. These schemes have been going on for more than two decades, but theyve started to capture global attention only recently, as the syndicates running them increasingly shift from Chinese targets toward the West. And even as investigators, international organizations, and journalists gradually pull back the curtain on the brutal conditions inside scamming compounds and document their vast scale, what is far less exposed is the pivotal role platforms owned by Big Tech play throughout the industryfrom initially coercing individuals to become scammers to, finally, duping scam targets out of their life savings. As losses mount, governments and law enforcement agencies have looked for ways to disrupt the syndicates, which have become adept at using ungoverned spaces in lawless borderlands and partnering with corrupt regimes. But on the whole, the syndicates have managed to stay a step ahead of law enforcementin part by relying on services from the worlds tech giants. Apple iPhones are their preferred scamming tools. Meta-owned Facebook and WhatsApp are used to recruit people into forced labor, as is Telegram. Social media and messaging platforms, including Facebook, Instagram, WhatsApp, WeChat, and X, provide spaces for scammers to find and lure targets. So do dating apps, including Tinder. Some of the scam compounds have their own Starlink terminals. And cryptocurrencies like tether and global crypto platforms like Binance have allowed the criminal operations to move money with little or no oversight. Scam workers sit inside Myanmar's KK Park, a notorious fraud hub near the border with Thailand, following a recent crackdown by law enforcement.REUTERS Private-sector corporations are, unfortunately, inadvertently enabling this criminal industry, says Andrew Wasuwongse, the Thailand country director at the anti-trafficking nonprofit International Justice Mission (IJM). The private sector holds significant tools and responsibility to disrupt and prevent its further growth. Yet while the tech sector has, slowly, begun to roll out anti-scam tools and policies, experts in human trafficking, platform integrity, and cybercrime tell us that these measures largely focus on the downstream problem: the losses suffered by the victims of the scams. That approach overlooks the other set of victims, often from lower-income countries, at the far end of a fraud supply chain that is built on human miseryand on Big Tech. Meanwhile, the scams continue on a mass scale. Tech companies could certainly be doing more to crack down, the experts say. Even relatively small interventions, they argue, could start to erode the business model of the scam syndicates; with enough of these, the whole business could start to founder. The trick is: How do you make it unprofitable? says Eric Davis, a platform integrity expert and senior vice president of special projects at the Institute for Security and Technology (IST), a think tank in California. How do you create enough friction? That question is only becoming more urgent as many tech companies pull back on efforts to moderate their platforms, artificial intelligence supercharges scam operations, and the Trump administration signals broad support for deregulation of the tech sector while withdrawing support from organizations that study the scams and support the victims. All these trends may further embolden the syndicates. And even as the human costs keep building, global governments exert ineffectual pressureif any at allon the tech sector to turn its vast financial and technical resources against a criminal economy that has thrived in the spaces Silicon Valley built. Capturing a vulnerable workforce The roots of pig butchering scams reach back to the offshore gambling industry that emerged from China in the early 2000s. Online casinos had become hugely popular in China, but the government cracked down, forcing the operators to relocate to Cambodia, the Philippines, Laos, and Myanmar. There, they could continue to target Chinese gamblers with relative impunity. Over time, the casinos began to use social media to entice people back home, deploying scam-like tactics that frequently centered on attractive and even nude dealers. The doubts didnt really start until after Gavesh reached Bangkoks Suvarnabhumi Airport. As time ticked by, it began to occur to him that he was alone, with no money, no return ticket, and no working SIM card. Often the romance scam was a part of thatbuilding romantic relationships with people that you eventually would aim to hook, says Jason Tower, Myanmar country director at the United States Institute of Peace (USIP), a research and diplomacy organization funded by the US government, who researches the cyber scam industry. (USIPs leadership was recently targeted by the Trump administration and Elon Musks Department of Government Efficiency task force, leaving the organizations future uncertain; its website, which previously housed its research, is also currently offline.) By the late 2010s, many of the casinos were big, professional operations. Gradually, says Tower, the business model turned more sinister, with a tactic called sha zhu pan in Chinese emerging as a core strategy. Scamming operatives work to fatten up or cultivate a target by building a relationship before going in for the slaughterpersuading them to invest in a supposedly once-in-a-lifetime scheme and then absconding with the money. That actually ended up being much, much more lucrative than online gambling, Tower says. (The international law enforcement organization Interpol no longer uses the graphic term pig butchering, citing concerns that it dehumanizes and stigmatizes victims.) Like other online industries, the romance scamming business was supercharged by the pandemic. There were simply more isolated people to defraud, and more people out of work who might be persuaded to try scamming othersor who were vulnerable to being trafficked into the industry. Initially, most of the workers carrying out the frauds were Chinese, as were the fraud victims. But after the government in Beijing tightened travel restrictions, making it hard to recruit Chinese laborers, the syndicates went global. They started targeting more Western markets and turning, Tower says, to much more malign types of approaches to tricking people into scam centers. Getting recruited Gavesh was scrolling through Facebook when he saw the ad. He sent his rsum to a Telegram contact number. A human resources representative replied and had him demonstrate his English and typing skills over video. It all felt very professional. I didnt have any reason to suspect, he says. The doubts didnt really start until after he reached Bangkoks Suvarnabhumi Airport. After being met at arrivals by a man who spoke no English, he was left to wait. As time ticked by, it began to occur to Gavesh that he was alone, with no money, no return ticket, and no working SIM card. Finally, the Jeep arrived to pick him up. Hours later, exhausted, he was on a boat crossing the Moei River from Thailand into Myanmar. On the far bank, a group was waiting. One man was in military uniform and carried a gun. In my country, if we see an army guy when we are in trouble, we feel safe, Gavesh says. So my initial thoughts were: Okay, theres nothing to be worried about. They hiked a kilometer across a sodden paddy field and emerged at the other side caked in mud. There a van was parked, and the driver took them to what he called, in broken English, the office. They arrived at the gate of a huge compound, surrounded by high walls topped with barbed wire. While some people are drawn into online scamming directly by friends and relatives, Facebook is, according to IJMs Wasuwongse, the most common entry point for people recruited on social media. Meta has known for years that its platforms host this kind of content. Back in 2019, the BBC exposed slave markets that were running on Instagram; in 2021, the Wall Street Journal reported, drawing on documents leaked by a whistleblower, that Meta had long struggled to rein in the problem but took meaningful action only after Apple threatened to pull Instagram from its app store. Today, years on, ads like the one that Gavesh responded to are still easy to find on Facebook if you know what to look for. Examples of fraudulent Facebook ads, shared by International Justice Mission. They are typically posted in job seekers groups and usually seem to be advertising legitimate jobs in areas like customer service. They offer attractive wages, especially for people with language skillsusually English or Chinese. The traffickers tend to finish the recruitment process on encrypted or private messaging apps. In our research, many experts said that Telegram, which is notorious for hosting terrorist content, child sexual abuse material, and other communication related to criminal activity, was particularly problematic. Many spoke with a combination of anger and resignation about its apparent lack of interest in working with them to address the problem; Mina Chiang, founder of Humanity Research Consultancy, an anti-trafficking organization, accuses the app of being very much complicit in human trafficking and proactively facilitating these scams. (Telegram did not respond to a request for comment.) But while Telegram users have the option of encrypting their messages end to end, making them almost impossible to monitor, social media companies are of course able to access users posts. And its here, at the beginning of the romance scam supply chain, where Big Tech could arguably make its most consequential intervention. Social media is monitored by a combination of human moderators and AI systems, which help flag users and contentads, posts, pagesthat break the law or violate the companies own policies. Dangerous content is easiest to police when it follows predictable patterns or is posted by users acting in distinctive and suspicious ways. They have financial resources. You can hire the most talented coding engineers in the world. Why cant you just find people who understand the issue properly? Anti-trafficking experts say the scam advertising tends to follow formulaic templates and use common language, and that they routinely report the ads to Meta and point out the markers they have identified. Their hope is that this information will be fed into the data sets that train the content moderation models. While individual ads may be taken down, even in big waveslast November, Meta said it had purged 2 million accounts connected to scamming syndicates over the previous yearexperts say that Facebook still continues to be used in recruiting. And new ads keep appearing. (In response to a request for comment, a Meta spokesperson shared links to policies about bans on content or advertisements that facilitate human trafficking, as well as company blog posts telling users how to protect themselves from romance scams and sharing details about the companys efforts to disrupt fraud on its platforms, one statingthat it is constantly rolling out new product features to help protect people on [its] apps from known scam tactics at scale. The spokesperson also said that WhatsApp has spam detection technology, and millions of accounts are banned per month.) Anti-trafficking experts we spoke with say that as recently as last fall, Meta was engaging with them and had told them it was ramping up its capabilities. But Chiang says there still isnt enough urgency from tech companies. Theres a question about speed. They might be able to say Thats the goal for the next two years. No. But thats not fast enough. We need it now, she says. They have financial resources. You can hire the most talented coding engineers in the world. Why cant you just find people who understand the issue properly? Part of the answer comes down to money, according to experts we spoke with. Scaling up content moderation and other processes that could cause users to be kicked off a platform requires not only technological staff but also legal and policy expertswhich not everyone sees as worth the cost. The vast majority of these companies are doing the minimum or less, says Tower of USIP. If not properly incentivized, either through regulatory action or through exposure by media or other forms of pressure often, these companies will underinvest in keeping their platforms safe. Getting set up Gaveshs new office turned out to be one of the most infamous scamming hubs in Southeast Asia: KK Park in Myanmars Myawaddy region. Satellite imagery shows it as a densely packed cluster of buildings, surrounded by fields. Most of it has been built since late 2019. Inside, it runs like a hybrid of a company campus and a prison. When Gavesh arrived, he handed over his phone and passport and was assigned to a dormitory and an employer. He was allowed his own phone back only for short periods, and his calls were monitored. Security was tight. He had to pass through airport-style metal detectors when he went in or out of the office. Black-uniformed personnel patrolled the buildings, while armed men in combat fatigues watched the perimeter fences from guard posts. On his first full day, he was put in front of a computer with just four documents on it, which he had to read over and overguides on how to approach strangers. On his second day, he learned to build fake profiles on social media and dating apps. The trick was to find real people on Instagram or Facebook who were physically attractive, posted often, and appeared to be wealthy and living a luxurious life, he says, and use their photos to build a new account: There are so many Instagram models that pretend they have a lot of money. After Gavesh was trafficked into Myanmar, he was taken to KK Park. Most of the compound has been built since late 2019.LUKE DUGGLEBY/REDUX Next, he was given a batch of iPhone 8smost people on his team used between eight and 10 devices eachloaded with local SIM cards and apps that spoofed their location so that they appeared to be in the US. Using male and female aliases, he set up dozens of accounts on Facebook, WhatsApp, Telegram, Instagram, and X and profiles on several dating platforms, though he cant remember exactly which ones. Different scamming operations teach different techniques for finding and reaching out to potential victims, several people who worked in the compounds tell us. Some people used direct approaches on dating apps, Facebook, Instagram, orfor those targeting Chinese victimsWeChat. One worker from Myanmar sent out mass messages on WhatsApp, pretending to have accidentally messaged a wrong number, in the hope of striking up a conversation. (Tencent, which owns WeChat, declined to comment.) Some scamming workers we spoke to were told to target white, middle-aged or older men in Western countries who seemed to be well off. Gavesh says he would pretend to be white men and women, using information found from Google to add verisimilitude to his claims of living in, say, Miami Beach. He would chat with the targets, trying to figure out from their jobs, spending habits, and ambitions whether theyd be worth investing time in. One South African woman, trafficked to Myanmar in 2022, says she was given a script and told to pose as an Asian woman living in Chicago. She was instructed to study her assigned city and learn quotidian details about life there. They kept on punishing people all the time for not knowing or for forgetting that theyre staying in Chicago, she says, or for forgetting whats Starbucks or whats [a] latte. Fake users have, of course, been a problem on social media platforms and dating sites for years. Some platforms, such as X, allow practically anyone to create accounts and even to have them verified for a fee. Others, including Facebook, have periodically conducted sweeps to get rid of fake accounts engaged in what Meta calls coordinated inauthentic behavior. (X did not respond to requests for comment.) But scam workers tell us they were advised on simple ways to circumvent detection mechanisms on social media. They were given basic training in how to avoid suspicious behavior such as adding too many contacts too quickly, which might trigger the company to review whether someones profile is authentic. The South African woman says she was shown how to manipulate the dates on a Facebook account to seem as if you opened the account in 2019 or whatever, making it easier to add friends. (Metas spam filtersmeant to reduce the spread of unwanted contentinclude limits on friend requests and bulk messaging.) Wang set up a Tinder profile with a picture of a dog and a bio that read, I am a dog. It passed through the platforms verification system without a hitch. Dating apps, whose users generally hope to meet other users in real life, have a particular need to make sure that people are who they say they are. But Match Group, the parent company of Tinder, ended its partnership with a company doing background checks in 2023. It now encourages users to verify their profile with a selfie and further ID checks, though insiders say these systems are often rudimentary. They just check a box and [do] what is legally required or what will make the media get off of [their] case, says one tech executive who has worked with multiple dating apps on safety systems, speaking on the condition of anonymity because they were not permitted to speak about their work with certain companies. Fangzhou Wang, an assistant professor at the University of Texas at Arlington who studies romance scams, ran a test: She set up a Tinder profile with a picture of a dog and a bio that read, I am a dog. It passed through the platforms verification system without a hitch. They are not providing enough security measures to filter out fraudulent profiles, Wang says. Everybody can create anything. Like recruitment ads, the scam profiles tend to follow patterns that should raise red flags. They use photos copied from existing users or made by artificial intelligence, and the accounts are sometimes set up using phone numbers generated by voice-over-internet-protocol services. Then theres the scammers behavior: They swipe too fast, or spend too much time logged in. A normal human doesnt spend eight hours on a dating app a day, the tech executive says. Whats more, scammers use the same language over and over again as they reach out to potential targets. The majority of them are using predesigned scripts, says Wang. It would be fairly easy for platforms to detect these signs and either stop accounts from being created or make the users go through further checks, experts tell us. Signals of some of these behaviors can potentially be embedded into a type of machine-learning algorithm, Wang says. She approached Tinder a few years ago with her research into the language that scammers use on the platforms, and offered to help build data sets for its moderation models. She says the company didnt reply. (In a statement, Yoel Roth, vice president of trust and safety at Match Group, said that the company invests in proactive tools, advanced detection systems and user education to help prevent harm. He wrote, We use proprietary AI-powered tools to help identify scammer messaging, and unlike many platforms, we moderate messages, which allows us to detect suspicious patterns early and act quickly, adding that the company has recently worked with Reality Defender, a provider of deepfake detection tools, to strengthen its ability to detect AI-generated content. A company spokesperson reported having no record of Wangs outreach but said that the company welcome[s] collaboration and [is] always open to reviewing research that can help strengthen user safety.) A recent investigation published in The Markup found that Match Group has long possessed the tools and resources to track sex offenders and other bad actors but has resisted efforts to roll out safety protocols for fear they might slow growth. This tension, between the desire to keep increasing the number of users and the need to ensure that these users and their online activity are authentic, is often behind safety issues on platforms. While no platform wants to be a haven for fraudsters, identity verification creates friction for users, which stops real people as well as impostors from signing up. And again, cracking down on platform violations costs money. According to Josh Kim, an economist who works in Big Tech, it would be costly for tech companies to build out the legal, policy, and operational teams for content moderation tools that could get users kicked off a platformand the expense is one companies may find hard to justify in the current business climate. The shift toward profitability means that you have to be very selective in where you invest the resources that you have, he says. My intuition here is that unless there are fines or pressure from governments or regulatory agencies or the public themselves, he adds, the current atmosphere in the tech ecosystem is to focus on building a product that is profitable and grows fast, and things that dont contribute to those two points are probably being deprioritized. Getting onlineand staying in line At work, Gavesh wore a blue tag, marking him as belonging to the lowest rank of workers. On top of us are the ones who are wearing the yellow tagsthey call themselves HR or translators, or office guys, he says. Red tags are team leaders, managers And then moving from that, they have black and ash tags. Those are the ones running the office. Most of the latter were Chinese, Gavesh says, as were the really big bosses, who didnt wear tags at all. Within this hierarchy operated a system of incentives and punishments. Workers who followed orders and proved successful at scamming could rise through the ranks to training or supervisory positions, and gain access to perks like restaurants and nightclubs. Those who failed to meet the targets or broke the rules faced violence and humiliation. Gavesh says he was once beaten because he broke an unwritten rule that it was forbidden to cross your legs at work. Yawning was banned, and bathroom breaks were limited to two minutes at a time. KATHERINE LAM Beatings were usually conducted in the open, though the most severe punishments at Gaveshs company happened in a room called the water jail. One day a coworker was there alongside the others, and the next day he was not, Gavesh recalls. When the colleague was brought back to the office, he had been so badly beaten he couldnt walk or speak. They took him to the front, and they said: If you do not listen to us, this is what will happen to you. Gavesh was desperate to leave but felt there was no chance of escaping. The armed guards seemed ready to shoot, and there were rumors in the compound that some people who jumped the fence had been found drowned in the river. This kind of physical and psychological abuse is routine across the industry. Gavesh and others we spoke to describe working 12 hours or more a day, without days off. They faced strict quotas for the number of scam targets they had to have on the hook. If they failed to reach them, they were punished. The UN has documented cases of torture, arbitrary detention, and sexual violence in the compounds. We h
0 التعليقات
·0 المشاركات
·56 مشاهدة