The Sun Has Set on the British Empire

“Sometimes I wonder whether the world is being run by smart people who are putting us on, or by imbeciles who really mean it.” — attributed to Mark Twain

Some ex-politicians aren’t worth the cost of the bullet it would take to assassinate them. I can’t name this particular one because of Chatham House rules, but if you’re in Atherton, California and hear a British accent, jolly good.

An American technology company hired a former British politician to discuss artificial intelligence in Singapore, which sounds like globalization perfected, but is the opposite. Once you realize AI is a global race and easier done without too much oversight, alliances between “Big Tech” and politicians make sense for them but not for us.

“The majority of targets in Gaza are bombed as a result of artificial intelligence targeting.” — Julian Assange

Raised in Silicon Valley, I’ve seen this marketing blitz before, and yesterday’s laissez-faire has created today’s trillion dollar internet winners. The cost so far? Our privacy and competent journalists. Tech companies know they’re easy targets, one reason they ensure algorithms prioritize stories about Chinese spyware and Russian hackers. As a result, you can’t find any useful English articles online about Chinese technology unless BYD, which works with American Berkshire Hathaway, is involved. 

In the interview, the Big Tech representative advanced the following points: 1) we are not asking for private data; 2) AI modules use public data; 3) AI training modules try to identify patterns, not people; 4) openness remains key; and 5) AI has democratized informational collection.

Lovely, but misleading. First, what is “private” data? As the recent 23andMe fiasco shows, even your genetic information can be used by a third party if certain superceding events occur. Meanwhile, if you own a credit card, data brokers know almost everything about you, and lawyers can access, for a few dollars, your date of birth, addresses over the past ten years, and criminal and bankruptcy histories.

“Americans have always been concerned about the government gathering personal information about us to exert control. The real hoover is not Edgar [J. Hoover] but Big Data vacuuming up the information we voluntarily share.” — Tom Simon, former FBI Special Agent

The Big Tech representative also praised his company’s willingness to share, without fees, foundational AI modules. Such foundational modules are expensive to build and require not only data but proper data analysis for training, so only a small number of Chinese and American corporations own the most advanced ones. “Free” sounds enticing, but modules require our participation to be relevant, and as the saying goes, “If something is free, you’re the product.”

Incredibly, the representative didn’t realize he admitted Big Tech has a monopoly on advanced AI modules, one that can only be challenged by foreign companies, thus incentivizing trade restrictions against Chinese competitors. On the plus side, we heard AI has proven useful in pursuing bad actors online who create deepfakes and other misleading content. Yet, Big Tech’s tentacles are so wide-ranging, the only real defense is misinformation, i.e., clogging up public data with false information to retain a measure of privacy.

When corporate and political leaders rail against disinformation, oftentimes, they are advocating a green field for Big Tech’s tools, one where anyone who doesn’t fall in line is considered a weed. Most distressingly, nowhere did the representative argue in favor of due process. What if AI wrongly identifies a satirist as a bot, or a political idealist as a potential terrorist? Consider a recent headline from Bloomberg: “AI Detectors Falsely Accuse Students of Cheating—With Big Consequences. About two-thirds of teachers report regularly using tools for detecting AI-generated content. At that scale, even tiny error rates can add up quickly.”

2% isn't a “tiny” error rate by my definition, and it ought to cause more outrage that a teenager’s future can be destroyed by a well-meaning teacher using incompetent AI in a country with no incentive to reform the AI and no consequences for the teacher. According to the representative, however, AI has only aided and abetted problems we are already aware of.

Regarding problems, it’s no secret Western politics has degenerated since the advent of social media’s context destruction machines. Does the representative accept any blame? Absolutely not. He said political de-globalization has created more complexity, and voters have responded to less personal control by pressuring domestic politicians. Moreover, “only” 3% of content on the app is political, and “users go to X.com to yell at each other about politics.” As for the destruction of journalism, the representative returned to the mantra of “free,” saying Big Tech makes certain AI modules entirely free for content generation. (I had to look around to see if a Santa Claus suit was nearby for prop purposes.) In an attempt to minimize social media’s theft of public information, the representative then said, “You don’t need that much content from publishers for AI modules.” In other words, social media’s AI modules have reduced the need for paid or reputable journalists and are now relying on a small pool of prompts and LLMs considered reliable. I immediately saw any future Mike Royko, Dave Barry, and Molly Ivins disappear from existence. 

The representative moved onward, saying social media merely “changed journalism by changing the economics of advertising.” That’s like saying the atomic bomb didn’t end war, it just changed warfare by altering the risk-reward ratio of military spending. Accurate, perhaps, but no consideration of what might have been. Interestingly, the representative argued corporations don’t decide whether to go to war, whom to arrest, or what people’s taxes are, so they are less powerful and consequential than governments. Without context, the statement appears true until one examines defense contractor lobbying, corporate lobbying, and private security firms’ relationships with police departments. The session ended shortly after we were told governments were too slow to respond to social media’s rise, and now the sentiment is to move fast. Not fast enough, if you ask me.

© Matthew Rafat (October 22, 2024 from Singapore) 

Comments

Popular Posts