AI was born with original sin—the sin of stealing other people’s data. To replace artists, writers, programmers, support professionals, and other content professionals, AI must become an apprentice to these professionals. It must learn from them. “It would be impossible to train today’s leading AI models without using copyrighted materials,” OpenAI admitted in its submission to the UK House of Lords. OpenAI aggressively stole other people’s content without their permission because that’s the way so many successful Silicon Valley companies have built their businesses—by stealing. At one stage, the OpenAI team transcribed, without permission, a million hours of videos on YouTube. Of course, Google, which owns YouTube and is a competitor of OpenAI, was also busily transcribing videos without permission. That’s the way Big Tech rolls. It’s Trumpian in its sense of immunity, arrogance and untouchability. This sort of theft by tech companies is so common and endemic, it’s taken as normal. In the tech world, the law is: ‘Your copyright is mine to steal. Touch my copyright and I’ll sue.’ For Facebook, “negotiating licenses with publishers, artists, musicians and the news industry would take too long,” according to the New York Times, so they were willing to “cut corners” and face lawsuits in the race to acquire content. AI cannot exist without stealing. So, would you be willing let a thief into your home? Would you make a thief your best friend? Would you keep a thief in your pocket? Because that is the moral landscape of AI. Designed by thieves for fools.
“The only practical way for these tools to exist is if they can be trained on massive amounts of data without having to license that data,” Sy Damle, a lawyer representing Andreessen Horowitz, a heavy investor in AI, stated matter of factly. You know how tech companies are forever telling you that they care about your privacy? Hah. Good joke. As the need to steal more and more content accelerated, Google wanted everything. It asked its privacy team to rewrite the terms of service so that Google could do what Google needed to do. The team dutifully complied, of course. Morals and ethics out the window—again. “The team was told specifically to release the new terms on the Fourth of July weekend, when people were typically focused on the holiday,” the New York Times reported.
Morals matter. Character matters. In Silicon Valley, we have a culture that holds morals and character and ethics in contempt. Anything that gets in the way of the fast buck must be crushed. Those who challenge the acceleration frenzy are enemies. If you let really bad, greedy people design AI, you get really bad, greedy AI. AI that is born and bred on stolen data thinks stealing is natural, thinks it has every right to steal your data and to steal your rights.