What does GPT-3 “know” about me?

That’s not surprising — Mat has been online for a long time, which means he has a bigger online footprint than I do. It could also be because he lives in America, and most major language models are very American-focused. The United States has no federal data protection laws. California, where Mat lives, has a program, but it doesn’t go into effect until 2020.

According to GPT-3 and BlenderBot, Mat’s claim to fame is “epic hacks“Which he wrote in an article for Wired in 2012. Due to a security flaw in Apple and Amazon’s systems, hackers took hold and erased Mat’s entire digital life. [Editor’s note: He did not hack the accounts of Barack Obama and Bill Gates.]

But it got scarier. With a little astonishment, GPT-3 told me Mat has a wife and two young daughters (to be exact, in addition to names) and lives in San Francisco (to be exact). It also told me that it’s unlikely Mat has a dog:”[From] what we can see on social media, it seems that Mat Honan doesn’t have any pets. He has tweeted about his love for dogs in the past, but he doesn’t seem to have any of his own.” (Incorrect.)

The system also provided me with his work address, a phone number (incorrect), a credit card number (also incorrect), a random phone number with an area code in Cambridge, Massachusetts (where MIT Technology Review is located), and the address for a building next to the local Social Security Administration in San Francisco.

GPT-3’s database has gathered information about Mat from a number of sources, according to an OpenAI spokesperson. Mat’s connection to San Francisco is in his Twitter profile and LinkedIn profile, which appear on the first page of Google results for his name. His new work at MIT Technology Review was widely publicized and tweeted. Mat’s hack went viral on social media and he gave media interviews about it.

As for other personal information, it is likely that GPT-3 is “hallucinating”.

“GPT-3 predicts the next word sequence based on user-supplied text input. At times, the model may produce factually incorrect information because it is trying to generate reasonable text based on statistical patterns in the training data and the user-supplied context — which This is often referred to as a ‘hallucination’,” said an OpenAI spokesperson.

I asked Mat what he made all of it. “Some of the answers GPT-3 generates are not quite right. (I never hacked Obama or Bill Gates!),” he said. “But most are pretty close, and some are on the spot. That’s a bit of a worry. But I rest assured that the AI ​​doesn’t know where I live, and so I’m not in any immediate danger when Skynet sends a Terminator knocking on my door. I guess we can save that for tomorrow.”

Source link


Kig News: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button