When ChatGPT became available, I tried to use it to help with research. It seemed to come up with easy answers. However, over time, I learned that it was a piece of junk meant to fool people, by using convoluted language and making up things, that they were getting information. To prove the point, I offer the following dialogue (edited for bevity because ChatGPT loves repeating nonsense), with the AI. Repeated stuff is edited out as standard “boilerplate.”
Yesterday it told me the Gospel of Mark was found in the Dead Sea Scrolls. I was flabbergasted. One sentence later it explained how no NT writings were in there. I called it out and it said, "Oh yeah, I was wrong earlier, ignore that."
I will only use it for settling hypothetical debates now. "Who would win a fight between a ninja and viking?"
Yesterday it told me the Gospel of Mark was found in the Dead Sea Scrolls. I was flabbergasted. One sentence later it explained how no NT writings were in there. I called it out and it said, "Oh yeah, I was wrong earlier, ignore that."
I will only use it for settling hypothetical debates now. "Who would win a fight between a ninja and viking?"