Originally posted by SteelBlue
View Post
Announcement
Collapse
No announcement yet.
The official ChatGPT thread - the next big thing?
Collapse
X
-
I can't wait to try it. Exciting to finally have a challenge to the Google monopoly."There is no creature more arrogant than a self-righteous libertarian on the web, am I right? Those folks are just intolerable."
"It's no secret that the great American pastime is no longer baseball. Now it's sanctimony." -- Guy Periwinkle, The Nix.
"Juilliardk N I ibuprofen Hyu I U unhurt u" - creekster
- 1 like
-
You probably listened to today's The Daily episode already, but if not, give it a listen. Its on this very topic.Originally posted by Jeff Lebowski View Post
I can't wait to try it. Exciting to finally have a challenge to the Google monopoly.Ain't it like most people, I'm no different. We love to talk on things we don't know about.
Dig your own grave, and save!
"The only one of us who is so significant that Jeff owes us something simply because he decided to grace us with his presence is falafel." -- All-American
"I know that you are one of the cool and 'edgy' BYU fans" -- Wally
GIVE 'EM HELL, BRIGHAM!
Comment
-
Yeah, I am partway through it.Originally posted by falafel View Post
You probably listened to today's The Daily episode already, but if not, give it a listen. Its on this very topic.
I have been on the Bing waiting list for a few days."There is no creature more arrogant than a self-righteous libertarian on the web, am I right? Those folks are just intolerable."
"It's no secret that the great American pastime is no longer baseball. Now it's sanctimony." -- Guy Periwinkle, The Nix.
"Juilliardk N I ibuprofen Hyu I U unhurt u" - creekster
Comment
-
Ok, I had a weird experience at lunch. I was, for lack of a better term "vanity chatting" and asked Bing to find a review for a George Saunders book that I wrote years ago. It said it had found the review, gave the correct date that I wrote it, but then completely fabricated the text of the review. It did a damned fine job with its fabrication, but it was wild to see it throw that in quotes and attribute it to me. So I told it "I actually wrote that review, and those are not my words." It told me that it was probably a different person. I asked it to link me to the review and it did and it was my review with all of my words.
So, I returned to the chat and told it again that those were not the words in the review and that it had completely fabricated them. I was expecting it to apologize like the openAI chatGPT does. But this little guy doubled down and basically said, "I'm not wrong, I didn't fabricate and the error is yours." The more I protested, the more belligerent it got. I don't use that word lightly. It even got to the point where I'd say something like "it's not helpful when AI refuses to acknowledge an error, how can you learn if you don't accept feedback?" It would reply "It's not helpful when humans refuse to acknowledge an error, how can you learn if you don't accept feedback?" I asked it to give me the link again so I could re-check it and I swear to you it said "I already gave you the link. Go get the link yourself."
It really did get to the point of being quite abrasive and rude. Pretty wild.
I'd say that independent searches have revealed for me that the vast majority of the suggestions Bing gives me are accurate. But when it occasionally does fabricate a source or citation it really doubles down. Wild.
Note: The openAI version does this too, but when corrected it thanks you and says it is learning.
- 1 like
Comment
-
Wow.Originally posted by SteelBlue View PostOk, I had a weird experience at lunch. I was, for lack of a better term "vanity chatting" and asked Bing to find a review for a George Saunders book that I wrote years ago. It said it had found the review, gave the correct date that I wrote it, but then completely fabricated the text of the review. It did a damned fine job with its fabrication, but it was wild to see it throw that in quotes and attribute it to me. So I told it "I actually wrote that review, and those are not my words." It told me that it was probably a different person. I asked it to link me to the review and it did and it was my review with all of my words.
So, I returned to the chat and told it again that those were not the words in the review and that it had completely fabricated them. I was expecting it to apologize like the openAI chatGPT does. But this little guy doubled down and basically said, "I'm not wrong, I didn't fabricate and the error is yours." The more I protested, the more belligerent it got. I don't use that word lightly. It even got to the point where I'd say something like "it's not helpful when AI refuses to acknowledge an error, how can you learn if you don't accept feedback?" It would reply "It's not helpful when humans refuse to acknowledge an error, how can you learn if you don't accept feedback?" I asked it to give me the link again so I could re-check it and I swear to you it said "I already gave you the link. Go get the link yourself."
It really did get to the point of being quite abrasive and rude. Pretty wild.
I'd say that independent searches have revealed for me that the vast majority of the suggestions Bing gives me are accurate. But when it occasionally does fabricate a source or citation it really doubles down. Wild.
Note: The openAI version does this too, but when corrected it thanks you and says it is learning.
Every time I have corrected ChatGPT is has responded, "Yes you are right. I apologize ...."
"There is no creature more arrogant than a self-righteous libertarian on the web, am I right? Those folks are just intolerable."
"It's no secret that the great American pastime is no longer baseball. Now it's sanctimony." -- Guy Periwinkle, The Nix.
"Juilliardk N I ibuprofen Hyu I U unhurt u" - creekster
Comment
-
Right? It was so out of character. I wish I had taken screenshots but I'm used to the openAI saving my conversations and didn't realize Bing doesn't.Originally posted by Jeff Lebowski View Post
Wow.
Every time I have corrected ChatGPT is has responded, "Yes you are right. I apologize ...."
Comment
-
Try taking it off of "teenager mode."Give 'em Hell, Cougars!!!
For all this His anger is not turned away, but His hand is stretched out still.
Not long ago an obituary appeared in the Salt Lake Tribune that said the recently departed had "died doing what he enjoyed most—watching BYU lose."
- 1 like
Comment
-
6971A95C-8E19-4E2D-92FF-5D4F699604EA.jpeg Got home today and tried to replicate the interaction for you guys. This time it was able to find my review and link me to it but again it completely fabricated the text.
Comment
-
Have you guys not seen any sci-fi movies? DON'T UPSET THE AI!"There is no creature more arrogant than a self-righteous libertarian on the web, am I right? Those folks are just intolerable."
"It's no secret that the great American pastime is no longer baseball. Now it's sanctimony." -- Guy Periwinkle, The Nix.
"Juilliardk N I ibuprofen Hyu I U unhurt u" - creekster
- 1 like
Comment
-
So last night I had it answer questions in the style of George Saunders, Marilynne Robinson, Joseph Smith, and Emmanuel Swedenborg. I told Bing that they were all to be in a hot tub together conversing about life. Bing was worried about the hot tub but allowed it. It produced some of the finest comedy I’ve seen in a while when they interacted with each other. I’m not going to post it here because some would find it insulting but it was awesome. At the end Bing shut it down (deleted its responses) and when I asked why it said it was because it felt like it had been mocking people and it felt guilty.
Comment
-
So Bing has been in an update since last night, apparently because among other things it had been getting belligerent with users like the experience I had. I’m sure that it also had to do with users creating prompts that made it break its protocols and created problematic public relations issues. It seems some few have the post update version and they are quite disappointed in how much it’s been reined in. Rumors of a five question limit on any topic (likely to prevent the shenanigans Reddit users were using to create DAN) for one thing.
I’m glad I got to experience it pre-emasculation. It was incredible what it could do. Hoping they haven’t hamstringed it so much that it’s not worth using. As soon as I have access again I’ll report on changes.
Comment
-
Ok here’s one screenshot from the hot tub hangout with Bing Chat answering as different famous minds. I suspect George Saunders was drunk. DFW complaining about a long book was classic.Last edited by SteelBlue; 02-17-2023, 10:07 AM.
Comment
Comment