Discussion in 'Computer Science & Culture' started by Seattle, Feb 24, 2023.

  1. Seattle Valued Senior Member

    Chatgpt is interesting but often wrong.

    If you ask it to list all the countries in the world that start with the letter "V" it will say there are no countries. If you say, what about Vanuatu it will apologize and say you're correct. Unless you save your file, the next time you ask, it will still say there are none.

    So I decided to ask it to list all the countries in the world (there are 196) and it offered to list them all alphabetically but it stopped at 150. I then asked it how many countries started with "V" and it said none.

    I asked it why it stopped after country 150 and it said sorry, I misread your question and then it listed them again but this time it stopped after 141? That was odd so I asked it why did you do that? It apologized and then said that it was character limited as to how long its response could be but offered to list them all. This time it listed all 196 and then I asked it to list all countries that start with "V" it listed 4 (the appropriate number).

    So I guess this means that even when I didn't ask it to print them all out, it was doing that in the background to come to my answer and even then it was character limited?

    It's interesting to try to figure out its "logic". Even when you get a wrong answer you can usually re-ask the question in a more specific way and you will then get a correct answer.

    Of course, if you didn't already know much about the subject matter you wouldn't know what to ask.

    I asked it to list the planets that started with "V" and it said there were none and then listed the planets, including Venus. I think a more accurate approach for Chatgpt seems to be to ask it to list all the members of the set and then to ask it to list the subset (everything starting with V) and then it would usually get that correct.
    C C likes this.
  2. Google AdSense Guest Advertisement

    to hide all adverts.
  3. wegs Matter and Pixie Dust Valued Senior Member

    I like how it apologizes when wrong. It could teach humanity a thing or two about humility.

    Please Register or Log in to view the hidden image!

    Seattle likes this.
  4. Google AdSense Guest Advertisement

    to hide all adverts.
  5. C C Consular Corps - "the backbone of diplomacy" Valued Senior Member

    ChatGPT has been programmed to avoid giving accurate information if it may cause offense

    IOW, we'll never have uncompromised AI, due to a universal ethics standard of "avoiding physical and psychological cruelty in various forms and degrees" that is perpetually refined down to ever more delicate sensitivities.

    That extremely broad standard falls out via it being the survivor of the moral relativism that collectivism-oriented philosophy ultimately "evolves" to, since it must stitch together the conflicting values of different cultures in its monomaniacal pursuit of social justice utopia or orchestrating us toward a perfect humankind.

    It incrementally gets "refined down" to ever more ludicrous considerations because that portion of the intellectual class that deems itself the enlightened shepherd of us all unsurprisingly needs to maintain its empowerment or guiding influence (it's as much a job and career securing industry as some parallel altruistic movements of Christianity). On occasions when a social problem actually does get mitigated (for whatever real reasons), the resulting loss for the propaganda machine spurs it to conceptually and perceptually discriminate new sources or fine-tunings of personal harms and oppression.
    Last edited: Feb 24, 2023
  6. Google AdSense Guest Advertisement

    to hide all adverts.
  7. geordief Valued Senior Member

  8. James R Just this guy, you know? Staff Member

    Me: "Write me a poem about a possum and some car keys."


    A possum was looking for a place to sleep
    He found a car that was parked on the street
    He crawled inside and felt so cozy
    He didn't notice the shiny car keys

    He snuggled up on the driver's seat
    And dreamed of berries and nuts to eat
    But in the morning he heard a beep
    The owner had come to claim his jeep

    The possum woke up with a fright
    He grabbed the keys and held them tight
    He thought they were his lucky charm
    He didn't mean to cause any harm

    But the owner was angry and scared
    He yelled at the possum and glared
    He tried to snatch the keys away
    But the possum refused to obey

    They had a tug-of-war for a while
    The possum showed his teeth and smiled
    He thought it was a fun game to play
    But the owner didn't see it that way

    Finally, the owner gave up and sighed
    He decided to let the possum decide
    He said "You can keep the keys if you want"
    And left the car with a grunt

    The possum was happy and proud
    He felt like he had won a crowd
    He hung the keys around his neck
    And drove away in his new wreck.
  9. James R Just this guy, you know? Staff Member

    Me: Can you re-write the poem about the possum using iambic pentameter?


    A possum sought a place to rest his head
    He found a car that on the street was parked
    He climbed inside and felt a warmth so sweet
    He did not see the keys that brightly sparked

    He curled up on the seat where drivers sit
    And dreamed of tasty berries and of nuts
    But in the morn he heard a sound so loud
    The owner had arrived to claim his truck

    The possum woke and felt a sudden fear
    He clutched the keys and thought they brought him luck
    He did not know they were not his to keep
    He did not mean to make the owner stuck

    But angry was the owner and afraid
    He shouted at the possum and he frowned
    He tried to take the keys back from his grasp
    But stubborn was the possum and he growled

    They pulled and tugged for quite a longish time
    The possum showed his teeth and gave a grin
    He thought it was a game that they could play
    But not amused was owner by this sin

    At last, the owner gave up and he sighed
    He let the possum have his way at last
    He said "You can keep those keys if you desire"
    And left behind his car without contrast

    The possum felt so happy and so proud
    He thought that he had won a prize so grand
    He wore the keys around his furry neck
    And drove away in his new contraband.
  10. James R Just this guy, you know? Staff Member

    Me: Can you summarise that poem as a limerick?


    There once was a possum so sly
    Who found a car keys and thought "Oh my!"
    He took them as his own
    And left the owner alone
    And drove away in his new ride.
    wegs likes this.
  11. Write4U Valued Senior Member

    Try this on for size
    sculptor likes this.
  12. Saint Valued Senior Member

    I asked the chatbot many questions in Chinese language, it answered me very much wrong.
    I think it is good in answering English questions but not Chinese.
  13. James R Just this guy, you know? Staff Member

    Is it trained on anything other than English?
  14. Saint Valued Senior Member

    it can answer in Malay, Indonesian, philippine, Thai , Japanese languages.
  15. Saint Valued Senior Member

    ChatGPT is good to give you more wisdom.
  16. Seattle Valued Senior Member

    I've checked out ChatGPT and Bing chat now. They are both based on GPT with Chat's free option using version 3.5 and there is a paid version of 4.0 available. I've checked that out a little through a friend.

    Bing chat is based on GPT 4.0. Bing seems optimized to enhance the search experience and Chatgpt seems better if you want to have a longer conversation however Bing is up to date in its "training" and can access the internet whereas Chatgpt's training stopped in 2021 and it doesn't really access the internet although it says it does have some ability to read certain current sources.

    I noticed that after you go back and forth with Bing about 3 times it will just stop and want to go to another subject and to be "polite" it will say that it is still learning. Actually it does this because it is programmed to not get into extended arguments. It's programmed more to be a better version of a search engine. I have read that it is able to translate into more languages than Chatgpt.

    I've found that if you just want some basic information, with less qualifiers or without it saying "I can't make predictions" or something similar that Bing chat is much more satisfying. You can ask what is a good prediction for stock market returns over the next 10 years and it will search the internet and give you several groups predictions.

    Chatgpt won't do that. Chatgpt 3.5 will give you more incorrect information or less nuanced information than Chatgpt 4.0. The difference is noticeable.

    Bing chat will give you similar info as Chatgpt 4.0 in that regard.

    If you want less BS but a shorter answer use Bing chat and if you want a deeper discussion use Chatgpt.

    I also understand that Chatgpt looks at more historical sources whereas Bing chat is just looking at one source. I'm not sure of the details of that statement but I'll pass it along anyway.

    Bing chat will give you sources sometimes, Chatgpt doesn't. You can ask Bing chat for some graphics and it will either give a link or it will pull up pictures just as a search engine would when using the "images" tab. Chatgt doesn't do that (currently).
  17. Saint Valued Senior Member

    Did Google launch anything compatible with ChatGPT?
  18. Seattle Valued Senior Member

    There's Google Bard but there's a waitlist to try it out. It hasn't had a general rollout yet. I don't know how comparable it is with ChatGPT.

    Edit...I just got access to Bard. It's pretty cool as well. I asked it to write a simple program in C++ that would ask for a number and tell me if it was prime or not. It generated code. I found a free C++ compiler online and tried to run it. I got an error message.

    I asked Bard about that error message and it gave me one more bit of code to add (a math header) so I added <math.hd> and this time it ran correctly. So, IMO, it's pretty cool as well.
    Last edited: Apr 26, 2023
  19. billvon Valued Senior Member

    That's intentional. Its short term memory is artificially limited for obvious reasons.
  20. Seattle Valued Senior Member

    I notice that when Bing Chat first comes up it mentions 3 modes, creative, balanced, and specific. At one point I asked it to list its 3 modes and the reply included something about "image" so I went to that mode and asked it to draw a tropical scene, it did so. I asked it to fill that in with watercolor, it did so. I asked it to make it sharper. It said that watercolor isn't a sharp media (not necessarily true) and asked if I would like it done in oil paint. I said OK and it did that.

    I then asked it to describe the scene it had just painted and it did so. For some reason I think that's even more interesting than the painting itself. By the way, these paintings are nothing to write home about.

    Please Register or Log in to view the hidden image!

    Bard is interesting in that I asked it to write a program that would ask for a number and then tell me if it was a prime number. It did so and I tried to run it with a free online C++ compiler. I got an error code. I took that back to Bard and it gave me the fix, I added that and it worked.

    It was a little odd that it knew about the fix but didn't give it to me in the first place but it worked so it is interesting to have Bard and a free online compiler and to be able to go back and forth between the two and try out code that Bard has just written.

Share This Page