Report - ChatGPT/OpenAI is training its AI technology to replace some software engineers

Discussion in 'Off-Topic Discussions' started by Lerner, Jan 28, 2023.

  1. Lerner

    Lerner Well-Known Member

    "OpenAI has quietly hired hundreds of international contractors to train its artificial intelligence in software engineering, according to a report from Semafor.

    Some contractors, hired in the last 6 months from places like Latin America and Eastern Europe, have reportedly been tasked with creating data to coach AI to learn simple software engineering tasks.

    While OpenAI already has a product called Codex, which can convert natural language into working code, the company's hiring spree indicates that it's looking to advance that technology, potentially creating a working replacement for some human coders.

    Semafor spoke to one engineer in South America who interviewed for one of OpenAI's contractor roles. As part of the process, he was tasked with finding bugs in AI code and providing explanations for how to fix its mistakes. The engineer told Semafor he thinks the company wants to feed the training data into its AI technology. "
  2. datby98

    datby98 Active Member

    I heard that ChatGPT is so powerful even in generating an impeccable colleague essay.
    Just out of curiosity, will ChatGPT beat Google Translation in translations?
  3. Johann

    Johann Well-Known Member

    I don't imagine that's too hard. The bar is set pretty low. :( In particular Google Latin is mostly incorrect and sometimes unintelligible. Google had a Q&A once and someone asked " Why is Google Latin so bad."

    The bafflegab answer was that Latin's grammar system is SO complicated. BS. It's no more complex that Russian or Polish. Latin has couple more noun cases than German (but still one less than Russian, IIRC), but the verb systems are both of (I'd say) similar complexity.

    Google Translate does a WAY better job with Chinese than it does with Latin. I don't know what the answer is, but it's definitely NOT what the Google guy said it was.
    Last edited: Jan 28, 2023
  4. Rachel83az

    Rachel83az Well-Known Member

    Indeed. Google Translate is pretty awful at times. ChatGPT is already better, IMO. And it has more languages available, to varying degrees of fluency.

    ChatGPT is still pretty terrible at getting truths correct, though. When not translating, it gets things very wrong. If it doesn't know something, it makes things up. Stuff that sounds good on the surface, but really isn't. This guy used an AI to write an essay about Doctor Who. I can't remember if he used NovelAI (which I believe uses the ChatGPT API) or ChatGPT directly, but they're both roughly the same. I think NovelAI is supposed to be able to hold a coherent plot, better than ChatGPT, but I don't think it does.

  5. Rich Douglas

    Rich Douglas Well-Known Member

    Not from what I've seen. But the public version is very limited. It's hard to say what else might be behind the curtain.
  6. Lerner

    Lerner Well-Known Member

  7. LevelUP

    LevelUP Active Member

    ChatGPT is pretty powerful though isn't enough to replace a programmer on a 1-to-1 basis. It can assist in programming to speed up tasks, so if you had say, 100 programmers, maybe you could reduce staff some if they got more productive using ChatGPT.

    I saw a video where a guy with zero experience in Flutter, was able to use ChatGPT to assist him in writing a simple Todolist app in one hour.

    ChatGPT is great for assisting in writing new functions and short programs, but it's not going to be able to understand large applications.
    Rachel83az likes this.
  8. Lerner

    Lerner Well-Known Member

    "ChatGPT can write computer code to program applications and software. It can check human coders' language for errors and convert ideas from plain English into programming language.

    "In terms of jobs, I think it's primarily an enhancer than full replacement of jobs," Columbia Business School professor Oded Netzer told CBS MoneyWatch. "Coding and programming is a good example of that. It actually can write code quite well."

    That could mean performing basic programming work currently done by humans.

    "If you are writing a code where really all you do is convert an idea to a code, the machine can do that. To the extent we would need fewer programmers, it could take away jobs. But it would also help those who program to find mistakes in codes and write code more efficiently," Netzer said."
  9. Rich Douglas

    Rich Douglas Well-Known Member

    Hasn't this been going on since the invention of computer languages? No one codes in binary, right? They code in higher languages and compile. This seems like an extension of that.
    Rachel83az likes this.
  10. rhodamine

    rhodamine New Member

    To quote an old Director of IT of mine - "The two of you [developers] should start coding while I go get the requirements." I heard that early in my career and still find it true decades afterward. It's easy to pit IT and Business against each other when a project doesn't end well. All too often, the business cannot provide the requirements at a sufficiently detailed level. An AI code developer will either infuriate the business because of continued vague requirements, or the business folks will finally figure out how to create a comprehensive set of requirements detailed enough for the AI to be successful. Human IT workers can fill in the missing requirements far better than ChatGPT. I'm not in fear of losing my job anytime soon. If anything, the AI will help me produce code faster by allowing me to focus more on the business and less on debugging and googling for syntax.
    Rachel83az and Rich Douglas like this.
  11. Johann

    Johann Well-Known Member

    Yes, to all. More than 30 years ago, before I retired, our company's IT "shop" had a then-new "code-generator." It never replaced anyone, though some feared it might. It wasn't designed to. All the devices that write code, then and now, seem to do one thing - deprive developers of monotony. Shoulder some (maybe most, nowadays) of the dull, repetitive grunt work, so that well-qualified humans can use their superior minds and skills to create really good software, that can't be produced by any other method. That's why there's "real" intelligence - and "artificial."
    Rachel83az and Rich Douglas like this.
  12. Lerner

    Lerner Well-Known Member

    The popularity of ChatGPT, the online chatbot built by OpenAI, has brought many to question the survival of search engines such as Google. Paul Buchheit, the creator of Gmail, has also dropped his opinion on the matter, and he thinks that Google's business will last a maximum of two years, he tweeted.
  13. SteveFoerster

    SteveFoerster Resident Gadfly Staff Member

    I'll go ahead and take that bet: Google search will still be in use on Groundhog Day 2025. (Or Imbolc, or Candlemas, or Ayn Rand's Birthday, or whichever observance one prefers, since it turns out February 2nd punches above its weight!)
  14. JBjunior

    JBjunior Active Member

    I concur. I think the interface is familiar and what many people value using. How long did/has Yahoo lasted as a search engine once something came along exponentially better? More importantly, I think Google can use competing technology coupled with a familiar interface to deliver a product that meets most people’s use case. Another wild card will be people’s distrust of AI and the potential desire for a perceived less intrusive (less advanced) option that is more likely to give them an ad related to what they search for instead of using their deepest desires to enslave them. We live in exciting (scary? wonderful? It doesn’t matter, it is what we get) times!

Share This Page