• Lev@europe.pub
    link
    fedilink
    English
    arrow-up
    85
    ·
    1 day ago

    Daily reminder that Codeberg is always the good alternative to corporate bastards like this idiot

  • rimjob_rainer@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    1
    ·
    edit-2
    1 day ago

    I don’t get it. AI is a tool. My CEO didn’t care about what tools I use, as long as I got the job done. Why do they suddenly think they have to force us to use a certain tool to get the job done? They are clueless, yet they think they know what we need.

    • buddascrayon@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      ·
      23 hours ago

      Because unlike with the other tools you use the CEO of your company is investing millions of dollars into AI and they want a big return on their investment.

    • bless@lemmy.ml
      link
      fedilink
      English
      arrow-up
      58
      ·
      1 day ago

      GitHub is owned by Microsoft, and Microsoft is forcing AI on all the employees

      • ksh@aussie.zone
        link
        fedilink
        English
        arrow-up
        3
        ·
        16 hours ago

        They all need to be sued for unethical “Embrace, Extend and Extinguish” practices again

    • sobchak@programming.dev
      link
      fedilink
      English
      arrow-up
      16
      ·
      1 day ago

      I think part of it is because they think they can train models off developers, then replace them with models. The other is that the company is heavily invested in coding LLMs and the tooling for them, so they are trying to hype them up.

    • Jhex@lemmy.world
      link
      fedilink
      English
      arrow-up
      18
      ·
      1 day ago

      Why do they suddenly think they have to force us to use a certain tool to get the job done?

      Not just that… why do they have to threat and push for people to use a tool that allegedly is fantastic and makes everything better and faster?.. the answer is that it does not work but they need to pump the numbers to keep the bubble going

    • MajorasMaskForever@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      It’s not about individual contributors using the right tools to get the job done. It’s about needing fewer individual contributors in the first place.

      If AI actually accomplishes what it’s being sold as, a company can maintain or even increase its productivity with a fraction of its current spending on labor. Labor is one of the largest chunks of spending a company has so, if not the largest, so reducing that greatly reduces spending which means for same or higher company income, the net profit goes up and as always, the line must go up.

      tl;dr Modern Capitalism is why they care

    • 0x0@lemmy.zip
      link
      fedilink
      English
      arrow-up
      14
      ·
      1 day ago

      They are clueless, yet they think they know what we need.

      Accurate description of most managers i’ve encountered.

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      1
      ·
      1 day ago

      They are clueless, yet they think they know what we need.

      AI make money line go up. It’s not clueless, he’s trying to sell a kind of snake oil (ok, not “snake oil”, I don’t think AI is entirely bad).

  • Jocker@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    18
    ·
    edit-2
    1 day ago

    Contradictory to the title, this message is not to the developers, developers don’t care what github ceo thinks, and they should know it. This might be for the management of other companies to allow using ai or force ai usage.

  • redlemace@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    ·
    edit-2
    1 day ago

    such an easy choice …

    (edit: I followed up and got out. This too is now self-hosted and codeberg when needed)

  • Fedditor385@lemmy.world
    link
    fedilink
    English
    arrow-up
    35
    arrow-down
    1
    ·
    edit-2
    1 day ago

    AI can only deliver answers based on training code developers manually wrote, so hod do they expect to train AI in the future if there is no more developers writing code by themselves? You train AI on AI-generated code? Sounds like expected enshittification down the line. Inbreeding basically.

    Also, small fact is that they invested so much money into AI, that they can’t allow it to fail. Such comments never came from people who don’t depend on AI adoption.

    • Showroom7561@lemmy.ca
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      It’s like all those companies who fast tracked their way into profits by ignoring the catastrophic effects they were having on the environment… Down the road.

      Later is someone else’s problem. Now is when AI-pushers want to make money.

      I hate where things have been heading.

    • WhyJiffie@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 day ago

      same as how it goes on the stock market? they don’t care about the long term, but only the short term. what happens on the long term is somebody else’s problem, you just have to squeeze out everything, and know when to exit.

      they are gambling with our lives. but not with theirs. that’s (one of) the problem: they are not fearing their lives.

  • ipkpjersi@lemmy.ml
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    edit-2
    17 hours ago

    Threatening remarks like that are why I learned PHPUnit and XDebug, and yeah it made me become a better developer, but often times these are just empty statements.

    AI is just another tool in my toolbox, but it’s not everything.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    ·
    1 day ago

    Expectation: High quality code done quickly by AI.

    Reality: Low quality AI generated bug reports being spammed in the hopes the spammers can get bug bounty for fixing them, with AI of course.

  • medem@lemmy.wtf
    link
    fedilink
    English
    arrow-up
    37
    ·
    1 day ago

    “Managing agents to achieve outcomes may sound unfulfilling to many”

    No shit, man.

  • zarkanian@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    233
    ·
    2 days ago

    This part really stuck out for me:

    This is the latest example of a strange marketing strategy by AI companies. Instead of selling products based on helpful features and letting users decide, executives often deploy scare tactics that essentially warn people they will become obsolete if they don’t get on the AI bandwagon.

    If hype doesn’t work, try threats!

    • A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      arrow-up
      58
      ·
      2 days ago

      Which is how you know they have a good product that they have full faith in.

      when they have to blackmail, threaten, coerce, and force people to accept their product.

    • vacuumflower@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      Threats work well for scams. People who couldn’t be bothered to move by promises of something new and better can be motivated by fear of losing what they already have.

      It’s really unfortunate psychology is looked down upon and psychologists are viewed as some “soft” profession. Zuck is a psychology major. It’s been 2 decades, most of the radical changes in which were not radical in anything other than approach to human psychology.

      BTW, I’ve learned recently that in their few initial years Khmer Rouge were not known as communist organization to even many of their members. Just an “organization”. Their rhetoric was agrarian (of course peasants are hard-working virtuous people, and from peasantry working the earth comes all the wisdom, and those corrupt and immoral people in the cities should be made work to eat), Buddhist (of course the monk-feudal system of obedience, work and ascese is the virtuous way to live, though of course we are having a rebirth now so we are even wiser), monarchist (they referred to Sihanouk’s authority almost to the end), anti-Vietnamese (that’s like Jewish for German Nazis, Vietnamese are the evil). And after them taking power for some time they still didn’t communicate anything communist. They didn’t even introduce their leadership. Nobody knew who makes the decisions in that “organization” or how it was structured. It didn’t have a face. They only officially made themselves visible as Democratic Kampuchea with communism and actual leaders when the Chinese pressured them. They didn’t need to, because they were obeyed via threat (and lots of fulfillment) of violence anyway.

      This is important in the sense that when you have the power, you don’t need to officially tell the people over which you have it that you rule them.

      So - in these 2 decades it has also came into fashion to deliberately stubbornly ignore the fact that psychology works over masses. And everybody acts as if when there’s no technical means to make people do something, then it’s not likely or possible.

    • Echolynx@lemmy.zip
      link
      fedilink
      English
      arrow-up
      12
      ·
      1 day ago

      For some odd reason, this calls to mind an emotionally immature parent trying to get their toddler to eat vegetables… no reason at all…

      • uzay@infosec.pub
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 day ago

        Just that the vegetables in this case are actually fastfood and gummibears.

  • aliser@lemmy.world
    link
    fedilink
    English
    arrow-up
    28
    ·
    1 day ago

    does “embracing AI” means replacing all these execs with it? or is it “too far”?

    • Soup@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      1 day ago

      No, they’re all super special and have an “instinct” that a robot could never have. Of course the same does not go for artists or anyone who does the actual work for these “titans of industry”.

      *by “instinct” we, of course, mean survivorship bias based on what is essentially gambling, exploitation, and being too big to fail.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    1 day ago

    I asked an AI to generate me some code yesterday. A simple interface to a REST API with about 6 endpoints.

    And the code it made almost worked. A few fixes here and there to methods it pulled out of it’s arse, but were close enough to real ones to be an easy fix.

    But the REST API it made code for wasn’t the one I gave it. Bore no resemblance to it in fact.

    People need to realise that MS isn’t forcing it’s devs to write all code with AI because they want better code. It’s because they desperately need training data so they can sell their slop generators to gullible CEOs.

  • alvyn@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    30
    ·
    1 day ago

    Is his message: “let us scrape your code or go away, and we gonna scrape it anyway” note: scrape = steal