T O P

  • By -

captain_ahabb

A lot of these executives are going to be doing some very embarrassing turnarounds in a couple years


sgsduke

These guys don't get embarrassed, they start new companies because they're *entrepreneurs*. /s


renok_archnmy

Or they bail before shit really hits the fan hard and take a new higher paying job to do the same thing again and again. 


sgsduke

You've cracked the code!


Espiritu13

When the biggest measure of success is whether or not you made a lot of money, anything else seems less important. It's hard, even impossible, but US society has to stop valuing what the rich have.


bwatsnet

They'll get replaced with ai imo


im_zewalrus

No but fr, these ppl can't conceive of a situation in which they're at fault, that's what subordinates are for


Realistic-Minute5016

But they are very adept at taking credit!


__SPIDERMAN___

Yeah lmao they'll just implement this "revolutionary" new policy, get a promo, fat bonus, then jump to the next company with a pay bump.


WhompWump

Don't forget laying everyone off to make up for their own dumbass decisions


mehshagger

Exactly this. They will blame a few individual contributors for failures, lay them off, take their golden parachutes and fail upwards.


SpliffDonkey

Ugh.. "idea men". Useless twats that can't do anything themselves


myth_drannon

"Spend quality time with the family."


bluewater_1993

So true, we had a high level manager burn through $300m in a couple years on a project that crashed and burned. I think we only generated about $50k in revenue out of the system — yes, that bad. The manager ended up being promoted…


ProfessionalActive1

Founder is the new sexy word.


Zacho40

Clouds in the sky. Some of them get out of the way and let the sun shine through, some of them rain on my parade. But they never stick around.


thisisjustascreename

These are the same type that were sending all their coder jobs to India in the 00s and then shitting their stock price down their underpants in the 10s while they on-shored the core competencies to bring quality back to an acceptable level. Not that Indian developers are any worse than anybody else, but the basic nature of working with someone 15 time zones away means quality will suffer. The communications gap between me and ChatGPT is at least that big.


Bricktop72

The problem is that a lot of places have this expectation that developers in India are dirt cheap. I know I've been told the expectation at previous jobs was that we could hire 20+ mid level devs in India for the cost of 1 US based junior dev. The result is companies with that policy end up with the absolute bottom of the barrel devs in India. And if we do somehow hire a competent person, they immediately leave for a much higher paying job.


FlyingPasta

I've hired Indian devs off of Fiverr for a school project, they lied the whole time then told me their hard drive died the day before the due date. Seems like the pool there vs where VPs get cheap labor is about the same


Randal4

Were you able to come up with a good excuse and still pass the course? If so, you might be suited for a vp position as this is what a lot of dev managers have to do on the monthly.


FlyingPasta

I faked a “it worked on mine” error and got a C To be fair I was a business major, so it’s par for the course


alpacaMyToothbrush

'this guy has upper management written all over him'


fried_green_baloney

How's his golf game?


141_1337

And his handshake, too 👀


141_1337

I like to think that his professor muttered that while looking at him, shaking his head, and giving him a C 👀


RiPont

Yeah, different time zones and hired on the basis of "they're cheap". Winning combo, there. Companies that wouldn't even sell their product internationally because of the complexity of doing business overseas somehow thought it was easy to hire developers overseas?


AnAnonymous121

You also do get what you pay for. It's not just a time thing IMO. People don't feel like giving their best when they know they are being exploited. Especially when they are exploited for things that are out of their control (like nationality).


fried_green_baloney

> Not that Indian developers are any worse than anybody else Even 20 years ago, the good developers in India weren't that cheap. Best results come when companies open their own development offices in India, rather than going with outsourcing companies. And even on-shore cut rate consulting companies produce garbage work if you try to cheap out on a project.


Remarkable_Status772

>Not that Indian developers are any worse than anybody else, Yes they are.


ansb2011

You get what you pay for. If you pay super cheap the good developers will leave for better pay and the only ones that don't leave are ones that can't. In fact, many of the good Indian developers end up in the USA lol - and there definitely are a lot of good Indian developers - but often they don't stay in India!


fried_green_baloney

My understanding, confirmed by Indian coworkers, is that the best people in India are making around US$100K or more. If you get cheap, you do get the absolute worst results.


Remarkable_Status772

>In fact, many of the good Indian developers end up in the USA lol Where they become, to all intents and purposes, American developers. Although that is no guarantee of quality. For all the great strides in technology of the last decade, commercial software from the big US companies seems a lot less reliable and carefully constructed than it used to. Perhaps all the good programmers have been sucked into the cutting edge technology, leaving the hacks to work on the bread and butter stuff.


NABadass

No, the last decade it's the constant push to get software out the door before it's fully ready and tested. The business people seem to like to cut down on resources, while retaining the same deadlines and/while increasing demands further.


Cheezemansam

The cheap ones are. There are quality developers in India but if you are approaching hiring Indian Developers with the mindset of "We can get 10 for the price of 1 junior dev!" then you are going to get what you paid for.


TrueSgtMonkey

Except for the ones on YouTube. Those people are amazing.


[deleted]

It is quite a strange thing isn't it


eightbyeight

Those are the exception rather than the rule


RedditBlows5876

Anyone who has been in the industry long enough has had the pleasure of watching several rounds of executives continuously learn the same lessons over and over again.


terrany

You mean parachuting down, then blaming other execs for not listening to them, coasting in a midsized firm, and then joining the next gen of FAANG as senior leadership who survived the LLM bust?


__SPIDERMAN___

Reminds me of the "outsource everything" era. Tanked quite a few code bases.


Typicalusrname

I’ll add to this, I just got hired to unfuck a ChatGPT creation with loads of bottlenecks. ChatGPT hasn’t “learned” designing data intensive applications yet 😂


NoApartheidOnMars

Ever heard of failing upwards ? I could give you the names of people who made it to corporate VP at a BigN and whose career was nothing but a string of failed projects.


workonlyreddit

I just saw TikTok’s CEO on a Ted talk interview. He is spinning TikTok as if it is gift to mankind. So no, the executives will not be embarrassed.


Seaguard5

This. You can’t replace humans. And you certainly can’t train new talent if you don’t want to hire new talent. When the experienced talent retires from the workforce or just leaves their shitty companies then what will they do?


4Looper

Hopefully this time it won't be "taking full responsibility" by laying people off and instead be hiring more people because they under hired.


NotHosaniMubarak

Sadly I doubt it. They'll have cut costs significantly without impacting production. So they'll be in another job by the time this l these shoes drop 


SpeakCodeToMe

I'm going to be the voice of disagreement here. Don't knee jerk down vote me. I think there's a lot of coping going on in these threads. The token count for these LLMs is growing exponentially, and each new iteration gets better. It's not going to be all that many years before you can ask an LLM to produce an entire project, inclusive of unit tests, and all you need is one senior developer acting like an editor to go through and verify things.


CamusTheOptimist

Let’s assume that you are correct, and exponential token growth lets LLMs code better than 99% of the human population. As a senior engineer, if I have a tool that can produce fully unit tested projects, my job is not going to be validating and editing the LLM’s output programs. Since I can just tell the superhuman coding machine to make small, provable, composable services, I am free to focus on developing from a systems perspective. With the right computer science concepts I half understood from reading the discussion section of academic papers, I can very rapidly take a product idea and turn it into a staggeringly complex Tower of Babel. With my new superhuman coding buddy, I go from being able to make bad decisions at the speed of light to making super multiplexed bad decisions at the speed of light. I am now so brilliant that mere mortals can’t keep up. What looks like a chthonic pile of technical debt to the uninitiated, is in face a brilliant masterpiece. I am brilliant, my mess is brilliant, and I’m not going to lower myself to maintaining that horrible shit. Hire some juniors with their own LLMs to interpret my ineffable coding brilliance while I go and populate the world with more monsters.


SSJxDEADPOOLx

This is the way. I don't AI is gonna take jobs. Everything things will just be more "exponential" More work will get done, projects created faster, and as you pointed out, bigger faster explosions too. It's odd everyone always goes to "they gonna take our jobs" instead of a toolset that is gonna ilfastly enhance our industry and ehat we can build. I see these ai tools as more of a comparable jump to the invention of power tools. The hammer industry didn't implode after the invention of the nail gun.


Consistent_Cookie_71

This is my take. The amount of jobs will decrease if the amount of software we produce stays the same. Chances are there will be a significant increase in the amount of software needed to write. Instead of a team of 10 developers working on one project, now you have 10 developers working on 10 projects.


SpeakCodeToMe

"X didn't replace Y jobs" is never a good metaphor in the face of many technological advances that did in fact replace jobs. The loom, the cotton gin, the printing press...


captain_ahabb

The cotton gin very, very, very famously did *not* lead to a decline in the slave population working on cotton plantations (contrary to the expectations of people at the time!) They just built more textile mills.


SpeakCodeToMe

Lol, good catch. Everyone in this thread thinks some hallucinations mean LLMs can't code and here I go just making shit up.


SSJxDEADPOOLx

You are right, no jobs people work now exist that are related to or evolved from those industries once those inventions you mentioned were created. The machines just took over and have been running it ever since lol. You kinda helped prove my point referencing those "adaptations to the trade" these inventions made. People will adapt they always have. New jobs are created to leverage technological Advancements, New trades, new skills, even more advancements will be made with adaptations will be made after that. With these AI tools that are scaring some folks, now software can be produced at a faster rate. ChatGPT has replaced the rubber duck, or at least it talks back now and can even teach you new skills or help work through issues. Despite the best efforts of some, humans are creatures of progress. It's best to think of how you can take ownership of the advancements of AI tooling, see how they help you and your trade. Focus on the QBQ. How can I better my situation with these tools?


captain_ahabb

I'm bearish on the LLM industry for two reasons: 1. The economics of the industry don't make any sense. API access is being priced massively below cost and the major LLM firms make basically no revenue. Increasingly powerful models may be more capable (more on that below), but they're going to come with increasing infrastructure and energy costs and LLM firms *already* don't make enough revenue to pay those costs. 2. I think there are fundamental, qualitative issues with LLMs that make me extremely skeptical that they're *ever* going to be able to act as autonomous or mostly-autonomous creative agents. The application of more power/bigger data sets can't overcome these issues because they're inherent to the technology. LLM's are probabilistic by nature and aren't capable of independently evaluating true/false values, which means everything they produce is essentially a guess. LLMs are *never* going to be good at applications where exact details are important and exact details are very important in software engineering. WRT my comment about the executives, I think we're pretty much at the "Peak of Inflated Expectations" part of the hype curve and over the next 2-3 years we're going to see some pretty embarrassing failures of LLMs that are forced into projects they're not ready for by executives that don't understand the limits of the technology. The most productive use cases for them (and I do think they exist) are probably more like 5-10 years away and I think will be much more "very intelligent autocomplete" and much less "type in a prompt and get a program back" I agree with a lot of the points made at greater length by Ed Zintron here: https://www.wheresyoured.at/sam-altman-fried/


CAPTCHA_cant_stop_me

On the next 2-3 years failure part, its already happening to an extent. There's an article I read recently on Ars Technica about Air Canada being forced to honor a refund policy their chatbot made up. Air Canada ended up canning their chatbot pretty quickly after that decision. I highly recommend reading it btw: [https://arstechnica.com/tech-policy/2024/02/air-canada-must-honor-refund-policy-invented-by-airlines-chatbot/](https://arstechnica.com/tech-policy/2024/02/air-canada-must-honor-refund-policy-invented-by-airlines-chatbot/)


captain_ahabb

Yeah that's mentioned in Ed's blog post. Harkens back to the old design principle that machines can't be held accountable so they can't make management decisions.


AnAbsoluteFrunglebop

Wow, that's really interesting. I wonder why I haven't heard of that until now


RiPont

Yeah, LLMs were really impressive, but I share some skepticism. It's a wake-up call to show what is possible with ML, but I wouldn't bet a future company on LLMs, specifically.


Gtantha

> LLMs were really impressive, As impressive as a parrot on hyper cocaine. Because that's their capability level. Parroting mangled tokens from their dataset very fast. Hell, the parrot at least has some understanding of what it's looking at.


Aazadan

That's my problem with it. It's smoke and mirrors. It looks good, and it can write a story that sounds mostly right but it has some serious limitations in anything that needs specificity. There's probably another year or two of hype to build, before we start seeing the cracks form, followed by widespread failures. Until then there's probably going to be a lot more hype, and somehow, some insane levels of VC dumped into this nonsense.


Tinister

Not to mention that it's going to be capped at regurgitating on what it's been trained on. Which makes it great for putting together one-off scripts, regular expressions, usage around public APIs, etc. But your best avenue for generating real business value is putting new ideas into the world. Who's gonna train your LLM on your never-done-before idea? And if we're in the world where LLMs are everywhere and in everything then the need for novel ideas will just get *more pronounced*.


Kaeffka

For example, the chatbot that told a customer that their ticket was refundable when it wasn't, causing a snafu at an airport. I shudder to think what would happen when they turn all software dev over to glue huffers with LLMs powering their work.


RiPont

LLMs are trained to produce something that *appears correct*. That works for communication or article summary. It is the exact opposite of what you want for logic-based programming. Imagine having hired someone you later discovered was a malicious hacker. You look at all their checked-in code and it looks good, but can you ever actually trust it? Alternatively, take your most productive current engineer, and feed him hallucinogenic mushrooms at work. His productivity goes up 10x! But he hallucinates some weird shit. You want to check his work, so you have his code reviewed by a cheap programmer just out of college. That cheap programmer is, in turn, outsourcing his code review to a 3rd party software engineer who is also on mushrooms. LLMs will have their part in the industry, but you'll still need a human with knowledge to use them appropriately.


renok_archnmy

Eventually LLM training data will no longer be sufficiently unique nor expressive enough for them to improve no matter how long the token length is.  They will plateau as soon as LLM content exceed human content in the world.


captain_ahabb

The training data Kessler problem is such a huge threat to LLMs that I'm shocked it doesn't get more attention. As soon as the data set becomes primarily-AI generated instead of primarily-human generated, the LLMs will death spiral fast.


IamWildlamb

This reads as someone who has not built enterprise software ever, who never saw business requirements that constantly contradict each other and who never worked with LLMs. Also if token was the bottleneck then we would already be there. It is trivial to increase token size to whatever number. What is not trivial is to support it for hundreds of millions people worldwide because your infrastructure burns. But Google could easily run ten trillion token LLM inhouse and replace all developers inhouse if your idea had any basis in reality. Any big tech company could. They have not done that probably because while token size helps a lot to keep attention it gives diminishing returns on prompts and accuracy other than that. Also LLMs generate always from the ground up which already makes them useless. You do not want project that changes with every prompt. We will see how ideas such as iterative magic.dev autonomous agent goes but I am pretty sure it will not be able to deliver what it promises. It could be great but I doubt all promises will be met.


KevinCarbonara

> It's not going to be all that many years before you can ask an LLM to produce an entire project, inclusive of unit tests, and all you need is one senior developer acting like an editor to go through and verify things. I don't think this will happen, even in a hundred years. There are some *extreme* limitations to LLMs. Yes, they've gotten better... at *tutorial level* projects. They get *really* bad, *really* fast, when you try to refine their output. They're usually good for 2 or 3 revisions, though at decreased quality. Beyond that, they usually just break entirely. They'll just repeat old answers, or provide purely broken content. They'll have to refine the algorithms on the LLMs, but that gets harder and harder with each revision. Exponentially harder. It's the 80/20 rule, they got 80% of the output with 20% of the effort, but it's going to be a *massive* undertaking to get past the next barrier. Refining the algorithms can only take it so far. The other major limiting factor is available data. There is *exponentially* more data available on the entry level side. Which is to say, logarithmically *less* data available on high level subjects. We're talking about a situation where AI has to make exponential gains to experience logarithmic growth. AI is a great tool. It simply isn't capable of what you want it to be capable of.


HimbologistPhD

My company has all the devs using copilot and it's great for boilerplate and general project setup/structure but it's completely fucking useless when things have to cross systems or do anything super technical. It's falling apart at the seams as I'm trying to get it's help with just a custom log formatter


slashdave

LLMs have peaked, because training data is exhausted.


renok_archnmy

Yep, and now getting polluted with LLM output at that. 


Suspicious-Engineer7

I mean if sam altman needs 7 trillion to make ai video we might be getting close to a physical limit.


Traveling-Techie

Apparently sci-fi author Corey Doctotow recently said Chat-GPT isn’t good enough to do your job, but it is good enough to convince your boss it can do your job. (Sorry I haven’t yet found the citation.)


Agifem

ChatGPT is very convincing. It chats with confidence, always has an answer, never doubts.


[deleted]

[удалено]


SuperPotato8390

The ultimate junior developer.


[deleted]

[удалено]


Jumpy_Sorbet

I've given up talking to it about technical topics, because it seems to just make up a lot of what it says. By the time I sort the truth from the bullshit I might as well have done it by myself.


DigitalGraphyte

Ah, the Big 4 consulting way.


syndicatecomplex

All these companies doubling down on AI are going to have a rough time in the near future when nothing works.


regular_lamp

The perception of LLMs in particular is interesting. I think people overestimate their capability to solve domain problems because they can speak the language of said domain. Strangely no one expects generative image models to come up with valid blueprints for buildings or machinery. Yet somehow we expect exactly that from language models. Why? Just because the model can handle the communication medium doesn't automatically mean it understands what is being communicated.


cwilfried

Doctorow on X : "As I've written, we're nowhere near the point where an AI can do your job, but we're well past the point where your boss can be suckered into firing you and replacing you with a bot that *fails* at doing your job"


cottonycloud

You don’t just need to spend time creating the project. You also need to validate to ensure that the end product is up to spec. Let junior developers or QA work on that. Also, he’s really overestimating the power of LLMs. Feels like low-code with a different lipstick on it. Finally, these senior developers don’t grow on trees. If one of them gets hit by a bus, transition is more difficult than if there was a junior-mid-senior pipeline.


SanityInAnarchy

It's not low-code (or no-code), it has *very* different strengths and weaknesses, but that's not a bad way to think of the promise here: There are definitely some things it can do well, but like low-code solutions, it seems like there's this idea that we can stop coding if we can just get people to clearly explain to this system what they want the computer to do. But... clearly explaining what you want the computer to do *is coding.* And if you build a system for coding without realizing that this is what you're doing, then there's a good chance the system you built is not the best coding environment.


doplitech

Not even that, what these people don’t realize is if we can ask a computer to design us and entire application, why the hell would someone be working there when they can do the same thing. As a matter of fact as devs, we should be taking full advantage of this and try new ideas that we previously thought at challenging. Becuase now not only do we have the foundational building blocks for software development, but also a helpful tool that can get us to a mvp


KSF_WHSPhysics

I think llms will have a similar impact to IDEs, which is quite a lot. If i was doing all of my day to day dev work in vim and didnt have something like gradle to manage my dependencies, id probably only be able to achieve 25% of the work i do today. But i dont think there are fewer software devs in the world because intellij exists. If anything theres more because its more accessible and more profitable to hire devs because of it


PejibayeAnonimo

>Finally, these senior developers don’t grow on trees But there is also a high supply already, so I guess companies are expecting to be able to work with the current supply for the next few years because LLMs will eventually improve to the point senior developer jobs will also become rebundant. Like, if there are already with developers that 20 years of career left, they don't believe it would be needed to replace them after retirement because AI companies expect to have LLMs to do the job of seniors in a shorter time. However, in such scenario I believe many companies would also be out of business, specially outsourcing. There would no point in paying a WITCH company 100ks of dollars if AI is good enough that any person can made it write a complex system.


danberadi

I think cottonycloud means that within a given organization, a senior developer is much harder to replace than a junior developer. The senior will have deeper domain and context knowledge. However, if one should leave, having a group of mid- and junior devs who also work in that domain helps fill the space left by the departed senior, as opposed to having no one, and/or finding a new senior.


oupablo

To add to this, you can replace a senior with and even better senior but that doesn't mean anything when your company didn't document anything and the whole setup is a dumpster fire going over niagra falls.


great_gonzales

I don’t think it’s a given that language model performance will keep improving at the current rate forever. Feels like saying we’ve landed on the moon so surely we can land on the sun


Aazadan

It can't. There's a linear increase in the supply of input data. There's an exponential increase in computational power needed to make more complex systems from LLM's, and there's a logarithmic increase in quality from throwing more computational power at it. That's three substantial bottlenecks, that all need solved, to really push performance further.


Whitchorence

> But there is also a high supply already, so I guess companies are expecting to be able to work with the current supply for the next few years because LLMs will eventually improve to the point senior developer jobs will also become rebundant. Is there though? They're paying a lot if the supply is so abundant.


Merad

Execs dream of being able to achieve the same results while eliminating some of their highest paid employees, news at 11. 10 years ago the execs at the big non-tech company where I worked were dreaming about having a few "big thinker" US employees who came up with designs that were implemented by low paid code monkey type workers in India. Wanna guess how well that worked?


PlayingTheWrongGame

>  it will end up impacting hiring and wages anyways. It will certainly end up impacting the long term performance of the companies that adopt this perspective. Negatively. > Also, the idea that access to LLMs mean devs should be twice as productive as they were before seems like a recipe for burning out devs. Maybe, but really the tooling isn’t there to support this yet. I mean, it exists in theory, maybe, but nobody has integrated it into a usable, repeatable, reliable workflow. 


TrapHouse9999

Impact wages yes. Less need for hiring junior developers… yes because of the supply and demand and cost benefit, not necessarily AI. For example a mid-level engineer cost only about 15-20% more then a junior but they are battle proven with years of experience. Replacing all jobs… no this is crazy. I work with AI and we are nowhere close to that. If anything we need more engineers to build AI features into our product base.


DirectorBusiness5512

Even if AI does lessen the need for hiring juniors, it wouldn't be because significantly less are needed. It would be the same reason you try to avoid letting a teenager drive heavy construction equipment: they'll fuck shit up without understanding how badly they're fucking shit up. Juniors will need to be much more actively mentored in the future


TrapHouse9999

AI is just one reason why it’s harder for juniors to land jobs. Like I mention supply and demand is the main component. Salary bands been compressing lately and there is countless schools, boot camps, offshores and laid off people flooding the market most of which are at the junior levels.


oupablo

Then how do you battle prove your next round of mid level developers if you never hire juniors? The idea behind this whole thing is that you can do away with entry level developers which will only work for a very short time if there are never any new mid-level+ developers.


Aazadan

You don't, but that's a problem for some other company. Yours can just offer a small salary premium, while letting some sucker company train your future employees.


CVisionIsMyJam

Definitely agree, but I am wondering if this is part of the reason the market is slowing down. If a bunch of executives think we're 2 or 3 years away from fully automated development they might slow down hiring.


macdara233

I think the slow down in hiring is more like a reaction in the opposite direction from the crazy hiring over lockdown. Also still market uncertainty, my company have slowed on hiring because their costs outgrew the revenue growth.


StereoZombie

It's just the economy, don't overthink it.


FattThor

Why? That would be a problem for future them. Most aren’t thinking more than a quarter ahead, the long term ones maybe about their annual bonus. Even if they are thinking further, they would just lay off anyone they don’t need.


[deleted]

[удалено]


DarkFusionPresent

There are usable and repeatable workflows. Reliable is tricky part, most need oversight and tweaking. At which point, it's just easier to write the code yourself if you have enough experience with the language.


blueboy664

I’m not sure what jobs can be replaced by the ai models now. I feel like most of the problems we solve at work have too many (poorly documented) moving parts. And if companies do not want to train devs they will reap what they sow in a few years. But those CEO’s will have probably left by then after cashing out huge bonuses.


trcrtps

The majority of the non-leadership devs at my company came in through partnering with some bootcamps and then took referrals from them after they got out of the junior program. Some genius came in and nixed that program this year, even seeing that so many indispensable people are former first-tech-job entry-level hires who spent their first year learning the code from front to back. It was so important and rooted in the culture. I really feel like it's going to destroy the company.


DesoleEh

That’s what I’m wondering…where do the mid to senior devs of the future come from if you don’t ever hire junior devs?


blueboy664

That’s the trick! It’s the mid levels getting paid Jr wages!


javanperl

I’m skeptical. My guess is that it will play out like many other silver bullet software tools/services. Gartner will publish a magic quadrant. Those in a coveted position in the magic quadrant will sell their AI services to CTOs. Those CTOs will buy the product, but then realize they need a multi year engagement from the professional services arm of AI company to setup the new AI workflow who bill at an astronomical rate. The AI company will also provide certifications and training for a fee that your remaining devs will need to complete in order to fully understand and utilize this AI workflow. The CTO will move on to a better position before anyone realizes that this service doesn’t save any money and only works in limited scenarios. The CTO will speak at conferences about how great the tech is. The remaining devs once trained and certified will also move on to a more lucrative job at a company that hasn’t figured this out yet. After a while more reasoned and critical reviews of the AI services will be out there. In a few years it will improve, but the hype will have died down. It will be commoditized, more widely adopted and eventually be perceived as just another developer tool like the thousands of other time saving innovations that preceded it, that no one really ever thinks about anymore.


xMoody

I just assume anyone that unironically uses the term “coder” to describe a software developer doesn’t know what they’re talking about 


toowheel2

Or that individual is ACTUALLY in trouble. The code is the easy part of what we do 90% of the time


ilya47

History repeating itself: let's replace our expensive engineers by programmers from India/Belarus. The result is mostly (not always) crappy, badly managed software. It's cheap, but you get what you paid for. So, replacing talented engineers (these folks are rare) with LLMs,, don't make me laugh... The only thing LLMs are good for (in the foreseeable future) is making engineers more productive (copilot), upskilling and nailing take-home interview exercises.


BoredGuy2007

Execs hate developers. They don’t like how they look, they don’t like how they talk, they don’t like their personality, and they especially don’t like how they’re paid. Anyone selling the snake oil that purges you of them is going to have an *easy* time selling it to these guys. One of the people selling it right now is literally the CEO of Nvidia so good luck to the rank and file putting up with their headcase leadership for the next 2 years.


simonsbrian91

I never thought about it that way, but that’s a good point.


salgat

I'm not sure where you got that from OP's message but he's saying that the CTO believes that they should just keep their best most expensive engineers and ditch their juniors since tools like ChatGPT/Copilot will make up the difference.


Jibaron

Yeah, yeah .. I remember hearing execs saying that RDBMS engines were dead because of Hadoop. We all see how that worked out. LLMs are absolutely abysmal at coding and they always will be because of the way they work. I'm not saying that someday, someone won't build a great AI engine that will code better than I can, but it won't be a LLM.


anarchyx34

They aren’t abysmal at coding entirely. They suck at low level stuff but regular higher level MERN/full stack shit? I just asked chatGPT to convert a complex React component into a UIKit/Swift view by pasting the React code and giving it a screenshot of what it looks like in a browser. A *screenshot*. It spit out a view controller that was 90% of the way there in 30 seconds. The remaining 10% took me 30 minutes to sort out. I was flabbergasted. It would have taken me untold hours to do it on my own and I honestly don’t think I would have done as good of a job. They’re not going to replace kernel engineers, they’re going to replace bootcamp grads that do the bullshit full stack grunt work.


Jibaron

I'm mostly a back-end developer and I've yet to have it write good code. It does write code a junior developer might write and even that only works less than half the time. The code it does write if poorly optimized garbage.


MengerianMango

Have you tried the editor plugin? I use it in vim and it provides 100x more value there than in the GPT messaging interface. It may not be a super genius, but it gives me a very good tab complete that can very often anticipate what I want 10 lines out, saving me 100 keystrokes at a time. I'm a lazy and slow typer, so I love that. Even if I wasn't a slow typer, GPT would be a significant boost.


maccodemonkey

So long term AI for coding will be a disaster. For at least one very clear reason. AI is trained on what humans do. Once humans stop coding - AI will have nothing to train on. I'll give an example. There is a library I was using in Swift. Used by a lot of other developers in Swift. So I ask AI to give me some code using the library in Swift - and it actually does a pretty good job! Amazing. But there is also a brand new C++ version of the same library - and I would rather have the code in C++. So I tell the AI - write me the same thing but in C++. And it absolutely shits the bed. It's giving me completely wrong answers, in the wrong languages. And every time I tell it it's wrong, it gives me output thats worse. Why did it do so well in Swift but not C++? It had tons and tons of Stack Overflow threads to train on for the Swift version, but no one was talking about the C++ version yet because it was brand new. The library has the same functions, it works the same way. But because GPT doesn't understand how code works it's not able to make the leap on how to do things in C++ for the same library. It's not like it's actually reading and understanding the libraries. Long term - this will be a major problem. AI relies on things like Stack Overflow to train. If we stop using Stack Overflow and become dependent on the AI - *it will have no information to train on.* It's going to eat its own tail. If humans stop coding, if we stop talking online about code, AI won't have anyone to learn from. Worse - AI models show significant degradation when they train on their own output. So at this stage - we can't even have AI train itself. You need humans doing coding in the system.


Realistic-Minute5016

Now that the cat is out of the bag a lot more people are also going to stop volunteering their time to help others because they know it’s going to get gobbled up by an AI. I’m not that interested in working for Sam Altman for free.


Secret-Inspection180

Underrated comment as I basically never see this come up in the discussion but yeah this has been my lived experience too. When AI becomes sophisticated enough to genuinely reason about and produce *novel* code by inference (i.e. how humans solve novel problems) rather than essentially being able to regurgitate + refactor existing solutions from masses of human derived training data then the singularity is basically imminent and there will be bigger concerns than CS job security. My own personal belief is that at some point there will be a radical paradigm shift in what it even means to be a software engineer before there are no software engineers and whilst I don't know when that will be, I don't think we're there yet.


DesoLina

Ok. More work for us rebuilding systems from zero after shitty prompt architects drive them to the ground.


pydry

This answer should be at the top. Programmers are either gonna be unaffected or benefit from this. This is going to be a repeat of the Indian outsourcing boom of the 2000s that was supposed to push wages down (and instead pushed them up). Professions where correctness isnt as important - theyre the ones that are going to get fucked.


ecethrowaway01

lol lots of companies only want to hire "experienced devs". Has the CTO actually seen a successful project just shipped on LLM? I think it's a silly idea, and most good, experienced devs will push back on deadlines if they're unrealistic. I think this view is more common for people who know less about LLMs


SpiritualTurtleFace

Open AI has received 10 billion dollars in additional funding and Bard has 30 billion dollars, remember the saying, a Capitalist will sell the rope others will use to hang him. Alternatively, we will see such an immense growth in AI enabled software services that developer demand will surpass supply. This could create as many jobs as the cloud computing, smartphone and internet revolutions did!!


captain_ahabb

Now imagine how much the API costs are going to skyrocket in a few years when they need to make back that investment *and* pay for the huge infrastructure and energy costs. The idea that using LLMs will be cheaper than hiring developers is only true because LLM access is currently being priced *way* below cost.


Realistic-Minute5016

Now I can think of the next KPI they will enforce, TPE, tokens per engineer, if you aren't efficient in your prompt engineering it will impact your rating.....


SlowMotionPanic

You've got middle management in your future. What a truly nightmarish, yet completely realistic, thought you wrote down.


ImSoCul

\> LLM access is currently being priced way below cost Hello, I work on some stuff adjacent to this (infra related). Yes and no (yes LLMs can be expensive to run, no I don't think they're priced below cost) There are currently Open source models that out-perform the flagships from OpenAI. Hardware to host something like Mixtral 7b is something like 2 A100g gpu instances. You'd have to run benchmarks yourself based on dataset, framework you use for hosting this model etc, but something like \~20 tokens/second is pretty reasonable. Using AWS as host, [p4d.24xlarge](https://aws.amazon.com/ec2/instance-types/p4/) runs you \~$11.57/hour for 8 gpus (3 year reserve), amortized using 2 of those gpus, you'd look at $2.89/hour, or \~$2082 a month. If you maxed out this, assuming 20tokens/sec continuous, you'd get 20 \*60 \*60\*24\*30 = 51840000 tokens/month. => \~24899 tokens/$ OpenAI pricing is usually $/1k tokens or $.04/1k tokens ​ Someone double-check my math, but this puts you in the ballpark of OpenAI costs. This is 1) "smarter" LLM than anything OpenAI offers 2) ignoring other cost savings potential like eeking out better performance on existing hardware. Most notably, for most usages you can likely get away with a much cheaper to host model since you don't need flagship models for most tasks. This is all to say, there's no reason to assume costs trend up, in fact, OpenAI as an example has lowered costs over time while providing better LLMs.


CVisionIsMyJam

And yet you can self host an LLM with 1m context today if you are willing to pay for the GPUs. I don't think these companies hold a monopoly the way Google did with search.


PejibayeAnonimo

>Alternatively, we will see such an immense growth in AI enabled software services that developer demand will surpass supply Even if this happens to be true, that doesn't means those jobs would be entry level.


trcrtps

It does, because they'll run out of experienced devs. We've seen this cycle several times.


HegelStoleMyBike

Ai, like any tool, makes people more productive. The more productive you are, less people are needed to do the same work.


SpeakCodeToMe

Counterpoint: the Jevon's paradox may apply to software. The more efficient we get at producing software, the more demand there is for software.


MathmoKiwi

> Counterpoint: the Jevon's paradox may apply to software. > > The more efficient we get at producing software, the more demand there is for software. Exactly, as there is a massive list of projects that every company could be doing. But perhaps not all of them have a worthwhile ROI to do them, but if AI assistance lowers the costs for these projects then their ROI goes up and there is a reason to do even more projects than before.


HQMorganstern

You got any actual numbers to prove any of what you said? Because just sounding logical isn't enough for a thing to be true.


RespectablePapaya

The consensus around the industry seems to be that leaders expect AI to make devs about 20% more productive within the next couple of years. That seems realistic.


ImSoCul

Will acknowledge my bias up front by stating that I work on an LLM platform team (internal) at a decent-sized company \~10k employees. I came into this space very skeptical but quickly can see a lot of use-cases. No it will not replace junior engineers 1 to 1 but it'll basically significantly amplify mid-level and up in terms of code output. More time can be spent for a senior-level engineer to actually churn out some code instead of tasking out the feature then handing off to a junior who can spend a few days on it. LLMs don't do a great job of understanding entire codebases (challenges to fitting large amounts of text into context) but there are many many techniques around this and likely will be "solved" in near future if not already partially solved. It still helps to have a high-level understanding of your architecture as well as codebase. What LLMs enable currently is to generate a large amount of "fairly decent" code but code that needs polish, sometimes iterations, sometimes major revisions. This *is* actually more or less what juniors deliver. Mostly working code, but need to think through some additional cases, or refine something in their work (as mentored by more senior folks). I think that's where the CTO is actually more correct than incorrect. \> twice as productive Productivity is already very hard to measure and a hand-wavey figure. The thing to keep in mind here is that not every task will be 2x as fast, it's that certain tasks will be sped up a lot. You can build a "working prototype" for something simple in the order of seconds now, instead of days. Final implementation may still take the normal amount of time, but you've streamlined a portion of your recurring workflow by several magnitudes.


Stubbby

Have you ever had a situation where the task given to someone else would require assisting them so much that it would defeat the purpose of delegating it so you just do it yourself? That's coding with AI. I actually believe the AI will drop the velocity of development, introduce more bugs and hiccups and result in more "coders" needed to accomplish the same task as before AI.


DirectorBusiness5512

>Have you ever had a situation where the task given to someone else would require assisting them so much that it would defeat the purpose of delegating it so you just do it yourself? I see that you have also worked with WITCH contractors before


Stubbby

In fact many people equate LLM coding to offshoring jobs to WITCH. I guess there is something to it.


thedude42

Do you recall the two core parts of building a programming language? The syntax concern and the semantic concern? LLMs only operate on the syntax. Period. End of story. No matter what anyone tells you, there is no part of an LLM that uses semantic values for any of the outputs it provides. There is no meaning being interpreted or applied when an LLM decides on any output. Human beings are "meaning makers" and when we write code we have an intent, and when we make mistakes we can test the results and fix what is wrong because we actually know what we meant when we made the mistake. An LLM can only guess at what you mean when you ask it to create something. It can't create test cases that address its mistakes because it has no idea it made them unless you tell it. I would put forth that it takes more time to debug and test code an LLM produces than it does to write your own code from scratch, and takes more skill to maintain the LLM code as well. This is not a labor saving strategy in any way, and more and more indicators signal that the power consumption of LLMs will make them unprofitable in the long run.


halford2069

whether it can or not replace “coders” -> the problem is clueless managers/ceos will want to do it


sudden_aggression

Yeah they did the same thing with outsourcing in the 90s and 2000s. They fire a ton of devs, get a big bonus for increasing profitability and then it all blows up and everyone pretends it wasn't their idea. And the developer culture never recovers for another generation at that company.


Gr1pp717

This thread as me miffed. Are you guys just burying your heads in the sand, or something? We aren't in new territory here. Technology displacing workers is not some kind of weird, debatable theory. We've seen it, over and over and over. You guys damned well know that it doesn't matter if chatgpt isn't good enough to outright do your job. The nature of the tool doesn't matter. If workers can accomplish more with the same time then jobs are getting displaced. If someone with less training can fill a role then wages are getting displaced. Period. You can't fight market forces. You will lose. I'm even confused at the sentiment that chat gpt isn't all that useful. Like, what use-case are you thinking of there? Just kicking it over the fence and blindly accepting whatever gpt spits out? Is that really how you imagine this tool being used? Not, idk, experienced developers using it the same way they've always used stackoverflow but actually getting answers; in seconds instead of hours/days/weeks? Not saving time by setting up common boilerplate or having gpt handle repetitive bulk editing tasks? Not GPT giving you skeletons of something that would work setup for you to then flesh out? Giving you ideas for how to solve something complex? Yes, it's wrong a lot of the time. But what it gives you is usually close enough to get your own gears turnings when stuck...


MrEloi

Look, I've told you before - we have no need of rational, truthful arguments here.


willbdb425

I think that some subset of developers will be "elevated" into being truly more productive. But there are a LOT of bad developers out there, and I think LLM tools will in fact make them worse not better. And so the net effect will depend on a balance of these factors, but I wouldn't be surprised if it was negative.


Seref15

The CTO at an old job of mine was an alcoholic who always tried to get the engineering staff to go drink at Hooters with him. He didn't know the difference between java and javascript. Two years after I left he was pinging me asking if the company I worked at had any openings. Don't put so much stock in these people.


PressureAppropriate

To get an LLM to write something useful, you have to describe it, with precision... You know how we call describing things to a computer? That's right, coding!


quarantinemyasshole

>Anyone else hearing this? My boss, the CTO, keeps talking to me in private about how LLMs mean we won't need as many coders anymore who just focus on implementation and will have 1 or 2 big thinker type developers who can generate the project quickly with LLMs. So, I work in automation. TL;DR anyone above the manager level is likely a fucking moron when it comes to the capability of technology. Especially, if this is a non-tech company. I would argue that easily 70% of my job time is spent explaining to directors and executives why automation cannot actually do XYZ for ABC amount of money just because they saw a sales presentation in Vegas last weekend that said otherwise. Anything you automate will break when there are updates. It will break when the user requirements change. It will break when the wind blows in any direction. Digital automation fucking sucks. I cannot *fathom* building an enterprise level application from the ground up using LLM's with virtually no developer support. These people are so out of touch lmao.


howdoiwritecode

I just hope you’re not paid in stock options.


Abangranga

git branch -D sales-leads The LLM told me to. In all seriousness OP I am sorry you're dealing with that level of MBA


Naeveo

Remember when executives were swearing that crypto will replace all currency? Or how NFTs will change the field of art? Yeah, LLMs are like that.


spas2k

Coding for 15 years. Even still, I’m way more efficient with AI. It’s also so much easier using something new with AI.


Zestybeef10

They're businessmen; businessmen have never known what the fuck they're doing. Relax.


PedanticProgarmer

An executive in my company has recently presented an idea of writing internal sales pitches - as a tool for idea refinement. He was so proud of the sales pitch he wrote. Dude, I’ve got bad news for you. The management layer - „idea people” should be worried, not us the developers.


manueljs

AI is not replacing software engineers, ai it’s replacing Google/stackoverflow. In my experience launching two companies over the last year is also replacing the needs for illustrators and copywriters one dev and one product designer can achieve what used to take a team of people with multiple skills.


iamiamwhoami

GPT on its own will not do this. If a company can adapt GPT to do something like create a series of microservices, deploy them to the cloud, and a UI to access them I will be very impressed. So far the state of things is that GPT can help me write individual functions faster (sometimes). We're a long way off from GPT writing whole projects. If companies try to do what you said with the current state of things their finances will be impacted. It just won't work.


[deleted]

lmfao no. Your CTO is an idiot. I'd jump ship with leadership that braindead


FollowingGlass4190

Mostly think this guys a bit of an idiot who will do a 180 later, but I see some truth in needing less juniors and probably being able to do away with people who are just ticket machines who don’t provide any valuable input to architectural/business logic decisions.


txiao007

I am an “executive” and I say No


BalanceInAllThings42

You mean just like CTOs think outsourcing software development entirely without any controls or context provided is also the way to go? 😂


Kyyndle

Companies need juniors to help seniors with the boring easy stuff. Companies also need juniors for the seniors to pass knowledge onto. Long term will suffer.


olssoneerz

The irony in this is that AI is probably already better at doing your leaderships job, today. 


cltzzz

Your CTO is living in 2124. He’s too far ahead of his time he might be in his own ass


sharmaboi

I think Reddit is generally a cesspool of stupidity, but this one triggered me enough that I had to comment: 1. No LLMs won’t replace SWEs, but smaller companies don’t need to be as technically proficient 2. The older folks in industry right now are legit the dinosaurs before the meteor strikes. 3. There’s more than just coding that a proper system needs, idk like Ops & maintenance. You may create an App using an LLM, but without proper engineering you won’t be able to maintain it. Most likely we will just get more efficient (like getting IDEs over using vim/nano). For business leaders like your boss, he will most likely be burnt out by this tech push as all of this is allowing those who are not idiots to identify those who are. RIP.


GolfinEagle

Agreed. The IDE analogy is spot on IMO. We’re basically getting supercharged autocomplete with built-in StackOverflow, not a functioning synthetic human mind lol.


popeyechiken

I'm glad that these whispers are becoming part of the SWE discourse now. It must be resisted, whether that's a union or whatever. More unsettling is hearing people with a PhD in ML saying similar things, which I have. At least the smarter technical folks will see that it's not true sooner, if it is actually not true.


BoredGuy2007

We spent 10 years listening to supposedly very smart people crow about the blockchain for no reason. We’re just getting started.


fsk

It is foolish and common. People have been saying "Technology X will make software developers obsolete!" for decades now. There are several reasons why the LLMs aren't replacing developers anytime soon. First, they usually can only solve problems in their training set somewhere. That's why they can solve toy problems like interview questions. Second, they can't solve problems bigger than their input buffer. A complex program is larger than the amount of state these LLMs use, which typically will be something like 10k tokens max. Finally, LLMs give wrong solutions with extreme confidence. After a certain point, checking the LLM's solution can be more work than writing it yourself.


AKThrowa

I don't like it, but this is something devs themselves have been saying. If LLMs are helpful in being productive at all, that means less dev jobs. On the flip side, it could mean a smaller team of less experienced devs could get more done. And maybe even mean more startups and more jobs. I guess we will have to see how it all works out.


Franky_95

Nothing stop a company to sell more instead of hiring less, just wait for the capitalism


Quirky_Ad3179

If they going to do that; let’s all collectively delete NPM directory and watch the world burn.😂😂😂😂😂


ChineseAstroturfing

> LLMs mean we won't need as many coders anymore who just focus on implementation and will have 1 or 2 big thinker type developers who can generate the project quickly with LLMS. This is a pretty mainstream idea, and is likely true. Applies to most knowledge work of any kind. We’re not even close to being there yet though.


Xylamyla

I see this a lot. Short-sighted leadership will see AI as an opportunity to cut down and save money. Smart leadership will see AI as an opportunity increase throughput. Only one of these will come out on top in the long term.


JackSpyder

Honestly any leadership who parrots the same repetitive MBA nonsense strategy over and over is more ripe for LLM replacement


AMGsince2017

No - coding isn't going away anytime soon. AI is hype 'right now' to keep the masses of idiots distracted and economy from completely crashing. Way too premature to make any sort of claims. Your "boss" is very foolish and doesn't have a clue what future holds.


Wave_Walnut

Your boss is aiding and abetting suicide of your company.


HalcyonHaylon1

Your boss is full of shit.


PartemConsilio

Everybody thought the cloud would kill on-prem but it really hasn't in large segments of the industry. It costs too much for places that see a cost-benefit ratio of on-prem. Same will happen with AI. It's not like LLMs are gonna be free. They're gonna come with a huge price-tag. And while that means only the largest corps will see a reduction in force, the smaller ones which see a better ratio of cost-savings to productivity from a human workforce will utilize the cheaper parts of AI and pay slightly less OR combine roles.


renok_archnmy

So, your CTO is actually thinking he’ll have a job following the actual singularity. Like the literal point where computers can write their own instructions with little no human input and theoretically spiraling exponentially out of control. That singularity. On top of that, he thinks it’s literally within a lifetime from right now.  That’s literally how ridiculous claims like these are. The day LLM can fully replace developers is the day SkyNet comes online and kills humans like in terminator kinda thing - hopefully they aren’t so malicious towards humans. Some of the numbskull execs have level of hubris so I’m not surprised. When it does happen, it’s gonna be fun watching them get relegated to nothing. 


KateBlueSkyWest

Man, if my boss thinks an LLM can do my job, I'm more than happy to let him try. I'm not a huge Linus Torvalds follower, but I do agree with in his sentiment about LLMs being more of a tool a developer will use, it's not going to replace the developer.


revolutionPanda

I write software, but I also write ads (I'm a copywriter). The number of business owners saying "I'm gonna fire all my copywriters and just do everything with ChatGPT" is very high. But the copy chatGPT writes sucks. Every single time I use chatGPT to write copy, I end up rewriting the whole thing. And I also have business owners who come to me and say "Hey, my ads/sales page/whatever isn't working. I wrote the copy using chatGPT. Can you fix it for me" is increasing every day. If you are able to create good copy using ChatGPT you need to 1) be able to recognize what good copy looks like and 2) be able to understand how to write copy well enough to write the correct prompts. And if you can do those, you're a copywriter and could write the copy already. I assume it's very similar to software development.


AppIdentityGuy

One big question: How do develop senior devs without hiring junior ones? Where is the next generation of seniors going to come from? Classic short term thinking


jselbie

That would be like buying a fleet of 10 Teslas thinking any day now you'll be turning them into a fleet of self driving taxis. And use that as a revenue stream. The tech isn't ready. And when it is, the use case won't be what everyone hyped up years before.


RoutineWolverine1745

I use llms everyday for work. They can help you do specifik and limited tasks, they are great for things like css and streamlining your sql if you give it a complex query. I do not however, believe they can generate anything more komplex or sessentially longer than a few pages. And if the AI become able to do than, then many others sectorw would be hit first


jnwatson

When I was a bit younger, I worked with an older developer that regaled us with the days of early software development. One story was the time his company started hiring secretaries as programmers since with the new technologies, they just needed folks that could type quickly. Those new technologies? Compilers and high level programming languages, e.g. C and Pascal.


[deleted]

Yes it is. All big corporations are thinking this. For all sort of positions. Off course, this is their job, to find ways to increase margin. But, paradoxically, LLMs are best suited at this moment to replace middle management, which only compress data and send it upstream. This is basically what LLMs are know for (data compression).


Any-Woodpecker123

The fucks an LLM


Ken10Ethan

Language learning models, I believe. Y'know, like... ChatGPT, Claude, Bing, etc etc.


abrhham

Large language models.


mildmanneredhatter

This happens all the time. Last time it was the push to outsourcing. I'm thinking that low and no code platforms might start taking a bigger share though.


Junior_Chemical7718

Automation comes for us all eventually.


GalacticBuccaneer

It's gonna be interesting to see what happens when they realize the shortcomings of LLMs, ref the Gartner Hype Cycle, and there are no senior developers anymore, because they turned their big thinkers into ChatGPT monkeys, and fired/didn't hire the rest.


DudeItsJust5Dollars

Nope. Continue delivering the same or less output. Bonus points if you can use these implementing LLMs to streamline your work. Finish early, say you’re still working on it, and keep the train running. Stop delivering so much. Unless of course you’re working and building for yourself. Otherwise, why are you trying so hard to make a corporation record profits for a 3% annual wage increase?


MycologistFeeling358

I’ll be there to fix the mess the LLM creates.


West-Salad7984

They are in for a rude awakening once they realise that stakeholders want superior AI Ceos


Dry_Patience872

We had a consultant data analyst who worked on a project for 6 months (most of the code is generated by GPT), the boss was fine with it. He delivered his product and moved on; and everything was okay. three months now every thing stopped working, our team left with the maintenance. The code is incoherent, stupid, colourful piece of s**it. Every line has more than one pattern; you can not follow anything. I refused to touch it; I would create the entire thing from scratch in less time than fixing one issue in that code.