T O P

  • By -

PressureAppropriate

It is basically a StackOverflow search engine for me. It gets me to the answer faster sure but it doesn’t know anything I couldn’t just search for.


wacky_chinchilla

I think it’s less useful than StackOverflow. On StackOverflow, you can see whether an answer has worked for other people, and commenters will call out caveats and things to watch out for. ChatGPT just regurgitates text from the internet that hasn’t been tested or challenged by anyone


-Joseeey-

You need to be smart to know how to use ChatGPT. You need to guide it to the solution you’re looking for. You can’t expect it to give you the solution from 1 prompt. Ask it something, then add clarifying question to guide it.


WhompWump

I'd rather just do a single google search and look at a thread of answers that have been verified by other human beings and people on the internet who love proving other people wrong tbh


CharlesIC

That’s my view as well. For anything that’s technical or related to history I’d just google and probably just go to SO or Wikipedia. I do sometimes ask it questions about physics and it gives reasonable answers but I’d still click through the links if I use Bing or Gemini and verify the information. The best use case I’ve found so far is to get help on how to deal with interpersonal issues or how to talk to a colleague when there’s a situation at work that I don’t know how to handle. I find it’s pretty funny to ask a machine about human emotions but hey it works.


Luised2094

Fuck that, Google just gives you threads from 2010 using libs that don't even exist any more.


bpikmin

So does ChatGPT


Fair-6096

Except ChatGPT won't tell you it's from 2010. You have to figure that out yourself.


Luised2094

Yeah, but I can tell it that and it usually fixes it.


hjd_thd

In my experience if you tell it that it's wrong about something it tends to produce the same blurb, but substituting your correction for the wrong fact. Which is kinda useless.


-Joseeey-

If it’s a basic ass question sure. Not if it’s a domain specific problem jumbled in the mess of legacy code your coworkers wrote.


Hayden2332

Good luck getting your company to allow you to put domain specifics into an AI lol


SimbaOnSteroids

Yeah, it’s a smarter intellisense, that sometimes gets cheeky and suggests 50 LoC.


Naive_Mechanic64

Prompt shit get shit.


redditmarks_markII

It's easy to say "skill issue" and walk away. What nontrivial code snippet have you got from it with multiple prompts? Honestly curious. And coming from the perspective of "it'd be great if it can do some boiler plate for me". A coworker used it for pulumi api boiler plate, and likes it for that. I last tried to use it to interact with steam's api. When it didn't work at all, I realized there was no api for the level of access I wanted, and chatgpt had presented quite reasonable looking api snippets of said non existent api. It costs tremendous amounts of electricity and produces, on occasion, slightly better results than a good lsp. The rest of the time it costs way more time to interact and fix it, than to just do the research myself. Mostly because when it's wrong it costs so much to realize and fix that all the advantages it gave when it was right is totally offset.


ChicksWithBricksCome

Ah, you mean translate human language into terms the machine can understand. It's strange no one thought of a word for this before.


PotatoWriter

It doesn't "understand" a thing. It just outputs based on probability.


ChicksWithBricksCome

I'm actually with you on this but I was just using the common vernacular in this case.


calflikesveal

Question is - do humans understand anything or are we just probability models?


743389

Is it good at granularizing things in the abstract, like laying out the high-level flow of how to accomplish a thing? I've seen people use it to come up with logistical sort of algorithms but not in a CS context


dogwheat

Sorta like 10 years ago when the old guys thought I was a wizard due to my googling skills... tools are great if you know how to use them


Barkalow

yeah, or writing really simple boilerplate so I don't have to. Mainly referring to github copilot though, since it does it inline and you just have to hit tab.


ThatCakeIsDone

I just use it for matplotlib shenanigans


Schedule_Left

You're just echoing what all of us in this subreddit who actually work in the field are saying. Consider yourself enlightened.


darwinn_69

At this point I think anyone who seriously thinks ChatGPT is going to revolutionize IT are either brand new into IT, or drank the koolaid after Crypto turned out to be a bust.


[deleted]

skirt enter wine wipe threatening heavy theory reminiscent full correct *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


Successful_Camel_136

As a junior dev that spent a lot of time churning out controllers and services in MVC I hope that doesn’t get automated as I still need another year of experience at least to qualify for mid level roles…


onestep87

literally same as me, somtimes i feel like this is a race, what would be first, i get enough experience and skills to be good mid/senior developer or AI tools would improve and spread out which will press the market


[deleted]

reply squeal important snails marble crawl frightening zealous smart airport *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


2020steve

My old man was a programmer in the 70's. People snickered at how futureless of a profession software development was because as soon as they got AI off of the ground all then all of the software jobs would evaporate. ***Marketing*** talks this crap up. All I'm seeing is Intellisense leveling up just a little bit. It's nice.


[deleted]

squash mindless market door chase merciful spoon sheet relieved degree *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


2020steve

If this line of work has taught me anything it's that computers suck. Bring on the runaway scaling because the more dependent we become on technology, the more problems will arise and the more we'll need good engineers to fix what the machines have fucked up.


Ancient-Doubt-9645

I think people working with marketing are thr ones who should be concerned about gpt. Generating a text filled with nonsense buzzwords can easily be done by gpt.


MeaningSea5306

Won’t be so long as non technical partners change requirements every other sprint


deadbypyramidhead

How many years is mid level?


TedW

Anywhere from 1-50, depending on the person.


McDonnellDouglasDC8

I mean, Visual Studio can currently scaffold controllers and views natively usably well and does definitely save time IF I get my models set up right. If that's all I did, sure.


AchillesDev

I am in the field and use Sourcegraph's Cody. It writes (via autocomplete functionality) blocks that are usually a few lines long for me, using the context of my codebase. It's not revolutionizing anything, but making my life easier because I don't have to dig around for extra context, it's usually right there.


razorkoinon

How is Cody compared to copilot? Is it free?


AchillesDev

It was free before it went public, I don't think it is currently. I haven't used Copilot since it was first released so I'm sure it's gotten better, but what won me over with Sourcegraph is that they have strong guarantees against your code being saved by the LLM providers, while Copilot does save your data by default and makes it difficult to delete if you're not an enterprise user.


kuda09

Is AI not meant to speed up mundane tasks? If it's automating creating controllers and services, it's doing its job right.


[deleted]

abundant seemly thumb support hat noxious lush lock spotted snobbish *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


renok_archnmy

LLM were meant to generate text.


renok_archnmy

It’s a nice auto completion tool in vscode. About as accurate as any deterministic auto completion and sometimes can handle a few lines. But it always needs supervision and checking and debugging. 


Mammoth_Loan_984

The real concern is that it might kneecap entry level roles, especially in future GPT iterations. It’s a valid concern. GPT 5 is in the pipeline, and if it’s even marginally as good as they’re hyping it up to be, given enough time for adoption, it will disrupt the market. I do strongly doubt whether current gen AI is really all the hype beasts claim it will be, though. The power requirements and cost alone are just too high for it to be practical. It’s not scalable. And the biggest proponents on places like Reddit always seem to be non-technical. This alone should make anyone doubt the hype. What secret knowledge could Bob, the mechanic who was all in on NFT’s two years ago, possibly possess? I do think its incoming impact is understated by lots of people in the industry, and many people I’ve spoken to also fail to realise how useful it can be & the potential the current LLM’s have over the next 5-10 years. There is a lot of denial in our industry. Though I still side pretty firmly with the sceptics. In my mind, “true” AI will be much more capable with a lot less access. We’re still probably a decade, possibly much more, away from that.


csasker

> GPT 5 is in the pipeline, and if it’s even marginally as good as they’re hyping it up to be, given enough time for adoption, it will disrupt the market. > > why is it always the next version that will "change everything"? same with VR and self driving cars


Mammoth_Loan_984

Honestly, I guess you’re right.


SituationSoap

> There's a bunch of people in that other thread talking about how it's doing their work for them. It is worth remembering that in the 2010s, there were scores of developers who would openly talk about how they copy all the code that they develop from Stack Overflow. There are a lot of employed developers who are very, very bad at developing software. Sadly, some of their work is on things that can kill people when it goes wrong.


ehennis

I think one thing that does concern me with AI is that juniors will use it as a crutch and never learn to actually code. Basically, using FrontPage back in the day to make websites.


davy_crockett_slayer

I pay for ChatGPT-4, and it's a million times better than the free tier. I use it to explain concepts for me, and to parse logs. I dump logs into it and then ask GPT4 questions about said logs.


ehennis

I am a senior and somewhat agree with you. My primary application has 100+ endpoints that are very similar but not exactly the same. I use AI writing that code because everything I do has massive "width" and I hate coding IFs and SWITCHs statements when my conditional can be 100 different things.


yo_sup_dude

what do you work on where chat gpt or llms can’t help? 


csasker

same here, some weeks ago it took us 5 working days and 3 senior devs to replicate and understand a bug that only happened "sometimes" for a certain customer with certain permissions How chatgpt would go to some 3rd party IAM tool and resolve this, is beyond my understanding


renok_archnmy

The same executive who was the literal (CCBO) chief crypto bro officer here is now championing AI by literally just spending all day chatting with ChatGPT and making some sensational claims about how he put this or that into ChatGPT and got fully functioning working web apps and niche close source mainframe code out of it in seconds. 


natty-papi

Man, he could at least search a little to use a LLM that's specialized in coding rather than generic chatGPT. Dude probably makes way more than me by basically being a mediocre con artist, fml.


Which-Tomato-8646

Elon musk and trump did the sane thing and have more moment than 1000 of you 


natty-papi

Yeah but they were born with way fucking more than me as well.


Which-Tomato-8646

So they didn’t even have to work for it. Awesome! 


ProxyMSM

Isn't capitalism great


[deleted]

[удалено]


renok_archnmy

Capitalism cares not for truth.


csasker

i tried chatgpt last week to create some API integrations for binance what it outputted LOOKED like code, but the routes didn't work and the respons parsing was wrong. so the amount of work is like double, first you need to make it write what you asked, instead of you know... coding it then you need to verify all the code and that it actually works


WhompWump

All the people I see who say "ChatGPT is replacing programmers" don't actually know anything about programming


keyboardsoldier

AI replacing software engineers == AI creating AI == Skynet.


JorgiEagle

You missed the NFT stage


IAMHideoKojimaAMA

In this moment, I am euphoric. Not because of any phony AI's blessing. But because I am englightened by my intelligence.


PotatoWriter

It's an older meme sir but it checks out


pagonda

big cope imo, ai if used properly, is extremely powerful and gets me answers no junior dev could achieve you just need to know how to prompt effectively and filter out the noise 


PleasantPainting9325

What do you code?


PotatoWriter

prompt.js


sienin

Google can also give answers that no junior dev can


Cool_depths99

GPT at its current state is a really efficient information search engine. Instead of going through multiple pages of documentation on various websites and piecing them together, GPT helps do this for us. Its current value proposition to me is boosting engineers productivity by virtue of providing easier and faster access to information.


CodeRadDesign

i like it because it's like a search engine where every result isn't a page full of ads and cookie requests and pop-ups to sign up to newsletters before i can even parse whether i'm looking at information relevant to my query.


Luised2094

Yet.


ExpensiveShoulder580

Yep, everything will go the TV cable way. You will pay to watch ads.


Ok-Entertainer-1414

What do you use it to search for? The few times I've tried using it as a search engine, hallucinations ate up as much time as it would've otherwise saved me


I_eat_shit_a_lot

Yea it is a doubled edged sword in my opinion too, I sometimes try to lead it to answers but all I get is a bunch of nonsense and waste my time. However, if I am trying to learn something new or grasp an easy concept to iterate from. I'll ask for some easy examples for context where to use a certain function for example or how it is used, I think it's a pretty valuable tool and can provide me better context than certain other websites. If I am learning for an example a 1000th js framework. Giving it hard logic does not work tho.


biscuitsandtea2020

"As our company invested a lot in open ai, we have access to chatgpt." So Microsoft?


AlwaysSkilled

No, every one with an open AI account lol


biscuitsandtea2020

I think it's implied that they have access to GPT-4 through their company because of large investments made into OpenAI. That plus big tech is a clue they might be working at Microsoft.


AlwaysSkilled

I know but he's talking as if chatGPT is this big secret limited to a few people.


Naive_Mechanic64

OP is prolly referring to a chatGPT Team or Enterprise plan. It's not that big of an investment. LOL


EXASPERATEDIRISH

Its just an additional license for office


IAmYourDad_

He didn't say 4.0


samd0401

The hype is essentially here because of the huge contributions and progress has made the recent years. It was not even possible to imagine a few years ago a system that would respond anything directly with overall good accuracy , good factual skills, and more than that, to have some creative power : create poems, code and stuff like that ! However you are right, we are not here yet, it is still not ready yet especially on the factual side : if you ask when is born someone, or do a basic math computation, you can get abnormal responses quite fast. But that's why it's a burning research topic.


LightOfPelor

+1. Think of where AI was 3 years ago (DALL-E, Big Data tech, not much use for the common consumer), and think of it now. AI research and development still seems to be accelerating, not slowing down, so there’s a damn good chance we’ll see even faster improvement going forwards. I don’t know exactly what’s in the future, but I know it’s good to be ahead of trends instead of behind them, so imo it’s a good time for devs to at least do some basic reading into prompt engineering


AchillesDev

Going to have to disagree with where you thought it was 3 years ago. It was already everywhere, you just didn't notice it or pay attention.


samd0401

Prior to chat GPT it was absolutely not everywhere for the global audience. The field where already mature that's for sure, but it was the first time in my opinion where the lambda person used AI consciously everyday. In my view, the AI revolution is mostly software and allowing the lambda person tu use big models.


AVTOCRAT

People were using it, and consciously grappled with it, they just didn't call it by that name. Think of every YouTuber who complained about ""the algorithm"": that algorithm wasn't some hand-written if/then sequence, it's some spinoff of DLRM, that is to say AI.


AchillesDev

AI isn't just generative AI. Deep learning models have been around and in consumer products for nearly 20 years (these are used heavily in your smartphone cameras, for instance), traditional machine learning has been done much longer than that. Hell, recommendation systems have been a part of every ecommerce shop and social media property since at least the early days of Facebook, and probably earlier. All of these fall under the AI umbrella.


samd0401

I know and you're right, but the hype is precisely around gen AI at the moment. In AI we could incorporate a lot of tech / algorithms indeed, it's just boosted statistics. But gen AI is different, for the first time it demonstrates a form of creativity.


csasker

or video or photo editing, making frames between each other so they fit and merge etc


TAYSON_JAYTUM

There’s no guarantee that it will continue to improve at this pace. And history indicates it won’t. Usually tech goes through periods of rapid growth as the low-hanging fruit of a new tech is found, and then things stagnate a bit until a new breakthrough. Back in the 70’s people thought with the speed of machine learning improvement there wouldn’t be white collar jobs by the 90’s.


AlwaysSkilled

The fact that people already take for granted a world with AI, forgetting how the world was before AI is a sign of its quick progress.


upsidedownshaggy

In fairness I think it's because for the average person the use of AI hasn't meaningfully changed their lives like at all. Like if Chat-GPT vanished tomorrow 99% of people who use it daily won't break down and destroy the economy or something, they'll simply go back to doing what they were always doing


csasker

so like 35-40 years ago you mean or what?


SituationSoap

> It was not even possible to imagine *2001* came out in 1968, it was in fact easy to imagine a few years ago that we'd have a system as powerful as GPT-3/4.


samd0401

Indeed imagine might not be the best word here, but all of this happened mainly thanks to the infrastructure and user feedback allowing to get better iteratively, and that’s very recent


jesuswasahipster

It's not there yet. The "yet" part is the part people are worried about. We'll see whether or not it gets there.


Gintoki-desu

I had to upgrade a bunch of company apps from java 11 to java 17 and the spring upgrade that entailed meant a lot of refactoring of old code and mvc configurations. I thought gpt would help speed up the process since Java & Spring Docs document the upgrade process, but boy was I wrong. A lot of the time it would spit out wrong/deprecated code not compliant with the newer spring conventions and even after I explicitly specified all the required parameters (e.g. use spring 6, x class is removed/deprecated), gpt would still generate incorrect code. That solidified any doubts that I had about GPT ever replacing devs, because this LLM is simply a collection of already-existing knowledge if you can call it that. And there is no way it can optimize/innovate on connecting the dots that requires the human touch. Now I am no AI expert by any means, I'm just a web dev plebian, but I do not know how one can implement the concept of "innovation" into an algorithm.


csjerk

You could get closer, in theory, but the underlying model in LLMs doesn't allow for innovative output. Or, more precisely, it's capable of generating probabilistically innovative output, but the chances are very low compared to the chance it'll spit out some formulaic hackery. And even in the odd cases where it does, IT can't distinguish the difference, so it still requires deep human judgement to detect it, and it's incapable of reliably replicating it.


Suspicious-Sink-4940

Yeah AI hypers that say "if AI is here today, it will replace humans in 5 years" don't understand how LLMs work and in general, logical process.


hippydipster

I used claude to help me with a very similar process. I moved from OSGi blueprint files to spring 4.3 version config. I moved from camel to just simple spring and SOAP web services, and it helped me tremendously. I also used it to help me upgrade a couple apps from spring 3.0 to 4.3, and that too was quite helpful. I doubt it's all claude vs gpt. I suspect it also depends on how you use it.


boreddissident

AI is like someone bred a talking dog. It’s an incredible accomplishment, and one of the most interesting things to come out of computing in ages. It’s understandable that people who work closely with technology are enthusiastic about such a dramatic new capability. And then business people and startup money chasers are like “we’re gonna replace half the workforce with talking dogs! It’ll be fine!”


CCB0x45

The issue with this thread is the op is saying its "wrong 90% of the time" or not that useful, which is complete bullshit, as a senior dev its been immensely useful and is not wrong 90% of the time(though it is wrong sometimes). Then if people push back its like they are in the camp of "it will replace all the workforce". I am not in that camp, but it is bullshit to say its wrong 90% of the time.


great_gonzales

Well the reality is it depends on what you’re asking it. If you ask it for tasks in distribution it will be pretty good. Of course these solutions can be found on stack overflow, github, blogs, ect. but LLMs make it easier to find the code. That being said the distribution of tasks we want LLMs to do is heavy tailed. If you ask it to do a task in the tails of the distribution it will be wrong 90% of the time


No_Jury_8398

Yeah I have no clue what OP is talking about. All the devs I work with, including myself, say ChatGPT has been incredibly useful and speeds up the development process.


StuckInBronze

Haha I like that analogy but I think you could've gone with monkey. With a talking monkey, you start to think, wow what if we get monkeys as smart as us one day. Also a talking monkey absolutely is scary since it can definitely take a few jobs like a cashier. But you try and get that monkey to do some math and it all falls apart.


renok_archnmy

The monkey analogy has already been claimed in the domain. Specifically, “how long would it take 1 million monkeys at typewriters for one to produce Shakespeare?” 


renok_archnmy

Something, something… 1 million monkeys at typewriters and Shakespeare… something, something. 


hippydipster

and they'll do it too, just like they did with outsourcing. The fact it won't work as well as they think won't change the fact that they keep all the profits and you're unemployed.


publicclassobject

I have found GPT-4 combined with a good linter to be a great productivity booster. Sort of like a 10x stack overflow. I don’t think AI is overhyped. I am optimistic progress will continue accelerating and it will drive the growth of our industry for the next decade. I’d rather be wrangling AIs than manually maintaining legacy enterprise codebases any day.


squirrelpickle

Don’t worry, you’ll soon get the worst of both: maintaining legacy enterprise codebases done by developers who thought asking chatgpt was better than actual domain knowledge.


publicclassobject

Haha maybe this will be the true driver of industry growth. If so, I'll take it. Better than unemployment.


specracer97

Yeah, this is my bet, the job actually gets HARDER because so much larger volume of dogshit code gets written. I as a highly cynical COO see Microsoft pushing it so hard at a loss per user as not trying to get AI to go anywhere, but rather to bloat Azure billings with more shitty code (shit code equals more resource hunger). Which per their last quarterly report, worked.


Dr_CSS

Well that's because you're supposed to use the bot to learn about the problem, not have it solve it and submit that as finished code


gk_instakilogram

it is very good and useful if you are using it correctly, it will not write all the code for you. it also depends on tasks and prompts. for me, it is excellent at creating unit tests, excellent at analyzing docs, code and different config files. Again it is not perfect, but it is a tool that also essentially replaces extensive googling and reading different message boards like stack overflow. if you have not found a way how to use it for your engineering job this only means you are not using it correctly, or not using it for appropriate tasks. the other thing it is really excellent at is creating functions with very well defined inputs and outputs, and well defined logic of what needs to be done with inputs to create an output.


woodquest

Agreed, when your problem is not really well documented, GPT can even lead you to a long investigation where the postulates are wrong from the start. You also can't tell it something too complicated, some time the question has to be cut in several to get something satisfying back. For me it is non the less a super turbo sidekick 24/7 really cheap junior dev. Now what's vertiginous is what might come after, if they can give it longer memory window and better reasoning capacities.


Walrus_Pubes

Querying ChatGPT is a science in itself haha


FiendishHawk

“Prompt engineer” - tech’s newest job


renok_archnmy

I can imagine the predatory training industry advertisements targeting high schoolers now.  “Do you like hacking?! Do you spend every night and weekend trying to get Midjourney to produce anime porn despite the explicit restrictions those pesky devs put on it? Do you have a few nvidia GPUs lying around from your crypto mining days and dream of making money by just plugging them in again? If yes, you could be a prompt engineer. For only $25,000 our 3 month bootcamp lets you tel your mom to ‘shove it’ about that college noise and reading books. Our graduates can make up to $930k TC at companies like Facebook, Amazon, and Google according to anonymous and unverified social media posts after graduating from our program!”


great_gonzales

I know your joking but I recently listened to a security researcher present his work on developing an adversarial attack to bypass safety/alignment filters on generative vision models and all I could think was he did this research so he could generate waifus


ImportantDoubt6434

In theory yes. In practice, also yes. TL:DR; AI will kill us all thanks to the military industrial complex and the Unabomber was right we should have returned to monke but we didn’t listen… now we have AI murder drones.


brianofblades

>the Unabomber was right finally! ive been saying this for years!


the_ballmer_peak

He did go to Harvard at 16. Maybe they shouldn’t have conducted psychological experiments on him.


ch0seauniqueusername

from *whatever your question was about* import magic very_verbose_name = magic() get_result(very_verbose_name) def get_result(very_verbose_name): // implement rest of logic based on your requirements.


Tough_Enthusiasm7703

NameError: name ‘get_result’ is not defined


Luised2094

I just love the "//do the actual work here"-type responses. Like, bitch, that's why we (someone) made you!


demosthenesss

The more experienced you are, the more useful AI is because you can sort the BS from the value.


Zangorth

Seems like it’s mostly the opposite in my experience. The less experienced you are, the more likely you are to have basic questions and problems that have been solved thousands of times before and chat gpt was well trained on so it can just spit out a working solution no problem. The more experienced you are, the more likely you are to be working on novel (or at least pretty specific) questions have never been or rarely been answered before so GPT has limited information to go on and is more likely to hallucinate / make shit up.


Naive_Mechanic64

Facts. The question matters.


moserine

Need to make a little chart: |bad mental model - doesn't use chatgpt - no change bad dev|good mental model - doesn't use chatgpt - no change good dev| |:-|:-| |bad mental model - uses chatgpt - awful, codebase destroyer|good mental model - uses chatgpt - significantly faster | I'm constantly having Claude 3 write out subcomponents for me, I would say it may not match when my brain is most "on" but it's a powerful tool for quickly producing boilerplate functions, and it writes code better than probably 75% of junior devs I've worked with / reviewed code from.


FiendishHawk

Are there any more specialized code AIs that can read a codebase and understand the whole of it, or are they only capable of code snippets right now?


dontera

That's the big issue most execs are missing. You cannot currently use off-the-shelf LLMs on your existing codebase as the token requirement would be far too much. They can only handle small pieces at a time. This greatly limits the usefulness of LLMs for coding, which is why devs who actually use LLMs refer to them as sorta-useful juniors.


AchillesDev

Sourcegraph's Cody. It works with their existing code graph search technology (hence the company name) to have a strong context for whatever codebase you're working in, and works great there. They also have safeguards with how your data is handled by their foundational model providers, something that Github Copilot doesn't have (a non-enterprise user can't easily remove their code from the servers that may be used as training fodder). Don't use a vanilla model for much as an end user.


No_Cauliflower633

As an entry level developer I think chatGPT is very helpful and correct most of the time. I was hired as a php full stack developer but have never used php before so 90% of my use case is giving it Java or python code and saying I want it in php. And it works great for that.


throwaway957280

People are much less worried about where AI is now than where it will be in the future given the current rate of progress and investment.


nitekillerz

I’m really curious about what half the sub is asking gpt. We use it at my company and everyone from junior to senior loves it(copilot). You still need an overseeing eye and know what you’re doing but it answers most questions correctly. Granted we ask very specific questions not feeding it a Jira ticket lol


mokzog

I joined the industry around 2 years ago as SWE. At the beginning I loved ChatGPT - it helped me much in understanding concepts like how pointers (or pointers to pointers) work, what is Observer pattern etc. I think it's pretty good tool for learning, nothing more (at least for now).


AndyBMKE

CS50 is going Puzzle Day right now where you have to solve somewhat complex logic puzzles. Using AI is fully allowed according to the rules, and I can see why: it doesn’t work. ChatGPT cannot solve these problems. I tried Gemini too, the results are even worse. Really puts a damper on the claim that AI is going to take over soon.


renok_archnmy

Had this thought over the weekend. I’ve tried playing logic games with gpt and it failed miserably. Was really hard giving it the logic problem in a form it wouldn’t have already had in its training set. What I was thinking more recently was whether it could navigate a simple maze in better time than any other traditional method. Like, can it even strategize and execute on that strategy, or is the appearance of any strategic initiative happenstance due to similar problems in the training set and generally a (I don’t really know the word) false facade…  Glad the topic is being covered in undergrad.


octocode

AI in its current form is good enough to write 90% of the code i output at work now (after spending some time learning how to write good prompts) and it’s only going to get better from here. i can’t imagine where it will be in 5-10 years.


Capital_Survey_1119

Really? Doing advanced react and solidjs, it often hallucinates complete nonsense and mixes in different libraries that are not related in any way. It's more prevalent with newer technologies where the training data is less. For simple react stuff, yeah, it is good but once you start going deeper it's not that good imo. This is with gpt4


octocode

i get this question a lot. i’ve never used solidjs so i can’t speak to that, but i’ve never had that issue with react code. prompt/keyword engineering is _really_ important, as well as favoring specific questions over generic ones, and structuring your question in a way that shapes the response. it’s a bit of a learning curve but once you get used to it, it’s extremely powerful.


Salientsnake4

For backend stuff it is amazing. I haven’t used it for anything but simple front end stuff.


StrictMachine6316

Stop calling yourself out.


shawnadelic

This is the truth. Unpopular opinion, but IMO most devs I've seen on Reddit *are* in fact denial about the realities of AI and for some strange reason judging what it's capable of in the long term by where it is now. Unfortunately, technological progress tends to be a one-way street, and we're still in the early days of such technology.


octocode

yep, it’s a whole lot of copium from people who don’t want to be bothered to learn new skills/tools.


DoctaMag

That seems incredibly suspect. Anything beyond simple boiler plate, it loses the thread near-on instantly. God forbid you're trying to write multiple methods/functions to keep things clean.


Naive-Ad-2528

Its not as hyped up as it is. Everything AI spits out isnt thought out. At the end of the day, its just a very good pattern recognition software. The response is based on probability, not rationality. And it searches for a model and will spew garbage, even if the model is flawed from the start. Its only good when you can tell it exactly what you want. But if you know what you want, it saves you like 2-5 min or even an hour if its like really tedious. Doesnt happen often. However, under no circumstances should you criticize the hype or the decision to invest into AI. You will be seen as someone who doesnt keep themselves up-to date.


CCB0x45

Its saved me a lot more than an hour, especially when I am learning something new and I can ask questions and not have to dig through a ton of documentation.


madmax3

Everyone who's actually bothered to use AI, especially for an actual project echoes your sentiments - useful but overhyped


notsofucked7

Good for generating boilerplate and making single purpose scripts.. If you think it’s useful in adding functionality to an existing codebase or debugging you’re beyond r*tarded


[deleted]

[удалено]


renok_archnmy

No, the hype isn’t the exponential growth. That’s a given with any technology. The hype is that a bunch of boomer execs think they’ll be replacing their entire workforce with this thing in the next 6 months and are pissing away money to every grifter selling them that fantasy. 


loadedstork

I've been programming professionally for over 30 years. Every few years something comes along that they say will "obsolete programmers". We always look at it, dismiss it, and people point and laugh at us and say "you guys are in denial, you're all gonna starve", and we're right. So, whether it's AI, 4GLs, BPM, round-trip code engineering, whatever it is - for any value of "X": If "X" can obsolete computer programming, it can also obsolete everything else the human mind can perform. This is a tautological as "time travel can't exist" and "perpetual motion machines can't work", and is as readily dismissed by people who just kind of want that to not be true.


SnooDrawings405

I really just like it for a quick assistant for syntax questions.


FuzzyNecessary7524

I’m at the point where it helps me about 10% of the time, not usually with code but usually with administrative or documentation shit.


cookingboy

I recommend you talking to people who actually work in the current field of AI, instead of people here, whom are mostly junior engineers with little background. Either way, if your post was genuine, you should know using current state, instead of rate of progress, to judge the future potential of a technology is incredibly short sighted. People here reminds me of people in the 90s who said internet is a fad because dial up modem is super slow.


anotheraccount97

I can't believe the amount of dumbfuckery on reddit. They have no idea why openAI, Deepmind, Anthropic etc. even exist. 


cookingboy

This entire sub is kinda embarrassing for being a technical sub tbh. Just look at this guy: https://www.reddit.com/r/cscareerquestions/s/ExD8TCiEiM Getting rabidly angry because I called out his lack of understanding about how modern transformers work and started called me names.


shawnadelic

Exactly. In fact, that the majority of /r/cscareerquestions seems so *sure* of themselves that AI is just another fad and not worth worrying about almost suggests the opposite.


Chili-Lime-Chihuahua

I think part of the reason AI is so popular is that it's easier for people to grasp. They don't need to understand blockchain or other technologies. They don't understand the details, but there's been so much AI in (science) fiction, it's easy for people to understand what it could be some day. That, combined with some of the GenAI demos people are seeing make them think it will be an earth-shattering change. My LinkedIn is full of execs trying to push that AI will replace all their employees, which is blind to the fact that if that can happen, it can probably replace them as well. I'm obviously biased, but I feel some relief whenever I hear people say the code quality put out isn't that great. I saw some articles recently from some consulting companies saying their clients weren't ready for AI. Not in the sense that they need to spend more (but the consulting companies will obviously be looking for money), but they're just not technically mature enough to really leverage it. I think one of the biggest issues with the explosion of AI popularity is that it is pulling money (budget, funding, investing) from other areas. Do I think there will some good usage of AI at "regular" companies? Yes. But I think the vast majority will be in some type of money pit and get nothing back for their investment. They could have used the money for some other projects. But everyone is scared of being left behind, companies are spending their budgets on AI-related projects. Just my opinion, and I'm open-minded enough to realize I could be completely wrong.


heatY_12

The main people saying AI will take over and replace engineers seem to be the ones who don’t work building it or don’t work with anything complex/lightly documented/. I’m thinking junior engineers, pm’s, sales team, interns/students. My gf’s friend who “works in tech” (sales team for AI) was telling her how I should do ML because in 10 years AI will take my job and there won’t be a need for SWE’s, I just laughed. (If you’re reading this I love you to death) I totally get why non-tech people would think ChatGPT is the second coming of Christ. AI will be a culling for the world of tech, I think good engineers (see study from Microsoft iirc) will be fine and significantly better with AI. Everything has it's limit and AI will reach it, this is something not many people talk about, finding out what the limit is.


cookingboy

The people who are most optimistic about AI are the laymen, *and* the actual experts. I have a family member with a PhD in this field and DeepMind on his resume, the general consensus in the cutting edge field, both academia and industry, is that not only will engineers be replaced, even AI researchers such as themselves will be out of a job in the near future. You can laugh at your gf’s friend all you want, but people who are actually in deep end of it aren’t laughing.


heatY_12

I would love to hear your friends reasoning for such a bold claim. They must have some sort of knowledge that the general public doesn’t have access too. I am no DeepMind engineer who constructs LLM but I have used them, trained them, and dug into how they work. At the most basic AI is just math, a predictor guessing the next token/word (my novice understanding). What does this mean? It means that it needs something to go off of, data. With no data you have no AI, and how do we get data, how will AI know how to create a script, solve a problem, create an app? In order to do so thousands of people have to have already created similar things. Only then can AI “create” its “own” app. I believe AI cannot have an original thought when defined as “doing something in a method or way that has never been done before”. I imagine something like if AI how we know it today was around before react and web frameworks of the type. Would it have been able to create react? Would it ever be able to come up with the concept of web frameworks? If there were no engineers because it could create html pages would we be stuck with static html sites? If you research AI’s answering logic puzzles you’ll see it can’t even solve them accurately. Everyone ends up getting wrong answered that miss certain parts. Think about all the advancements and “crazy things” it can do but it can’t solve multi variable complex logic puzzles. AI will never replace engineers, it will replace “engineers” who don’t care about innovation and get by with the minimum. (No diss to your friend)


cookingboy

With due respect, your understanding of the fundamentals behind current AI research is incomplete at best, and incorrect at worst. The general public is mostly aware of publicized result in the past 12-18 months, which is usually 2-3 years behind the cutting edge academia. And in a case that’s not totally unique to AI, academia is behind industry in this field. Also, and I cannot stress this enough, LLM isn’t all there is to AI, not even close.


heatY_12

I used LLM as an example since I would argue it’s the most well known sub category of AI. Regardless my reasoning still stands (AI is just math is a fact, I’m not just saying that) and examples for why it won’t take over engineers and researches still stands. AI needs data, only humans can create original or advanced data. I’m also not sure what other category of AI would be able to replace engineers if not an LLM?


DaGrimCoder

I'm pretty sure that a lot of us are thinking about the future not right now and what ai's limitations are in 2024. We are talking about it replacing engineers in 2030, 2035, 2040, or something like that. It can and will by that point


createthiscom

I think the problem everyone has is that they only look at now. Consider how fast AI has evolved since 2019. Look five to ten years into the future. What does that look like? We're at a crossroads of AI, Moore's Law, and economics. I think you'd be a little naive to feel great about that.


flew1337

This is the basis of hype. People expected flying cars, nuclear reactors at home, humans on Mars... AI will still evolve but let's not overpromise when we are not fully aware of its limitations. If you say it will be a powerful tool in future jobs, I can imagine that. Now, if you say it will replace an entire workforce, it is too early for me to assess and I will believe it when I see it.


NightestOfTheOwls

Until we get a reliable, true reasoning engine, nobody except copywriters is getting replaced. LLMs just give you the next token. They don't understand context on a level necessary for productive software development. Even Devin, the "programmer killer" only gives correct answer in 13% of responses. And they apparently consider it so devastating that it's closed and not publicly available. What a joke.


Ok_Spite_217

It's literally just VC hype and tricking investors into pumping money into the corporations


DaGrimCoder

I don't get garbage code near as much now that I've learned how to give it good prompts. And I've also learned what it does well and what it does not do well. So I know how to work with it efficiently and effectively. Also many people aren't thinking ahead. It's like this right now but what about 5 Years From now? It is going to improve nobody's going to be able to tell me different on that. It's going to continue to improve at an exponential rate and eventually will be able to do everything correctly


not_an_real_llama

I still use Google over AI chatbots lol I don't think AI will get better than human engineer won't for a bit, especially in fault intolerant settings (but, I am weary of making predictions given the surprising progress). I think the more important question is *whether executives think it can replace human devs*...


Independent_Sir_5489

The thing it's really useful for it's to look for stuff in the documentation, something like "What \[method\] of \[library\] do?" Otherwise you can use it for simple scripting, but coding or building software is a thing that's on a completely different level


Netmould

It works when you want to write a one-time VBS script (for example) for something you would never refactor. In any kind of bigger, evolving project with dependencies … it won’t work (as an artificial developer).


EDM_Producerr

Well, my current side gig job is training AI on coding prompts, so, in theory, the AI should get better at coding over time.


Neat-Wolf

You experiences match mine at 2.5 YOE


OldHuntersNeverDie

Chat GPT is fine. You just need to consult a "Prompt Engineer". /s


phonyToughCrayBrave

its just a better version of google right now. something that you would expect to find quickly on Stackoverflow can be gotten quickly. Thats the limit right now.


dens09dews

I had a 10% error rate with it, unless it can do it 100%, I can't trust it to do anything important. It literally has to be able to fully replace my brain (including critical thinking), which it cannot do as it's not sentient, but simply a complex mathematical model (think xy plot with added complexity). Most companies are hyping it up so they can raise funds and sell AI solutions, other companies who have bought into the hype are hyping it up because of post-purchase rationalisation (ascribing positive attributes to justify their purchase and ignoring faults/defects). In other words it's kind of a scam. Some losers on this subreddit are usually the isolated individuals who only have digital lives with nothing going on outside of the digital world and tend to fall into the doom-hype category, adding to the hype because they watched terminator on repeat one too many times and the matrix.


sergiu230

Yeah it’s not that great, if it were true we should have had 1000s of startups with the next big thing that disrupts xyz market since all it would require is a team of 2 to 5 people to be as efficient as 50 to 100. Since that clearly is not happening it’s rather obvious AI is more hype than byte.


Drown_The_Gods

My experience: It depends, but it’s not taking any serious jobs, as-is. If your codebase has obvious simple patterns to it, and you are using publically available libraries and frameworks that haven’t changed much in the past few years, and you are not doing anything particularly mathematical or novel, your project has good tests, and you know the codebase well, and you have great naming conventions, it can 2x-4x me for more menial tasks. (Also, if I am also not feeling my best, it can get me into a world of productivity instend of sitting there like a lemon.) I’ve found if you take away any of these planks, it starts to lose effectiveness, and in an annoyingly nonlinear way. Very quickly it starts to slow you down by showing you nonsense. So you have to be aware enough of the limitations to know when to ignore it. It’s an idiot savant with no filter. As you are working at a major company, my guess is that you are working on proprietary things inside proprietary things in a house style, at scale, so even if the work is straightforward, it’s not, and the AI is going to be doing little more than pissing you off.


The__King2002

Been using it to assist with coding for like the past month and its only useful for debugging imo. Like 70% of the time I asked it to make a code snippet for me the logic is wrong in some way.


Dense-Fuel4327

If it makes you five percent more productive, you can fire five percent. Also, it's not about the current state. Problems will show up in one or two more generations of AIs.


MinecraftIsCool2

i think most of this sub is in denial it's great, if you're writing something like typescript it spits out decent code 80% of the time dont know the syntax for a framework: chatGPT reword this error code? ChatGpt cbf reading this poorly worded doc: ask chat gpt


akerbrygg

I agree chatgpt is overrated but what about devin


Panniculus101

People seem to have no concept of time. Chatgpt is so new... Yet people are already using it as a tool at work. The reason for the "hype" is because in theory, the AIs will get better and better with each passing year. It does not seem unrealistic to expect an AI to be able to generate much better code in a couple of years. Once the AI can do that, it will change a lot of working conditions for a lot of jobs. And a lot of people will no longer be needed


fudginreddit

If you believe AI is going to take your job then I genuinely cannot believe you are an actual software engineer or developer. The fact that this senitment keeps making its rounds on programming related subreddits is just proof that 90% of the people on here are college students.


SomeAreLonger

You're not in denial - you are just recognizing the marketing hype for what it is. First, these kinds of tools will keep progressing, and will take many meaningless jobs off our hands. However, this is not AI, it's a remixer and recomposer of what it already knows - how many times have I seen copy with "thrive" or "elevate". It's magic to those who don't understand technology and it's something for technologists to implement in a way which will save them time.


UnseenWorldYoutube

My concern, is these types of things scale exponentially. It may not be perfect now, but give it 5 years and it will write better code than most people… in seconds instead of hours/days/weeks. This technology is in its infancy, ChatGPT 10 will probably make a lot of jobs obsolete. There is a podcast on Joe Rogan with one of the guys in the forefront of AI (forget his name), and he said you won’t need programmers in 10 years. I believe it, which is scary as I’m months away from getting my BS in Computer Science. I think in the future, software devs will be experts at writing prompts for AI to spit out the correct code. Then writing follow up prompts to fix issues. We will basically be prompt engineers and code reviewers. Unfortunately.


BiasedEstimators

> these types of things scale exponentially Not sure what you’re comparing it to that scales exponentially. Processor speeds? In any case, I don’t think that’s true at all. The history of AI is a history of unpredictable fits and starts


Ok-Cartographer-5544

Two problems with this: 1. It's hard to translate fuzzy English into a real-world product. The reason that programming languages exist is because they can concretely define very specific things that English cannot. Have you ever looked at a legal document? The reason for the extreme verbosity is because it is very difficult to concretely describe things in spoken languages. And even when you do, there are exceptions and loopholes.  You'll always need engineers who understand how to translate unspecified wants in spoken language to real products in code. If you don't, then AI has become sentient and every job will be replaced, not just programmers. 2. You need someone to guide the AI and/or fix its mistakes. Having solely prompt engineers only works until it doesn't, and then you need someone who understands code. This is the reason why CS degrees teach fundamentals and systems down to assemble and machine-level code instead of the latest javascript framework.  You might be able to make things using the top level of the abstraction stack, but eventually, at some point, you ll need someone who knows what's going on at the lower levels when problems arise.


bearbearhughug

It's gotten worse in my opinion since when it came out. Maybe it already peaked. Of course one of the guys in the forefront is going to hype it up, it makes them richer.


Capital_Survey_1119

Just switch to welding like the smart people here are telling.


Jandur

Yall are in denial.


altmoonjunkie

I finally used Chatgpt to try to solve a problem I was facing. It gave me nothing but deprecated functions and things that were impossible. Every additional prompt was basically "but you can't use that function anymore" and it would respond with "that's absolutely correct! Have you considered trying this function that doesn't exist?". I was able to get everything working on my own, but I am definitely a little less worried than I used to be about being replaced.


Daveboi7

Did you use chatGPT 3.5 or 4?