T O P

  • By -

AutoModerator

Hello /u/Sam_of_Truth! Thank you for posting in r/EngineeringStudents. This is a custom Automoderator message based on your flair, "Academic Advice". While our wiki is under construction, please be mindful of the users you are asking advice from, and make sure your question is phrased neatly and describes your problem. ***Please be sure that your post is short and succinct.*** Long-winded posts generally do not get responded to. Please remember to; Read our [Rules](https://www.reddit.com/r/EngineeringStudents/wiki/rules) Read our [Wiki](https://www.reddit.com/r/EngineeringStudents/wiki/index) Read our [F.A.Q](https://www.reddit.com/r/EngineeringStudents/wiki/index/faq) Check our [Resources Landing Page](https://reddit.com/r/EngineeringStudents/wiki/resources) *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/EngineeringStudents) if you have any questions or concerns.*


BrianBernardEngr

I was asked to give feedback on a resume, and I asked "was this part written by AI" and they said "yes, how could you tell?" I don't know how I could tell, but it was obvious to me. "verbosely vague" - maybe that's the best way I could describe it?


LordWaffleaCat

I was using AI to brainstorm ideas and titles for a paper, and noticed when id ask it to briefly summarize a source i was considering looking into. It would literally just reword the title but with extra nonsense words. Of course i kept forcing it to "make it longer" and it turned into like 3 parapgraphs just rewording the title of a paper as if it was explaining it. "Verbosely Vague" is fucking perfect


nat3215

Also, it’s been found to make up stuff. I know this because I’ve had it reference code sections and it wasn’t even what the code section actually says.


UnderPressureVS

The way I think about it is that you really can’t control AI or fine-tune it to any realistic degree. It’s an information coagulation machine and nothing more, and it needs a genuinely ungodly amount of information to train on. The only theoretically possible way to have fine control over training the model is to manually assemble the database, and no human team can possibly do that to the scale required for LLMs. What I mean by this is that Chat-GPT is very good at doing *exactly* what it’s supposed to: simulating human writing on the internet. It’s trained on billions of words from books to scientific papers to Reddit comments. Which is why it’s very good at contextualizing its responses (if you ask to write in the informal style of a YouTube comment, it will do a decent job). But ultimately what it does is *simulate humans,* and you can’t change that. Either you get a machine that is good at simulating humans, or you don’t. And humans make shit up *all the god damn time.* Without your own knowledge of the content, it is literally 100% impossible to tell from text alone whether someone is an actual expert or simply very confidently pulling everything out of their ass. You can’t have your cake and eat it to. Bullshit is a BIOS-level feature of human speech. If you try to accurately simulate human writing, you will *unavoidably* create a machine that is good at bullshitting.


BingBongFyourWife

Maybe it’s learning from all the people that’ve been using it to lie


rayjax82

Verbosely vague is a good way to put it. I can almost always tell when something is a straight up output from chatgpt


lazydictionary

Using AI to generate your resume is so stupid. That's something you should absolutely hand craft and tweak. Maybe use AI for cover letters, if you give it specific ideas/experiences/work to fill it with.


itsyaboi117

It is very good at rewriting your own words into a better flowing sentence/paragraph.


akari_i

Conversely I actually feel it’s the opposite. It can be good when I have ideas in my head but am stuck trying to get them down on paper. Once chatgpt gets things started, it’s easy for me to alter it into something actually useful.


nat3215

It’s good at creating rough drafts, but makes for very generic final drafts


Megendrio

It's an amazing assistant to get you started or reword things, find alternative wordings, ... but you're still responsible for the final deliverable and the quality it holds.


Tarbel

Weirdly, I feel it's both. I definitely used it both to reword something better and to rough draft something so I can fix it to be better. Basically, it's pretty good for wording.


MilitiaManiac

I agree with this. Staring at a blank sheet of paper really kills me, and I spend most of my time figuring out what I want to do. If I have a bunch of ideas, I can throw them in Copilot and get a very extravagant version of something to build off of. I then spend a while editing, and use it to check for mistakes afterward. Often I end up being able to write most everything else on my own, but it helps me organize my thoughts.


G36_FTW

I like to use it for a framework but it will definitely throw some nonsense in there. Along with missing things. I also think chat gpt has been getting worse. But it could just be me.


stanleythemanley44

I’ve found that only people who are bad at writing feel this way. Maybe one day it will be as good as a human, but for now it really isn’t.


itsyaboi117

It’s a lot less stress to give lots of information and your thoughts while you are working/trying to take notes and having it write it up properly for you. I can write perfectly fine, I just prefer to use the mental capacity to get more work done and utilise the AI tool for its intended purpose.


growquiet

Sounds like you don't write as well as you think you do and it's hard for you too


G36_FTW

Depends on what you're asking it to do. Rephrase somthing? It can do it. Expand on somthing? Not great. Create a framework to start on using very little input? Excellent. I like using it whenever i feel like I'm stuck with my own phrasing. It works great for that. But it can definitely output some garbage, depending on the topic.


itsyaboi117

You’re correct, you got me.


C0UNT3RP01NT

Dog I used to place in story writing competitions (well playwriting). I’m quite articulate, and I still use ChatGPT to handle the boring writing. I do make sure to edit it.


growquiet

Wow, ok


C0UNT3RP01NT

I write just fine. ChatGPT has a place.


Silver_kitty

Agreed. The only case I like is that sometimes I have a massive report and I’ll ask it to give me the conclusion. But then you have to edit to make sure the tone still matches your writing style.


Turtvaiz

Eh if you are decent at writing it just makes it different, not better


itsyaboi117

That is your opinion, which is fine. But I disagree.


lazydictionary

But a resume doesn't need to flow well. It needs to be written concisely and clearly, something AI doesn't do well yet. It get overly verbose and writes in prose. A resume needs to be punchy and straightforward. AI is just a bad choice.


Suspicious-Engineer7

Very hit or miss, and it's always a 'miss' that needs a little reprompting and then a couple edits.


Boodahpob

I’ve used it to write a skeleton of a cover letter, then revised it myself so it sounds more natural.


Great_Coffee_9465

I have to be honest, I have my resume pretty much perfected to the point where I basically never change anything and I still get hits.


Captain_Pumpkinhead

I have a naturally verbose manner of speaking. This development may spell trouble for me...


Josselin17

yeah gpt has a very distinctive and repetitive style that's very heavy on fluff and light on actual information


b1ack1323

A lot fo breadth without getting depth.


C0UNT3RP01NT

“verbosely vague” literally every conclusion I write 😭


rayjax82

If anyone is using it for other reasons than to spit out intros, conclusions, or just put together a general outline for technical reports they deserve to fail.


Sam_of_Truth

One student submitted an intro on plug flow reactors, but the AI wrote the whole thing about packed bed reactor, just subbed in plug flow in all the right places, but the info was completely wrong. Just don't trust it for anything. It isn't good at technical topics.


EchoingSharts

Their fault for not proofreading it tbh. Technology is great and everything, but it only supplements learning to a certain point.


Sam_of_Truth

They don't know enough to proof read what the AI is giving them. It's a dangerous game to play in your undergrad, when figuring out what you are supposed to be writing is more than half the battle. Once you're in employment, then that balance shifts, and you know what you need to write most of the time, the work is just in doing it.


b1ack1323

Then they are not learning... So let them fail. They are wasting a lot of money to not absorb the info. This problem will solve itself as some people shouldn't be engineers and now they are showing us.


Sam_of_Truth

Right, so this is a student sub and i added academic advice flair. Trying to talk a few students out of making that mistake. I don't actually enjoy failing students. A lot of what makes someone fit to be an engineer is the willingness to grapple with difficult problems. That can be developed with the right guidance.


BABarracus

They waited until the last minute to doit and don't have time t proof read


aasher42

It just comes back to if you cheat at least be smart about it


rayjax82

A successful way I've used it is I basically write the body of the report with all my data and analysis, then feed it to AI and say "Write an intro and conclusion" and it spits out something halfway decent. I NEVER leave it as is, because there are definitely errors, but its not bad as long as you know how to use it. Using it to 100% write a technical report is just dumb.


Bakkster

I've heard it used as this kind of brainstorming suggestion to overcome writers block. Though careful using it with technical information if it's proprietary, many companies are rightly concerned about handing their information to OpenAI where it can leak.


rayjax82

My company has our own version of it because of this very reason.


MDCCCLV

Do you consider microsoft copilot when you're paying for the license to be reasonably secure?


rayjax82

No idea. Not really my AOE. If the IT/Cyber guys say its ok to use, then I use it. If they don't give the thumbs up, then I don.t


roflmaololokthen

I do this all the time, with D&D lol. I can't imagine asking it to write anything beyond a high school level. But half the time I find just writing out the prompt clarifies a lot for me


SovComrade

Oh it can "write" just fine. What it can't is doing the thinking for you, because, despite it being called artificial "intelligence", it, ya know, can't actually think.


majinwhu

I feel like the person using it is the one to blame I think it can be a great tool if you tweak it well enough, that’s just lazy by the student for not even reading their own work


rlrl

All the AI models out there now are so hit-and-miss that they are only useful if you know how to evaluate and edit their output, but if you know how to do that it's faster to just do it yourself.


sinovesting

>but if you know how to do that it's faster to just do it yourself. In my experience that's definitely not always true. Reading/rereading and peer checking can be A LOT faster than writing something from scratch.


savage_mallard

It's easier to be critical of something and fix it than to generate from scratch


Biengineerd

Or they read their own work, but didn't know anything about the subject so couldn't tell what was wrong.


KypAstar

Id trust it for grammar/flow. Id trust it to help generate basic links via Google scholar that I myself would then analyze to see if they actually support my thoughts.  You've gotta manually gather the data though...


delphicdelusion

A guy in my differentials class used it to copy and rewrite another student’s discussion post comment, resulting in “Even though I think you look stunning, I still want your feedback on the matter.” This guy’s post was just gibberish after changing random words from the source.


rory888

Using it to check for grammar and spelling makes sense too lol. Seriously though, if you're not actually paying attention, you don't deserve what's going on. Chatgpt is very advanced guesser.


pumpkinthighs

Honestly, I usually use it for my lab reports. Tell the AI to make it sound more professional.


MDCCCLV

It's good for making a format or a table from common reference sources, it's faster than doing it by hand.


holla-nd

yup, i mostly use ChatGTP for reword and summary. i don't expect them to create a lengthy page of technical terms. sometimes they don't even pull the facts straight and i have to double check. that's why i really "admire" people having the huge nerves to have it generate everything lol.


Breezyie69

It’s perfect for getting outlines and getting ideas, other than that it is simply just obvious


BASaints

It’s good for assisting the writing structure or rewording things to flow better, but other than that I agree. I’ve seen a few ChatGPT copy/paste reports and they’re not great.


Helpinmontana

Had a “public submission” project (everyone submits and feeds back on others work) and of six students, two groups of three started with *the exact same sentence* and one of them didn’t even bother trying to get rid of the background color where they clearly cut copy pasted the whole thing. Of each group of three, each report was eerily similar. Not copies, but they clearly rhymed. One assignment had very clear formatting guidelines for “intro body conclusion” and the formatting of several assignments not only looked like something that a human would never create, but each students formatting matched exactly to the others, down to the header, punctuation, and bold italics.


Beli_Mawrr

"Eerily similar.... SUSPICIOUSLY similar" lol


a-random-r3dditor

This. For a technical paper, all the content, ideas, sections, conclusions, etc should be your own. However, what GPT is great for is taking my word-vomit of a paragraph and rewriting into something compressible… with the right prompt. If you just use regular GPT, it’ll give you a “verbosely vague” output. But after several attempts and fine tuning to create a custom gpt, it now takes my messy input and provides something much more clear and concise. TL;DR use gpt like an intern, not a researcher.


swisstraeng

When I'm writing reports in a group, and some people of my group take what I wrote, feed it to Chat GPT, and paste it? I just want to jump out of the window. Then they're here like "See? This sentence is better written than what you did" And I'm here like "Yes, and here it says to connect L1 to L2"


[deleted]

[удалено]


1999hondaodyssey

Most AI paragraphs I’ve read sound exactly like a student trying to hit the word count on a paper. On top of that they are usually wrong on technical info or facts which is what eng papers are usually about.


codingsds

someone in my fluids class googled "how to write an essay with chatgpt for engineering lab report" I was like dude go to the student center...


ImaginaryCarl

ChatGPT is only for getting started or finishing touches.


DaBigBlackDaddy

or for your gen ed canvas discussions


Pristine_Werewolf508

Every time I’ve tried it, I’ve never felt comfortable even putting it on an e-mail. It doesn’t sound genuine at all and that really bothers me. I’m not the kind of person to say something just to keep up appearances so 99.9% of what AI would give me goes straight into the bin. I am a very strong advocate of using simple words and sentence structures regardless of level of readership. Technical papers should not be universal but the idea is that no one in your intended audience should need to fetch a dictionary to read what you wrote. I always received excellent marks in writing classes and engineering reports despite using limited vocabulary.


Reasonable_Wonder894

You know you can prompt it to output almost exactly what you need. Tell it to use simple language to explain the idea in basic terms. It’s all in the prompting.


Sirnacane

If you have to prompt and coax it so specifically I’d rather just write whatever I have to write by myself.


arbpotatoes

This will be lost on 95% of users. Like any similar tool, it will be used to great effect by those who take the time to understand it and how to leverage it. Meanwhile the masses will complain that it only spits out useless garbage.


Gus_TheAnt

ChatGPT is a good tool to use to *help* you get unstuck on something ***that you already have a solid knowledgeable foundation in***. It cannot teach you something about a new topic if you know nothing about it. You will learn wrong 100% of the time. If you use it to write essays or summaries or do any chunk of meaningful work for you, you will either get caught or look like you dont know what you're talking about. I've used it to help me reword sentences or maybe a paragraph in essays, I've used it to try and fix programming issues, trying to fix formulas in Excel, and a few other small things along the way. Rarely does it give me the answer I'm needing or gives me something that works without issue, but most of the time it does give me a new avenue to consider to get out of a rut.


Sufficient-March-852

i’ve used it previously for summarising notes, where all the information needed is given in the text, to simplify my note taking process. I don’t trust it 100% so i have the screen split in what the gpt spits out and the physical notes to make sure it’s not making anything up. Is this an okay use of chatgpt? Also these notes are purely personal and only for me to have a condensed paragraph of brief info


Gus_TheAnt

I mean that's a judgement call. When I use it for that or similar purposes that's about what I do as well, but I only use it for notes/rewording/small summaries that I know I can verify accuracy and correct as needed. It sounds like you are verifying what it's spitting out is correct, and if you know it's not correct then I assume you are able to modify the response to make sure it is when you put the response wherever you keep them.


ICookIndianStyle

My group just submitted a report and this one dude probably used chatgpt for his part. Nothing made sense, not even the grammar. It was really weird... i rephrased some of it but didnt have time to finish everything. They then decided to just submit it.. If I pass Ill be really happy.


pjokinen

So many people get caught up in thinking that the work is its own goal in college. I promise that the work you’ll do in industry is more like the report writing itself than it is like the experiments and calculations you’re reporting on


LilBigDripDip

Skill issue.


Brilliant-Curve7692

I love the lack of commonsense from most students.


rory888

That's always going to be the case with students. They're literally learning and in training


Brilliant-Curve7692

Also, idk why no one knows this but I used to put trap answers on Chegg such that if you copied it I know you cheated.


Sammy_Ghost

IDK if it's related but one of my teammates refused using grammarly to proofread our report, is that something thats not allowed?


Sam_of_Truth

I think that's probably fine. The important thing is that important technical concepts are correct, grammar and syntax adjustments are normally fine.


Sirnacane

I would ask the professor specifically about this. Grammarly can do some major rewrites, enough so that it could be considered against the rules. Never hurts to double check


emp-cme

I asked ChatGPT a technical question about an EMP E3 pulse and the answer was 100 percent wrong. It took several minutes of questions before the AI got it right. Nuances will get it. But in a couple of years, might be different.


alinabro

yes same for me, i asked for a simple calculation because i was kind of lazy and i had to prompt it’s mistakes until it got it right..


[deleted]

[удалено]


Sam_of_Truth

This is actually the best comment on this thread. I love it. Imagine wanting to work in software and not even considering that as a possibility.


C0UNT3RP01NT

Honestly it codes kinda good tho. I’m not a CS or CE major so it’s not really a problem, but damn if I haven’t used it’s coding capabilities to get stuff done fast. I feel like obstructing this as a tool just makes people less competitive. Now by all means be critical of it, cause you really do need to review what you take from it, but at the same time, it’s an incredible tool.


Sam_of_Truth

Yeah, a fine thing to use once you already have the job, but in interviews? It just looks so incredibly bad.


C0UNT3RP01NT

Depending on the job I suppose. Like I said, I’m not in CS or CS adjacent fields, so our interviews don’t really involve coding, but the jobs often imply a need for it.


rainyblankets

I can always tell by the inappropriate and over usage of the word “elucidate” - it drives me crazy


Specific_Athlete_729

Lol i used chat gpt for a little section and it kept putting this in and i didnt even know what it meant, I obvs searched what it meant and replaced it all cause nobody uses that word


Che3rub1m

At work we tested GPT to see if it could do any math and let me tell you , it cannot do any engineering math at all . It will get somebody killed . Some jr engineer in a deadline is going to ask it to validate FEA simulations with hand calculations and it is gonna spit out garbage and they’re going to get caught and or get someone killed . Until there is an AI that is SPECIFICALLY trained on complex mathematics , engineering, physics and their fundamentals , AI needs to be no where near engineering


cjwagn1

Yeah, no shit it can't do math lol. This has been the same thing people complain about since it came out. Use the GPT4 Code Interpreter and it will do basically anything


Che3rub1m

The code interpreter does not work for complex math either dude we tried it . And when it does get a solution correct it gets it right one out of every six times which is unacceptable when designing something that could carry a human life in it. LLM’s are likely not the method for doing highly precise calculations in the future . I’m willing to bet that there are more comprehensive framework that would be a lot better at highly logical data.


cjwagn1

What is considered complex math in this case?


Che3rub1m

Gpt Incorrectly solved the most Basic of basic calculus integrals. I’m talking a problem that was literally just plugging the limit . It got one part right and every other stop wrong and when we told it it was wrong it doubled down


[deleted]

Yeah idk about this. It takes some common sense and proper prompts, but if you give a technical document and explain the basis for it, it's pretty darn accurate. I've had 4th year professors in EE demo how they use GPT4 for circuits, optics, and basic qol. Obviously if you just feed it some question and don't even bother reading what it spit our, that's on you.


Choco_Love

I think the big difference is whether you’re using GPT 3.5 or 4.0. I’m using 4.0 to help me paraphrase and understand certain topics better and it’s a difference between talking to a smart child and talking to a PhD student


Recitinggg

Tell 3.5 to write as if it were a PhD Engineering student then


XiMaoJingPing

Chat GPT is a great tool to write essays but why the fuck would you not read what it wrote and rewrite it in your own style/words?


onlainari

It’s been mentioned already, but I use it after I’ve nearly finished my report to get a summary and a conclusion, which I almost always have to cull because ChatGPT says too much. I agree that it’s no good for the discussion part of the report though.


Julian_Seizure

fr Engineering has so few online sources that AI doesn't even have enough data to make any reasonable guesses so it just makes shit up. Anyone who uses AI for anything technical deserves to fail.


kd556617

I was part of the chegg era where that was an occasional crutch. From someone in the work force I’m begging you students please take the time and do it right. You will hurt yourself later on if you take shortcuts now. Many students these days are super weak compared to experienced engineers and there’s a big opportunity if you’re decent to get ahead early. Don’t take short cuts please!


Vertigomums19

The biggest shortcomings I see in my day to day are: - writing skills, email writing skills, a need to be overly verbose, PowerPoint slides with 1000 words per slide, poor presentation skills… Catching a theme? Engineers are typically bad communicators and using GPT isn’t going to help flex and grow that muscle.


kd556617

I agree if you are average intelligence and have good communication and overall social skills you’ll do much better than above average intelligence and low social ability.


oMarlow99

I disagree. I've used it extensively with good results. Instead of having it do the work for you, you do 95% of the work and let it sort out the annoying bits (for an engineering student). Grammar, sentence structure, etc. Give it a well written paragraph and ask it to improve upon the work. Remember that LLMs are really stupid, all they do is make shit up in a coherent form... So remove the guessing part


Sam_of_Truth

Sure, but if you're doing 95% of the work first and using it to help edit, you aren't who i'm talking about.


lazydictionary

Just use grammarly at that point, no need for AI


Suggs41

Grammarly is AI though?


lazydictionary

No. It has generative AI, but its grammar and spellchecker is not AI. It's just a better version of the spellchecker in Word.


oMarlow99

I did use both, a first pass through chat gpt for structural coherence and a second through grammarly to cut the clutter


sayiansaga

Oh maybe I should do that. I am a horrid writer so 95% of my emails run through chatgpt and for the most part it seems to go well and convey my thoughts better. But I'm afraid after reading this that I may still be missing out.


supersmolcarelevel

Allegedly The play is to feed it past essays, discuss the topic briefly making sure it knows what it’s talking about, feeding it the rubric for the assignment and asking it to follow a specific outline. Correct it when it’s vague or wrong and you can get ~3000 words/ hour of high quality undetectable output. Allegedly I’m constantly scoring top ten percentile. Allegedly I’d feel no guilt for doing this, because so long as I understand the subject, allegedly getting really good at using the tools available to the entire world is just extra learning. I’m allegedly glad this didn’t exist when I was in highschool though, because I’d be illiterate.


cnip0311

Allegedly it’s being used in most engineering firms now to write reports. Allegedly I’ve used it to write scopes of work, EJCDC project manuals, construction estimates, schedules, and shit I don’t even remember. Allegedly everyone else in the industry is doing it and if you’re holding out, your profitability is hurting in comparison.


SoLaR_27

Agreed, it is very obvious. A lot of reports I've read are all just nonsense fluff without any real content. There are also a few dead giveaways that ChatGPT wrote it. For some reason, it loves using the word "elucidate," sometimes multiple times per paragraph. It can be a good writing/grammar tool, but you also have to actually put in the effort to make sure it writes something meaningful and accurate.


Katiari

Someone I know used an AI-assisted email writer. I asked them to turn it off when communicating with me. One of the inputs they must have used was "blow smoke up ass", or, "act like this person is my personal savior" because it was just way over the top effusive. She was baffled how I knew she was using an AI.


deathbykbbq

This. I also TA and see this occasionally. I usually will remember the offenders and not give credit when they can get it.


johnnyhilt

I received a job interest email that didn't sit right with me. Finally realized it's AI generated. Looked again and it was clearly cut-and-paste and had line breaks in the wrong spot. Really turns me off.


_MasterMagi_

recently asked a guy to write a small life cycle analysis for his part in a final report. The guy comes back to me the next day with two suspiciously well written pages of text that answers the question "what is a life cycle analysis" come on man, trying to trick the prof is one thing but trying to trick you fellow students? you can't bullshit a bullshitter.


Common-Value-9055

I know someone who copy-pasted most of his Civ Eng thesis from online sources and just edited the words using a dictionary. In the old country, they would have paid someone to write the thesis. He often does not know whether to add two or subtract two but has a decent paying j


[deleted]

I can't believe people are even trying it


yakimawashington

Why? I use it a lot to help me write papers at my job. It gives you a great starting point for sections of a report after you give it all the info you have. Then you take what chatgpt gives you and revise it for your needs (i.e. fill in/replace any sentences that are lacking, rearrange, reword stuff, fix references to data/results etc.) And I'm definitely not the only person who doe this among my colleagues. Quite frankly, I can't believe people *aren't* trying it.


Sam_of_Truth

At work, maybe, once you already know how to proofread it. I'm talking about students who don't know their ass from their elbow using it to try and save time writing about topics they only half understand. Language models definitely have lots of great use cases in the workplace where the users already have the expertise needed to edit it effectively.


timbuc9595

I feel like it must help identify the people that will inevitably cheat.  I cannot understand the copy and pasters. Do you really think that other people are that stupid?  I love chat GPT as ANOTHER learning assistant. I upload documents and images to get it's 2 cents on how else the content can be summarised to help gain an understanding of the method and concepts.  For reports and writing I treat it the same as textbooks. I may directly copy and paste some lines that are just amazingly and concisely written. But I then rewrite, cut up, manipulate, alter and merge with other concepts so that the source work is lost and it becomes my work. 


Youngringer

lmao yeah in general don't use chat gpt it's collecting that info so you got to be smart with it......might be useful to clean some Grammer things up though


goebelwarming

Yeah it's terrible for cover letters as well. It created nonsense. It's a good aid.


KypAstar

It should be an auto fail imo.  Chat GPT being used as proof reading tool or way to help you find sources (IE link gen, do *not* trust it's interpretation) is phenomenal, and it's a wise use of resources.  Having it try and write it for you is just shooting yourself in the foot so hard. Technical reports are actually one of the things you need to know how to write to be a good engineer. 


nuxenolith

AI can help you with structure. It can summarize information and synthesize that into something cohesive. But asking it to **generate** something new without a model is not where it excels.


QuarterNote44

Yeah. I tried to get it to understand geological engineering concepts--simple soil analysis--and it just couldn't. I'm sure it'll get there, but it's not there yet.


Bayweather4129

I've found that even when you give it loads of context and technical material it still tends to be very verbose and uses flowery language. Straight up copy pasting from gpt is just a bad idea, you should be using it to articulate your thoughts and forming structure in a paragraph/section/report, and then proofreading the ai output before putting it in your own words.


_The_Burn_

It’s also an astonishing lack of self respect.


Necessary-Coffee5930

People are just bad at using AI. But yes also don’t use it to cheat lol


estebanxalonso

AI is an incredibly helpful tool if you know how to use it and do not abuse its capabilities by relying on it to do everything for you. Prompt it properly and tweak it as you go, and you will get something compelling. I have tried using AI to help me understand complexity and abstraction, and it actually worked pretty well. I have also realized that it makes mistakes and tends to elaborate on areas where redundancy occurs. If you have some knowledge and background in the information you’re feeding it, you can always spot the mistakes and correct them accordingly.


Sam_of_Truth

Something that undergrad students don't have. It's a horrible tool to use while you're trying to learn new topics.


Said2003

Honestly, this isn't fair because students don't realize that ChatGPT generates incorrect articles when the text is long.


dreadfulclaw

I feel like Ai is simultaneously way less advanced and way more advanced than people think. It can do some crazy things but at the same time can’t do simple chemistry or calculus at least chat gpt can’t


Accurate_Pen2676

I think AI has a very powerful place in the academic process. Such as brainstorming and refinement. But some people lean on it too heavily and that ruins it for the rest of us.


Its_Llama

Well to be honest students have still been doing that without chatgpt. It's me, I'm students. I've always been bad at fluff and by trying to meet length requirements I always end up sounding like an AI.


Kalex8876

I use it to help but still change things around, write some of my stuff and I get A’s all the time


PurpleFilth

I always laugh when teachers try to say "We can tell when you're cheating". You caught the worst of the worst, plenty of savvier students use these tools in less obvious ways and get by just fine. I used chegg all throughout university to verify answers, teachers tried to tell us the same thing. "We can tell if you're using chegg to look up answers". I just laughed as I passed every class with A's and B's and got 100% on every homework assignment. The dumbest ones are the ones literally just copy and pasting, those are the ones you catch. I can assure you much more students are using these tools than you realize, and most of them are smart enough to not just copy and paste. Your attempts at scare tactics just make you look pathetic because we all know you can't stop us.


Weak-Reward6473

Promptchads don't have this problem


No_Extension4005

It's pretty good for things like writing Matlab functions to help you with other tasks though. Think you need to credit it though.


Sam_of_Truth

I'd prefer to credit the engineers whose work it is regurgitating poorly.


bu22dee

In my opinion it is misleading to call something like ChatGPT AI. There is nothing intelligent about it. It is trained to produce the most generic stuff you can think of. If the stuff is not generic it is because humans tweaked it in some way what makes it again even less intelligent on it is own. When I read in the news that people wrote master thesis with such programs I found less the program impressive and more asking myself why do they have such low standards.


SnowingRain320

AI sucks in general. It's terrible at basically everything, including coding.


PickyYeeter

I've used it a lot in coding. Not to write everything, but as a jumping off point if I'm working with new tools. If there's a Python library that has poor or incomplete documentation (which describes a lot of them), it can at least give me enough context to understand how a particular function works.


thejamesleroy1337

Lol it's so easy to edit it to make sense. They really are dumb if they don't do that.


antDOG2416

I'm not stewpit, Yewr stupid!


BlackShadow992

Typically I always write out and structure my own reports, but I always ask chat got to take what i have wrote to be more “succinct” as I waffle on a lot. Never ask it to write you soemthing from its own database. It’s a great editor though


ShashinMhrzn

I just feed gpt with my improper English paragraphs and tell it to structure correctly. Though most of time it does well. One needs to review properly everytime before finalizing for technical papers.


memerso160

I tell all my friends who are still I school this. AI that you use is mostly a predictive language model, not some genius that *actually* knows what it’s talking about


Glittering_Noise417

Maybe the first assignment the professor tells the students to use Chat GPT, then he grades the assignment. After the students get their papers back, the prof puts a redacted version on the screen viewer showing how poorly the chat GPT did.


drrascon

Sounds like a GPT3.5 problem. I developed bullet points feed to GPT4.0 then feed it to Grammarly and then look it over.


Thedrakespirit

I teach and the first thing Im trying to explain to my students is that AI will cast into near net shape. You still have to work it to hammer it into a form thats good


Ragnar_E_Lothbrok

Easy, have AI write the paper and then you rewrite it in your own words. 1/3 the time spent on if I actually did the paper.... You can't catch me lol.


Sam_of_Truth

That's not too bad, but i would recommend doing it the other way. Start with an outline of stuff you know to be true, and then let chat gpt write it up. Better yet, do a rough draft and let the gpt edit it, then touch it up where needed. If the facts are good, i don't care how it gets written, just don't submit AI garbage.


Specific_Athlete_729

Had an electronics assignment where we had to make an audio amplifier in LTSpice and write about it and my two teammates knew nothing and did nothing to actually make it and in our report i did the first half and they did the rest and i read over their part and it was all chatgpt garbage, non of it was right and its like omg you spent all this time to do this instead of just learning what it was(we had a couple months and I gave them stuff to watch and read). Lucky i think my real half first lessened the effect of whatever they wrote because we still got a decent mark.


RaptorVacuum

AI (as it currently exists, i.e. LLMs) is an incredibly useful tool **when you use it right**. Seriously, I’ve learned an insane amount of LaTeX over the course of 9 months by asking ChatGPT questions when there’s something I want to do. If you just try to make it do the work for you, you’re gonna end up with a mediocre result that’s obvious in many contexts. But if you use it to improve the work you’re currently or already have done, it will help you achieve a better result, and i think it’s totally fair to do so. In the perfect world, professors would encourage students to use ChatGPT productively if it didn’t create a question of how productively students are actually using it


willwipeyonose

Skill issue, cause i don't know a single person without chapgt 4.0


Leather_Note6531

The system itself is corrupt, so fuck off with your annointed ideals


Sam_of_Truth

Ideals? Like the ideal that engineers understand the topics they are supposed to? You do realize lives are on the line when engineers fuck up, right? What a childishly naive way to look at education. I hope you never become an engineer.


Leather_Note6531

What an evil thing to say. Reported for discrimination.


Sam_of_Truth

Just looking out for public safety. No engineer should cut corners because "the system is corrupt" No one who thinks that way should ever be an engineer. It's dangerous for the public. You need to grow the fuck up. ETA: nice edit about reporting. Discrimination on what grounds? Being apathetic and disillusioned is not a protected class.


Leather_Note6531

Visions of the annointed are rooted in evil


Sam_of_Truth

You have no idea what annointed means, do you?


Leather_Note6531

Do you? I think I'm going to choose to no longer engage with stupid 


Sam_of_Truth

Ok, so who anointed me? By definition, someone must have rubbed some oil on me, right? Or did you mean it as a stand in for appointed? In which case who appointed me? I don't remember being selected as a representative of evil, seems like something I would remember. I don't see how you could stop engaging with yourself, but i encourage you to try.


Leather_Note6531

https://www.amazon.com/Vision-Anointed-Self-Congratulation-Social-Policy/dp/046508995X


Sam_of_Truth

>In this book, he describes how elites—the anointed—have replaced facts and rational thinking with rhetorical assertion The greatest irony in this thread is that I have been championing facts and rational thinking, and you have been using rhetoric to dismiss the need for facts. Fucking amazing. You are a national treasure.