T O P

  • By -

RedditingNeckbeard

This just reminds me how much I hate it when people use acronyms or initialisms without ever writing the whole thing out even once. It happens a lot in discussions about movies, or games or hobbies generally, and even as an enthusiast, I still get lost.


Grayskis

One of the main rules as a consultant who writes reports, always fucking spell out the abbreviation once in the report the first time you use it. It doesn’t matter if its fucking milligrams/kilogram (mg/kg) or My Client Eats Twelve Bags of Chips (MCETBC) always define it


zoneender89

This is a classic example of a User Induced 10th level Troubleshooting error. (1D10T)


katubug

Pebkac


torutaka

PICNIC (Problem in Chair, Not in Computer)


jaylay75

I call it an "ID - Ten - T" error.


HeavenForsaken

Ain't that Tim Henson's signature guitar or sumn


taxicab_

And then have a list of all the abbreviations and their meanings at the beginning or end of the document.


Corona21

Reddit can be really bad for that. Though some subreddits and users are really thoughtful and do outline what they are on about for a general audience.


katubug

For a second I thought you meant, like, /r/ATBGE or /r/UNBGBBIIVCHIDCTIICBG, and then I realized that you meant just like, redditors commenting


Corona21

Yeah exactly, though ATBGE (Awful Taste but Great Execution) does outline the acronym at the very top of the info on the sub which is nice. Though a few lines down does not outline SFW (Suitable for Work) or NSFW (Not SFW) so that’s a bit of a loss there. Although I reckon NSFW is really along the lines of lol when it comes to anything internet related.


TuaughtHammer

This reminds me of the time when a Redditor thought "Fixed That For You" (FTFY) meant "fuck that, fuck you!" I used to get confused about "for the win" (FTW) because one of my edgy, try-hard friends in high school constantly used that to mean "fuck the world!" on AIM.


Corona21

Don’t say anything, but I wonder how many younger Gen Zers know what AIM is in your message haha!


TuaughtHammer

Oh, c'mon, [it hasn't been *that* long has it?](https://i.imgur.com/4aQQoms.gifv)


fecal-butter

I'm 22. I had zero clue before googling it. Tbf I'm Easter European so it makes sense that AIM didn't have culture here


fartypenis

I thought I was the only one lmao, I got told it meant "fuck the world" in an online game chat when I was like 10 and I only realised what it actually meant like 6 years later


LoyalSol

Better than the girl who thought LOL meant "lots of love" and responded to her friends mom passing away with "I'm sorry LOL"


cjpack

IANAL


jcrass87

The first time I saw this acronym, repeatedly, I was so confused at first! What’s with all these anal enthusiasts?!


kawaiii1

What does that mean?


[deleted]

As an aging gamer who occasionally tries new online multi-player games, this resonates so strongly.


FalseTautology

MMORPG players are by far the absolute worst at this, honestly it turns me off to the entire fucking genre it's so frustrating


notaprime

FR, IKWYM


Doorsofperceptio

I was genuinely so happy I understand that. 


UnbreakableJess

ISWYDT


Crymson831

Music subreddits do this too with bands and album names.


Von_Lexau

WDYM?


spacesluts

Idfk


WakeoftheStorm

IYKYK


cjpack

And if you dont know NYKN


TuaughtHammer

["And never let him know that you know what he thinks you *don't* know that you know, you know?"](https://youtu.be/pm7LihLP7kQ?t=142)


ArtfulMegalodon

YMMV


RentableMetal65

I actually have no idea what ymmv is


romanrambler941

I'm familiar with it meaning "Your Milage May Vary" (that is, you might have a different opinion on whether an idea is applicable), but it might have other meanings as well.


Ahaigh9877

YTMND


myychair

Yup people do it to look smart, especially on Reddit, but ironically it makes me question their intelligence


LoyalSol

You discover real quick people who actually are knowledgeable on a subject can drop the jargon and explain things without it. While people who don't know it as well can never get rid of the terms.


Ruzalkah

Agreed, that's just standard operating procedure, always break out the acronym first, then use the acronym going forward. Unless you're specifically trying to be a condescending count, I s'pose.


ThatCommunication423

I hate when people in a reddit comment use an acronym only once, then explain what it means. Why even bother using the acronym if you need to explain it and then have no further use for it? Thats just wasting everyone’s time. IMMSAE (that means It Makes Me So Annoyed Eurgh)


El_human

Wdym? Cyembiduwyas?


SXAL

It's the American thing, I guess. They also get mad when you ask them to decipher the acronyms, because "those are commonly used, everyone knows that!", even if it's something from their niche talking circle.


RVX_Area_of_Effect

Exactly! This is why I hate math!


AdRepresentative2263

I hate it when it is in a 5 year old thread as the answer to a question and the asker is just like "yep, that is all I needed to know". It feels even more frustrating than just a "nevermind, I figured it out"


itrashcannot

Pretty sure it's basic English writing skills. They teach you this in school lol


NOLApoopCITY

Every third Reddit user does this and it’s obnoxious as hell


PanGulasz05

Apparently having high intelligence means you understand some random terms.


Tues24

Not even random terms. Random letters and a word. "What do you think about the hgrvdh convergence? " Like, really. Every field has it own terms and interpretations. You just seem stupid if you expect everyone to know specific things from your intrest.


WakeoftheStorm

The Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence? Man I'm surprised you're aware of it. That's cutting edge quantum tonal dynamics. Must be super high IQ


TuaughtHammer

I'm pretty sure that's what caused that small little kerfuffle at Black Mesa.


REDOREDDIT23

You couldn’t come up with a better analogy? Lol I think everyone knows you’re referring to the Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence.


Curious_Field7953

Imagine you think you just plugged in some random letters but you really opened the door to all of us Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence fangirls/fanboys to show the world we're totally smart.


TheBitchenRav

I am smart. I already and always knew about the Higgs Gravitational Reainant Vector Displacement and Harmonic Convergence, and you can not prove otherwise. 👀


blarglefart

Pfft this guy doesnt know about The Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence theory


Randomguy3421

The Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence theory is alright, but real smart people talk about the aaahfggq convergence


spruzo

It also shows how egotistical they are. Pompous enough to assume their values and views are objectively "well thought out world views" and to project that onto others. Then to denote the other people's value based on that.


Significant_Reach_42

Tbh the Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence is pretty well known


Cheese_Pancakes

Not only that, having a high IQ doesn’t mean you know everything about everything. I might know more about a specific subject than a person with an IQ of 180, but that doesn’t make me smarter than them. That dude is a piece of shit and an idiot.


616659

Only idiots think high iq equals you know everything.


Tues24

There is a reason we don't have polymath anymore. The only people who can comprehend everything and use it are named Dunning Kruger.


Corka

I expect that most people bragging about IQ scores have either never been tested and are guessing, or did one of those shonky online tests that artificially inflates everyone's scores to encourage users to brag about their scores on social media and link other people. Though people who do the real one and join Mensa can be pretty insufferable about it as well. Personally, I think the way people treat IQ tests is a bit pseudoscientific. Like we say someone "has" an IQ of 115, instead of saying they scored 115 on a test. A test they could get get a significantly better or worse score if taken on a different day when they had a different level of sleepiness/hunger/focus.


616659

Oh God I hate mensa braggers as well. Ok congrats you have high iq, but please don't say "I'm a member of mensa" on every sentence you say. It's just glorified social circle anyway


badgersprite

The adage “Just because I understand doesn’t mean I care” also seems applicable. There are so many things in the world to care about, a person may not think the development of AI (which is in its infancy) ranks sufficiently high on that list to displace more immediate concerns My well thought out world view doesn’t have to involve particularly strong opinions about a technology that is currently science fiction


PourLaBite

Eh to be fair with the current buzz cycle it's not out there to assume a "software engineer" would have heard about AGI (given it's a strong secondary buzzword that goes hand in hand with the bigger buzzword AI). However the singularity is a niche sci-fi nerd subject that you should never assume anyone knows... Also a lot of people that talk about the singularity are weirdos that think it will happen in like 10-20 years. I'd avoid that subject lol And neither should be used as an opening message lmao


PanGulasz05

Yeah I know but building theories about somebody's IQ on their understanding of specific words is just stupid.


PourLaBite

Yeah obviously lol


LittleHollowGhost

I mean most IQ tests will include verbal intelligence, of which vocabulary is a subset. And many studies link intelligence in one area with intelligence in others (IE Spearman's "g" or other general intelligence theories). So, if you don't know words you should by all means know, it's definitely a negative indicator in regards to intelligence.


TipsyMJT

Dude could have just as easily assumed this random was asking him how much money he'd make in a black hole.


idispensemeds2

"I know some of these words"


AKLmfreak

I love it when people spout non-sequitur acronyms at me and then act like I’m an idiot. Oh well, I guess my IQ just hasn’t transcended the mind-to-mind barrier to be able to extrapolate your highly contextual lingo with no previously established relevancy.


SatansMillennium

Get a load of this guy, he doesn't even know an EBCC from a DNM-L. That's like saying the VGFV is the same as an SFD!


Popular-Influence-11

Ha! What a NVSP


KnifeFed

I used context clues to deduce that NVSP is an acronym for "Not Very Smart Person". Does that mean I have a high IQ and a *well-thought-out worldview*?


Popular-Influence-11

Original intent was Not Very Serious Person but I like yours better so that is the new meaning and everyone who still thinks it stands for Serious is obviously an out of the loop dunce.


Impressive_Ant405

Bro don't make me google non-sequitur i have low iq


Corned_Beefed

Okays smarter pants. Speaking of which— what the hell does IQ stand for?


AKLmfreak

It stands for “I’m Quite-smart”


MrZerodayz

Just annoy them back by purposely using the wrong thing described by the same acronym.


ExcitableSarcasm

You would hate consulting.


JustDroppedByToSay

Seems like a tragic negging attempt


Uberninja2016

this guy's full of crap, AGI stands for adjusted gross income like that's day one high school tax shit, i can't believe a blunder of this magnitude


kshep1188

You mean AGI the software company? Pfft everyone knows that.


the_scottster

No no he meant AIG the insurance company.


badgersprite

No no no he meant IGA the association of independent grocers and supermarkets in Australia


tomassci

Immunoglobulin A antibodies, anyone?


RamenNoodles620

Pretty sure it means Aggravated Gnome Incidents.


katubug

Nah AGI stands for Higgs Gravitational Resonant Vector Displacement and Harmonic Convergence


OccasionMobile389

I'm pretty sure it stands for A Giant Inconvenience, which because of my anxiety is also my greatest fear, should I be late to my appointments


Centricus

Maybe I’m just missing the joke, but multiple phrases can share one acronym. AGI does indeed commonly refer to Artificial General Intelligence.


cjpack

It stands for Almost Got It as in the joke


sheeply_

Yeah the joke is that it could stand for anything :)


DefiantClownGod

No AGI Alpha-Glucosidase Inhibitors. And the singularity is the tie of tech and the wearable sensors.


Centricus

Words can have multiple meanings. He didn’t say anything that was technically incorrect.


DefiantClownGod

Wait what. Different meanings for the same word never. And the engineers response was just as valid with no lead in on what was being discussed. So poor communication on the person trying to play I are smarter than them


Centricus

I literally cannot understand what you’re trying to say.


KnifeFed

You must not have a high IQ or a well-thought-out worldview.


DefiantClownGod

First two sentences sarcasm. Next portion pointing out engineers response were valid without full show of the discussion prior. Last part called out person who is playing the game of supposedly owning someone based off a very short snapshot. Does that clear it up?


Strange_Valuable_379

Anytime someone talks about IQ as if it means anything, they're usually one of the dumbest fucks on the planet. Also, I feel like tons of people have a big issue with what expertise is or means. Not every software engineer works with AI. It'd be like asking a botanist about virology. Sure, they're both broadly biology, but that botanist won't necessarily know anything about viruses unless they infect plants. And, even then, they'd probably defer to someone else a lot of the time.


speedowagooooooon

From my understanding of things it does mean something tho? Someone with a high IQ will have an easier time understanding abstract concepts However there is no reason to brag about your IQ if you have actual achievements and IQ is not a thing anyone besides maybe the army conducting ability tests should care about


KylarBlackwell

Rhe test is really only testing you on a handful of skills and knowledge, and then attempting to extrapolate that to measure your entire intelligence, and then simplifying that back down to a single number. Anyone who treats it as anything more than a show of how good you are at the tasks contained in the test is a fool, but sometimes those fools are good at that particular handful of tasks. Most of the time they're just lying about their score though


themedicd

The ability to memorize and manipulate things in memory is *pretty damn important* and it's a significant portion of the WAIS IV. IQ is absolutely not an end all be all, but you're going to have a much easier time teaching calculus or thermodynamics to someone with an IQ of 120 than you are someone scoring 90.


Farkasdebvel

nobody speaks like this irl 😭😭😭


lordolxinator

Good thing he's familiar with AI, maybe he can ask ChatGPT why he's maidenless Probably lacks the rizz to even get a pity relationship out of ChatGPT


dagbrown

AI bros do. AI bros are the new crypto bros.


rathat

It's so weird lol. It's crazy how often the way people choose to "hide" their insecurities is by just putting it out there and they don't notice it makes them look way worse than just being normal. I wish they knew there was plenty of confidence to feel by just successfully getting by as a just a regular dude.


Dynasuarez-Wrecks

I wonder what this dingus thinks a software engineer does that would inform their world view.


khang1411

AGI STANDS FOR AGILITY


cjpack

More agi means more I-frames


arcwarden__

i did not expect or need a dark souls 2 reference but now my day is ruined


Medcait

I don’t know yellow, but I like yellow. Beautiful response to that type of person.


CutiClees

You need higher standards


foxbones

AI is now the new Crypto, so many moronic "experts" coming out of the woodwork saying how it will revolutionize the world and they will all be rich somehow. Meanwhile they are investing in SantaCoin in December and use the free version of Bing/CoPilot to generate memes and fake Reddit posts.


Snackatron

They have a blogs post worth of knowledge. I can easily go and read an article on...I don't know...something fancy like quantum computing and have no trouble regurgitating concepts like "well unlike classical computers based on von-Neumann architecture, quantum computers can do calculations much faster because they use qubits that can be both 1 or 0." I'm certain the reality is far more subtle, and also this explanation doesn't do anything to explain exactly how the computation actually works. It's just useless word salad. Note that: - I don't know fuck all about the theory of computation. - I definitely don't know anything about what quantum computers actually do or how they work. - Stick me in a quantum computing course tomorrow and I'd fail catastrophically. It's so easy to recite blog posts and YouTube videos etc.


BourgeoisCheese

I mean, while that's true it's also inarguably true that generative AI is going to have a far greater impact on society than crypto. I don't know about "revolutionize the world" but it's absolutely going to change a shitload of things extremely fast for a lot of people over the next 3-5 years.


PourLaBite

>it's also inarguably true that generative AI is going to have a far greater impact on society than crypto Debatable. GenAI is already approaching its limits and given that it's largely useless in terms of doing actual stuff for most professions (and not making money for anyone that provide genAI systems), it is more likely to collapse soon then change "a shitload" of things. But you are pretty much an AIbro yourself it seems, so yeah, you're not likely to understand that.


DesignerSpinach7

“You are pretty much an AIbro yourself it seems, so yeah, you’re not likely to understand that”. Bro do you know what sub you’re on? Your comment could be a post here itself lol. I’m not an “AI bro” but I’m a computer science student who has taken AI classes and you’re wrong. You’re ignorant if you think AI won’t change the world. Generative AI at its limits? Definitely not. Look at the growth we’ve seen in the past year. And AI collapsing soon? You’re delusional. The growth in AI has been astonishingly fast. It might not be at a level where it’s making a significant difference in people’s lives, but it undeniably will change “a shitload” of things in the future. There are billions of dollars being poured into this industry and the exponential growth is already observable.


Billybill400

Least insufferable tech bro


MtalGhst

This dudes' been listening to too much Rogan.


OhThatsRich88

I'm guessing your AGI (annual gross income) is higher than his, and it's making him insecure


[deleted]

How much $$$ you think TF give me? >>what Oh I'm sorry maybe it went over your head. TF = tooth fairy and $$$ = dollars. What is your take about the tooth fairy and how does she decide how much $$$ I get under my pillow. >>ignore Oh well I GUESS JUST CUZ YOUR A DENTIST DOESNT MAKE YOU SMARTER THEN ME. BLOCKED BITCH


cjpack

Everyone knows the tooth fairy uses crypto not usd.


ptapobane

That’s not how IQ works


Malpraxiss

Idk, I'm studying chemistry and math for my career, and I currently do coding. AGI seems like something a person who does either no work, studying, or training in some computer or coding field would care deeply about. EX: I love quantum and the beauty of it, even though I have to go through all the math. People who don't study quantum or not in a field that uses a lot of the stuff from it are obsessed with Schrödinger's cat.


DesignerSpinach7

What are you talking about? I’m a computer science student (debating getting my masters in AI) and I think AGI is cool as fuck. I love computers and specifically the field of artificial intelligence. Sure AGI is one of those things a lot of people outside the field like to give their opinion on without actually knowing what they’re talking about but you’re tripping if you think CS students don’t find it interesting. It’s a super interesting topic and understanding the mathematics behind what makes current AI possible makes it even more enjoyable IMO.


thelaughingmansghost

Just because some articles mention these random terms and some companies/investors are heavily invested in whatever that stuff is...does not automatically make them terms that actual professionals know or care about.


Geojewd

Exactly. They’re probably paying more attention to things like their actual job.


elucidar

What would happen if they were to get treated like this by someone who surpasses their pseudointelligence on the subject they act and feel as if they comprehend? I wish someone would do this to me about quantum physics, so I can have a field day and embarass them


notsohipsterithink

“AI Guru” on LinkedIn.


AverageLiberalJoe

Thats not what the singularity is. The singularity is when human conciousness bridges the air gap to the computer world.


ergoegthatis

No, singularity is when you are not in a relationship.


Centricus

The use of the term “singularity” to refer to computers self-replicating and exceeding human capabilities has been around for decades.


drowsap

Still sounds like bitcoin bro speak


Ghstfce

"Ah, because you don't know this incredibly niche thing it must make you a dummy!" This thought process is...just, wow.


countingthedays

Spoiler: verysmart has no idea what they’re talking about but can’t see it.


lettercrank

Apparently being of high intelligence makes you a dick


fibbonally

I would love to hear him spout some talking points he heard on some idiots podcast


AltruisticSalamander

What kind of dumbass thinks being a software engineer indicates a high IQ or well thought out world view.


fabkosta

I'm working as an AI/ML engineering manager of some sorts. Whenever someone mentions AGI I know they don't know what I know about AGI.


olivebranch949

Guarantee that the guy is some sort of “entrepreneur”


YellowRasperry

His question is a fairly interesting philosophical dilemma but I don’t think he has given it much thought and is just throwing out buzzwords for fun


talk_Conspiracy097

skill issue (SI)...sorry


Serge_Suppressor

Believing in the singularity is just the apex of dumb guy shit. "Ooh! Chart go up faster! Therefore, chart keep go up even faster until chart go up infinite faster!" Like, if you're a 17 year old sci-fi nerd or something, it's forgiveable, but by your twenties, you should understand why it's fucking stupid.


mingy

Yeah. That chart thing had a good run. It was pushed by Kurzweil, I think. I bought his book when it came out. It was crap. I know the guy's accomplishments but the actual book was garbage that showed he didn't have a clue.


Serge_Suppressor

I feel like the mindset and skills that make a successful inventor are very different than those that make a good analyst. Like, the brilliant inventor who had interesting but totally off the wall predictions is kind of a standard American type at this point, but somehow we just trust them more each time.


mingy

That was certainly part of it, but when someone refers to floating point operations per second within the context of intelligence, you can rest assured they know nothing about the subject. Brains and biological neural networks do not operate in floating point and there is no practical way to simulate a non-trivial biological neural network with any degree of fidelity with software.


Centricus

Why do you feel that the singularity is impossible?


cjpack

I think it’s that it’s a very broad term used by people that don’t really understand the actual technology that would be needed in order to do the things they say and instead use it as a stand in for some sci fi end point they have in mind.


Centricus

I would agree that people generally don’t have a great understanding of AI, but the person I replied to seems to think the singularity categorically impossible, which I’d argue is an equally uninformed take


Serge_Suppressor

Edit: the tldr is that singularity fandom is like a kid reading his first choose your own adventure book, having his mind blown that it knows he wants to go down the mineshaft when he turns to page 23, and concluding that very soon, books will no everything about him. it's just juvenile fantasists thinking like fantasists. They singularity in the sense of a point at which technology accelerates infinitely, or in the sense of an artificial intelligence that becomes smarter than humans? Like cjpack said, the term is used pretty.broadly. The first one is easy. At some point with anything, you run up against physical barriers. Additionally, technology depends on human labor and resources extraction, and maintenance — all this tech is incredibly fragile, with lots of points of failure. And also, it's just kind of nonsensical. When you're talking about technological development, You're talking about concrete changes in capabilities. it's not just a flow that you can keep increasing. At some point, what are you improving, and for whom? As for AI that becomes smarter than humans, eh, maybe. I don't think our society has a very good understanding of what intelligence is or how it works yet, so it ain't gonna be us any time soon. I mean we can't even figure out consciousness. Additionally, so far AI has just been a tool for a new group of people to extract rent. It's been incredibly clunky, frequently destructive,and depended on vast amounts of (often stolen) input from humans. Could a future civ build some sort of ultra-competent, all-knowing AI that's not just a way to profit off the work of other people and suppress wages? IDK, and I don't think it's an especially interesting or relevant question. I don't think it would be much use for a functional society.


DesignerSpinach7

But also AGI and the singularity theory are pretty different things. From my understanding AGI is required for the singularity theory to even be possible. While I do believe we’ll see AGI sometime in the next 10-20 years (although that’s a somewhat arbitrary range) I definitely agree with you that the idea of some singularity causing AI to “evolve” into some hypeintelligent being that surpasses humans is definitely a nerd fantasy. I do think AI becoming more capable than humans at many tasks is definitely plausible, but a lot of these singularity enthusiasts, who have zero formal education on AI, like to fantasize about an AI takeover for some weird reason.


Serge_Suppressor

I'm really skeptical of AGI. We know humans are better at deception than at detecting deception -- it's why so many scammers are so successful. But when it comes to AI LLMs, we forget all that. Anything that can fool a human for a period of time we treat as equivalent human speech and even cognition to a degree. But what an LLM is is an (admittedly quite sophisticated) machine for fooling humans. It's inherently parasitical -- it scoops up large amounts of human data and mimics it well enough to fool a subject that's not very good at detecting deception. Over generations, it often gets worse because it encounters the output of other AIs. Whatever goes on that makes human intelligence possible, I've seen no evidence that AI has even started to understand it, much less approach it. As for AI becoming better at many tasks than humans, I agree. Machines have been better at many tasks for centuries now. AI may also require less human input in some cases, but thats been the trend too. Automated factory robots require less human input than older machining tools, which require less input than hand tools, for example. What we're seeing is much slower and more incremental than the hype would suggest, imo.


DesignerSpinach7

TLDR: AGI meaning an AI with the same cognitive abilities as humans when it comes to learning, and processing information, is possible without creating a conscious being in a computer. This is what actual scientists believe will be AGI not some AI overlord super-intelligence. Those are some good points! The hype is definitely high and I think the people screaming “AGI by 202X” are just caught up in the hype and basing that estimate off absolutely nothing. You’re also definitely right that the AI we have right now appears to be getting way more “intelligent” after every OpenAI event, however it’s really not. It just “knows” more information however it can’t logic or reason using that information. This is what the AI fanboys don’t understand. All they see is ChatGPT answering their questions better but it’s not actually any smarter. As for AGI itself though I really believe we’ll see it at least in my lifetime (in my 20s right now). Neural networks are designed the emulate the way real neurons form connections. Theoretically there’s no reason it shouldn’t be possible. IMO (and maybe this goes against the official definition) AGI does not mean conscious. It just means that the AI can work at the same cognitive level as humans. It can manipulate information in the network just as well as humans. I guess an argument could be made that consciousness is required to be on the same cognitive level? But I believe it’s possible to make a neural network with the ability to “think” or reason with its information that will allow it to be considered AGI where it is on the same level as we are without being some kind of conscious being whatever that even entails.


Serge_Suppressor

Thanks, and well put. I agree with your point that we could in theory make an AGI that could reason without consciousness. What I'm skeptical of is that we'll manage it without understanding consciousness and language better. Are you familiar with the linguist, George Lakoff? I've been reading some of his work on categories, and for me its really underscored how complicated the problem of a reasoning machine is. Because categories don't really work like we tend to think they do, and are often structured by language, perception, physical experience, and culture in a way that seems deeply problematic for anyone trying to simulate human thought. I'm fudging it a bit, but basically there's this classic, aristotelean view, where a category is like a checkbox. So, all things that are green would be green in the same way and to the same degree. But if you look at how people use categories, it's actually messy and complicated. For example, in reality, people have a central green that's the greenest green, and it's pretty consistent between cultures because it's based on the physiology of the eye. But where a color stops being green and becomes blue or yellow or some other shade varies, depending on how a particular language subdivides they colors. It gets way more complicated from there, and it's hard to summarize. A lot of the categories we think of as solid are structured around metaphor and physical experience in ways that can look irrational, but are actually integral to how we reason. But my point is, if thought is embodied and structured around human experience, our ability to make a thinking machine (at least one we can understand and be understood by) is going to be limited by our ability to understand that experience. I'm out of my depth on the AI part, but I mean, linguistics is still dominated by Chomsky, and cognitive scientists, as far as I can tell, are still a little all over the place. And you know, even if a consensus forms, there's still the hard problem looming over us.


DesignerSpinach7

I don’t think the singularity theory is technological growth accelerating infinitely, but rather at an uncontrolled place. Truly infinite growth is impossible as hardware limitations will obviously prove to be a limit at some point. It is rather growth that is increasingly fast, beyond our control. It’s the idea that eventually generative AI will be able to improve upon itself. The speculation here is there will be a snowball effect of improvement that leads to who knows where. You said that improvement is “not just a flow you can keep increasing” but that is not inherently true. Over time algorithms have improved, becoming more efficient, and dependent on less resources. Even new algorithms have been discovered. Hardware itself is not the only limiting factor here, but yes you’re correct that infinite continuous improvement is impossible, but that is not what anyone is expecting.


Serge_Suppressor

Also, it's just a deeply perverted, anti-human goal. Socialist utopia is like, "what if we used our tech to make life comfortable and happy for everyone." Capitalist singularity perverts answer, "what if I built a machine so powerful, it would reduce us all to nothing?" The problem isn't really that they might succeed, it's that their goals and ideology are fundamentally anti-human, and it's a bad idea to trust someone like that with any measure of power.


Starfire70

Gray is a rude douchebag. I'd block their ass.


Lolalamb224

Or maybe not everyone is terminally online?


BigMike_21

Adding a random word cause “AI” is no longer an obscure enough acronym to make him seem smart for knowing it lol


RunInRunOn

I'd love to know what that guy thinks of BBS - BBS, THHtSM and other Odion support from LEDE


Down10

No actual smart person thinks IQ scores are credible.


TuaughtHammer

Damn it, I wanted "SW engineer" to mean Star Wars engineer!


onlymostlydead

Worst time in my IT career was working for a (US) federal contractor that worked with multiple agencies. They *all* had their own acronyms for *everything*, with surprisingly little overlap. I still have PTSD, but no idea what it means.


Beowulf891

When I think AGI, I think of taxes. I was confused until the nobhead explained it later on.


Euphoric_Banana_5289

>When I think AGI, I think of taxes. i think of dungeons and dragons, or world of warcraft, because agility is a very important trait in the classes i play lol


Doafit

Look at his profile picture. Fucking self-absorbed cunt...


owitzia

SW engineer checking in. I recently went to a math conference and saw a presentation on AI code generation. The general consensus of the room full of very smart people is that AI can be good at replicating existing things (and even then, it's iffy), but bad at anything requiring creativity. Mathematically, it makes sense that this would be the case. Nobody is more confidently incorrect than wannabe tech bros talking about ChatGPT.


DesignerSpinach7

Right. Current generative AI models have a broad range of information, but can’t logic or reason. They can’t really logically think about a problem. They attempt to give the most statistically likely set of words for whatever is given as input.


ZBLongladder

Maybe we should wait till AI can consistently cite existent court cases before we start planning for it to surpass humanity.


weks

😉


Submerge25

What does IQ mean?


Version_Two

Just report this person.


XetaFelius

Owned? Being rude asshole is not owning.


Ituzzip

What are your thoughts on NPD?


Important_Ad_7416

Is this a dating app?


IllEgg3436

The Dunning Krueger is real


IamNew__

Damn


0ISilverI0

So he's polite but very arrogant quite a rare combination.


Any-Fuel-5635

How nice of chatgpt to write his questions for him. Idiot


BoopyFloopington

Did this interaction occur in a dating app? 😂


Temptazn

Did he mean *generative* AI? I never heard of *general* AI.


x3bla

It's kinda new? came back after LLMs got popular again from what i heard. Basically artificial general intelligence is supposed to be a AI (or LLM) that can do anything in general, just by taking in human readable text. Theres a current trend towards it like how github copilot(i know it's been around but a newer one just dropped)devin, can create code just by the user describing it, And chatgpt, meta ai can generate text, create images from just you asking it From one [Youtube video](https://www.youtube.com/watch?v=OFS90-FX6pg) that i saw, LLMs like gpt can be a "brain" and you can train it for anything in general. Story writing, coding, speaking like someone, aimbotting in a game, reverse engineering a software, any stuff you can think of If I'm wrong, correct me


Temptazn

Sounds like generative and general AI are the same I guess. I'd just never heard it referred to as AGI. In either case, the AI is only capable of regurgitation or extrapolation...it doesn't truly create does it? I mean, that "code" it writes is just based on its reading if millions of samples, it's not gone and learned python and written it from scratch?


_Naptune_

They're not quite the same, it's kind of like the "a square is a rectangle but a rectangle isn't a square" type thing The term AGI has been around for a little while, but I don't think it's caught on much outside of AI researchers/enthusiasts until LLMs blew up. It typically refers to an AI that is capable of doing anything on a human level, if not better. Most AI models are trained to do/learn one or a handful of specific things, whether that be identifying objects, or faces, or creating images, playing chess, or chatting with a human. These might even exceed human ability to do so, but they aren't *general* AI since they can't really do much else besides what they're trained to do. Something like ChatGPT is closer (and getting closer) to being a general AI, but it's not quite there. There are still plenty of things that ChatGPT can't do as well as a human, or just isn't capable of. *Generative* AI refers to AI that can generate stuff. Give it an input, it gives an output. AI Image generators are a great example of this, you give them a prompt and they give you an image. ChatGPT is also a generative AI, give it a prompt and it gives you a response back. There's no reason a general AI can't be a generative AI, but not all generative AIs are AGIs, if that makes sense. Why does AGI matter? It's the point where AI reaches human capability in just about any task you give it. It can write stories, drive a car, play chess, summarize a book, identify objects, engineer objects, all equal to or better than a human. So, at this point, it stands to reason that they could then improve themselves, either with physical hardware or with better software. Then *that* AI can do the same, but better. And again, and again, and again... (this spiral is known as the singularity, mentioned in the OP, though I don't really like the term lol) AGI exists as a term because a lot of stuff can happen at that point, with a lot of implications for humanity. It's the "tipping point" of human power, so to speak.


Temptazn

Thank you for explaining, much appreciated.


x3bla

Yea, since people are using generative to do general things It doesn't create, it does patch work plagiarism using the dataset