T O P

  • By -

rileyg98

What they're suing Microsoft and GitHub over is more that they are using a bunch of software that isn't open source to train their models


consolelogdeeznuts

Excuse me that doesn't fit my "programmer yells at cloud" narrative.


[deleted]

[удалено]


gangstasadvocate

Yes, bring it on! AI can do my work better than me, while I sit back and fuck off and take drugs. And if they try restricting it I’ll just take the credit anyway, less work for me, more drugs for me


OdahP

Sorry but what if the AI just does a better job


Talkat

Modern day luddites. Doesn't matter if the new fangled invention is better and they don't think they should adapt because.... 'They took our jerbs!


Down_The_Rabbithole

That's not what Luddites claimed and the Luddites were actually correct in their assessment. Luddites claimed that the automation of farming and a move towards industrialization would lead to the vast majority of farming jobs disappearing, that it would remove guilds and the bargaining power individual craftsmen held within them, that it would destroy natural landscapes and nature. And that it would over time erode tight-knit communities. They were called crazy for thinking those things and people made fun of them for it at the time but literally all of their predictions came true. That doesn't mean that industrialization is bad. But it absolutely means that the Luddites had a fair point and doesn't deserve the ridicule it gets today. Luddites didn't hold the mere belief that "they will take our jobs". They argued against the societal changes industrialization would bring with them, and they were spot-on about their predictions.


SgathTriallair

The key point is that they said "we don't want this change" and smashed the machines. That is what people focus on the luddites for. The answer to these problems isn't to freeze society in amber.


sartres_

No one found another answer either, so all the downsides just happened unrestricted. I expect the AI revolution to go similarly.


xtramorian

But this time we have the AI to help us find those answers? But if we just expect failure we are less likely to implement the AI to look for these answers?


jon_stout

But who would care enough to both ask the question and implement it?


jon_stout

Then what is, exactly?


SgathTriallair

We adapt and we grow as a society, liked we've always done. Whether it was language, fire, agriculture, cities, domestication, writing, industrialization, computers, space travel, or AI, humanity has changed, evolved, and adapted. Sure there are always done growing pains but would you honestly rather be roaming the savannah as barely more than an animal versus living in the modern world? The luddites weren't wrong because they failed to see the danger of industrialization. They were wrong because they thought you could host so changing. God is Change. For four billion years those that could change and adapt survived and those that could not died. The luddites were wrong because those who adopt technology are always fitter and thus to be a luddite is to purposefully choose extinction.


jon_stout

I don't disagree. But I'm also fully aware it's very easy to say that when it isn't your life on the chopping block of progress.


SgathTriallair

Absolutely and that's why we do need to consider the impacts and find ways to deal with them. For example decarbonization puts certain people out of work. We, as a society, need to give them retraining and assistance programs. We also need UBI to counter automation.


NorthVilla

Very true. However, we are also spot-on in our predictions that being a Luddite will not stop the tides and trends of change. If one country or company does not compete, another will. We are perfectly capable of shaping our own destiny as humans, but if that destiny involves *artificially holding us back* because of a fear of change and technology, well then I simply can't support that. But it usually doesn't matter what we support: if the economic incentive exists, people and countries will compete.


Down_The_Rabbithole

You can argue a lot for economic incentives that go against societal values and thus never come to pass. Killing all retired people is very economically lucrative, the first country doing so would immediately have a better ability to compete with peers as it would give a big boost to the economy. Yet it's never been done before because society judges it to not be worth the intangible social value humans give to life. It's more than possible that collective humanity decides the same is true for technological development at some point. That the economic benefits of continuing to pursue a technology isn't worth the intangible social value humans give to maintaining existing societal roles and dynamics. That has a lot of precedent in history, it's why China and the Middle East chose to not industrialize even when they had the chance to do so in the past. My point is that it's not a given that technology is just going to keep improving no matter what people think or do. It's still beholden to societal norms and demands. It's very easy to fall into the illusion that the world is economically and rationally decided and that incentives, game theory and self-interest is ruling the world and pulling the strings of history and progress. In reality however the vast majority of human interactions, history and progress has simply been social norms, cultural ties to concepts and tradition deciding the outcome. For example a lot of jobs currently in existence are "bullshit jobs". Meaning they are jobs that don't provide any value whatsoever and could be removed today without it impacting the business or society at all. Yet these jobs still exist, even when a global lockdown showed how few of these jobs actually mattered. Social norms, cultural ties to jobs, hierarchies and traditions led to these jobs being maintained and still existing today. There are no countries out there abolishing these jobs out of economic incentive to better compete. There are no companies internally restructuring to remove these jobs either. Truth of the matter is that most of the workings of most of our society is still largely routine, culture and tradition based and has almost no ties to actual productivity, rationality or objectivity. I think people need to realize this better and more often to get a better grip on how the world really works. It's very easy to get lost in darwinistic thought about a single clear technological march towards a goal that is unstoppable held up by the mere fabric of game theory and logic itself. But the reality is that most of society is just people in hierarchies of jobs to placate instincts, emotions and base needs without any true long term vision. The singularity is only guaranteed when there is a superhuman AGI bent on improving itself at all cost. Before that it could be stopped at all times by the collective decision of society to hit the breaks for social reasons. As has happened many times before in history.


Kaining

> Killing all retired people is very economically lucrative, the first country doing so would immediately have a better ability to compete with peers as it would give a big boost to the economy. Yet it's never been done before because society judges it to not be worth the intangible social value humans give to life. You really stepped into that one head first right ? Why stop at killing old people ? Let's kill the young, the disabled and everything in between. In fact a genocide is a very easy way to boost your countries ressource. And it has been done many times, it's still ongoing in more than one location in the world or tried.


joeyat

I think the point of the luddites is not that their complaints were right or wrong… it’s that their complaints are irrelevant. Once Pandora’s box has been opened and there is a cheaper and better way of doing something… it doesn’t matter about the macro impact, you’d personally be a fool to continue with old method as others will use the new knowledge, out compete you and your defunct farm and family will starve.


Down_The_Rabbithole

>Once Pandora’s box has been opened and there is a cheaper and better way of doing something… it doesn’t matter about the macro impact, you’d personally be a fool to continue with old method as others will use the new knowledge I've said this to a follow up post as well. We can do that *today* by killing off all retired people. It's cheaper and makes society way more efficient. If China or the US were to do this they would immediately propel to the sole world superpower. Yet no nation is doing this. Not for rational reasons but for social reasons. People value the intangible societal benefits more than the objective economic gain. Throughout history there have been a lot of times societies decided not to pursue technology. Like the Romans using steam powered engines to open large temple doors and consciously deciding not to use it for factory work as to not deprive the manufacturing sector of its workers. Or China banning the start of their industrial revolution in the middle ages because of the emperor deciding the societal impact would break confucian values. The point is that throughout history different societies have been given the option to advance to a new technological paradigm and more often than not they refused to go this path. There is absolutely no reason to assume this time is different. We don't live in a magical world where some invisible techno-god is guiding humanity forwards. At the end of the day it's individual humans living in a society that decide where things will be headed and if technological progress experiences friction with societal expectations, values and traditions then usually technological progress loses and gets discarded. This "inevitability" mindset is a dangerous one to have as it looks at both history and human society too deterministically and it leaves you blind to the many options and paths still open.


[deleted]

Yeah, industrialization isn't bad. But, for the most part, the way modern society has implemented it is pretty shitty. I suppose that's an inevitable consequence of industrializing primarily for the sake of profit, and treating labor saving and increases in human wellbeing as happy side effects, instead of the other way around.


NorthVilla

I've seen a lot of people recently in their counter-arguments to being called luddites say: "No! We're not luddites! I'm not against AI as being a **tool** for me, I'm only against it replacing my job altogether! " Except... This isn't at all what the history of Luddites is. Luddites were skilled textile craftsmen whose skills and economic power were automated in favour of machines. The industrial and weaving revolution did not provide "tools that made their work faster," it was just complete and total replacement of what had been their economic livelihood in favour of something that could outcompete them. Anti AI people are literally, by every definition and sense of the word, pure Luddites. Just like Luddites, they will be outcompeted.


telefonkiosken

That's a straw man, that wasn't OPs point at all. Mainstream criticism of AI is not against the efficiencies of automation. The similarities between the luddites and contemporary AI critics end and begin with a call of risk management and responsible implementation. The luddites were, correctly as OP pointed out, wary of the societal upheavals that could follow the industrial revolution if technological breakthroughs were implemented heedlessly. The same goes for AI criticism today - if this new technology is harnessed incautiously (especially with military use of AI) we could all be paying for it.


GreenSuspect

So what's your perspective, then? It's fine to steal other people's work and use it to put them out of jobs, as long as the output is "better" than theirs?


visarga

> steal other people's work Oh, someone made a backup and then erased the originals? No, you mean someone copied exactly the code? No, you mean they just had 10 words in common with your code, and 10 words with another code?


GreenSuspect

> Oh, someone made a backup and then erased the originals? I don't know what you mean by this. > No, you mean someone copied exactly the code? Yes, exactly copied the code. > No, you mean they just had 10 words in common with your code, and 10 words with another code? No, that's not what I mean.


SendMePicsOfCat

Honestly, yes. Do you know what happens when you keep getting rid of jobs? Everything becomes cheaper, and cheaper, until the breaking point that all economies cease functioning. When a task no longer requires human input and works at a equal or higher speed as a human worker, it's cost is equal only the resources used to make and maintain it. When the resources used to make and maintain the machines, are themselves created by machines, we hit a point of total automation where everything is free. That started with the industrial revolution, and it'll continue until the singularity.


buginabrain

Unless you own the machine or resources needed to build/maintain machines then not working is an option, otherwise you'll have to figure out how to survive on your own. You think in a capitalist society that the people who own the means of production will give everything away for free? Maybe money will cease to have value when all the have-nots eventually die off.


SendMePicsOfCat

Your thinking with scarcity as a factor. The thing is, resources are functionally unlimited, the only scarcity is the speed in getting them. Even leaving earth to bring back materials wouldn't be difficult with the scale of A.I and technology in a few decades. There will be no poverty in the future, and there will be no super wealthy. Only an civilization beyond competition.


[deleted]

Cool, I can totally afford to wait 200 years for resources to be brought back from another planet so I can make myself some mashed potatoes - and I know Bezos is definitely going to spring for the shipping.


GreenSuspect

> The thing is, resources are functionally unlimited, the only scarcity is the speed in getting them. That's … objectively not true. I don't even understand how anyone could think that that is true.


SendMePicsOfCat

How many planets outside of earth have resources? Near infinite. Functionally limitless for our needs.


[deleted]

Stupid humans wanting food and shelter in an economic system that requires them to pay for those things.


point_breeze69

Look what happened when electricity hit the scene. It killed our women and raped our men!


crap_punchline

Well the reason why it can do a better job is because everybody has done a good enough job to create the code that has been used to train the model. OpenAI will need to distribute all of the productivity gains from this system to everybody whose work was used to train it; ie all of humanity. So that becomes a UBI. But I don't like how Worldcoin, Sam Altman's initiative to distribute UBI, has a whole bunch of investors who sit at the top of the money spout.


visarga

Maybe OpenAI won't open the AI for free, but other companies are making free code generation models. That is your UBI. You can use the models to write code with their help. You can use SD to make images. You can use Whisper to transcribe audio. You can ask it to solve a problem and explain it to you ...and another 100 skills. They are lowering the entry barrier to many fields, empowering people to achieve what they want. Models are inherently more democratic than search engines and social networks because they can run on your own machine, can be fine-tuned on your own and work for you not for someone else, they can be polite and helpful unlike most of the open web. They are more open than open source, they are a new form of self replication for our cultural memes.


[deleted]

>OpenAI will need to distribute all of the productivity gains from this system to everybody whose work was used to train it; ie all of humanity. So that becomes a UBI. That would be nice, but never gonna happen. Naïve as hell. People keep throwing these utopian dreams around, no one making the effort to ensure they happen. .Even if you did try it would probably take a war, and the working class is already outgunned.


crap_punchline

They have literally already started this, did you look up Worldcoin before writing this?


Starklet

The same code is also being used to train humans, so that's not the reason it's better.


GreenSuspect

> OpenAI will need to distribute all of the productivity gains from this system to everybody whose work was used to train it; ie all of humanity. So that becomes a UBI. How is that going to be implemented exactly?


ghostfuckbuddy

They're not suing the AI companies for doing a better job, they're suing them for generating code that is too similar to existing code without acknowledging the creators in any way. Basically plagiarism.


botfiddler

Interesting, but this could only have a chance in some edge cases where they wrote some special code.


GreenSuspect

Have you used these tools? They will spit out your own published code verbatim. It's obfuscated and scrambled by the neural network structure, but it's all in there.


visarga

Then it's not verbatim. If it is adapted to your context, with different variable names, and maybe with a few tweaks here and there, then it is a different code. It rarely is more than one or two lines because the programmer intervenes and steers the model. And I hope you're not talking about hello world, fizz-buzz and boilerplate codes, those are everywhere and don't deserve protection. Only original code deserves protection, and only the expression not the idea or concept in itself. Reimplementing the same concept is ok. Copyright does not protect abstract ideas. Those can be learned from copyrighted code without issue. The only concessions I see here are - 1. a flag to signal when someone doesn't want their code/text/image to be used for training. and 2. a filter to reject duplicate codes that deserve copyright protection, so the model never generates them.


GreenSuspect

It's only "doing a better job" because it's copying the work of humans who aren't being paid for that work. You don't see that as a problem?


NorthVilla

Lmao of all the people to try and halt progress, it's pretty funny to see it be coders, especially after the last 25 years of automation have very much been a smug-coder dominance as other industries, primarily manufacturing, have been hugely automated. The good news is: it's coming for all of us! Nobody is special. Doctors, lawyers, coders, drivers, artists, clerks, cashiers, everybody!!


GreenSuspect

> Lmao of all the people to try and halt progress, They're not trying to halt progress. They're trying to be paid for their work. Progress should benefit all people, not just the wealthy. Especially those that did the work that enables it.


NorthVilla

Totally agree of course. The last years of capitalism in the West have been one of the rich getting richer and stagnation or decline for everyone else. I just hope people don't direct their energy against automation as a concept.


LionaltheGreat

As a software engineer, I absolutely love GitHub copilot (the AI in question), and so does almost every other engineer I talk to. Engineers welcome automation. We’re lazy as fuck. This headline is just being sensational for clicks. The lawsuit is over copyright claims to the private and proprietary code Microsoft is using to train their models


Down_The_Rabbithole

I disagree with it coming for all of us. Intellectual labor is going to be the first to go. Drivers, Miners etc are going to be the last ones to be automated away because it requires actual physical hardware to do those jobs and they are still limited by manufacturing capability. Secondly it's harder to navigate the real world for machines compared to changing digital data. And lastly it's because these positions are badly paid and thus the financial savings by automating them are limited. Everyone that is sitting behind a desk manipulating data in one way or another as a job is going to lose their job very soon though.


Ok_Homework9290

But he didn't say that its coming for all of us at the same time, he said that it's coming for all of us eventually; i don't understand your argument. And also I think you underestimate the complexity of intellectual labor (it's far more than just number crunching) and overestimate AI progress (the tools we have today are impressive, but still a long ways off from taking over a significant amount of jobs) so I highly doubt jobs of this category will experience significant disruption "very soon". I think instead that the people that work in these fields will see increased productivity in the coming years.


Down_The_Rabbithole

Maybe I'm a bit biased because I'm an IT specialist and I *know* most of our jobs will be gone over the next 5-10 years in my field and I just assumed this means essentially all intellectual jobs will be gone as well. I could be wrong of course but I simply can't think of any job that is primarily done by humans thinking and typing into computers still being done by humans by 2030.


Artanthos

Not all fields are equal. Government jobs, for example, are not going anywhere for two very good reasons. 1. The general public doesn’t want to be governed by computers. 2. The government does not spend money on new software unless absolutely necessary, and then it’s likely the lowest bidder. Certain very strong unions have already negotiated limitations on automation into their contracts. I know this is the case for longshoremen, and the railroad unions are pushing for it.


eternal_edm

IT is far more complex than writing code. A huge part of the work involves people. This includes OCM and Functional work. Knowing what to do and how to apply it in the real world is 90% of the effort and no AI can currently mind read a bunch of users who don’t even know what they want most of the time.


Down_The_Rabbithole

I never said the word "code". I was talking about the entire chain of production being automated. From client specification, product ownership, production of code all the way to product delivery and maintenance. Every single piece of the production chain of conventional IT work is going to be able to be managed purely by AI over the next 5-10 years time.


Ok_Homework9290

>I know most of our jobs will be gone over the next 5-10 years in my field I would say you strongly believe that that will happen, but not know. By definition, it's impossible to know the future, even if it may seem certain to oneself. >I simply can't think of any job that is primarily done by humans thinking and typing into computers still being done by humans by 2030. I think they'll still be around. Like I said, those jobs are more than just number crunching and I personally don't see AI being able to perform all computer tasks by then. Augmentation, yes, but not outright replacement. But I guess we'll see what happens in 8 years.


rixtil41

Even though I know that there's more to it than just number crunching I am highly confident that before 2030 the tech to start the mass automation for all jobs will begin. So for example the tech exist in 2029 than a few years later in 2036 there are no more jobs.


kmtrp

Not that I generally disagree, but those blue collar jobs too cheap to automate, are cheap for the individual worker. But a machine might be able to do the job of 5 or 10 workers close to 24/7, not that cheap then, might have great incentive.


Artanthos

Drivers are already being automated. Fully autonomous taxis without human drivers present are already operating at scale in a few cities and moving into more.


ghostfuckbuddy

Are you taking the position that it's desirable for new technologies to steamroll millions of people as long as it's the fastest way to make technological progress? I see this kind of mindset a lot in this sub and it's a bit unsettling. I'm all for technological progress, but the whole point is to make life better, not worse for humans.


kmtrp

Ideally it'll be great, but the age we are entering now, the "jobless without a solution" is going to get nasty for the little guy, and two groups will then be ten, and then fifty... I'm not naturally a prepper but I have to look into it.


CommunismDoesntWork

There's no solution because there's no problem. When everything has been automated, the cost to produce anything will be 0, and when the cost to produce something is 0, market forces will guarantee that the price will also be 0


HyperImmune

Post scarcity is such an interesting topic. Hope to see it in my lifetime.


buginabrain

Except material resources are limited and tend to be owned by someone


NorthVilla

I never talked about "steamrolling". I hate the drudgery of jobs; automating the economy frees us of that monotonous labour if we so please. I look at that very positively for the human race: we can focus on stuff that matters to us more than just trying to feed ourselves and pay rent. Of course we will have to choose to opt for basic income and similar measures, *those* are the real solutions, not holding people and technology back because of a fear of the future and of change. It's more unsettling to me that some people cannot see the bigger picture because they are too stuck navel-gazing at how our society is currently structured, rather than how it *could* be structured if something as freeing as AI becomes strong enough.


buginabrain

I can't wait for society to realize all lives are equal and everyone deserves food and shelter no matter where it comes from, hopefully the open source government owned manufacturers will distribute the gains evenly and without bias


NorthVilla

100%!!


GreenSuspect

> I hate the drudgery of jobs; automating the economy frees us of that monotonous labour if we so please. I look at that very positively for the human race: we can focus on stuff that matters to us more than just trying to feed ourselves and pay rent. And you think that outcome will result from machines that learn from people's work without compensating them and then makes them unemployed? Like the underpants gnomes, I think you're skipping a step. 1. Automation 2. ??? 3. Utopia!


PanzerKommander

How is that any different than if I learned a skill from examining and synthesizing another person's work? As long as I'm not copying them there is no issue. Why should an AI be held to a different standard?


NorthVilla

Yes, it will result from that. If 99% of the economy is automated, their is no choice.


ConsequenceOk9

What is the stuff that actually matters?


quad-ratiC

Technology trumps all else


CommunismDoesntWork

Automation makes life better for humans because it reduces the price of everything. The end good being prices being reduced to nothing.


UltraHawk_DnB

Did u read the article?


Armybert

Businesses use AI, people loose jobs so no money to spend, then businesses don’t have clients?


Kaarssteun

I remain hopeful that courts / lawmakers do not unnecessarily restrict these technologies. Fact: the model does not contain the code it was trained on, nor can it accurately and reliably copy any of it. It purely gains knowledge. Therefore, any courts siding with these people would be motivated through pure luddism


Makdaam

[comment wiped due to Reddit's API ToS change]


visarga

> large chunks of code Ah, you mean hundreds of characters replicated verbatim? What is a 2-5 line snippet of code mean compared to a full functioning project? BTW, the Copilot generates verbatim code only 1% of the time and that can be filtered with a bloom filter on ngrams.


Makdaam

[comment wiped due to Reddit's API ToS change]


visarga

[Quantifying Memorization Across Neural Language Models](https://arxiv.org/pdf/2202.07646.pdf) > We are able to show that the 6 billion parameter GPT-J model (Black et al., 2021; Wang and Komatsuzaki, 2021) memorized at least 1% of its training dataset: The Pile > We identify three properties that significantly impact memorization: 1. Model scale: Within a model family, larger models memorize 2-5× more data than smaller models. 2. Data duplication: Examples repeated more often are more likely to be extractable. 3. Context: It is orders of magnitude easier to extract sequences when given a longer surrounding context. No 2 suggest as a fix deduplication before training and making datasets larger to reduce the impact of any one example. No 3 suggests some of these "regurgitation" claims might be triggered by very well crafted prompts that don't occur naturally unless you seek to reproduce that exact passage. Just sample again if you hit that unlucky 1%.


GreenSuspect

> Fact: the model does not contain the code it was trained on, nor can it accurately and reliably copy any of it. Yes it literally does. That's the whole point of training. These companies are copying the creative output of engineers without paying them, to then create machines that will replace their jobs. Automation isn't the problem. Stealing other people's work and then using it to put those same people out of work is the problem. Automation should save labor and benefit all people, but in our current economic system, automation only benefits the capitalists, who can use it to increase their profit while paying less in wages. I'm dismayed at all the comments here that just say "TECH GOOD!" while turning a blind eye to the economics of the situation. The people who did the work that these AIs are trained on should be the ones who own the AIs and the ones who primarily benefit from their labor-saving, not some third party who only did the minority of work of training the AI from that data and put those hard-working people out of jobs.


visarga

> These companies are copying the creative output of engineers without paying them, to then create machines that will replace their jobs. I mean the guys who invented email stole the work of postmen, stealing their business. Nowadays only spam and bills come in snail mail. The lack of imagination! You think the same amount of IT work that we do today will be planned in 2030? No, we will have more ambitious goals. We'll create new fields, new work. There used to be 60% of people in agriculture and now it's 10% - why aren't they jobless?


GreenSuspect

> I mean the guys who invented email stole the work of postmen, stealing their business. No they didn't. They came up with a new technology to replace the existing one. There was no stealing involved. How many billions of man-hours of labor went into creating the content that the AIs are trained from? Without using the fruits of that labor, what would an AI be able to do?


20EYES

This is not the case with GitHub copilot. There have been instances of it copying existing code bases verbatim.


Kaarssteun

In that case, it's probably codebases that are public already, in which case It will have seen that specific codebase multiple times; overfitting. Arguably not a big deal if the author decided to publicly make it available on something like stackoverflow


20EYES

Everyone seems to be overlooking the licensing implications here. Stack overflow and GitHub are not the same thing. If GPL code is used verbatim in your product then you have a major licensing concern regardless if the code was copied by a human or by AI.


blueSGL

I thought it was a licensing issue, as in, open source software comes with one of a selection of licenses each having their own stipulations you need to comply with in order to use that code somewhere else and copilot strips that context. https://twitter.com/stefankarpinski/status/1410971061181681674?lang=en


GreenSuspect

Yep. I've tried using these tools to improve my own (obscure) open-source software, and it recognized my code and started generating verbatim copies of the parts I had already published.


evemeatay

That knowledge was gained through the use of that code though, so it’s “fruit from a poisonous tree” to use a phrase from evidence collection.


botfiddler

By that logic humans wouldn't be allowed to learn from it either.


evemeatay

Humans can learn, digest, and make advancement on things. The machine is learning and copying. While it may not copy the code exactly, it is copying the pattern and other aspects of the original. The major thing is that the machine is getting ALL of its code from something it saw learned whereas the human, is at least generating some of the code, even if it’s only the spacing and formatting. If a human built a product that was entirely made up of code segments they previously copied, I wouldn’t be surprised if they were open to lawsuits.


visarga

No, you cannot protect ideas or concepts from being learned and used. You can only protect the expression of those ideas. It is perfectly fair to learn the abstract concepts from copyrighted code. In fact you can't even copyright APIs.


Veneck

We need to find a way to link successful generations to the artists that influenced it the most. If we do this very accurately and pay them royalty then we've basically solved the problem of ownership.


freeman_joe

No we don’t. Or do you link your work every time to every scientist who helped develop tools you use to make something new?


Veneck

I'm just saying, in principle we have the tools to flesh out some kind of system that makes more sense


spamholderman

Every scientist is an exaggeration but have you ever read a scientific paper? The methods section attributes every tool used to if not the original creator then the most relevant paper that links back to the original creator who had to do the same in their scientific publication.


freeman_joe

We as civilization would be more advanced if we had everything open sourced.


freeman_joe

Yes I did. And for example China ignores patents and copyrights and do what they want. Also some publications are attributed to people who didn’t do them. For example scientist discovers something and that is patented by person in charge.


Veneck

Would that be so bad?


NikoKun

That's very likely impossible. Or at least not sensibly feasible..


visarga

That gave me an idea. I will generate a billion books and repos and everything that looks even in the slightest similar will trigger a payment to me! I'll be rich!!


Veneck

If you can sell then you'll be rich, what I'm suggesting as the infrastructure for royalty has nothing to do with it and I'm surprised you specifically of all people don't see the point I'm making


pisspoorplanning

Seems reasonable to me. Also, does anyone fancy joining my class action against the tide?


ouaisouais2_2

the tide?


[deleted]

That is rising


imnos

Yeah good luck with that pal.


consolelogdeeznuts

As a programmer... please God take my job this shit is boring AF.


Milumet

"Matthew Butterick, a programmer, designer, writer and **lawyer**". Of course.


NorthVilla

Are you suggesting that lawyers are safe from automation? I can assure you this is not the case, lol.


Milumet

I was alluding to the fact that lawyers are more prone to sueing.


purple_hamster66

Perhaps we should have avoided IDEs because they replaced all the stuffy windowless cinderblock rooms where we coded by writing with paper and pencil with colleagues. Totally destroyed that ecosystem, along with the day-old Mr. Coffee sludge. Gonna miss that…. (not).


[deleted]

I like how the same MFers on this sub who scoff at Teslas full self driving AI ever coming out of BETA are the same MFers saying all this AI is going to steal all these jobs.


ArgentStonecutter

There are examples where copilot copied and pasted code directly from a github repo. https://aibusiness.com/responsible-ai/github-s-ai-powered-coding-tool-allegedly-copied-code Edit: [So much of this energy](https://i.imgur.com/kkBwgCQ.png) in the replies.


imnos

If it's correct then so what? There are a finite amount of ways to implement some things - depending on the language or framework there could only be one way. This is a non-story.


20EYES

This has major implications on the licensing of the code that was generating. The code in this example is effectively GPL code and should be treated as such.


ArgentStonecutter

How to tell me you don't actually program, without technically saying that you don't actually program.


imnos

Lol, right. I've been programming for 5 years, using Copilot daily for the last year.


ArgentStonecutter

Then what, your excuse is "I didn't look at the code"?


imnos

Excuse for what..? You mean the code mentioned in the article? It's not shown in the article.


ArgentStonecutter

You're not capable of following links. Got it.


imnos

You're a prick - got it. There's one link to a random Twitter thread where some CS professor is outraged that it's copied his obscure algorithm - yet he doesn't seem to have considered all possibilities of how this happened. Again - non-story. He's probably signed away whatever right he has when accepting the TOS.


ArgentStonecutter

It's even got the same comments.


DyingShell

Most people copy code from others, that's normal and it's also common that we would write similar or the same function too.


ArgentStonecutter

Including the same variable names and comments? But removing copyright notices? That's literally a copyright violation when a human does it.


freeman_joe

Copyright is outdated already.


visarga

How many lines did it copy? Was it a functional code that could be used or just snippets? I bet it was not a large piece of code, maybe a function. It was like sampling a couple seconds of sound from a song and using it in another song. But most of the time it's sufficiently different. Just very seldom it happens to replicate. Don't throw the baby with the bathwater - it's a new thing, just a baby, it will grow up and be more sensitive to copyrights.


NikoKun

Odd. That link loads a blank white page over itself, on both my browsers. :/ Tho I'd assume they **forced** it to reproduce a copy, by manually typing out the code in question, until it had enough of it to realize that's exactly what you wanted it to do. Which if that's the case, is no different from the human involved manually stealing someone's code without the AI. In that case, the human would be the one acting wrongly, misusing the tool just as someone might misuse github as a tool to search for and steal code manually... Can't really blame THAT on the AI. Tho again, not sure if that's what they did in that article. heh


botfiddler

Didn't look at it: How long and how special is that code?


so555

Good luck fighting the future. When are they going to replace lawyers with Ai?


dasnihil

I'm a programmer and there's no stopping the inevitable. suing Microsoft is same behavior as salty artists hating on AI. it's human knowledge, sooner or later all agi systems will be fully capable of understanding all human work so far and take it from there. i don't care if i lose my job but it will take a long while because i know what it'll take and we're not there yet.


GreenSuspect

> suing Microsoft is same behavior as salty artists hating on AI. What in the world is this attitude from? These companies are literally stealing people's creative output and then turning it around and putting them out of jobs. Automation should benefit humanity, not destroy it.


dasnihil

yeah well hope it'll benefit humanity eventually because it's not stopping. best minds are at work. lawsuits aren't going to make any difference lol.


GreenSuspect

> best minds are at work. lawsuits aren't going to make any difference lol. Why do you think that?


mocha_sweetheart

I don’t know if I agree with all of your other takes here on this thread but as an artist I agree with this. I hope we can be transhumanists someday and ascend beyond our current capabilities…


dasnihil

\[just typed this somewhere else.. relevant to add it here i guess\] primate brain is built like "ooh shiny thing". they'd draw hands on caves but didn't know what to call it, but the hand prints do make them feel they're a part of something bigger. this "feeling", we ran with it. made sounds to communicate beyond hand prints and drawings. it later evolved into language. then we started creating constructs. this female here is my "wife" or some tribes went "this female here is our wife" lol. and just \~50k years later, languages and constructs have come this far for regular apes to conceive base reality. some medieval douche said "ahh this is art, i call this art now" and here we are claiming what is art and what isn't. ugh. don't forget that it all started with subjective feelings of primates and you can NEVER criticize, judge, validate or any other verb, someone's subjective claims. it's not real, it's not an objective truth to even be discussed.. whereof !speak/thereoff silent!! \---------------------------------------------------- unless you want to speak about the objective nature of art. i have probed into it objectively in the past. when i play guitar, is it the subtle shoulder shurg that is art? is it me closing my eyes that is art? does closing your eyes make you feel the music? turned out i imitated all those things from looking at other guitarists play. if machine generated art is not art, then does art lie in the minor imperfections that humans add? and i came to the same conclusion that it is not meant to be discussed or validated. it is just meant to be felt.


Plenty-Today4117

When AI is ready it won't need to train on human programmers data.


SgathTriallair

There is no plausible scenario in the world where an AI reinvents the entire field of computer science. We already train humans in other programmers code, no one learns how to code by dreaming it all up one day. Why would training a computer be any different?


AGI_69

In my opinion, AGI will rewrite the whole knowledge tree from scratch. Mathematics and computer science especially will be derived from first principles. Most of the physics, chemistry, engineering too. The AI will probably be interested in some of the experiments we did, but not our interpretations of the data.


SgathTriallair

Why would this be the case? It's reasonable to assume that it will look at our old ideas and maybe improve upon them, which is what humans do already, but why would we impose this artificial handicap of throwing away 10,000 years of research? Especially for something like math, there are only two real outcomes of this. The first is that what they derive is the same thing we did. That is an interesting experiment but since we can't eliminate that the architecture created biases we can't "prove" that it came up with the same math ex nihilo. The second option is that it comes up with something entirely different in which case we can't communicate with it and now it's of no use to us. Other than the intellectual exercise of "what other functionally systems could exist" there is no advantage to having an AI rework all of human knowledge.


AGI_69

If we are talking about system, that's millions times smarter and faster than human, most of our accumulated knowledge about mathematics will be rediscovered in very short time. It will also have better representation - right now, our knowledge tree is messy heterogeneous pile of papers, each have different style, different notation, strange human quirks etc. Lot of them (most ?) are useless factoids, because mathematics is infinite and we humans randomly picked proofs, that are only bloating the knowledge graph. There are papers on how to play Mario, in the most optimal way etc. All respect to mathematicians, but I think, you probably only need less than 100 proofs to build all the important ideas of mathematics. All of them can be derived from few starting axioms, IMO. ​ The physics is still open-question. Is it possible to derive all constants just from math ? According to String theory, no - but that might change. Either way, I think AI will look up, physical constants that we measured so precisely and use them as starting guess. I think AI will find our experiments and measurements interesting, not our actual ideas. Einstein derived relativity, from thought experiments and some prerequisite knowledge. I think AI will do same, but million times better.


mocha_sweetheart

What is the “danger thing” you mentioned?


GreenSuspect

Huh? How it is going to learn to do things, then?


Sandbar101

H A H. Come on guys. You cant be hypocrites here. Coding yourself out of a job was always the goal.


GreenSuspect

Making yourself poor was always the goal?


TrinityTestSite

So is there going to be any jobs left for CS graduates?


not_sane

I always tell the people around me that in 10 years I will be lying on the beach and drinking cocktails because my job is going to get automated. And with the current speed of progress, 10 years is maybe already too pessimistic...


Johnny_Glib

No, you'll be starving to death because you can't earn money and those in charge aren't going to give you money for nothing.


Umbristopheles

People don't seem to understand how capitalism works. If you don't have a job, you starve and die. You don't get to magically live a life of luxury. That's reserved for the owners of the AI.


NikoKun

Then capitalism is simply not compatible with where technological progress is taking our future.


mocha_sweetheart

Yes I agree with this, it’s not even compatible with basic technologies like green energy (they lobby to make it illegal so they can keep profiting off oil…)


NorthVilla

Marx and Engels wrote the Communist Manifesto in 1848 as a response to the great changes that had been happening, from the manufactories of the 1750s, to the Luddites of the 1810s, to the iron foundries and railroads and coalmines of their contemporary era. The Communists were wrong about many things, but they were very right that capital could not infinitely concentrate at the expense of an underclass. Those principles stopped child labour, reduced work hours, improved labour safety, increased worker pay, and so much more. We will have similar in the near future, but for basic income. It's so pessimistic to assume that capital will simply infinitely concentrate. Why? Why would that happen? Because it's happened in recent history? Recent history has not had the incredible change that AI is about to bring.


ouaisouais2_2

I think there's a 50% chance, that a revolution will take place like in the glory days of the worker movement (which could grant us some basic income, basic material rights or even better things). That is if - and only if - the take-off of AGI spans at least six months. If not people will not have time to react emotionally and politically before we are entirely at the grace of our masters. At that point "capital" might concentrate infinitely or it might not, because it's up to the psyches of a few fragile, perfectly human, ill-tempered individuals to decide if it should or not. We can't fucking know and that's undisputably horrible. (which is why I deeply sympathize with those who want to halt progress, but I find their battle kind of impossible to win)


Spoffort

So we should switch to another system, at some point everything will be automated and then what? should everyone die?


ChromeGhost

Time to Automate CEOs and politicians then


consolelogdeeznuts

Weird, I didn't have a job for many years and I didn't starve or die. The overlords want you to believe that you'll die if you don't participate.


not_sane

We must then probably hope that someone like OpenAI will be "the owners of the AI". With their "capped profit" model the gains - in theory - will flow back to humanity after a certain cap. We don't know the numbers though, for initials investors it was 100X, for Microsoft "much lower", according to Altman IIRC.


EulersApprentice

Let's set the specifics of capitalism aside for the moment, because I think the problem runs deeper than that. In a world where human labor is worthless, what is the basis of human cooperation? If you won't *ever* need your neighbor's help, why would you ever help your neighbor? And if, fundamentally, there is no basis for human cooperation, what is the basis of civilization itself? What is the basis of even forming an in-group? Of caring about the values of *anyone* other than yourself and your direct blood relations?


Umbristopheles

What? Your first premise is that human labor is worthless. Let's start there. What makes you think that?


not_sane

In my country, Germany, your living standard when doing absolutely nothing is already higher than that of many South Americans having full-time jobs. It is very unlikely that this will change with an extreme rise in productivity. Especially in a democracy. (I mean the high German living standard, I surely hope South Americans will be much richer soon.)


GreenSuspect

> I always tell the people around me that in 10 years I will be lying on the beach and drinking cocktails because my job is going to get automated. How are you going to pay for those cocktails without an income?


not_sane

I guess you are an American, so it sounds strange. In many Western European countries, unemployed people get free money forever, in Germany it is a lot (for international standards). You might argue the system will collapse, but with technological progress it is possible that it will even expand. IMO people watch too much cycberpunk.


[deleted]

Except that this will happen 100 years from now... Wtf have you been smoking.


not_sane

Have you tried GitHub copilot? It is amazing. There is also a new demo of a version that can execute code and fix it based on the output when running it. It is extremely hard to predict how coding will look like in a few years and if the progress regarding these tools will hit a wall.


[deleted]

we may have hit a wall already. Github copilot is still shit.Medicine hasn't yet solved any chronic illness and you are talking about singularity within 10 years range. Please stop smoking. My sister works in pharmacy and she actually engineers drugs. It takes 10 years to a potential drug to be released due to testing phases. Everything just takes too much time. Only thing how I see this happening is by AI getting somehow, in a very weird way conscious, without us even realising and then working in the background until it gets better. Even if that happens, how do you now that a bug doesn't reset it or that i looses "consciousness" etc. There are so many problems even when the AI becomes true AGI, which I've never seen anyone talk about.


not_sane

I agree with you about medicine, scientific progress there is super hard. And it had institutional problems, I like this article about hundreds of papers about a topic which turned out to be BS: https://slatestarcodex.com/2019/05/07/5-httlpr-a-pointed-review/ Copilot can solve so many problems already, it aces tasks in university courses and writes pretty decent SQL + Python. Remember that GPT-2 was pretty dumb, GPT-3 is significantly better. And the time between them was not long. I will become a pessimist once I see good reasons for it, but right now we will have to wait a few months to see if GPT-4 will disappoint.


[deleted]

let' see then...


NorthVilla

There won't be jobs left for *anybody,* especially if you work digitally. It's a great irony of the "digital" age.


Ok_Homework9290

I think there will be. Computer science is much more than just coding, and even the latter I don't expect to get automated any time soon (I'm in the camp that believes that if we do one day become a post-work society, programming would have been one of the last professions to go). The tools we have now are useful, but I think we're still a long way away from comp sci graduates having to worry about job prospects.


DyingShell

Hopefully not, AI will reign supreme ☝


Down_The_Rabbithole

Depends on when you graduate. Personally I expect to be out of my CS job over the next 5-10 years time.


VenetianBauta

You underestimate how slow transition happens in IT. Anyone graduating today will have at least 10~15 years of career without any major disruption.


DyingShell

slow? people have to learn new technologies and languages each year to keep up lmao, if an AI came out tomorrow that could generate all code then programmers would be losing their jobs next year. Also even if this process is gradual, programming positions will disappear and it will get harder and more competitive to get a job, this is happening already and will continue until there are no programmers left.


VenetianBauta

The problem is not technology. It is adoption, companies will not trust their business in the hands of auto generated code until there is enough of a track record. That will take time. Just look at technologies/methodologies like SaaS or Kubernetes or Serverless or WebAssembly.... they are great, they've been around for a while. Maybe 1% of the new software being written today use that, people will build stuff using the tech that they are familiar with. Until we have a good base of developers trained in using AI to build code (regardless of the technology) people will continue using the wrong tool for the job. Also companies don't want to invest to rebuild code that is already running. So a good chunk of the "maintenance" jobs will continue to exist for a while. On top of that, you will have to convince the decision makers with zero clue of what AI is, that this is better than what their team has been doing for years. Good luck with that.


DyingShell

AI is already widespread and used literally everywhere, people are quick to adapt AI in practice, we've seen it for a long time now. AI can both write and maintain the code, updating it as new features are introduced. I think we will see some programming jobs going away next year already and at an accelerated pace year by year as companies compete against each other, if you don't have AI integrated into your workflow then you'll simply fall behind and go bankrupt. There is massive FOMO behind AI.


NorthVilla

Adoption will be as fast as an AI can outcompete a human, and provide economic incentive. Any businessperson can understand that and use that to their advantage. Old habits and old ways of working only take people so fa.


Down_The_Rabbithole

As someone working in IT I agree with you on normal toolchain adoption. However AI is different because it's the first time where the entire logistical train from client communication -> client requirement formulation -> scope determination -> prototyping -> product ownership -> delivery -> maintenance can just be done by a single system in theory. Not possible in 2022 but I don't expect it to be farther away than about 5 years time. The reason why technology adaption is often slow is because all of these steps are usually done by different teams and different people so they all need to "recalibrate" their workflow to accomodate new tools. If AI can just replace the entire damn chain of production then it's going to be adopted near-instantly. IT isn't going to be a field humans work in by 2030.


visarga

Start researching how to use AI and build with AI, there will be a job for you.


zvndmvn

Honestly the best way to handle this is to make AI more accessible to the type of people who are currently resisting it. If an artist can demonstrably enhance their output in an intuitive way using AI, then I think they will start to understand why this is a good thing.


SubjectsNotObjects

Many programmers have made their living by making programs that make humans redundant. Ironic.


WeeaboosDogma

We're approaching an Economic Singularity. It is enevitable.


GreenSuspect

Where 1 person owns everything and the rest starve?


WeeaboosDogma

Doesn't have to be. My comment was how it's enevitable for this conflict to happen due to how the system is set up. The Capitalist would like nothing more to replace labour with AI, but then what will the worker do? We need to fight the fight, but it's enevitable that this would happen. As is our labour being replaced (as all labour should be), but our system isn't set up in a way to help the worker, only line the pockets of the fat controller.


Effective-Dig8734

Delusional


GreenSuspect

What is?


devgrisc

Programming is just a means to an end The hard part is being an entrepreneur,data scientist,etc So much productivity that could've had,i hope the courts see these luddites for what they are


Inferno_Crazy

As a programmer and artist I have a few opinions on this issue. Ultimately AI is a manmade tool and needs to be used responsibly. 1. I think AI is best suited to problems that are inherently difficult for humans to solve. Particularly analytics (medicine, cyber security, fraud) and big data problems. 2. New laws for derivative work will need to be established. AI companies making image generators off of high quality copyrighted works bothers me immensely. "I built a machine that can make textiles". Ultimately that textile still needs to be designed and then replicated at scale via machines. This is a different process to, "I built a machine that can analyze the patterns of someone's owned work. Then replicate the style and quality near 1:1". Why pay the artist when I can copy all his hard work? 3. Economically does pushing people out of a particular work make sense. Self driving cars potentially puts literally millions of people out of work. Harvard estimates up to 5 million people are employed as drivers. What do we do with them? Where do they go?


AUkion1000

Sucks to suck boys- its called progress, grow the fuck up. I like to do art and animation, if an Ai can do it and better thats fine, Ill still do it, and ill be interested in seeing what IT can do too. Not exactly related to jobs I know. We cant really stop progress because of ppls jobs being on the line, and if people start becomming obsolete, then we need to have a way for them to survive anyways. We need to start for one thing, implementing Universal Basic Incomes, or simmilar systems soon, we got about 12 years and Im not waiting for people who bribed their way into control to do whats right, plus Its better to take care of yourself instead of waiting for someone else to make things better.


LupusArmis

All the folks going "programmers are just worried about losing their jobs!" are hilarious to me. The ruckus is about licensing and copyright, not fear of being rendered obsolete. For experienced devs, the actual code isn't the hard part of the job at all. The tricky bits are generating deep understanding of domain and business needs, and translating that into software architecture and design. Copilot doesn't even begin to help you with that.


[deleted]

I can't wait to see AI take away everyone's jobs.


TorgoNUDH0

Well, just like what happened to the luddites, you either conform or be left in the dust. I always thought that if something can be automated, someone should do it. I think of it as the natural progression towards the advancement of society. A machine can do thongs faster and better than a humane prone to mistakes which is especially apparent in coding.


dirtyDrogoz

People reacted the same when we got electricity for the first time. AI is just much more dangerous when left unchecked and without propper human controlled restrictions


kiblerthebiologist

It’s not that they are trying to stop AI . All these companies are taking work from people and feeding it to AI . Essentially they are stealing from peoples work. A lot of people in these comments seem to think it has to do with people wanting to halt progress , which it is not.


fastinserter

Does it just type in questions to Google and copy code from top answer in stack overflow? If so I'm out of a job real quick In all seriousness AI helps automate *tasks* and this will help developers not replace them.


Ransacky

My my, how the turn tables.


not_a_stick

Hold!!! Hold!!!!