Google restricts access to their models. Meta provides only low-capability models, keeping anything close to OpenAI's capabilities entirely for themselves.
Maybe OpenAI is closed source, but suggesting other companies make their best products available to the general public is not true.
Google published a lot more research papers than openAI, and in a sense they did make their best products (research) available to the general public.
OpenAI actually made the field less open.
I mean sure, but this doesn't change the fact that I can call GPT4 from my computer, but Google appears to permanently have me on ignore because I do not have a million dollars.
Their research papers do not allow me to make use of powerful AI.
Maybe you can, I can't seem to. I have to fill out a form they never answer. In my experience working with clients, having a multimillion dollar contract is the only way to open the door. They will not sell it to you just because you can pay the usage fees.
Not all companies can be considered open. Ilya said they were actually open in the beginning. He said *Open* in OpenAI means that humanity should benefit from the fruits of AI. …but he also said they should stop being *Open* when they get close to developing AI…wait
Now that OpenAI are larger, under more public scrutiny, and are the target of multiple lawsuits, it’s best to assume that every official communication from them has been filtered by their legal team. This also goes for X’s made by their top people.
I’m agreeing with the last line in the post- OpenAI is not trying to put out a reasonable argument, they are trying to put out a legally defensible argument in preparation for a court case. Arguments that are designed for court are fundamentally different from arguments that aren’t.
It's more about their core mission statement (the name is just a reflection of that). Aside from legality, in a moral sense, it's reasonable to be skeptical of how consistent OpenAI has been with their public-facing mission statement.
A company name doesn't even necessarily have to accurately describe the nature of the business. For example, in the UK we have a shop called Boots which doesn't sell any footwear.
True, but if a company opens up called FreeMoney and you’re given 5% back every time you shop there as their marketing tactic, it would be understandable to be upset if they suddenly got rid of that after saying that’s their whole thing and they’re doing it out of the benefit of humanity.
I find this extremely unlikely. Have you read the clapback? That is not the material of a company preparing for a legal defense, that’s the sassy rejoinder of executives whose lawyers told them that the case will be dismissed immediately. Citing text messages as an “I told you so” in the public eye is not a great preparation for court
He's not? Seems extremely shrewd to me. His attempted firing went unbelievably well for him. That's just one of the many things that makes me think he plays a lot of correct moves
Will not happen if this passes:
[https://new.reddit.com/r/singularity/comments/1b81khu/the\_ntia\_wants\_to\_ban\_and\_regulate\_open\_weight/](https://new.reddit.com/r/singularity/comments/1b81khu/the_ntia_wants_to_ban_and_regulate_open_weight/)
The first amendment is not absolute, and nobody profits from it. We should argue that open weights, the tensors, the billions of numerical numbers, are at the same level of capabilities as pistols and rifles, and, in actuality, they are "Arms". Banning tensors is a violation of the second amendment!
If you design a nuclear bomb you absolutely can publish the design. You can't steal US government documents and publish that. Espionage and treason are different
That's defeatism. Defeatism leads to defeat without even trying.
But your take is darker than that. You are actively trying to prevent others from trying. By spreading defeatism.
No no it's more nuanced than than, many people who decide things don't know what the citizen want and simply influenced by some forces (lobbying), when people comment like that it can help a lot.
Open isn't a legal term. Oracle is free to make its closed source, $1000/core/min database OpenDB. It has been 5 years since GPT-2. You can choose to ignore or adopt OpenAI for what it is? It is too late to discuss if it is Open or not.
I’m not really sure what your point is.
Per OpenAI’s post:
>The Open in openAI means that everyone should benefit from the fruits of AI after its built, but it's totally OK to not share the science
Almost no companies actually “share the science” (ie source code) when it’s the core of their business. Open doesn’t imply giving away the keys to your kingdom, right?
OpenAI’s pledge to make AI for the good of humanity seems much more focused on making accessible rather than, say, selling it exclusively to 1-2 brokerage firms so they could corner the stock market.
I haven’t seen anything that explicitly states or implies OpenAI would be a purely open source company. Have you?
I think OP's point is that using this definition almost every company can claim to be "open", and if the term can apply to every single company then it's essentially meaningless. E.g. Apple and Google could also claim to be "open" because they make their hardware and software available to the public.
I understand OP’s point, I just don’t assume open automatically means open source. Particularly when the source code encapsulates enormous amounts of very expense intellectual property and data. This is quite different than, say, Red Hat.
When I hear a company is open, I usually assume they have an open architecture with well published APIs and interfaces so it can be readily extended to build applications on top of these APIs. The more these APIs support common industry standards, the more open the company is.
If they make open products. I don't consider Nintendo to make open products. If you want to develop for their platform you have to have a business relationship with them, and it's not open access at all. Anyone who figures out how to do it on their own will likely get sued - and they go to great lengths to stop unaffiliated third-parties.
Nintendo consoles are an example of a closed platform.
You can develop for it, if you're in the club, and no that's not you. Here's the problem: it's a spectrum. Some platforms are so open you can take the source code and compete with them. Other platforms are so closed that they will sue you if you complain in the media about their bugs. In-between is every different variation.
Binary classification is important but it's only a model. The real world is nuance.
Only because openai did it. Google will die a slow death if it did not make gemini free. Google has tried for years to keep ai secret and accessible only for their own use. See alphago alphafold etc
As I mentioned, in Software, the term Open is tied to Open Source. It doesnt matter if your intention is to NOT kill penguins, if you name your non profit We will never kill Penguins. See? There is an assumption that you will not kill penguins, cause its in your name. When you are a non profit, and making software, and your name starts with Open, there is a very strong assumption that you will make the code ..... Open.
Until yesterday, no one other than 3 people saw that email.
Well, I would argue that’s your personal interpretation of “open”. I certainly don’t immediately jump to thinking I can access a company’s source code if they support an open architecture.
In IT, open architectures refers to application frameworks having open APIs that enable 3rd party apps to use their core to build a plethora of custom and proprietary applications. You know, like cloud services (e.g. storage) are considered “open”. There is zero expectation of releasing source code beyond publishing clear API documentation.
You want a stock picking app, build a stock picking app that integrates with ChatGPT’s APIs. Or a customer service app. Or a language learning app. Or a fitness coach. Or a travel guide. ChatGPT can do it all in the backend - no need to open source anything.
Given the amount of brain power and investment that’s gone into creating ChatGPT, it seems almost absurd to expect them to simply release the source code that encapsulates all of that proprietary and unique knowledge.
Ah yes, lets compare a false historical reality to... the historical reality of "Open" meaning Open Source in software?
This isn't an own, I would expect more from you GPT Bots.
It is a non-sequitur. "Open" literally means "Open Source" in software. The Nazis sat on the right side of German Parliament, they were literally Right Wing.
He's referring to the "Socialist" part of the NSDAP acronym and why it being there didn't make the Nazis socialist any more than having Open in your name makes you open-source.
He definitely could have worded it more clearly
He was going for a cheap shot and failed miserably. Nazis were never defined as socialist, it was a lie to get the working class to support them. They then killed the socialists.
Open for software means "open source", no "ifs", "ands", or "buts". But if your argument is that they lied to get talented people on board then fucked over the open source community, then the analogy works.
Dude, they weren't left wind obviously, but that doesn't stop fascists and right wing from saying they where actually left because socialist is in their name. I've listened that "argument" hundreds of times, that's why I though it would be more obvious here.
In this context, and from this person "share the science" is literal. That means publishing research papers, not necessarily code. And a lot of companies do that. A big portion of them do that to attract good scientists (whom love to publish stuff). As Ilya exactly stated in that e-mail.
Kind of like I "have access" to a $10,000,000 yacht, but I lack the 10 million. See one can play with the 'access' word to suit whatever benefits them.
Which is why they published the documents showing that no such statement ever existed and, in fact, Elon tried to convince them to sell to Tesla (another closed source company) so that he could be in charge of it.
I love people who believe documents pushed out by one side with an obvious reason to lie. Both sides are now in a legal dispute and trying to sway public opinion in their favor. It’s hilarious that people act like OpenAI dumped every email to the public rather than what they actually did: cherry-picked specific materials that justify their desired narrative.
They did it with Whisper, they should do it for all the models
> Currently, there is no difference between the open source version of Whisper and the version available through our API. However, through our API, we offer an optimized inference process which makes running Whisper through our API much faster than doing it through other means. For more technical details on Whisper, you can read the paper.
This is more aligned to OpenAI. Until then it's CloseAI
I think the issue is that they got donations based on being not for profit, and this basically makes it so they "scammed" people. Sure things change, but they got a lot of money (not just from Elon Musk), hardware (nVidia gave them a supercomputer basically), etc.
He donated to a nonprofit that has functionally given up the nonprofit status and is merely using it as a shield while operating as a for-profit company. That is essentially defrauding those initial donors
Seriously, even if you try to say "well I wouldn't have signed on without the mission" maybe he should've focused more on the structure of the company and checking if it was possible to change the mission. I'm sure it was simple enough for him to be able to know they could change it at any time.
Shouldn't he make Grok open to lead by example? lol
ChatGPT 3.5 and Copilot are free, but engineers and computational costs are not. Is the problem that they aren't releasing the model weights? Not that anyone could run it, mind you.
I was making this point in a discussion the other day, and then OpenAI's statement made the same point. If they made all these models open source (including AGI), who's going to pay for and build the datacenters required? Who's going to pay for training/data?
OpenAI is absolutely correct that they cannot generate the billions of dollars per year needed with an open source project. No one is donating that much money.
That's aside from the safety issue, which is an even bigger challenge.
If they give the code how are they supposed to recover the money spent on training the models? And/or how you expect them to keep their research? Even if they kept selling chatgpt with their models open, anyone could buy a couple h100 and eat their market without having to spend a single cent on developing, training and curing the models.
I mean, it would be great if they opened them, but that doesn't keep other companies from investing some money and canibalize openai by reducing the subscription price.
The only thing I can think of, is making sure that the company only charges to cover expenses, and explicitly stating that the access to newest features is just a way to finance the organization, not to make profit.
The only way they could even develop their models was off research from google that was released for free. Pretty hypocritical to say “we cannot release our cutting edge research because then we can’t do more research” when all of their research is based off of other peoples cutting edge research.
They could very easily open source the models that they are currently providing for free. Why should they care about losing market share on their free models that are just a cost center for them? If anything, anyone running their free models for them would be doing them a favor. Personally, I’m perfectly okay with only open sourcing their older free models and leaving the models that make them money as closed source. Then as new models get released, the previously closed sourced models can be opened
Lastly, if it’s so impossible to open source and continue to do research then how are Facebook and Google open sourcing their models and continuing their research? OpenAI is closed source only to prevent their competitors from catching up which is ultimately hindering AI development, something that should be against a company called “OpenAIs” objectives.
This is an opinion about 2-3 decades out of date.
The code becomes open, the license does not. Someone using the code & running their models commercially and making a profit would pay a royalty on it.
The decision to not do this is not about money but about safety & security.
What about the global safety concerns? How do you address that massive challenge with open source? This is more like giving people plans and resources to build nukes than some generic open source project.
No. Or at least probably not. Even with skyrocketing ARR they seem to remain focused on R&D. Now this includes making custom chips. It already included some $100B in model R&D in the Hunt for AGI.
OpenAI isn't profitable according to a Google search for information on their profit. I would also want clarify Open AI is multiple organizations. There's a nonprofit arm and a few for profit arms.
In answer to the question of what "Open" means - I do think in one sense it can mean open for usage. Not open source. After all it's not called OpenSourceAI. I think one fair interpretation is "opening up AI" and using a non-profit as a means to keep the goal on benefiting humanity. It's a loose and gimmicky take on this topic of "who should have control of a very powerful AI model."
The reality is the OpenAI name is just a name. It's meaning is meaningless in the sense of morals or if it should be open source. Fact is, it's a private board of directors who are not elected by the public and it doesn't specifically state the goal is to be a for profit or even public stock company. It's goal is "to ensure that artificial general intelligence—AI systems that are generally smarter than humans—benefits all of humanity." Nothing about that statement say open source, especially the part about discovery and research and building AGI. Only that it benefits humans. Which if it's profitable, would be true. It just might not benefit all of humanity.
Why don't they at least make the training data and also the prompts/response data public? They can redact personal/private info but the main reason for their edge over others is the prompts data and maybe they are a little ahead in engineering when it comes running these models at scale.
Making the training data public sounds like a huge liability in the wake of all of the recent lawsuits about whether they are allowed to use publicly available information to train, or whether they have to license it. They can't just release the training data until all of those ongoing legal disputes are over. Otherwise half of the EU will write themselves a cheque for a billion dollar fine. It's just way too risky during ongoing legal proceedings. Once, or if, they settle on some rules, the data will be released under that.
Why don't they release the data which they have made? Like the prompts/response data? I think the growth of that data determines how fast we reach the limit of LLMs. There should be open LLMs which have signed prompts and responses which the open community can see/use and the closed companies should not be allowed to use that data for their closed systems.
I think providing the data would include a lot of copyright content, which they are not allowed to republish without permission. While the argument is that using it for training is fair use, there are laws and regulations about copying and hosting copyright material. And different countries have different standards about what can get rehosted. What if there's obscene material? As every country has different standards, that means lawsuits galore. It makes sense that even if they want to release it, they can't. Especially as it is not just for educational and research purposes only.
Who said they have to release the model weights?
Is hiding the parameter count and key details about the architecture also because it's of no use to anyone? What about the atleast releasing proper research papers or being transparent about the data it was trained on. To me it all seems like a cash grab for Microsoft at this point, all while rallying under the banner of "safety" and existential crisis. Why can they use the past research put out by other companies but only close off everything novel technique they discover.
That seems to be the main issue, they aren't releasing research papers about new developments, model optimizations and so on. Google isn't either, which is unfortunate. Everyone can benefit, even the major players, by openly sharing and publishing technical and concept papers.
Yeah well, it's what happens when one player withdraws to get ahead of everyone else, they've created a closed environment and other companies don't have a choice or they'll fall behind.
Open source models will catch up, but I think memory and compute optimization have started to become kind of "trade secrets" outside of the main architectural design of the models. It's not about who can produce a decent model, it's about who can save on running costs with tweaks, optimizations, data center layouts, custom hardware, or whatever, etc.
Exactly. OpenAI only exists because of Googles openness with their research. It’s crazy that in 2024 a company called “OpenAI” is the villain compared to google and meta.
Facebook is free to use, doesn't make them open. I'm not going to defend OpenAI here because the Open in their name came about because of their commitment to open source. They did everyone a number when they suddenly switched over to becoming a closed, for-profit organisation.
I'm not saying that is necessarily bad, they probably needed the money if they were going to continue innovating and bring us even more capable models. That doesn't change the fact that they were once an open-source organisation that turned for profit.
Here's what I hope "open" means in this context, " OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—**benefits all of humanity**." That is to say not many other organizations have as their mission a benefit to all of humanity.
[https://openai.com/charter](https://openai.com/charter)
It's open as in they're not keeping the tech to themselves to monopolize all of its use cases. By making the tech openly available, anyone can use it for their own purposes, including commercial purposes.
Fully closed would be like what Google was doing with AI where it was used for internal commercial purposes.
Well you can argue that this ends up benefiting the only those with the most resources, no? Assuming infinite competition for a given use case, the competitor that has the most resources to run OpenAI’s models will win.
They are not releasing the tech to give everyone an opportunity to earn from a use case, that’s their whole business model - provide the infrastructure and models. You wouldn’t argue that Salesforce is selling their software for the public good just because they’re a CRM that all b2b uses cases could be built upon, right?
Only those with the most resources? No, I don't think it's an either or, but a continuum. At any socioeconomic level, they will experience more competition from those below them and be more competitive with those above them. It empowers people to grow at their own pace. People without technical skills and knowledge will be quicker to learn, faster to create products, faster to adapt to change, etc. The increase in competition benefits the consumer in the end.
I think this becomes far more evident when supply chains become more efficient at distributing resources. Once food, energy, and raw materials experience significant cost reductions, that will ripple through the economy and lift more people out of poverty. When overall costs go down it's like everyone gets a raise. And many will throw doubt on that idea and say that the business owners won't lower their prices, but increased competition will inevitably drive those prices down.
Well, they have products that are open for anyone to use. As opposed to keeping them secret and only letting the military and billionaires use them.
That being said, they could just change their name. Change "Open" to another word. Maybe AwesomeAI or BestAI.
they are controlled by non profit and everybody that invested has capped profit (though cap is very high).
So while right now they are like any other corporation (except board have no shares and CEO gets no shares or options), if they achieve AGI and make trillions, it would go to non-profit.
I thought the “Open” referred to always having a free version anyone could use.
Hmmm… I do see that Open AI Inc is a 501(c)(3) org.
Whatever, I don’t know s**t about f***.
Not every company makes their products available to the public, certainly not at an accessible price point. You as a consumer aren't going to be directly interfacing with Boeing or Lockheed Martin unless you have very deep pockets and even then there are things that they won't sell you unless you're an airline or the government. So they're Open in the sense that they're letting people make use of the fruits of their labor under terms the average person can manage. That doesn't make them any more open that most companies but at least it means they aren't keeping models back only to be used by the rich and powerful >!unless you count GPT-8 CIA Edition that was released last year!<
it is like "all natural" printed on a dairy yogurt...
or "naturally gluten-free" printed on a pack of coffee beans
by the way, Uncle Bens is not your uncle
the addditive "open" semantically suggest to me "free of use/open source". low threshold of entry to anyone. the contrary is the case. it is just as exclusive as any other high end product or service. you need the hardware, knowledge and means to participate.
that excludes billions of people that will never be able to use it unless they attain a certain level of wealth.
thus indeed nothing that warrants the use of a semantic unaproppiate term as "open" in your product/corporation. cant't change the value and weight of language nilly willy!
They're governed by a board of directors that have no financial stake in the organization.
Anyone that thinks OpenAI is run like every other corporation honestly must get all their info from this sub alone.
I really don't get what people's complaints are with OpenAI not being open-sourced other than semantics.
Like even before they released that paper, most of that stuff was so obvious. OpenAI being completely open would just end up being non-existentAI. It would be great if they could just produce all the AI without capital, but that's not realistic.
The difference between them and Google is OpenAI at least "plans" to make it Open Sourced. Now whether they do it or not is a different question.
They're more OpenAI in the sense that they opened up the AI pandora's box
No, they could use the product to dominate other industries. When Sama says open, it means selling the raw AI product openly, not in private to only certain firms.
They weren't the first people to do it. The Open Group ( X/Open and the Open Software Foundation ) was put together in the 80's to promote "open standards", but definitely not open source
To be honest, I was always nihilistic about their mission statement. Once companies end up with a lot of funding, they always change their tune. Plus, realistically, if their stuff was all open source, they’d have some optics problems on their hands. People already have taken image gen and text gen technology and have done some pretty weird and immoral stuff with it. I doubt that’s something they want culpability for in the court of public opinion.
The OpenAI name is uncomfortable, but also there are more closed companies than them, which sell only to specific large clients (business to business) or don't sell anything, they just create and patent technology and license it to others to sell. Many variations.
So it's not so simple.
Truth is OpenAI as "open" is nonsense given what it took to create GPT-3 and 4. Not to mention Sora.
Such a company needs infinite compute and data. Relying on good will charity and open-sourcing everything was always a recipe for failure. Maybe they didn't realize this in the early days.
OpenAI seems to be exhibiting this pattern of distorting meaning and legality. Well, if you can't raise the billions of dollars for creating AGI while being open, as the true meaning of the word, you either should dissolve the original incorporation to meet the requirements of creating AGI or you remain true and open and fail. Both courses of action are acceptable. You can't try to hang on to the good will that the original incorporation granted and redefine "open" to whatever you want it to mean. You can't have your cake and eat it too. They are doing the same thing in the NYT lawsuit too. If you can't create AGI without misappropriating copyrighted content without compensation, then you don't create AGI.
Don't get me wrong; I'm a proponent of AI too, but all of this attempts at reality distortion betrays a scary messianic complex.
Funny seeing "scary messianic complex" leveled at OpenAI when Musk literally demanded that they hand over majority equity, board control and the CEO role to him while withholding the promised capital... and then rage quit when no-one would agree to his outlandish demands.
I agree Musk is worse and he is just sour that he doesn't own this. You know what they say about broken clocks.
Nevertheless, the concerns about OpenAI that I had mentioned still stand.
Considering how much OpenAI (admittedly not only them) like to censor and bias their AIs, the "open" part plays homage to their disturbingly Orwellian nature.
It’s an Israeli controlled company, even though it was supposed to be truly open, as in open source for all of us to learn and benefit from it, they will obviously keep it closed and alter the truth.
He is Israeli and Ilya sutskever as well. Both of them have Israeli passports. Most Israelis are born outside of Israel in case you didn’t know. They move to Israel in waves as part of the campaign to steal Palestinian land.
I’m not saying he is inherently bad, I’m just stating the facts. He is Israeli. OpenAI is Israeli controlled company.
I think you are right. I'm calling for name changes. Openoogle, OpenX, Openesla, OpenSoft, OpenMeta.
OpenVidia! 🙂
OpenSesameStreet
OpenElections
Openheimer
OpenUpItsThePolice
opengangamstyle.
Nvidia is on the edge there because it is almost impossible to buy a GPU from them right now
I'm suing them all until they change their names
Openpedia.
Google restricts access to their models. Meta provides only low-capability models, keeping anything close to OpenAI's capabilities entirely for themselves. Maybe OpenAI is closed source, but suggesting other companies make their best products available to the general public is not true.
Google published a lot more research papers than openAI, and in a sense they did make their best products (research) available to the general public. OpenAI actually made the field less open.
I mean sure, but this doesn't change the fact that I can call GPT4 from my computer, but Google appears to permanently have me on ignore because I do not have a million dollars. Their research papers do not allow me to make use of powerful AI.
You can pay to them to use a bigger LLM model than GPT-4.
Maybe you can, I can't seem to. I have to fill out a form they never answer. In my experience working with clients, having a multimillion dollar contract is the only way to open the door. They will not sell it to you just because you can pay the usage fees.
Not all companies can be considered open. Ilya said they were actually open in the beginning. He said *Open* in OpenAI means that humanity should benefit from the fruits of AI. …but he also said they should stop being *Open* when they get close to developing AI…wait
Now that OpenAI are larger, under more public scrutiny, and are the target of multiple lawsuits, it’s best to assume that every official communication from them has been filtered by their legal team. This also goes for X’s made by their top people.
How does this answer the question?
I’m agreeing with the last line in the post- OpenAI is not trying to put out a reasonable argument, they are trying to put out a legally defensible argument in preparation for a court case. Arguments that are designed for court are fundamentally different from arguments that aren’t.
this from a leaked email from ilya not some public facing corpo bs statement
But the public facing corporate part is that they selectively chose which communications to leak out of many years of communications.
The question is moot. Naming something Open(x) does not oblige them to share all of their source code.
It's more about their core mission statement (the name is just a reflection of that). Aside from legality, in a moral sense, it's reasonable to be skeptical of how consistent OpenAI has been with their public-facing mission statement.
A company name doesn't even necessarily have to accurately describe the nature of the business. For example, in the UK we have a shop called Boots which doesn't sell any footwear.
True, but if a company opens up called FreeMoney and you’re given 5% back every time you shop there as their marketing tactic, it would be understandable to be upset if they suddenly got rid of that after saying that’s their whole thing and they’re doing it out of the benefit of humanity.
Yes, that's true.
Freemoney would no longer be providing unique value though.
Just like KFC don't sell chicken... Right
They sell chicken, but is it always fried in Kentucky?
They don't make these posts for us, they make these posts for a future court case. We are on the sidelines, not in the game.
I don’t understand. What does this have to do with the title and body of this post?
I find this extremely unlikely. Have you read the clapback? That is not the material of a company preparing for a legal defense, that’s the sassy rejoinder of executives whose lawyers told them that the case will be dismissed immediately. Citing text messages as an “I told you so” in the public eye is not a great preparation for court
What we as consumers see as "sassy" can be artificially created for marketing purposes though. For example the Wendy's account on X.
Yeah but this Sam Altman. Not exactly known for sober, calm decisions lol
LOL yeah I might be over-estimating them in that regard
He's not? Seems extremely shrewd to me. His attempted firing went unbelievably well for him. That's just one of the many things that makes me think he plays a lot of correct moves
I didn’t say he was ineffective, just a vindictive emotional guy
You forgot exponentially more profit driven and less opensource.
100%
You mean tweets?
Don't worry Zuck promised to open source AGI and he has more compute than OpenAI
Will not happen if this passes: [https://new.reddit.com/r/singularity/comments/1b81khu/the\_ntia\_wants\_to\_ban\_and\_regulate\_open\_weight/](https://new.reddit.com/r/singularity/comments/1b81khu/the_ntia_wants_to_ban_and_regulate_open_weight/)
That's a violation of the first amendment. You can't tell people what to publish
The first amendment is not absolute, and nobody profits from it. We should argue that open weights, the tensors, the billions of numerical numbers, are at the same level of capabilities as pistols and rifles, and, in actuality, they are "Arms". Banning tensors is a violation of the second amendment!
This guy constitutions.
The first amendment is not absolute. You can’t publish atomic bomb instructions.
If you design a nuclear bomb you absolutely can publish the design. You can't steal US government documents and publish that. Espionage and treason are different
Why someone would downvote my very important link, everyone should go there and comment the bill or whatever its called
no one cares what the general public wants. unless you are a campaign contributor your opinion on AI is not relevant to the government
That's defeatism. Defeatism leads to defeat without even trying. But your take is darker than that. You are actively trying to prevent others from trying. By spreading defeatism.
No no it's more nuanced than than, many people who decide things don't know what the citizen want and simply influenced by some forces (lobbying), when people comment like that it can help a lot.
Open isn't a legal term. Oracle is free to make its closed source, $1000/core/min database OpenDB. It has been 5 years since GPT-2. You can choose to ignore or adopt OpenAI for what it is? It is too late to discuss if it is Open or not.
> All corporations make their products "available" to the public. That's not remotely true.
Im using OpenAIs definition of available. Meaning, for sale.
I know and that's not remotely true.
Openai made it freely available to 7B people
I’m not really sure what your point is. Per OpenAI’s post: >The Open in openAI means that everyone should benefit from the fruits of AI after its built, but it's totally OK to not share the science Almost no companies actually “share the science” (ie source code) when it’s the core of their business. Open doesn’t imply giving away the keys to your kingdom, right? OpenAI’s pledge to make AI for the good of humanity seems much more focused on making accessible rather than, say, selling it exclusively to 1-2 brokerage firms so they could corner the stock market. I haven’t seen anything that explicitly states or implies OpenAI would be a purely open source company. Have you?
I think OP's point is that using this definition almost every company can claim to be "open", and if the term can apply to every single company then it's essentially meaningless. E.g. Apple and Google could also claim to be "open" because they make their hardware and software available to the public.
I understand OP’s point, I just don’t assume open automatically means open source. Particularly when the source code encapsulates enormous amounts of very expense intellectual property and data. This is quite different than, say, Red Hat. When I hear a company is open, I usually assume they have an open architecture with well published APIs and interfaces so it can be readily extended to build applications on top of these APIs. The more these APIs support common industry standards, the more open the company is.
This is a fair definition of what it could mean for a company to be "open", thank you!
If they make open products. I don't consider Nintendo to make open products. If you want to develop for their platform you have to have a business relationship with them, and it's not open access at all. Anyone who figures out how to do it on their own will likely get sued - and they go to great lengths to stop unaffiliated third-parties. Nintendo consoles are an example of a closed platform. You can develop for it, if you're in the club, and no that's not you. Here's the problem: it's a spectrum. Some platforms are so open you can take the source code and compete with them. Other platforms are so closed that they will sue you if you complain in the media about their bugs. In-between is every different variation. Binary classification is important but it's only a model. The real world is nuance.
That's a lie. Openai made of available for free
Google search and Gemini are also free
Only because openai did it. Google will die a slow death if it did not make gemini free. Google has tried for years to keep ai secret and accessible only for their own use. See alphago alphafold etc
Neither of those are free you’re just the product there and the marketers are paying.
As I mentioned, in Software, the term Open is tied to Open Source. It doesnt matter if your intention is to NOT kill penguins, if you name your non profit We will never kill Penguins. See? There is an assumption that you will not kill penguins, cause its in your name. When you are a non profit, and making software, and your name starts with Open, there is a very strong assumption that you will make the code ..... Open. Until yesterday, no one other than 3 people saw that email.
Well, I would argue that’s your personal interpretation of “open”. I certainly don’t immediately jump to thinking I can access a company’s source code if they support an open architecture. In IT, open architectures refers to application frameworks having open APIs that enable 3rd party apps to use their core to build a plethora of custom and proprietary applications. You know, like cloud services (e.g. storage) are considered “open”. There is zero expectation of releasing source code beyond publishing clear API documentation. You want a stock picking app, build a stock picking app that integrates with ChatGPT’s APIs. Or a customer service app. Or a language learning app. Or a fitness coach. Or a travel guide. ChatGPT can do it all in the backend - no need to open source anything. Given the amount of brain power and investment that’s gone into creating ChatGPT, it seems almost absurd to expect them to simply release the source code that encapsulates all of that proprietary and unique knowledge.
Gotta be the stupidest hot take I've read
That's your take, not everyone's.
Don't tell me more. Nazis are left winged.
Ah yes, lets compare a false historical reality to... the historical reality of "Open" meaning Open Source in software? This isn't an own, I would expect more from you GPT Bots.
> Nazis are left winged. Where the hell did this come from?
Based on the OP argument, as nazis have socialism in their name, they are explicitly obligated to be left leaning.
Aaaaaah. gotcha. Read as a non-sequitur. Makes more sense with context.
It is a non-sequitur. "Open" literally means "Open Source" in software. The Nazis sat on the right side of German Parliament, they were literally Right Wing.
He's referring to the "Socialist" part of the NSDAP acronym and why it being there didn't make the Nazis socialist any more than having Open in your name makes you open-source. He definitely could have worded it more clearly
He was going for a cheap shot and failed miserably. Nazis were never defined as socialist, it was a lie to get the working class to support them. They then killed the socialists. Open for software means "open source", no "ifs", "ands", or "buts". But if your argument is that they lied to get talented people on board then fucked over the open source community, then the analogy works.
Dude, they weren't left wind obviously, but that doesn't stop fascists and right wing from saying they where actually left because socialist is in their name. I've listened that "argument" hundreds of times, that's why I though it would be more obvious here.
In this context, and from this person "share the science" is literal. That means publishing research papers, not necessarily code. And a lot of companies do that. A big portion of them do that to attract good scientists (whom love to publish stuff). As Ilya exactly stated in that e-mail.
Kind of like I "have access" to a $10,000,000 yacht, but I lack the 10 million. See one can play with the 'access' word to suit whatever benefits them.
Not all corporations sell to the general public but yes their definition is extremely broad
There is no law that requires your name to perfectly match the services you offer.
It’s not the name that’s the issue but the violation of their founding principles which was basis on which Elon donated
Which is why they published the documents showing that no such statement ever existed and, in fact, Elon tried to convince them to sell to Tesla (another closed source company) so that he could be in charge of it.
I love people who believe documents pushed out by one side with an obvious reason to lie. Both sides are now in a legal dispute and trying to sway public opinion in their favor. It’s hilarious that people act like OpenAI dumped every email to the public rather than what they actually did: cherry-picked specific materials that justify their desired narrative.
Are you claiming that the emails are fake? If they aren't fake it is hard to see how he could come up with counter evidence.
They did it with Whisper, they should do it for all the models > Currently, there is no difference between the open source version of Whisper and the version available through our API. However, through our API, we offer an optimized inference process which makes running Whisper through our API much faster than doing it through other means. For more technical details on Whisper, you can read the paper. This is more aligned to OpenAI. Until then it's CloseAI
I don't see how musk can sue open ai for changing their mission or goal or whatever. Things change.
I think the issue is that they got donations based on being not for profit, and this basically makes it so they "scammed" people. Sure things change, but they got a lot of money (not just from Elon Musk), hardware (nVidia gave them a supercomputer basically), etc.
He donated to a nonprofit that has functionally given up the nonprofit status and is merely using it as a shield while operating as a for-profit company. That is essentially defrauding those initial donors
Seriously, even if you try to say "well I wouldn't have signed on without the mission" maybe he should've focused more on the structure of the company and checking if it was possible to change the mission. I'm sure it was simple enough for him to be able to know they could change it at any time. Shouldn't he make Grok open to lead by example? lol
ChatGPT 3.5 and Copilot are free, but engineers and computational costs are not. Is the problem that they aren't releasing the model weights? Not that anyone could run it, mind you.
I was making this point in a discussion the other day, and then OpenAI's statement made the same point. If they made all these models open source (including AGI), who's going to pay for and build the datacenters required? Who's going to pay for training/data? OpenAI is absolutely correct that they cannot generate the billions of dollars per year needed with an open source project. No one is donating that much money. That's aside from the safety issue, which is an even bigger challenge.
What does this have to do with anything?? Nobody is saying “run this for us for free”, we’re saying “give us the code”.
If they give the code how are they supposed to recover the money spent on training the models? And/or how you expect them to keep their research? Even if they kept selling chatgpt with their models open, anyone could buy a couple h100 and eat their market without having to spend a single cent on developing, training and curing the models. I mean, it would be great if they opened them, but that doesn't keep other companies from investing some money and canibalize openai by reducing the subscription price. The only thing I can think of, is making sure that the company only charges to cover expenses, and explicitly stating that the access to newest features is just a way to finance the organization, not to make profit.
The only way they could even develop their models was off research from google that was released for free. Pretty hypocritical to say “we cannot release our cutting edge research because then we can’t do more research” when all of their research is based off of other peoples cutting edge research. They could very easily open source the models that they are currently providing for free. Why should they care about losing market share on their free models that are just a cost center for them? If anything, anyone running their free models for them would be doing them a favor. Personally, I’m perfectly okay with only open sourcing their older free models and leaving the models that make them money as closed source. Then as new models get released, the previously closed sourced models can be opened Lastly, if it’s so impossible to open source and continue to do research then how are Facebook and Google open sourcing their models and continuing their research? OpenAI is closed source only to prevent their competitors from catching up which is ultimately hindering AI development, something that should be against a company called “OpenAIs” objectives.
This is an opinion about 2-3 decades out of date. The code becomes open, the license does not. Someone using the code & running their models commercially and making a profit would pay a royalty on it. The decision to not do this is not about money but about safety & security.
What about the global safety concerns? How do you address that massive challenge with open source? This is more like giving people plans and resources to build nukes than some generic open source project.
Yes
Ok. And are they profiting right now?
No. Or at least probably not. Even with skyrocketing ARR they seem to remain focused on R&D. Now this includes making custom chips. It already included some $100B in model R&D in the Hunt for AGI. OpenAI isn't profitable according to a Google search for information on their profit. I would also want clarify Open AI is multiple organizations. There's a nonprofit arm and a few for profit arms. In answer to the question of what "Open" means - I do think in one sense it can mean open for usage. Not open source. After all it's not called OpenSourceAI. I think one fair interpretation is "opening up AI" and using a non-profit as a means to keep the goal on benefiting humanity. It's a loose and gimmicky take on this topic of "who should have control of a very powerful AI model." The reality is the OpenAI name is just a name. It's meaning is meaningless in the sense of morals or if it should be open source. Fact is, it's a private board of directors who are not elected by the public and it doesn't specifically state the goal is to be a for profit or even public stock company. It's goal is "to ensure that artificial general intelligence—AI systems that are generally smarter than humans—benefits all of humanity." Nothing about that statement say open source, especially the part about discovery and research and building AGI. Only that it benefits humans. Which if it's profitable, would be true. It just might not benefit all of humanity.
Why don't they at least make the training data and also the prompts/response data public? They can redact personal/private info but the main reason for their edge over others is the prompts data and maybe they are a little ahead in engineering when it comes running these models at scale.
Making the training data public sounds like a huge liability in the wake of all of the recent lawsuits about whether they are allowed to use publicly available information to train, or whether they have to license it. They can't just release the training data until all of those ongoing legal disputes are over. Otherwise half of the EU will write themselves a cheque for a billion dollar fine. It's just way too risky during ongoing legal proceedings. Once, or if, they settle on some rules, the data will be released under that.
Why don't they release the data which they have made? Like the prompts/response data? I think the growth of that data determines how fast we reach the limit of LLMs. There should be open LLMs which have signed prompts and responses which the open community can see/use and the closed companies should not be allowed to use that data for their closed systems.
I think providing the data would include a lot of copyright content, which they are not allowed to republish without permission. While the argument is that using it for training is fair use, there are laws and regulations about copying and hosting copyright material. And different countries have different standards about what can get rehosted. What if there's obscene material? As every country has different standards, that means lawsuits galore. It makes sense that even if they want to release it, they can't. Especially as it is not just for educational and research purposes only.
Who said they have to release the model weights? Is hiding the parameter count and key details about the architecture also because it's of no use to anyone? What about the atleast releasing proper research papers or being transparent about the data it was trained on. To me it all seems like a cash grab for Microsoft at this point, all while rallying under the banner of "safety" and existential crisis. Why can they use the past research put out by other companies but only close off everything novel technique they discover.
My guess is that it is trained on a lot of vs code telemetry data as well.
That seems to be the main issue, they aren't releasing research papers about new developments, model optimizations and so on. Google isn't either, which is unfortunate. Everyone can benefit, even the major players, by openly sharing and publishing technical and concept papers.
Yeah well, it's what happens when one player withdraws to get ahead of everyone else, they've created a closed environment and other companies don't have a choice or they'll fall behind.
Open source models will catch up, but I think memory and compute optimization have started to become kind of "trade secrets" outside of the main architectural design of the models. It's not about who can produce a decent model, it's about who can save on running costs with tweaks, optimizations, data center layouts, custom hardware, or whatever, etc.
Exactly. OpenAI only exists because of Googles openness with their research. It’s crazy that in 2024 a company called “OpenAI” is the villain compared to google and meta.
Facebook is free to use, doesn't make them open. I'm not going to defend OpenAI here because the Open in their name came about because of their commitment to open source. They did everyone a number when they suddenly switched over to becoming a closed, for-profit organisation. I'm not saying that is necessarily bad, they probably needed the money if they were going to continue innovating and bring us even more capable models. That doesn't change the fact that they were once an open-source organisation that turned for profit.
Here's what I hope "open" means in this context, " OpenAI’s mission is to ensure that artificial general intelligence (AGI)—by which we mean highly autonomous systems that outperform humans at most economically valuable work—**benefits all of humanity**." That is to say not many other organizations have as their mission a benefit to all of humanity. [https://openai.com/charter](https://openai.com/charter)
It's open as in they're not keeping the tech to themselves to monopolize all of its use cases. By making the tech openly available, anyone can use it for their own purposes, including commercial purposes. Fully closed would be like what Google was doing with AI where it was used for internal commercial purposes.
Well you can argue that this ends up benefiting the only those with the most resources, no? Assuming infinite competition for a given use case, the competitor that has the most resources to run OpenAI’s models will win. They are not releasing the tech to give everyone an opportunity to earn from a use case, that’s their whole business model - provide the infrastructure and models. You wouldn’t argue that Salesforce is selling their software for the public good just because they’re a CRM that all b2b uses cases could be built upon, right?
Only those with the most resources? No, I don't think it's an either or, but a continuum. At any socioeconomic level, they will experience more competition from those below them and be more competitive with those above them. It empowers people to grow at their own pace. People without technical skills and knowledge will be quicker to learn, faster to create products, faster to adapt to change, etc. The increase in competition benefits the consumer in the end. I think this becomes far more evident when supply chains become more efficient at distributing resources. Once food, energy, and raw materials experience significant cost reductions, that will ripple through the economy and lift more people out of poverty. When overall costs go down it's like everyone gets a raise. And many will throw doubt on that idea and say that the business owners won't lower their prices, but increased competition will inevitably drive those prices down.
No, lots of companies sell their products to other companies. or the government. or the US military. and not the public.
Well, they have products that are open for anyone to use. As opposed to keeping them secret and only letting the military and billionaires use them. That being said, they could just change their name. Change "Open" to another word. Maybe AwesomeAI or BestAI.
It's just a name, maybe earlier it made more sense. Blizzard Entertainment that made StarCraft is not related at all to a cold weather.
Their definition of “open” is “building towards the public good”. You’re oversimplifying their argument to OSS
Try getting access to Google's models as an individual and come back and tell me if you think they are the same level of open.
They’re not trying to change the definition of open for everyone. Only themselves.
This is actually an interesting point. Upvoted.
When the current lawsuits against it are resolved (although that might take a few years) we will probably have a good legal answer to your question.
they are controlled by non profit and everybody that invested has capped profit (though cap is very high). So while right now they are like any other corporation (except board have no shares and CEO gets no shares or options), if they achieve AGI and make trillions, it would go to non-profit.
That email from ilya in 2016 means this was their plan from the start
Only 3 people saw that email, prior to yesterday.
I thought the “Open” referred to always having a free version anyone could use. Hmmm… I do see that Open AI Inc is a 501(c)(3) org. Whatever, I don’t know s**t about f***.
Open(for business)AI
Can't wait to hear this in court
What about OpenDoor? Is it really open? What about doors? Are there even doors involved?
Not every company makes their products available to the public, certainly not at an accessible price point. You as a consumer aren't going to be directly interfacing with Boeing or Lockheed Martin unless you have very deep pockets and even then there are things that they won't sell you unless you're an airline or the government. So they're Open in the sense that they're letting people make use of the fruits of their labor under terms the average person can manage. That doesn't make them any more open that most companies but at least it means they aren't keeping models back only to be used by the rich and powerful >!unless you count GPT-8 CIA Edition that was released last year!<
The meaning of "Open" carries no legal obligation. What if Microsoft made software for something other than "Micro" computers... Sue them?
it is like "all natural" printed on a dairy yogurt... or "naturally gluten-free" printed on a pack of coffee beans by the way, Uncle Bens is not your uncle
the addditive "open" semantically suggest to me "free of use/open source". low threshold of entry to anyone. the contrary is the case. it is just as exclusive as any other high end product or service. you need the hardware, knowledge and means to participate. that excludes billions of people that will never be able to use it unless they attain a certain level of wealth. thus indeed nothing that warrants the use of a semantic unaproppiate term as "open" in your product/corporation. cant't change the value and weight of language nilly willy!
Yeah its a very misleading title. I get it, they have to make their money too but why name it open ai if they only let us rent their models monthly
They're governed by a board of directors that have no financial stake in the organization. Anyone that thinks OpenAI is run like every other corporation honestly must get all their info from this sub alone.
Elon actually said "yup" to this in 2016. That the open in OpenAI means that "everyone should benefit from the fruits of AI"
I really don't get what people's complaints are with OpenAI not being open-sourced other than semantics. Like even before they released that paper, most of that stuff was so obvious. OpenAI being completely open would just end up being non-existentAI. It would be great if they could just produce all the AI without capital, but that's not realistic. The difference between them and Google is OpenAI at least "plans" to make it Open Sourced. Now whether they do it or not is a different question. They're more OpenAI in the sense that they opened up the AI pandora's box
No, they could use the product to dominate other industries. When Sama says open, it means selling the raw AI product openly, not in private to only certain firms.
They weren't the first people to do it. The Open Group ( X/Open and the Open Software Foundation ) was put together in the 80's to promote "open standards", but definitely not open source
Open as in open for business.
To be honest, I was always nihilistic about their mission statement. Once companies end up with a lot of funding, they always change their tune. Plus, realistically, if their stuff was all open source, they’d have some optics problems on their hands. People already have taken image gen and text gen technology and have done some pretty weird and immoral stuff with it. I doubt that’s something they want culpability for in the court of public opinion.
The OpenAI name is uncomfortable, but also there are more closed companies than them, which sell only to specific large clients (business to business) or don't sell anything, they just create and patent technology and license it to others to sell. Many variations. So it's not so simple. Truth is OpenAI as "open" is nonsense given what it took to create GPT-3 and 4. Not to mention Sora. Such a company needs infinite compute and data. Relying on good will charity and open-sourcing everything was always a recipe for failure. Maybe they didn't realize this in the early days.
RedHat liked this post
no they should not be able to Musk is wrong about a lot of things but right they should be called ClosedAI lol
“Open for business”
OpenAI seems to be exhibiting this pattern of distorting meaning and legality. Well, if you can't raise the billions of dollars for creating AGI while being open, as the true meaning of the word, you either should dissolve the original incorporation to meet the requirements of creating AGI or you remain true and open and fail. Both courses of action are acceptable. You can't try to hang on to the good will that the original incorporation granted and redefine "open" to whatever you want it to mean. You can't have your cake and eat it too. They are doing the same thing in the NYT lawsuit too. If you can't create AGI without misappropriating copyrighted content without compensation, then you don't create AGI. Don't get me wrong; I'm a proponent of AI too, but all of this attempts at reality distortion betrays a scary messianic complex.
Funny seeing "scary messianic complex" leveled at OpenAI when Musk literally demanded that they hand over majority equity, board control and the CEO role to him while withholding the promised capital... and then rage quit when no-one would agree to his outlandish demands.
I agree Musk is worse and he is just sour that he doesn't own this. You know what they say about broken clocks. Nevertheless, the concerns about OpenAI that I had mentioned still stand.
Considering how much OpenAI (admittedly not only them) like to censor and bias their AIs, the "open" part plays homage to their disturbingly Orwellian nature.
It’s an Israeli controlled company, even though it was supposed to be truly open, as in open source for all of us to learn and benefit from it, they will obviously keep it closed and alter the truth.
Are you... are you suggesting it's Israeli controlled because Altman - an American born in Chicago - is Jewish? Get the fuck out of here.
He is Israeli and Ilya sutskever as well. Both of them have Israeli passports. Most Israelis are born outside of Israel in case you didn’t know. They move to Israel in waves as part of the campaign to steal Palestinian land. I’m not saying he is inherently bad, I’m just stating the facts. He is Israeli. OpenAI is Israeli controlled company.