Hacker Newsnew | past | comments | ask | show | jobs | submit | thr8976's commentslogin

Aleksander in particular is deeply invested in AI safety as a mission. It's a very confusing departure, since most of the reporting so far indicates that Ilya and the board fired Sam to prioritize safety and non-profit objectives. A huge loss for OpenAI nonetheless.


Perhaps you could argue that he wants to stick with Sam and the others because if they start a company that competes with OpenAI, there’s a real chance they catch up and surpass OpenAI. If you really want to be a voice for safety, it’ll be most effective if you’re on the winning team.


One funny detail is that the OpenAI charter states that, if this happens, they will stop their own work and help the organisation that is closest to achieving OpenAI's stated goal.


But now it may be the regulations they've gotten in place will make it harder for any new upstarts to approach them.


Maybe Sam wants to build something for profit?


really?


https://openai.com/charter

Second paragraph of the "Long-term safety" section.


Depends how much research is driven by Ilya…


> If you really want to be a voice for safety, it’ll be most effective if you’re on the winning team.

If an AI said that, we'd be calling it "capability gain" and think it's a huge risk.


I dunno, the moat Sam tried to build might make it hard to make a competitor.


We are about to find out if the moats are indeed that strong.

xAI recently showed that training a decent-ish model is now a multi-month effort. Granted GPT-4 is still farther along than others but curious how many months/resources does that add up when you have the team that built it in the first place

But also, starting another LLM company might be too obvious a thing to do. Maybe Sam has another trick up his sleeve? Though I suspect he is sticking with AI one way or the other


> most of the reporting so far indicates that Ilya and the board fired Sam to prioritize safety and non profit objectives

Maybe Ilya discovered something as head of AI safety research, something bad, and they had to act on it. From the outside it looks as if they are desperately trying to gain control. Maybe he got confirmation that LLMs are a little bit conscious, LOL. No, I am not making this up: https://twitter.com/ilyasut/status/1491554478243258368


lol sorry if this is clearly a joke but who cares if it's a little bit conscious. So are fucking pigeons.


it would be funny if Ilya followed the ranks of Blake Lemoine and went off for AI consciousness


The way Sam and Greg were fired maybe led him to no longer have faith in the company and so he quit?


Important detail: Only Sam was fired, Greg was removed from the board and then later quit. Source: https://twitter.com/gdb/status/1725667410387378559


More like the guy who engineered this situation is an asshole and they don't want to work for him.


Who's the situation-engineer for some of us duller but curious folks?


It's been confirmed to be Ilya.


Who was that? How are they an asshole?


> since most of the reporting so far indicates that Ilya and the board fired Sam to prioritize safety and non-profit objectives

With evidence, or is this the kind of pure speculation that media indulges in when they have no information and have to appear knowledgeable?


Twitter rumors from “insiders”


No. Statements from Ilya himself.


A rudder only works as long as you are moving faster than the current. I can imagine (some) people concerned with safety also feeling a sense of urgency, because their ability to steer the AI toward the good is limited by their organization's engine of progress.


Aleksander in particular is deeply invested in AI safety as a mission. It's a very confusing departure, since most of the reporting so far indicates that Ilya and the board fired Sam to prioritize safety and non-profit objectives. A huge loss for OpenAI nonetheless.


Greg was a critically important IC and the primary author of the distributed training stack.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: