Hacker Newsnew | past | comments | ask | show | jobs | submit | more astrange's commentslogin

> I would probably score about the same, does this prove I also rely on training data memorization rather than genuine programming reasoning?

It doesn't even prove the models do that. The RLVR environments being mostly Python isn't "training data memorization". That's just the kind of dumb thing people say to sound savvy.


BF involves a lot of repeated symbols, which is hard for tokenized models. Same problem as r's in strawberry.

Interesting. So why do the models seem to handle deeply nested Lisp expressions just fine?

Probably because there's a ton of code that deals with nested parentheses across languages in the training data, and models have learned how to work around tokenization limitations, when it comes to parentheses.

It's because the models wouldn't work for coding if they couldn't do nested scopes, so people don't release models unless they work.

They can only do it in a limited form though, because transformer models only have limited "memory". I don't think they can fully implement parsing.


Kind of like saying that scaling the language area in a human brain won't lead to a human brain.

True, but just don't do that then.


AI has been consistently defined as "anything we can't make a computer do yet" since 1970.

https://quoteinvestigator.com/2024/06/20/not-ai/


Run `IOAccelMemory` in the terminal to see what's causing that (dirty and wired columns). It's probably an app with a lot of windows.

Or for general memory `footprint` is good.


Thank you so much for this! At least now I can see the worst offenders. How did you even find this tool? The internet has almost no records of it. Amazing.

Still does not explain why this balloons over time. Aka if I restart my Mac right now and reopen the same exactly apps with the same exactly windows the WindowServer will take 80% less memory.


Well, now you get to file a bug with Feedback Assistant…

If you send me the number it will get looked at.


Having assets worth X dollars is not the same thing as having X dollars. It means investors are willing to trade you X dollars for the assets.

Those investors don't want to give /you/ money, they want to give the billionaire money.


Apple's R&D budget for a single product is larger than the entire audiophile industry, and they also actually know how to manufacture things.

Unfortunately ANC only works on some sounds. In the suburbs you have leafblowers and cars driving past you on the sidewalk, and ANC removes everything /else/ which makes them stand out even more.

Leaf blowers and cars driving past are exactly the kind of thing that ANC works well on, a fairly constant noise. It doesn’t block out other kinds of things well. At least it always seems to go that way for me, so I can’t relate to your comment at all. My experience seems exactly the opposite and I have trouble imagining it differently for anyone else, because it works well against a constant noise kind of in the bass range.

>Leaf blowers and cars driving past are exactly the kind of thing that ANC works well on, a fairly constant noise.

I can tell you haven't heard a leaf blower in a while, if ever. The revving the operators inevitably do causes it to bounce up and down over the spectrum at completely random-seeming intervals. Punches right through ANC, windows, doors, walls.

As if that's not bad enough they pollute more than a gigantic SUV because of how much oil they burn being a two-stroke.


> If you keep in mind that AirPods are calibrated to your ears with your iPhone's FaceID camera, they provide nice, tailored sound.

That's only for spatial audio.


That’s what Apple states, yes, but I suspect that it’s also used for calibrating the inner microphones of newer AirPods which is used for the “live eq” which works by listening the feedback inside the ear.

From my experience, Apple can sometimes “forget” to tell things.


Modern Apple gives you control over everything by hiding it in Accessibility settings. You can control almost everything about AirPods and give them custom EQ there. But it doesn't have that.

> But it doesn't have that.

Adaptive EQ? Apple says otherwise [0].

What I was talking about was the "realtime" adaptive EQ tuned by pods' inward facing microphones.

What I argued was, this adaptive EQ also uses the calibration data from personalized spatial audio data.

[0]: https://support.apple.com/en-us/111863


Carriers definitely care about the OS if it's a major OS, because bugs can take them down.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: