> I would probably score about the same, does this prove I also rely on training data memorization rather than genuine programming reasoning?
It doesn't even prove the models do that. The RLVR environments being mostly Python isn't "training data memorization". That's just the kind of dumb thing people say to sound savvy.
Probably because there's a ton of code that deals with nested parentheses across languages in the training data, and models have learned how to work around tokenization limitations, when it comes to parentheses.
Thank you so much for this! At least now I can see the worst offenders. How did you even find this tool? The internet has almost no records of it. Amazing.
Still does not explain why this balloons over time. Aka if I restart my Mac right now and reopen the same exactly apps with the same exactly windows the WindowServer will take 80% less memory.
Unfortunately ANC only works on some sounds. In the suburbs you have leafblowers and cars driving past you on the sidewalk, and ANC removes everything /else/ which makes them stand out even more.
Leaf blowers and cars driving past are exactly the kind of thing that ANC works well on, a fairly constant noise. It doesn’t block out other kinds of things well. At least it always seems to go that way for me, so I can’t relate to your comment at all. My experience seems exactly the opposite and I have trouble imagining it differently for anyone else, because it works well against a constant noise kind of in the bass range.
>Leaf blowers and cars driving past are exactly the kind of thing that ANC works well on, a fairly constant noise.
I can tell you haven't heard a leaf blower in a while, if ever. The revving the operators inevitably do causes it to bounce up and down over the spectrum at completely random-seeming intervals. Punches right through ANC, windows, doors, walls.
As if that's not bad enough they pollute more than a gigantic SUV because of how much oil they burn being a two-stroke.
That’s what Apple states, yes, but I suspect that it’s also used for calibrating the inner microphones of newer AirPods which is used for the “live eq” which works by listening the feedback inside the ear.
From my experience, Apple can sometimes “forget” to tell things.
Modern Apple gives you control over everything by hiding it in Accessibility settings. You can control almost everything about AirPods and give them custom EQ there. But it doesn't have that.
It doesn't even prove the models do that. The RLVR environments being mostly Python isn't "training data memorization". That's just the kind of dumb thing people say to sound savvy.
reply