Hacker Newsnew | past | comments | ask | show | jobs | submit | joshu's commentslogin

sgi? evans and sutherland? gl came from sgi originally!

this is just slop


brackets aren’t really the best way to figure this out, is it? makes a lot of assumptions about human preferences that probably don’t actually hold. something like ELO might be better.

How did they get memory protection on 68000?


The AM-100/L we used at work until about 1988 did not have any memory protection. It was a pain in the rear trying to develop on it while it was being used in production, as you could easily lock up the entire machine and shut everybody down for five minutes while it rebooted. But our current ERP system is still derived from a Windows port of a DOS port of the original system I built in Pascal on that AM.

We've still got the old machine sitting in a dusty storage room. Last time I tried to fire it up, which was probably more than twenty years ago, it wouldn't boot due to bad RAM. I called the company to see if I could get any documentation on it, as ours was long gone, but they told me they had no interest in helping.


If only we had rust back then


They didn't (not for AMOS at least, the UNIMOS capable machines had an external MMU).

"AMOS is also a strict real-memory operating system, which is to say there's no MMU, and programs were expected to be fully position-independent and run wherever the monitor ended up loading them. This makes it fast, but also makes it possible for jobs to stomp on other jobs, and it was not uncommon for busy systems to crash on a regular basis."


68451 or a custom SUN-like (SRAM, kind of like a PDP11) MMU, there was a guy who went around Silicon Valley in the mid 80s designing SUN-like MMUs for companies, they were all different, and some were broken (couldn't protect user space from kernel space).

68000s however had a problem: they couldn't return correctly from a page (MMU) fault (68010s fixed that) for a pre-VM (pre BSD or SVR2) UNIX world - however you could get around this with a few smarts


I think someone worked around it by running two 68000 in lock-step, or-one-step-behind or something like that.


yeah, that's rather a pain though and it effectively leaves one 68k frozen while the other services the page fault - it means you can't run another user process while the page is being read in (because it too might cause a page fault)


how is it that logitech software is such awful trash


Because people keep buying their generic hardware, and random youtubers keep recommending their stuff.

How about we just stop buying anything logitech. What other peripheral company has squandered their resources as much as they have, completely refusing to innovate?


i've made cultured butter. you want to use the correct strain. i buy a buttermilk culture specifically


all the insane and/or speculative projects that i never did because they would require heavy lift but with vague outcomes are now in progress. it's glorious.


this is cool. i built something similar a while back using wavelets and matching pursuit in a similar manner but with a different goal; i wanted to make an image compressor that had different visual effects when the file format was glitched. here are some examples of it moving variance from the original above to the compressed image below: https://youtube.com/shorts/f2pZyZNXY0Q?si=HXf14pOs9DaAk7MZ https://youtube.com/shorts/-LIALRpU63o?si=p_MiFnT8MMX0C0b4


bean soup theory in action


Sure. Or I am just making a commentary on how Micron has forgone the consumer market for this. Either way is fine as an interpretation though.


i guess i should have written up my claude/plotting workflow already. i didn’t bother actually plotting them. https://x.com/joshu/status/2018205910204915939


Let’s connect if you’re interested. marc at harmonique.one


it was a dreadful, useless computer, even then


Unlike the PS3 which the US Air Force bought 1,760 and clustered into the 33rd most powerful** at the time.

(**Distributed computing is very cheat-y compared to a "real" supercomputer which has insane RDMA capabilities)


We had clusters of them in university too.

If all you needed to do was vector math, a dedicated vector processor with eight cores that are capable of running as fast as the extremely wide bus could feed them with data is the way to do it. You couldn't buy anything close to it's capabilities (for that specific task) for the money.

I remember the course we used them in being hard as hell, and the professor didn't really have any projects prepared that would really push the system.


From what I understand (may be wrong) this is exactly the reason that they stopped allowing Linux installs on PS3s.

People were buying them just for this purpose. However, the consoles were sold at a discount because Sony expected users to buy games, controllers, etc. If someone bought a PS3 alone, without anything else then Sony lost money.


It coincidentally happened around the time this came out.

https://web.archive.org/web/20110106074158/http://psx-scene....

Then they sued him. There's a bunch of archived links on his Wikipedia page.


"it was a dreadful, useless computer, even then"

So you don't dispute the thesis that the hypothetical general-purpose machine described in the comment would have needed to have been been better than the PS2?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: