Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Given how cheap drives are I think upstream bandwidth for that storage will quickly eclipse the storage itself.


I disagree- the hardest files to deploy in a decentralized web are "long tail" files, for which there is usually very little demand on any given day (by definition) but which are extremely important to establishing a new web platform.


I think I rememeber that napster and various other file sharing services held quite a lot of the long tail but I might be wrong (never a heavy user).

IIRC the problem was more that there was an awful lot of malware + the RIAA and MAFIAA.


Napster had a tail 40,000 files long, with the decentralized web we'll need a tail 1,000,000,000 documents long or something.


> Given how cheap drives are I think upstream bandwidth for that storage will quickly eclipse the storage itself.

Which suggests a solution -- keep several of copies of everything (or copy-equivalents with erasure coding) and then download the one(s) with the cheapest/free bandwidth.

People care if you run down their battery or use up their cellular data. Not so much when the device is plugged into AC power and connected to unlimited broadband/wifi. At any given time there should be enough such devices that you don't have to make requests of any of the expensive ones.

Add to that the ability of people who don't want to worry about it (and don't want anyone they know to worry about it) to lay out a few hundred bucks once for a 10TB NAS and leave it connected to their broadband connection, and it feels like an actual solution.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: