Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Should theoretical research of data structures and algorithms have been capped at 1GB in 1980 because that was the biggest single hard drive available back then and you couldn’t store for example a 2GB dataset on a disk?


Not at all, I'll still call out fantastic claims when I see them though.


Google has definitely indexed over a trillion pages.


Do you have any sources for this claim?

As far as I am aware Google doesn't publish any statistics about the size of its index, which no doubt varies.



Well what do you know, they contradict the claim made above.


Sorry, they've crawled trillions of pages, and narrowed it down to an index of 100s of billions. Conveniently, the link answers your question of "can you have PB sized indices?" to which we can clearly say, yes.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: