I was just thinking about that. In my opinion, this find is sorta useless if these aren't digitalized and shared publicly.
To my knowledge, digitalization can be expensive, because they need hardware for high quality scans, and they have to be careful not to damage these books any further. I guess it all depends on the situation.
apropo username, having taken a crack at pulling relevant information out of scanned documents I agree that scan quality is very important (while often lengthy and expensive) especially if someone is trying to derive meaningful information from a digital copy without the physical copy to do a comparison with.
And from the look of the picture those books are massive and probably very delicate.
EDIT: to add a bit to the expensive part of this, it's expensive even with the willingness and resources to get it done, it's hard but unfortunately to even convince someone to dedicate these resources is a hurdle.
Ah, baloney. If you can open the book, you can photograph it with your iphone. You'll find the result answers your concerns. Try it with any of your books.
That reminds me, I have an out-of-copyright book by a namesake where I took the photos years ago — before I had a smartphone let alone one with built-in OCR — and still have not gotten around to transferring the text to wiki… source? wikibooks? One of them.
One fun use of emacs even if it isn't your main editor is to use M-x describe-char to see what a character is. Can detect and describe obscure unicode, emoji, math symbols, zero width spaces, right to left text symbols, etc. I often use it if some text looks a bit off and I suspect something is hiding in it, or if someone posts an emoji in IRC and I can't tell what it's supposed to be (in part due to small size). There is also M-x insert-char to search up a character by name and then write it.
I learned of this by watching a Xah Lee stream once where he used it. There is a similar thing with the vim-characterize plugin, but I like that the emacs way is built-in and works everywhere.
Right. People have been killing eachother since the beginning of time, no guns needed. I think it boils down to a societal ill that makes people want to kill eachother, weapons or not.
I'd love an experiment to prove this out. Take a room roughly equivalent in size to an average classroom and containing chairs, desks, wall hangings etc as would be found in most classrooms. Put a target dummy at one end of the room that can measure impact forces to various parts of the dummy. Define "death" conditions for the dummy - not just raw force, we want to allow for things like a knife to the throat.
Have people attack the dummy, starting from the other end of the room. Some participants will be given weapons like guns or knives, others will have to improvise with what's in the room. Measure the time it takes and use health monitoring devices to estimate the level of physical effort the participant applied.
Without doing this experiment I'm already pretty confident that a plot of time to kill will show the group given guns far ahead of the knife group or improv group. After all, they only have to aim and pull the trigger while the other groups have to move across the room. Likewise, the gun group would show far lower physical effort than the others, from the movement across the room as well as the greater need to directly apply human-sourced force. The knife group has to swing / stab / slice with enough force to meet the death conditions. The improv group has to potentially disassemble or rip apart something to then have a sharp or blunt instrument, then they still need to apply their own force like the knife group. Naturally this might change somewhat if we go nuts with the guns, like big heavy rifles or machine guns. Those would require some substantial physical effort. But handguns or the average AR-15? Not really.
Guns are a fast, low effort way to apply potentially lethal damage to a human target. You can absolutely apply lethal damage to a human in an uncountable number of other ways, but they are almost universally slower and higher effort.
Thus, we see frequent mass killings using guns, but knife incidents with high death totals are pretty rare.
I would argue that Notepad was one of the few apps that wasn't HN-bloated. Although, I kind of think I head that they rewrote it, so it probably is now...
I had a similar idea, at least in terms of resilience. It was, basically, to compress each piece of content just so that the compressed version would be theoretically uncensorable.
Jon Lech Johansen did this with DeCSS back in the day [1], and the compressed version of the program was a prime number, which gave it a sort of "untouchable" quality.
Obviously, doing this for much larger content (i.e. movies and general videos) would be a challenge, and this technique might not be the best choice. Still an interesting concept, though.
What you guys seem to be describing is "Freenet", but that project has been around forever and relatively obscure. These days, unfortunately, I wouldn't touch it with a 10ft pole due to safety concerns.
"Freenet is a peer-to-peer platform for censorship-resistant, anonymous communication. It uses a decentralized distributed data store to keep and deliver information"
From what I remember, it basically takes content and distributes "anonymous" and encrypted chunks of it across various members on the network. Content stays active by being replicated, and replication happens proportionally based on popularity. So content that never gets used basically disappears.
while true, the actual reason it's untouchable is that the companies cannot realistically sue everyone - it's too costly, and they cannot recover the funds.
I always find this a more entertaining way to learn about these topics, because they're questions which we might have had before, but never thought much about.
Apparently so, considering that this is the same person who got a hold of the No-Fly List a while back, and, you guessed it, they found it through Jenkins somehow.