Hacker Newsnew | past | comments | ask | show | jobs | submit | chandlerswift's commentslogin

Am I reading all of these backwards? You say

> He predicted at least $250 million in damages from Black Lives Matter protests.

He says

> 5. At least $250 million in damage from BLM protests this year: 30%

which, by my reading means he assigns it greater-than-even odds that _less_ than $250 million dollars in damages happened (I have no understanding of whether or not this result is the case, but my reading of your post suggests that you believe that this was indeed the outcome).

You say > He gave a 70% chance to Vitamin D being generally recognized as a good COVID treatment

while he says > Vitamin D is _not_ generally recognized (eg NICE, UpToDate) as effective COVID treatment: 70% (emphasis mine)



I'm interested in hearing more about the history behind Debian and zlib! I did some searching and the closest thing I could find was a nod to the same incident in the Upstream Guide[0]. Do you know of a place where I could read more about it?

[0]: https://wiki.debian.org/UpstreamGuide#No_inclusion_of_third_...


It's been so long ago, that it's hard to find all the discussions I had seen back then. I recall that it was after a long time without any zlib release, so looking at the zlib history, I think it was this one fixed in zlib 1.1.4 from 11 March 2002: http://www.zlib.org/advisory-2002-03-11.txt

Looking at the debian-devel archives around that date, I found a Debian developer complaining about the vendored zlib copies (https://lists.debian.org/debian-devel/2002/03/msg00716.html), but not the full discussion about getting rid of vendored libraries, so it must have happened elsewhere.


Thanks for taking the time these references up for me!


I got the same message, also from Xfinity. It does sort of make me wonder: If the FCC declared broadband to be 100/50, would I have 50mbps upload speeds this morning?

To me, this seems to be a clear positive outcome of the FCC's change: Xfinity had this capacity, the FCC raised their standards, and now all(?) Xfinity customers have increased upload speeds for zero additional cost. Seems like pretty much a best-case result of this metric increase.

Edit: Formerly 200/10, now 300/20. Email arrived 10am today.


I didn't get a message from Xfinity and a speed test showed no increase, but checking my plan on their website it said 300 mbps whereas it used to be 200 mbps. A modem reboot later and I'm getting 355 down/24 up on a speed test site, and 373 down/25 up on "what my ethernet card sees" test [1].

When the plan was 200 mbps I'd get around 240 down on speed tests. Xfinity has always been 10-20% faster for me than whatever the sticker on the plan says.

I'm not sure, unless they actually say it is, that this is in response to the FCC. In the almost 20 years I've had Xfinity I've had numerous speed bumps like this, much more often than the FCC bumps the speed in their definition. My plan went from 100 mbps to 200 mbps sometime in the last few months for example.

[1] Run netstat once a second to get the byte counts in and out per second, multiply by 8 to get bits.


What's different about this speed bump is that suddenly upload and download are closer together than was ever possible before. When we moved in a few years ago there were no options that provided >10 Mbps up but <=300 Mbps down. Now all of the sudden I'm on a 20 Mbps down plan with 300 up.

I'm skeptical that they just coincidentally matched the new minimum upload speed after years of insisting that residential plans could make do with 6.


In my experience, xfinity will often do speed increases and market them, but leave me on a "grandfathered" plan, which leaves me with the same speed as before, and usually some creeping "service fees". Eventually, as I get annoyed with the increasing service fees, I login and look at their new plans and end up enjoying a nice speed increase and a lower monthly bill.


I'd attempted to configure this some time back, but never gotten it working, and this was the kick in the pants I needed to finally get it working!

In case anyone is stuck in the same way that I was, the trailing slash at the end (which I had previously omitted, not realizing) is necessary for this to work. The docs[0] mention this, but I'd managed to repeatedly miss it:

> If the pattern ends with /, * will be automatically added. For example, the pattern foo/ becomes foo/*. In other words, it matches "foo" and everything inside, recursively.

[0]: https://git-scm.com/docs/git-config#Documentation/git-config...


Would you please add an OpenStreetMap attribution[0]? It looks like you're using OSM data via OpenRailwayMap (which also requires its own attribution[1]) and Carto basemaps (which I'm not terribly familiar with, but at first glance appear to be based on OSM data[2])---each of which detail their respective attribution requirements.

Leaflet makes this incredibly simple; just add the suggested text to the attribution field when you initialize the layers:

        L.tileLayer('https://{s}.basemaps.cartocdn.com/light_all/{z}/{x}/{y}{r}.png', {
            maxZoom: 19,
            attribution: '' // here!
        }).addTo(map);
        var railwayOverlay = L.tileLayer('https://{s}.tiles.openrailwaymap.org/standard/{z}/{x}/{y}.png', {
            attribution: '', // and here!
        }).addTo(map);
[0]: https://www.openstreetmap.org/copyright

[1]: https://wiki.openstreetmap.org/wiki/OpenRailwayMap/API

[2]: https://drive.google.com/file/d/1P7bhSE-N9iegI398QYDjKeVhnbS... via https://carto.com/legal


I'm interested in what went into this number.

I checked the six most recently published crates on crates.io (blablabla, nutp, tord, g2d, testpublishtesttest, hellochi, at the time of writing). Three of those (blablabla, testpublishtesttest, and hellochi) did some variation on printing `hello world`. g2d seems like an interesting graphics library. tord provides a data structure for transitive relations, which is also neat. No crate contained over 250 lines of code. Unsurprisingly, none of them contained unsafe code.

Elsewhere in this thread, it's been pointed out that as a consequence of crates.io having a global namespace, plus lax enforcement of an anti-squatting policy, there are a lot of namesquatting packages. Those presumably contain no unsafe code.

tokio contains unsafe code. rand contains unsafe code. regex contains unsafe code. time contains unsafe code. (method: a smattering of packages chosen from blessed.rs; result: every one that I checked except serde containing unsafe code; epistemic status: eh -- I grepped the codebases, ignoring things that were pretty clearly tests, but might have accidentally included some example code or something that's not part of the core library? Please let me know if I've misattributed unsafe usage to one of these projects, or if I've managed to select a biased sample!)

I'd certainly believe a straightforward reading of the claim "80% of crates have no unsafe code"...but that seems almost meaningless, given that a not-insignificant portion of crates contain basically no code at all? I'd be much more interested in a weighted percentage by downloads: I'd be wildly impressed if 80% of crate _downloads_ contained no unsafe code, and would be somewhat unsurprised if the number was well below 50% -- crates with more functionality would be more useful and therefore more download, but also more likely to use unsafe code, I'd imagine.

Edit: I just noticed crates.io has a most-downloaded list[0] -- I might end up running some numbers on top packages there tomorrow morning, for some more solid data.

[0]: https://crates.io/crates?sort=downloads


What is the fact that many foundational ecosystem crate contain unsafe code supposed to prove? That's the entire point of the language. That someone writes a really good regex crate once and then the rest of us don't have to write unsafe to use it. It seems like you have a fundamental misunderstanding about the goals and purpose of rust.


Libraries safely wrapping unsafe code in safe interfaces, and everyone reusing those safe interfaces is like, the whole point of…reusable libraries???

Also, you’re replying to someone who I’m fairly sure is on one of the core Rust teams, if not closely involved, I’m somewhat more inclined to trust _them_ when they say 80% of libs don’t contain unsafe (given that it cleanly meshes with my own experience of Rust libraries).


Instead of looking at the crates themselves, you might want to check your (or others') Rust application with https://github.com/rust-secure-code/cargo-geiger to get a sense of effective prevalence. I also dispute that the presence of unsafe somewhere in the dependency tree is an issue in itself, but that's a different discussion that many more had in other sub-threads.


Don't forget to include transitive dependencies as well.


This is really neat! Thanks for sharing.

I notice that the map uses OpenStreetMap (via Mapbox, it looks like) for its base data, but doesn't display the required attribution[0]. For fixing this, their Attribution Guidelines[1] are pretty informative. Mapbox also has some helpful docs[2], and may have some additional requirements. Thanks!

Edit: After a bit of digging, I'm a bit unimpressed: it looks like the OSM and Mapbox attributions are deliberately hidden? From your compiled index.css:

    .mapboxgl-ctrl-logo,
    .mapboxgl-ctrl-attrib-inner {
      display: none !important;
    }

[0]: https://www.openstreetmap.org/copyright

[1]: https://osmfoundation.org/wiki/Licence/Attribution_Guideline...

[2]: https://docs.mapbox.com/help/getting-started/attribution/


fixed!


Looks great! Thanks for doing that so quickly :)


Looks like the theme might be a lightly modified version of the GitHub Pages Hacker theme[0]?

[0]: https://github.com/pages-themes/hacker



It looks like course details are linked in the sidebar to the right: https://opensecuritytraining.info/AdvancedX86-VTX.html (perhaps this URL would have made more sense for submission, per "Please submit the original source"?)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: