Hacker Newsnew | past | comments | ask | show | jobs | submit | legobmw99's commentslogin

There are some interesting experiments going on in the OCaml world that involve what they call 'modes', essentially a second type system for how a value is used separate from what it is. One goal of modes is to solve this problem. It ends up looking a bit like opting-in to a Rust-style borrow-checker for the relevant functions

https://oxcaml.org/documentation/modes/intro/


Yeah, I think people find C appealing because it feels like you can fit the whole language in your head. I’ve yet to meet someone who has actually accomplished this in practice

If they’re DDR4 (or even DDR3), it has no value to e.g. OpenAI so it shouldn’t really matter

But it's a cascading effect, OpenAI gobbled up all of DDR5 production to the point that consumers are choosing to upgrade their older DDR4 systems instead of paying even more to upgrade to a new system that uses DDR5. As a result, DDR4 ram is at a new all time high - https://pcpartpicker.com/trends/price/memory/

DDR4 prices are up 2-6x in the last couple months depending on frequency. High end, high speed modules (e.g. 128GB 3200MHz LRDIMM) are super expensive.

Isn’t that due to different reasons (like the end of production for older standards)? I recall the same happening shortly after manufacturing for DDR3 ceased, before eventually demand essentially went to 0

Even RDIMM / LRDIMM prices have recently started going up. And I thought that those would be safe, because neither "big AI" nor regular consumers need them.

Demand spills over to substitutes.

The price of DDR4 is also going up!

Similar things on the OCaml forums. He has a post titled “Dune: concurrent builds are here!” but really it’s a +967 -100 slop PR that had to be closed


(Neo)Forge primarily use either mojmaps or Parchment, which are the Mojang mappings with some extra goodies like docstrings and parameter names


We should call it UnTypedScript


SelfishScript. JavaScript credits Self as inspiration, but misses all the important things about Self.

JavaScript copied:

The name "Java", cynically chosen for marketing misdirection, not technical truth.

The word "prototype" from Self, but turned it into a quirky pseudo-class system. Instead of living objects delegating naturally, with multiple inheritance dynamically changeable at runtime, JavaScript glued on a weird constructor-function pattern that always confuses people, with constructors you have to call with new but can also uselessly call as normal functional foot-guns.

JavaScript missed:

The fluid, live object experience (JavaScript dev environments were never designed around exploration like Self’s Morphic).

The elegance of uniformity (JavaScript bolted on primitives, type coercions, and special cases everywhere).

The idea that the environment mattered as much as the language. Netscape didn’t ship with the kind of rich, reflective tools that made Self shine.

And most important of all: Self's simplicity! The original Self paper (Ungar & Smith, 1987, “Self: The Power of Simplicity”) was all about stripping away everything unnecessary until only a uniform, minimal object model remained. The title wasn’t ornamental, it was the thesis.

Simplicity. Uniformity. Minimal semantics. A clean consistent model you can hold in your head. Less semantic baggage frustrating JIT compiler optimization. Dynamic de-optimization (or pessimization as I like to call it).

Self proved that expressive power comes from radical simplicity.

JavaScript showed that market dominance comes from compromise (worse is better, the selfish gene).

JavaScript should be called SelfishScript because it claimed Self’s legacy but betrayed its central insight: that simplicity is not just aesthetic, it’s the whole design philosophy.


Yes, indeed! It's a design philosophy, and one that the market does not always reward. I suspect that for many, it is either not salient, or unimportant. Design is subjective, and multi-dimensional.

Thank you, Don for seeing and writing about this dimension.


What about SloppyScript? It has a nice ring to it.


Besides the logistical problems (shared ownership of MLB and minor league teams, the fact that almost all [good] players in the minors are really ultimately contracted with a major league team, etc), I truly believe that even the worst major league team would absolutely destroy AAA, and the best AAA team would probably set the single season loss record with ease in the bigs


The fact that the Colorado Rockies would destroy, the, um, whoever the best AAA team is right now is a direct consequence of those logistical problems - if you didn't have the current system of minor league affiliates there's no reason for there to be a huge gap between the 30th and 31st best baseball teams in North America.


The current system exists partly because there's more money in a big city with a regional draw like Denver, Colorado than say Spokane, WA (the Rockies' "High-A" affiliate three leagues down). It's tough to balance the money without balancing the cities.

One of the ways that Promotion/Relegation systems work is that regional draw is a lot less of a factor because there aren't big "regional draw teams". The big cities get complex webs of "neighborhood teams" instead; for instance, London alone has dozens of football clubs and the distance between many of them is only a couple of miles. The "regional draw" teams such as national teams are what American sports would refer to as "All-Star" teams that exist briefly, are drafted from local teams, and generally only last for the shape of "a tournament".

The call for Promotion/Relegation in MLB is a bit of asking for an entire rebuild of the current landscape of baseball. What if Denver, Colorado had say five or six smaller teams instead of just the Rockies? They'd all use stadiums smaller than Coors Field. Coors Field would not be filled as often/regularly. There's less money in concessions/merch for any specific team. People in other corners of Colorado are going to feel less like any of those teams represent "the whole state" and are less likely to make trips to Denver to catch games, or even be eyeballs for TV Nielson ratings for Denver team sports. There would be fewer dividends to split to big investors, fewer sponsorships to sell in any one stadium, etc.

All of of which is to say, the big money gaps exist as much because of the hierarchy of the system generates more regular, centralized revenue. Team owners also want a hierarchy of ROI in team investments. The bigger stadiums exist in places with more "regional money", which raises more revenue, which keeps them in a cycle of encouraging bigger stadiums in that area (or moving to an area they think might have more money). It's an unvirtuous cycle of money and breaking that system would require either a lot more money to buy out existing owners or a lot of team owners to agree greed is bad and smaller teams would be a lot chaotic and so more fun for the sport itself.

(ETA: All of which is relevant today in watching the direct struggle play out as we speak between the two big American Men's Soccer organizations. Major League Soccer [MLS] is trying to reproduce the MLB system, including its hierarchy of centralized revenue, and USL is trying trying to build the promotion/relegation system of relatively balanced tiers, with the catch that USL started by trying to be the "down-level" leagues below MLS and so has started on the left foot of following similar stadium and revenue and ownership hierarchies, some of the owners of which don't really want relegation/promotion even if they say they might, and also that USL adding a top tier to directly compete with MLS also has to compete with MLS' first-mover advantage and revenue structure. Women's soccer relationship between NWSL and USL is similar but less chaotic and slightly fewer MLB aspirations from NWSL, given Women's sports in general rarely get to MLB [or NBA or NFL] levels of ROI.)


Maybe drawing 1000 fans/game because they've been relegated to a lower division is just what some billionaires need to figure out they don't belong in the majors. Ownership should be as competitive as the athletes on the field.


Maybe, yeah.

Relegations would also mean more opportunities for teams to be fan-owned. It's reasonably common in relegation/promotion leagues, especially after major upsetting relegations (though "Welcome to Wrexham" is an interesting story of the reverse, a fan-owned team selling to new money for the hopes of resources for promotion). In the US, we mostly just have the Packers as a fan-owned team, and that situation is truly an exception and almost impossible to replicate.

It does feel strange that so many US "major" teams need a billionaire or two or three to run. Why aren't more of the teams themselves public companies? Why are they all as much private toys for the super-rich as anything else? We've already seen that shows up as a lack of concern for/nonalignment with local/regional/fan interests on long enough timescales (all the "team moves", for instance).


My elevator pitch also includes the Relegation Series, a best of seven for keeping your spot in the majors. Could even draw more viewers than the WS!


The truly American way to do this would be: - the top half of the league (or most of it - currently it's 12 teams out of 30) play a postseason like we currently have - mirror-image postseason for the bottom half of the league, but the loser moves on instead of the winner


Publicly owned teams are sometimes banned.

If you have the totality of all baseball salaries, even though it would be “unfair” to whoever is signing the billion dollar contracts these days (have they got that high?) it’d be much nicer if it was spread over many MANY more teams and players, even if the lowest teams are almost “volunteers”.


Does this generalize to higher dimensions? I’m realizing my mathematics education never really addressed alternative coordinate systems outside of 2/3 dimensions


The last method mentioned at that wolfram.com link should work for any dimension (i.e. choosing random Cartesian coordinates with a normal distribution, then normalizing the vector obtained thus to get a point on the sphere).

The method presented in the parent article is derived exactly from this method of Marsaglia, and it should also work for any dimension.


There should conceptually be something similar for higher dimensions, but I'm not sure that it only involve reasonable functions, where by reasonable functions I mean functions you are likely to find in the math library of your programming language.

Here's an outline of where the θ = 2πu, φ = acos(2v-1) where u, v are uniform random values from 0 to 1 comes from.

If you just picked a uniform random latitude and longitude for each point the distribution would not be uniform, because lines of latitude vary in length. They are smaller the farther they are away from the equator. You need to pick the longer lines of latitude more often than you pick the shorter lines of latitude. The probability density function (PDF) for picking latitude needs to be proportional to latitude length.

If you have a non-uniform distribution and you need to pick an x from it but only have a uniform random number generator there is a trick. Figure out the cumulative density function (CDF) for your PDF. Then to pick a random value use your uniform random number generator to pick a value y from 0 to 1, and find the x such that CDF(x) = y. That's your x. I.e., pick x = CDF^(-1)(y).

Latitude length is proportional to sin(φ) or cos(φ) of latitude depending on whether you are using 0 for the equator or 0 for one of the poles (I think the Mathworld article is using 0 for one of the poles), which makes the PDF proportional to sin(φ) or cos(φ). The CDF is the integral of PDF so ends up proportional to cos(φ) or sin(φ). Using the inverse of that on your uniform random numbers then gives the right distribution for you latitude lengths. Thus we have the acos() in the formula for φ. The argument is 2v-1 rather than v to cover both hemispheres.

You could do the same things in higher dimensions. For a 4D sphere, you could slice it into parallel "lines" of "hyper-latitude". Those would be 2D instead of the 1D of the lines of latitude on a 3D sphere. Then to pick a random point you would chose a line of hyper-latitude at random, and then randomly place a point on that line of hyper-latitude.

Like with the 3D sphere you would need to weigh the lines of hyper-latitude differently depending on where they are on the 4D sphere's surface. Do something similar with the PDF and CDF to use the CDF^(-1) to pick one the lines of hyper-latitude with your uniform random number generator.

You then need to place a point uniformly on that slice of hyper-latitude. That's a 2D thing in this case rather than the 1D we had to deal with before so we will need two more random numbers to place our point. I have no idea what the hell it looks like, but I'd guess it is not going to be "square", so we can't simply use two uniform random numbers directly. I suspect 2D thing would also have to sliced up in parallel lines of "sub-latitude" and we'd have to do the whole PDF and CDF and inverse CDF things there too.

I think in general for an N dimensional sphere you would end up with placing your random point involves picking 1 coordinate directly with your uniform random number generator, and the other N-2 would involve inverse CDFs to get the right distributions from the uniform random number generator.

I have no idea whatsoever if those CDFs would all be simple trig functions like we have in the 3D case, or would be more complicated and possibly not have a reasonably efficient way to compute their inverses.


Since you mentioned Stan, feels relevant to mention https://stan-playground.flatironinstitute.org/, which lets you run Stan in WASM and analyze the results using WebR


Yes, the opam repository has recently been working on an archival policy to reduce the size of the checkout and hopefully ease pressure on the dependency resolver by pruning 'obviously' wrong choices. However, the heuristic they chose seems to have mainly assumed that the things in the repository are libraries, and used dependencies for tracking usages. For executables like dune, this is obviously the wrong idea, and I think they're still deciding how to proceed while adding back most versions


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: