Hacker Newsnew | past | comments | ask | show | jobs | submit | knappa's commentslogin

I don't know, but I've never had trouble with mud daubers.


I'm a mathematician. It's kind of a strange statement since, if we are talking about a matrix, it has two indices not one. Even if we do flatten the matrix to a vector, rows then columns are an almost universal ordering of those two indices and the natural lexicographic ordering would stride down the rows.


Yes. I think what all mathematicians can agree on is that the layout (and the starting index! :-) is like this:

  A[1,1] … A[1,n]
   …        …
  A[m,1] … A[m,n]


I assume they're talking about how when multiplying by a matrix by a vector, it's more natural to think of the matrix as a collection of columns than as a collection of rows, and to think of the vector as a column vector.

That layout is a nearly universal convention in applied practice such as statistics. Readers would be very very confused if you flipped it the other way.

The irony is that "programmers" are much more divided on this than statisticians are.


It depends which side you are doing the multiplication on? Most linear algebra textbooks work matrix-vector, where the vector is a column vector. In that arrangement, the resulting vector is formed by dot products of the rows of the matrix with the vector.

On the other hand, you see vector-matrix multiplication a lot in other places, for example, the Markov chain literature. There, the vector is a row vector and the resulting vector is formed by dot products of the columns of the matrix with the original vector.


> Institutional administrators confiscate 50, 60, 70% of every research grant.

Not even the Trump admin is alleging levels of indirect costs that high. See e.g.

https://grants.nih.gov/grants/guide/notice-files/NOT-OD-25-0...

"Yet the average indirect cost rate reported by NIH has averaged between 27% and 28% over time."

and a lot of that is simply because nobody wants to do the detailed accounting for things like: lab electricity usage, janitorial services, misc supplies.

> The result? 90%+ of academic science is fraud.

This is dramatic nonsense; a simple made up number.


> Not even the Trump admin is alleging levels of indirect costs that high.

The 70% "indirect cost" number had latched into my brain. I was willing to concede this point, but it looks like 50, 60, 70% are accurate as of 2025 [0].

While there exist institutions with only 30% indirect cost, every single not-especially-prestigious university in my region are retaining 60% or more.

[0] https://www.congress.gov/crs-product/R48540

> As of May 2025, indirect cost reimbursements for [institutions of higher education (IHE)] are typically determined by an indirect cost rate that is pre-negotiated with the federal government and varies by IHE—ranging from 30% to 70%.


You could definitely replace "Tuesday" with something like that and part of the pedagogical purpose of the problem is for people to question this. The actual effect comes from not distinguishing the boys. That increases the likelihood that at least one of them will be born on any particular day, upweighing the likelihood that there are larger numbers of boys. i.e. You just get, on average, better coverage of boys-born-on-Tuesday when there are more boys.


Well, "born on a day" would not convey any information unless it means "during daytime". If that has probability 1/2, the answer would be 3/7. With Tuesday (or, indeed, any other weekday, with probability 1/7), it is 13/27.


But correctly formatting references is pretty much a solved task through reference managers, possibly plus bibtex. It's a well-defined task, after all, and well suited to traditional software techniques. [1] If someone used an LLM to format the references, you would still have to go back through them.

If there is any use for LLMs in paper writing, I would think that it is for tedious but not well-defined tasks. For example, asking if an already written paper conforms to a journal's guidelines and style. I don't know about you, but I spend a meaningful amount of time [2] getting my papers into journal page limits. That involves rephrasing to trim overhangs, etc. "Rephrase the following paragraph to reduce the number of words by at least 2" is the kind of thing that LLMs really do seem to be able to do reliably.

1: As usual, the input data can be wrong, but that would be a problem for LLMs too. 2: I don't actually know how much time. It probably isn't all that long, but it's tedious and sure does feel like a long time while I'm doing it.


Re-phrasing to fit within word or character limits is certainly something I would pay for!

I have often spent more time doing this than writing the original draft, especially for grant applications...


That's not what was being said. They are saying that when your body does something different, in this case producing fever, it is (often) because of a change in the level of some number of proteins. You get new copies of proteins through gene transcription, and these changes (both increase and decrease) can be detected through changes in the levels of the transfer RNA corresponding to those proteins. Look up differential transcription analysis for more information.


I feel like the main thing that this will accomplish is to show that the quality of peer review is quite spotty. Usually 1 of 3 truly read and understood the paper enough to critique from a place of knowledge. Entirely miscategorizing a paper is not uncommon.


Having lived within a block or so of the place being described, 125th is a pretty accurate placement of the Harlem's southern border west of Broadway. (Maybe a bit more south at St Clair place.) Further east is different. I can't speak to the situation in the 80s.


> This is why, Prof. Wildberger says he "doesn't believe in irrational numbers."

Oh boy, I hope that they missed a joke or misquoted.


Here's a possible good faith interpretation: Irrational numbers exist as a limiting process, not as a static atomic unit. That is, irrational numbers are a verb, not a noun.

We use irrational numbers as nouns, when convenient, but this is an abuse, in some sense. When we want some digit of sqrt(2), say, we need to interrogate an algorithm to get it. We talk about how much time it takes to extract the amount of precision we want. At best sqrt(2) can be thought of as an abstract symbol that, when we multiply it by itself, is 2. That is, an algebraic manipulation that we can reduce to an integer under certain circumstances, but it doesn't "exist" the same way that an integer or a rational exists.


> That is, an algebraic manipulation that we can reduce to an integer under certain circumstances, but it doesn't "exist" the same way that an integer or a rational exists.

This depends on your interpretation: some view the reals as completions of that process, in which those “verbs” are “nouns”.

But you can construct a coherent theory in which this is not the case — and nobody is much fussed, because mathematics is full of weird theories and interpretations.

And both integers and rationals are defined by their relations, eg, integers are equivalence classes of pairs of naturals and rationals as equivalence classes of pairs of integers — where the class obeys some algebraic manipulation properties. If you feel there’s some great difference in sequences (and where you find that difference, eg, allowing only constructibles) is a matter of perspective.


Put another way: irrational numbers are functions, not values. You can run the function as long as you want and get more digits but the function won't halt.


That would make more sense if you use the word computation instead of function.


I presume he is the same one who authored "Divine Proportions: Rational Trigonometry to Universal Geometry" (https://www.goodreads.com/book/show/791022.Divine_Proportion...) so I suspect he is not misquoted.

I actually quite liked "Divine Proportions". As far as I know Wildberger is eccentric, but not exactly a crackpot.


He is a sort of strong finitist [0]. Roughly, this means that many usual mathematical objects that rely on notions of infinity, e.g., irrational numbers, do not exist.

I came across his YouTube channel [1] years ago as a undergrad and became really confused about some ideas in logic as a result.

[0] https://en.wikipedia.org/wiki/Ultrafinitism

[1] https://www.youtube.com/@njwildberger/playlists


Wildberger has always been this way. Way back in 2007, Marc Chu-Carroll's "Good Math Bad Math" highlighted Wildberger: "This isn’t the typical wankish crackpottery, but rather a deep and interesting bit of crackpottery." In brief, Wildberger is clearly educated, but also clearly rejects axioms that mathematicians accepted a long time ago (infinite sets in this case):

"Dirty Rotten Infinite Sets and the Foundations of Math" http://www.goodmath.org/blog/2007/10/15/dirty-rotten-infinit...

Wildberger also wrote a book on geometry with nothing allowed but rationals. (Or something like that.)


In mathematics there are fringe philosophies but you still just need proofs that are kosher.


Never show him a square whose side is 1, or he wouldn't believe in its diagonal :D


If you watch it closely, you can see the hypotenuse converging.


No, he really doesn't believe in them.


I suspect that in reality it's transcendental numbers be doesn't believe in. Usually such folks believe in irrational but algebraic numbers.


All models are wrong; some models are useful. I'm not sure I'd say I "believe in" the number 7, either. But the number 7 is very useful. It's reasonable to debate how useful irrational numbers are. That is: there are models of math where irrational numbers are "outlawed" in some sense, and it sounds like Prof. Wildberger studies those models. It's a "fringe" model though; most of math is better off simply accepting irrational numbers as they are. Clearly some of the most widely useful math involves exp(x), sin(x), and pi, all of which are either completely outlawed or require laborious workarounds if you're not allowing true irrational numbers. Still, it's a good idea to have some people studying these fringe models to get a better understanding of math as a whole.

In this case, I find the argument "but you can't calculate it!" unconvincing, since every computer will have rational numbers they can't exactly calculate as well. Our computers can't calculate the exact value of 1/3 either; so what? If we're worried about computing things, we should consider whether we can calculate things to arbitrary precision or not within reasonable time. In that sense, pi behaves no worse than 1/3.


> since every computer will have rational numbers they can't exactly calculate as well

It might be better worded as "can't calculate a decimal version of every rational number". One can work quite easily nowadays with exact representations of rational numbers on computer. With Bigint stuff, it is easy to have very large (for human purposes) numerators and denominators. To what extent practical calculations could be done with exact rational arithmetic, I am not sure of though I suspect it is largely not an issue as precision of inputs is presumably a limiting factor.

Wildberger has specific objections to the usual definitions of real numbers and they vary based on the definition. For decimals, it is the idea that doing arithmetic with an infinite decimal is difficult even with a simple example such as 1/9*1/9 which is multiplying .111... times itself, leading to sums of 1s that carryover and create a repeating pattern that is not self-evident from the decimal itself.

For Cauchy sequences, he objects to the absurd lack of uniqueness, particularly that given any finite sequence, one can prepend that sequence to the start of any Cauchy sequence. So a Cauchy sequence for pi could start with a trillion elements of a sequence converging to square root 2. This can be fixed up with tighter notions of a Cauchy sequence though that makes the arithmetic much more cumbersome.

For Dedekind cuts, his issue seems mostly with a lack of explicit examples beyond roots. I think that is the weakest critique.

Inspired by his objections, I came up with a version of real numbers using intervals. Usually such approaches use a family of overlapping, notionally shrinking intervals. I maximized it to include all intervals that include the real number and came up with axioms for it that allow one to skirt around the issue that this is defining the real number. My work on this is hosted on GitHub: https://github.com/jostylr/Reals-as-Oracles


> One can work quite easily nowadays with exact representations of rational numbers on computer

One can also work with exact representations of Pi and sqrt(2). Use a symbolic system like MATLAB or Wolfram Alpha. Yes, if you create dedicated data structures for those exact representations you can work around the limitations of both 1/3 and Pi -- this is my point: the line is not "rational vs. irrational", it's "exact vs. computable to arbitrary precision vs. uncomputable". That is to say: a mathematical model that permits the rationals but outlaws the irrationals is much less likely to be at all useful than a model that permits computable numbers but outlaws/ignores non-computable numbers. I contend most objections to irrational numbers boil down to their general incomputability -- that is, 100% of all irrationals are not computable, and that makes people anxious. There is a coherent computation-focused model that keeps all computable irrationals and disallows the rest that would quell almost everyone's objections to the irrationals. For example, the set of rationals plus computable irrationals is countably infinite. All polynomials have roots.

> For decimals, it is the idea that doing arithmetic with an infinite decimal is difficult even with a simple example such as 1/9*1/9 which is multiplying .111... times itself, leading to sums of 1s that carryover and create a repeating pattern that is not self-evident from the decimal itself.

Right, but this is another example where an objection to irrational numbers can also be levied against 1/9, showing that computability is actually what we care about. And Pi and e and sqrt(2) are all computable, and not in any qualitatively more "difficult" way than the rationals.

> For Dedekind cuts, his issue seems mostly with a lack of explicit examples beyond roots. I think that is the weakest critique.

Yes, that is a weak critique indeed. Any computable real can be turned into a Dedekind cut that you can query in finite time.

> I came up with a version of real numbers using intervals

I haven't dug into your axioms but it seems to follow that if you gave me a Dedkind cut (A, B) then I could produce an Interval Oracle by taking [x, y] => x ∈ A && y ∈ B. Similarly if you gave me an Oracle I could query it to determine inclusion in A and B for any points -- immediately if you allow infinity in the query. That is, Oracle(x, inf) <=> x ∈ A and Oracle(-inf, x) <=> x ∈ B. So at first glance these appear to be equivalent, unless you disallow infinity to the Oracle, in which case I might need O(Log(n)) steps to establish inclusion in the Dedekind steps. So it might be a very slight Is that where the power comes from?


As I recall, most of the complaints about big pharma were about price-gauging. This is entirely different.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: