Syntax is about the least of people's worries when they worry about 'mathematical rigor'.
Math is made for people to read and write, and they have different 'domain specific languages' for different parts of math.
Now what would be useful is a tool, perhaps something like a large language model, to automatically translate from the notation used in one area of math into another.
That can't be a fully mechanical procedure (hence the need for something flexible like an LLM), because it's part and parcel of human mathematics to abuse notation here and there in the name of ergonomics.
My personal pet peeve is defining a function like f(x) := x + 3, and then treating f(x) as the name of the function, instead of just f. But really, it's just a harmless abuse of notation when done by some humans to other humans.
While I agree with the point you make, I think that syntax has a huge impact. Maybe not on mathematicians themselves, but I am certain that mathematical syntax has done more to scare people away from the field than the actual mathematical problems themselves.
For example the convention to use the greek alphabet for certain things. This is totally arbitrary and you could have also used emoticons instead (had they existed). But what this means is that the pupil, before tackling the meat of the mathematical problem has to accept that weird looking letter they have never seen for no real reason whatsoever.
And I say that as someone who can fluently read the greek alphabet.
It is not arbitrary, it is heritage. Apart from the fact that Greek was the de facto scientific language of the west (which is no longer the case), I think we can agree that the characters of a single alphabet are not enough for notation, especially given the fact that it is very useful to be able to discern different types of entities, e.g. constants from variables or vectors from matrices.
If we changed symbols now, it would create an even bigger mess. Because the people that learn the new symbols, could not read any textbook published before 0 A.D. (Anno Discombobuli)
So you read my comment and thought: "That guy who can read the greek alphabet doesn't know there are historical roots for greek letters in mathematical notation".
Sure, back then when everybody who learned trigonometry had a classical education with ancient greek picking greek letters when the latin alphabet wouldn't do was a rational decision. It just hasn't aged well.
I'm just saying that changing the symbols will make things worse, not better.
Virtually all the textbooks use those symbols. Do you have a viable and better alternative to suggest or are you just complaining? And I didn't assume that you know or don't know something, I just wrote it down for the sake of the argument. We are not having a private conversation, we are contributing to a public discussion.
That'd not what they said: they made a claim that a single alphabet is insufficient, and thus Greek makes as good a choice as it was used historically too (and allows easy differentiation).
Math also uses made up symbols specific to math a lot (integral, summations, arrows, operators...).
If you want to claim that Greek is the problem, how would you solve the problem of insufficient symbols?
There are simple practical reasons though. We don't do that to be fancy. There are simply not enough letters in the latin alphabet to not have common intersection in writing. We like to use the same letters for objects of the same type (like x,y for coordinates or i,j,k,l for indices) because that increases readability significantly. But it does mean that you run out quite quickly.
Adding another alphabet alleviates those issues somewhat but even with greek letters added in we still run into this issue somewhat commonly.
There are simply not enough letters in the latin alphabet to not have common intersection in writing.
Agreed. But yet... some of the approaches taken to deal with that can be wildly annoying. Actually, using the Greek letters is probably the best of the lot, since they are a completely different set of characters with known pronunciations.
OTOH, sometimes you'll see people use both upper-case and lower-case latin letters in the same problem, forcing you to read it in stilted language like "The derivative of Big X with respect to y, plus the integral of Little x ..." Aaarrgggh.[1]
And then you get the "stylized" letters, which are (mostly) just Latin letters, but have no obvious unique pronunciation or verbalization without going through contortions. I mean, what do you say for "𝔑" especially if there is also a "n" on the page? And who's even going to recognize these monstrosities unless you're already a mathematician: 𝔖, 𝔚, 𝖄? Aaarrggggghhh.
[1]: to be fair, you could have the same problem with mixed case of Greek letters, but I haven't seen that as a common problem. But maybe that's only because I'm not a mathematician. shrug
It's a trade-off between brevity and verbosity, some mathematics do that. Often longer variable names are in ALL-CAPS. Since ab = a*b, it's important to differentiate vars from multiplications.
Yes, it is. I just brought it up because it's an important factor when talking about the need for additional symbols.
Btw, even mathematicians don't mind writing out `sin` or `cos` or `ln` in their formulas. So they are certainly not completely averse to multiple letters.
I do not think mixing up 'f(x)' and 'f' is harmless, given mathematics is all about clarity. In any quality text, 'f(x)' denotes the value of 'f' at 'x', and 'f' denotes the function 'f' itself. Also, speaking of notation, I wonder why you used ':=' instead of '=' to define 'f'. There is no computation going on, right?
Mixing up 'f' and 'f(x)' is mostly harmless in practice. The underlying principles are still clear enough. (And I say that as someone who would _really_ like to make that argument that people who mess this up are somehow unclear in their thinking. No, they are mostly fine.
Getting 'f' vs 'f(x)' right mostly is really important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category.
You could say calculus deals with higher order functions, like the derivative. And that's a valid way to look at it. But most people get by just fine using special purpose notation for the derivative and not thinking about it as a function just like 'f'.)
I used := to emphasis that I am defining 'f' here, not just writing down any old equation. (Eg like like in the example "Find all functions f such that f(x + 1) = x * f (x).")
Though if you wanted to be pedantic about notation, I could have written that as with the x on the other side of the :=, like f := \x -> x + 3 (for Haskell inspired notation) or f := (x |-> x + 3) where |-> means the little arrow I draw by hand to denote a mapping when I'm writing math on a chalk board or piece of paper.
I'm not sure why := would denote a computation? At most you might want to use it to denote an assignment in a mutable context?
> Getting 'f' vs 'f(x)' right mostly is important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category
Mathematicians deal with higher order functions all the time, e.g. in functional analysis.
The amount of "cheating" (as in, notation/language abuse) in functional analysis is much worse than that. People routinely call points in L^2[0,1] "functions"... OTOH, I don't think it leads to serious problems.
OTOH, the lack of rigor is definitely one of the problems of contemporary math. Many years ago, when I was a student, I studied one paper, coauthored by 2 people - call them X and Y. X was a very established mathematician, Y was a relative newcomer. There was one (set-theoretical) argument I couldn't understand, so I asked Y (he was my advisor's friend) about it. He told me "yeah, X asked me this, too, and I told him to use Zorn's lemma, and after a moment of thinking, he said, «yeah, that would work»". I'm not set theorist myself, but it smelled suspicious to me, so I asked another friend, who knew much more about set theory than me. He smiled and said "of course it's wrong, it's a very common mistake".
Had X and Y written out the argument more rigorously, we'd have one less published result with no correct proof...
And I have quite a few other anecdotes like this, unfortunately.
One professor at my former faculty once told how he approaches refereeing papers. "For the first 30 minutes, I try to prove the main result myself. If I don't succeed, I spend the next 30 minutes trying to find a counterexample. This way I write most reviews in half an hour."
A few years ago I coauthored a book about non-linear analysis. Quite a few quite interesting topics. One of the coauthors insisted on writing out proofs in detail and rigorously, and now we joke that our book is the first one where some (quite established and known in this field) theorems are proved correctly for the first time. (And that includes proofs with gaps/mistakes in both research papers and monographs, btw.)
> Getting 'f' vs 'f(x)' right mostly is really important for programmers who deal with higher order functions in general all the time. Most mathematicians don't fall into that category.
> Also, speaking of notation, I wonder why you used ':=' instead of '=' to define 'f'. There is no computation going on, right?
In math, := is typically used to denote a definition. Using equality (=) only makes sense if both sides of the equality sign already have a definition.
Yeah, I think that has the same origin. Though I'm not sure if programming languages or maths came first. Apparently, for programming languages it appeared first in ALGOL in 1958[1].
Edit: On math.SE[2] someone claims that it's notation borrowed form programming. Someone else claims that it was introduced by Bourbaki, which might predate programming, as Bourbaki started publishing in the 1930s. However, I couldn't find any evidence of this from skimming a few Bourbaki PDFs.
Liberalism is dressed up as good in the same way the communist fight against the triad of oppression is dressed up as good.
Both processes openly aim to remove the properties of gender, race and class from the population.
What is left are sexless nationless unprotected atomized individuals at the mercy of the state (controlled by a small group of men who control global finance) - and these people are unmotivated unloved nameless slaves basically
I don’t remember anyone voting to triple their electricity bills. I for one don’t believe all the mega Corps advertising and social movements like cancer cell fake meat being better for the environment
Could it be climate change does not exist and was fabricated by mega corps and the politicians they control to drive social movements they find convenient and which make us poor but ignore social movements that could also help their made up problem but make us happier and wealthier perhaps ?
I hate to be mean or insult anyone but this does seem like an immature guy who a glass ego who is too scared to face the truth that he’s a wage slave and yes it is totally meaningless what he is doing but by changing his routine or environment slightly he thinks it’s suddenly better , then so he just rambles on like a drama queen without saying anything deep or interesting. So save your time and don’t read it but best of luck to the author seems like a nice guy at least
Well, reading the 2 comments here before the article probably primed me, but yeah, it's a ridicilous article. "Fatigue, stress, brain fogginess" because he has so many options where to live? He says he's been killing it at a new job, pretending he's a senior consultatnt there to kick ass, and then he writes the job is "social media marketing". I howled in laughter.
Sorry to be breaking the "be nice" rule, but... flips table.
No it’s actually anhedonia caused by cumulative damage to the brain through having orgasms and this can be reversed and your emotions and sense of love regained