The library that you refer is not in use for a long time already. The document you pointed out is from 2006 (you can check the creation date).
Since then, a lot has changed, and now it is all based on cling ( https://root.cern/cling/ ), that originates from clang and llvm. cling is responsible generates the serialization / reflection of the classes needed within the ROOT framework.
Good catch: I'm confusing reflex and the cling code that came later. All the issues I mentioned are still there in (or caused by) cling though. Either way standardization in reflection would help.
I concur with your statement. Going through random sites for an hour, it gave me thinking back 25 years ago, when going from rings to rings of websites, look at them, reading the interesting ones and finally bookmarking them to be able to come back to them.
A complete different way to present a personal topic and/or interest than the current one. Right now, it is blogs that trying to gather an audience for whatever purposes, commercial websites, social medias etc. This compared to websites that gathered personal interests, or one specific topic that tries to be self-contained.
I don't know maybe it is the nostalgia, or how I have first interacted with the web, that brings back those contrasts.
I think people needs to put a bit of perspective on this :
the FCC is a project for the next 70 years. It is supposed to be operational after 2040 at the earliest if I recall correctly (20 years from now more or less), and to run for around 40 years.
So if you do a very simplistic calculation on the first 20 years : 100 B euros over 20 years over 20 countries (there is 23 member states into the CERN) it is 250 M euros / year / country. This is not even calculated on the over-all project time, on on the different steps that will be happening: first FCC as electron-positron collider and then much later FCC hadron-hadron collider.
And if you are really interested into the previewed outcomes (scientifically and new techs associated):
You can take look at the European research communities symposium that just happened recently on the future of the research topics in Europe. There are a lot of information for FCC and so on.
Just to let people knows: actually it is the common way to calibrate detectors and especially cylindrical detector system with a solenoid magnet.
So nothing special in this.
I don't know if I can plug-in here some of the materials I prepared some years ago. It might help to bridge/connect the use of Monte Carlo techniques and its relations with integration:
Also just to mention something else that is very interesting related with MC : "quasi-Monte Carlo techniques" (small example in https://github.com/ChristopheRappold/MCandPython/blob/master... ). For people interested in MC, just take a look to those keywords.
I hope to provide some perspective :
The Standard Model (SM) is based on a specific number of free parameters, such as the masses of elementary particles, the number of generation of families (quarks, leptons/neutrinos) etc. Those parameters are not fixed by the theoritical model, in the SM they are free (I hope it is clear enough). They are inferred from the experimental data so that we have defined what they are now.
The number of family of neutrinos was deduced for the first time in LEP experiments (in the 90s, the predecessor of LHC at CERN) : It was not known before whether there were 2, 3, or 4 family of neutrinos.
If you want to learn more : http://pdg.lbl.gov/2020/reviews/rpp2020-rev-light-neutrino-t... a review about this subject. In summary, the combined result from LEP experiments is N_neutrinos = 2.984 +- 0.008.
If you are interested, you can see the experimental plot that shows the fit and the difference between of a SM with 2, 3, and 4 neutrinos : https://arxiv.org/pdf/hep-ex/0509008.pdf Page 36 Figure 1.13, the data points are in red with their error bars (extremely important to pay attention to them and their size !) and the curves are the SM prediction for 2, 3, and 4 neutrinos.
In my opinion, this is a very nice plot that shows how different is the SM with 4 neutrino. This is why the SM is with 3 neutrinos and not otherwise, it is because the experimental data that were used to infer all the free parameters of the SM. The last ones were related the Higgs boson, and now everything is fixed.
To accommodate a 4th neutrino, then we would need to go beyond SM.
This seems to be assuming that the neutrinos are massless, or at least have a mass quite a bit smaller than the Z particle. It could be that there is a fourth generation where both lepton and neutrino are heavier than the Z particle, no?
In the Standard Model as written in the 1970s the neutrinos had to be exactly massless. We now have evidence that neutrinos aren't exactly massless, but they're very VERY light. Like, absurdly light. We still don't know the absolute scale of the neutrino masses, we know from mixing that they're damn small and from cosmology that the sum of all three masses is less than the mass of the electron / a million.
It is logically possible that the neutrino of the fourth generation is very heavy. In that case
- the constraint on the number of neutrinos from collider experiments is relaxed because the ultra-heavy neutrino's contribution to the observable would be extremely suppressed (because the collision energy was too low to be sensitive).
- the cosmological constraints are relaxed because the heavy neutrinos are already frozen out by the time of the electroweak phase transition in the early universe.
- the mixing constraints... well, right now it seems that the mixing matrix between the generations is unitary---there's no "leak" into a fourth generation. But our experimental precision is mediocre because precisely measuring the mixing is difficult (though there are experiments under way). It is also logically possible for there to be a fourth generation but that the neutrino doesn't mix at all---its mixing with the lighter neutrinos is precisely 0. While it's perfectly possible logically, we physicists do not like this kind of "fine tuning" without some explanation of how it could happen. In the SM the neutrino masses/mixings are input parameters, not things determined dynamically---they are axioms, so to speak. So any explanation of the mixing being really small would need to invoke more beyond-the-Standard-Model physics than "it's the same but there's a fourth generation".
Since then, a lot has changed, and now it is all based on cling ( https://root.cern/cling/ ), that originates from clang and llvm. cling is responsible generates the serialization / reflection of the classes needed within the ROOT framework.