The big deal about these events is that, if you reasonably assume that it's a proton, then it violates a bound called the Greisen–Zatsepin–Kuzmin limit. Anything going that fast should be energetic enough to interact with the cosmic microwave background, which is hugely blueshifted in its frame of reference. The GZK is a kind of cosmic speed limit over long distances.
So anything this energetic would need have a nearby source (in astrophysical terms) so it doesn't have time to slow down. But when we trace these things back, we see bupkus in the direction it came from.
This means there is either (almost certainly) interesting new astrophysics, or (with tiny probability) new particle physics involved. Whatever is giving individual protons the energy of a thrown baseball is probably something worth studying.
I wonder if it could be a background source that we don't understand. Maybe some freak coincidence where particles come in in exactly the right angles to fake a hyperenergetic photon. Or somebody messing with laser pointers. Or birds or whatever.
I used to work in particle physics, and never shared the confidence of my colleagues in rare events. If you just have 3-4 signal events, considering the expected number of events, you might statistically have a discovery. But you can't be sure how your detector is going to behave for those extreme events, because you have no benchmark. You have to assume a proton is a proton, an electron is an electron, and no weird things happen at high/low energies and angles (or that you understand how things change).
It is even worse if you are trying to disprove some specific model. You see no events of a certain kind. Does that mean you disproved the model, or that your setup (detector, triggers, event selection ...) is just blind in this very narrow part of the parameter space?
If you go out in space and look at what extrasolar stuff zaps you, you'll find a very standard power law distribution of charged particle energies, with no known upper limit. This is a better position to be in (from a detector confidence standpoint) than seeing a completely bananas event every few years. Here it's bananas (and progressively less bananas) all the way down.
It would certainly be convenient for me if it is a proton, given the experiments I work on (protons at the highest energies -> more ultrahigh energy neutrinos -> maybe my experiments will see something!), but, unfortunately for me, I don't think we know that. Nuclear species composition can typically only be done statistically, and it looks like this wasn't even measured in the fluorescence detector, which would give the best measurement of the penetration depth of the particle in the atmosphere (the best handle on particle id). In,the journal article, which admittedly I've just skimmed, they give direction reconstructions giving different species assumptions (and Milky Way models, which are important for heavier nuclei) and say that while they can exclude a photon (which... would be weird!), without data from the fluorescence detector, they can't tell the difference between a proton and a heavier nucleus.
Though given how TA mass-composition measurements turn light at the highest energies, they may perhaps privately argue it's likely a proton. But, alas, Pierre Auger Observatory would argue otherwise. I'll ask Toshihiro what he thinks next time I see him...
It is intriguing that the smaller TA has seen more high-energy events than the bigger PAO. Maybe there really are big differences between the northern and southern hemisphere at play here...
Probably a integrated in-ice Cerenkov + radio neutrino detector with a giant cosmic ray detector (scintillators + water cerenkov tanks + fluorescence detectors + radio) on top, perhaps with some fluorescent/radio detectors deployed on tethered aerostats to look for upgoing air showers from taus. Sort of like IceCube Gen2 Radio on steroids, mishmashed with Auger and ANITA/PUEO/EUSO-SPB. Both in Antarctica and in Greenland to get both hemispheres. But there are arguments for other types of detectors too, and perhaps I'm not thinking big enough since 100 Billion is several orders of magnitude above what is probably reasonable :)
> either interesting new astrophysics or new particle physics
Or measurement error which historically has been the most frequent explanation of these rule-breaking observations. Someone else will look at the data and notice an anomaly which once accounted for will make everything fit within expected models.
In this case though they've been observed over the past 30 years at at least 4 different detectors, which provides independent confirmation across multiple different research groups. It is very unlikely to be a "oops we didn't plug the cable in firmly" kind of situation. It would need to be something fundamental about the design of these detectors, which has been replicated four independent times, which has completely escaped the notice of physicists for >3 decades (which includes 3 decades worth of grad students incentivized to take a look at the problem with fresh eyes and make a name for themselves by explaining it all away).
Several of them use multiple as-independent-as-possible detection methods. The most common are ground stations in a grid that measure high energy particles passing through, and then correlate them in time, followed by ultra fast (in terms of data acquisition - eg
20M samples/s) telescopes pointed at the atmosphere (not the sky), which can see the slight trace of fluorescence light left behind by the air shower from up to tens of kilometers away.
What's common to all of the detection methods I know of is the atmosphere. Back at Pierre Auger Observatory, we used a combination of LIDAR scans and weather balloons to constantly monitor it. The atmosphere is basically a calorimeter for these detectors.
The fluorescence detectors (FD) are generally considered the most direct measurement of both primary particle energy because the fluorescence mechanism is relatively simple (somebody is bound to criticize me for saying that), with a proportionality whose constant can be measured in the lab[1]. But there's still models and simulations that go into it. And these fluorescence telescopes can only be operated in moonless nights, so have a duty cycle of only about 10%.
Eg. Auger combines FD with surface detectors (SD) to use simultaneously measured events to calibrate the more indirect energy measurement of the SD and thus make use of their near 100% duty cycle.
[1] Other methods have been tried to measure this fluorescence yield. It's primarily done in the lab: https://arxiv.org/abs/1210.1319
For something completely different: my own masters thesis from a long time ago was an attempt to determine this yield indirectly because we know from detailed bottom up simulations that air shower shape is near universal given primary and energy. If we knew the event geometry (direction) well enough, we could use the ratio of Cherenkov to fluorescence light along the recorded track and fit our longitudinal shower size model using the fluorescence yield as the free parameter. In hindsight that was a lot of fun. I would enjoy that type of work a great deal more today, now that I feel that have less to prove. :)
We've had many more events within an order of magnitude of these events' measured/reconstructed primary energy. The spectrum is a power law. While theoretically something completely different could be happening in the atmosphere of you add 2-10x the energy, there's really no indication thereof. Auger also has high energy event detected by multiple detection methods, not just the "standard" water-cherenkov or scintillator ground stations.
> It would need to be something fundamental about the design of these detectors, which has been replicated four independent times, which has completely escaped the notice of physicists for >3 decades
Your parent wasn't suggesting you hadn't considered it, they were asking for more information about the designs. To what extent is / isn't common design mistake the probable cause?
I assume that limit accounts for the time dilation, too. At the speed the particle was traveling it wouldn't have had much time in its reference frame to interact with the background radiation, even blue-shifted as it would be, right?
Yeah, the model does account for it, and the time dilations are nuts. For the Oh-my-god particle (a similar early detection), I remember that it would have crossed a billion light years in what it perceived as a day.
Imagining that reference frame is a beautiful thought. Galaxies and stars coming into and out of existence in real time, supernovae exploding like fireworks constantly as the entire universe zips by.
Until that trip ends by being snagged by some puny detector on an even punier planet that just so happened to intersect your trajectory. "I came across a billion lightyears, and all I got was this scientific paper". Not even a t-shirt...
- Whatever Star Trek says about warp reactors (matter / antimatter reaction) is valid. Reacts "some" amount of matter/antimatter every second.
- Klingon Bird of Prey can maintain an effectively infinite cloaking time from a human observation perspective, so it can be nearby.
- Warp reactor "occasionally" leaks (1% of 1% of 1% of reactions? I dunno...) so we might actually detect something. "Slightly" imperfect shielding.
- Problem: 5E1 J for OMG Particle (people say its a proton). 1.8E14 J for 1 gram of matter / antimatter annihilation. Except: 1 Proton = 1.6726231E-24 g. Proton rest mass energy is 1.503E−10 J. So the particle is more energetic than a Proton / Antiproton annihilation event (by a lot). It's been upshifted More than 100,000,000,000 from the rest mass energy.
- It "might" work if the one proton escaping represented a single proton gaining enough energy to overcome reactor core shielding confinement, which, in my opinion, seems somewhat plausible physics by Star Trek standards.
- PS: Personal guess is a Q-Clearance [3] experiment at a DOE lab we deny exists.
To the casual HN reader - my recommendation to you is to neithet laugh at nor dismiss out of hand analyses such as this (even as logic intrinsic to them merits as much picking apart as any other analysis one sees everyday on HN).
Robin Hanson's description of long-lived stars (usually strongest in infrared - i.e. the 3 stars our eyes lack the spectral response to see for every 1 star that we do see when looking into the night sky) significantly updated my priors on the likelihood of intelligent, non-human life roaming about the cosmos.
Do those red dwarves have the heavier atomic elements necessary to make diverse biological interactions, on a planet geologically active enough to protect the biosphere with a magnetic field?
I recently encountered a description of how Earth relies on the remnants of some interactions between stellar remnants which would have been far rarer early on, even if red dwarf stars were present for so long.
I think the simplest explanation is that we're just not accurate enough at determining the trajectory of the particle in question. They tried matching against sources "just outside its arrival direction" but without access to the paper (the DOI link just doesn't work for me, loads an empty page) or knowledge about the instruments being used it's hard to say how much that rules out.
At least looking into the Telescope Array's workings, it seems plausible that it'd struggle to produce accurate trajectory data. It works by analyzing the light produced by cascading collisions of particles with the incoming one: https://en.wikipedia.org/wiki/Air_shower_(physics). At first glance this seems like it would be very noisy for determining trajectory.
My recollection from the Pierre Auger telescopes was 1-2 degrees uncertainty. TA is likely a similar order of magnitude. So matching a single event against candidate sources might not be the most reliable analysis. (Caveat: I didn't dig up papers to check my memory from 15 years ago.)
I'm relatively lay in astrophysics. Could the origin not be galactic magnetic field lines between Earth and the galactic nucleus? Like the magnetic mirror between Earth and the Sun but much larger.
To our best understanding, particles of such high energy do not originate in our own galaxy. We don't have a good understanding of what exact process might be putting this much energy into a single particle, but the commonly accepted acceleration mechanisms of, eg., supernovae don't reach anywhere near these energies.
GP also said "nearby source (in astrophysical terms)" which in this case is code for "could be over 100 million light years".
What initial speed must the proton have for consistency with our observed final “dragged down” speed (assuming some ballpark galactic distance)? Alternatively, could a “clump” of particles have smashed into our detector? Whatever the new physics winds up explaining this bonkers momentum, it’ll be fascinating.
Kurzgesagt recently uloaded a video on hypethetical interplantary weapons (https://www.youtube.com/watch?v=tybKnGZRwcU) which is the only reason I even know what the OMG particle is.
They described an "Ultra Relativistic Electron Beam" which theoretically could travel much closer to the speed of light than the OMG particle. I'm left wondering if the proximity to the pure speed of light has any bearing on total delivered energy when comparing different particles.
You’re often taught that E=mc² but the fuller equation is actually E²=m₀²c⁴+p²c² where p is the momentum, so at high speeds mass fades into irrelevance and the velocity component of momentum becomes dominant.
EDIT in classical physics p=mv so you might wonder what I’m banging on about when mass appears linearly on both sides and on the first term is multiplied by c⁴ whereas in the second only by c². However relativistic momentum is classical momentum adjusted by the Lorenz transformation γ=1/√(1-[v/c]² so it actually dominates in the limit when v tends to c.
EDIT 2: the latter works out to be sigh c³vm₀/√(c²-v²) and it grows without bound as v→c so that’s the pedantic answer. The first term is linear the second term is anything but.
If the particle has zero rest mass, like a photon, it still has energy. E^2 = p^2 c^2 + m_0^2 c^4 captures that correctly, while E = mc^2 where 'm' is the relativistic mass, does not.
No, it does correctly capture that. The relativistic mass of a photon is not zero even though its rest mass is. However for a zero-mass particle you cannot use the Lorentz transformation to calculate relativistic mass, because v = c and so you would be dividing zero by zero.
You can correctly calculate the relativistic mass of a photon in other ways, such as from its momentum or its gravitational interaction, and doing so gives you the energy you'd expect from E = mc².
Oh heavens whatever caused you to believe this? The momentum p of a photon is given by p=h/λ where h is the Plank Constant and λ is the wavelength. Compton won a Nobel prize for this.
Of course p = h/λ. Thus the relativistic mass m is given by p/v = p/c = h/cλ. And a photon's energy is given by E = hc/λ. This is all in agreement with the relativistic mass you'd calculate from the photon energy E = mc².
Except that it does gravitate more. Avoiding the confusion that it doesn't is one of the conceptual benefits of relativistic mass. Gravitation is dependent on the stress-energy tensor, which contains... drum roll... relativistic mass, not rest mass!
For that matter, you've already noticed that the Lorentz transform becomes nonsensical as v → c when starting from rest mass, whereas the transform works just fine when using relativistic mass as the starting point, and nothing blows up, and you can avoid any confusion.
Neither is rest mass an actual "physical concept". When general relativity is considered, rest mass is ambiguous, or at least non-local.
Whether trendy or not, relativistic mass is however a very useful concept, because in many cases it is the relativistic mass, rather than the rest mass, that behaves the way we expect mass to behave; mainly that it is additive. For example, if you weighed a mirrored box full of bouncing photons, the scale would measure the mass of the contents as the sum of the relativistic masses, not the sum of the rest masses.
Rest mass has the awkward problem that it depends on which particles you consider to be "part of the system": A photon going left has no rest mass, a photon going right has no rest mass, but a system of two photons, one going left and the other right, does have a nonzero rest mass.
String theorists would even consider much of a particle's apparent rest mass to simply be another manifestation of relativistic mass, indicating the presence of an invisible periodic motion along a hidden dimension, resulting in kinetic energy even within a particle that seems to be "at rest".
I don't think it's useful concept. It's deeply misleading. When you are talking about systems then mass is not a sum of the parts but a proxy for total energy of the system. And when you are interested in the parts just use the full equation linking energy, mass and momentum. Linking some kind of mass directly with velocity is one shortcut too far.
> When you are talking about systems then mass is not a sum of the parts but a proxy for total energy of the system
And which kind of mass is a proxy for the total energy of a system? That's right: relativistic mass, not rest mass. And this is the sum of its parts, as it should be, because energy is always conserved. Relativistic mass and energy are the exact same thing, up to a constant factor of c². Isn't energy a useful concept?
Furthermore the history of physics has often revealed what we thought of as atomic particles to actually have systems underneath. In the course of time, it seems more likely to me that we will do away with rest mass than with energy, in which case all mass will be relativistic.
> And which kind of mass is a proxy for the total energy of a system?
When considering systems there is only one mass. It's called mass. It's neither relativistic nor rest mass. It's just E/c2 where E is the total energy of the system. It's not the sum of the parts because it's dependent on interactions between the parts as well. The mass of deuterium nucleus is not the sum of relativistic masses of the proton and neutron. It's less because they are bound.
Rest mass is a sort of useful concept because it is the bit of energy that's still there even when object is at rest. Apparently it has something to do with Higgs field which I'm not super keen on because I don't understand it but nothing seems to contradict this so far.
But talking about relativistic mass doesn't make much sense. The velocities are involved in mass only because equation for the energy of a particle has momentum term. It doesn't create anything qualitatively different to be worth naming it. Relativistic mass is only talked about to avoid contradicting the intuition of mass being quantity of matter which was introduced earlier to weed out the intuition of mass being weight which kids get from everyday language. It's really not a great idea because it misleads people who try to think for themselves a bit further. For example what would happen if you tried to push fast moving object sideways. Would it be harder because of higher innertia because of higher relativistic mass? After all trying to push it in the direction it travels should be harder because of high relativistic mass. That's what people say when trying to explain why you can't accelerate matter to the speed of light. Suddenly relativistic mass becomes a directional property. When you use the correct equations in their full form istead of trying to shortcut to skip momentum and vectors all of this confusion disappears.
> Cosmic rays with energies of more than 100 EeV are rarely spotted — fewer than one of these particles arrives on each square kilometre of Earth each century.
How do they explain detecting such a particle at all? I would assume that the surface of the Earth has much less than 1 square kilometer worth of detectors, so on average they shouldn't have detected any 100 EeV particles since the invention of cosmic-ray detectors.
It uses photomultiplier tubes to record interactions of the cosmic rays with a good-sized chunk of atmosphere, plus detectors for the shower of particles when a cosmic ray interacts with the air. https://en.wikipedia.org/wiki/Telescope_Array_Project says it uses "a 762 km2 grid array with 1.2 km between each unit".
> The Telescope Array project ... is designed to observe air showers induced by ultra-high-energy cosmic ray using a combination of ground array and air-fluorescence techniques. ... When a cosmic ray passes through the Earth's atmosphere and triggers an air shower, the fluorescence telescopes measure the scintillation light generated as the shower passes through the gas of the atmosphere, while the array of scintillator surface detectors samples the footprint of the shower when it reaches the Earth's surface.
A very different and really neat concept that hasn't become real yet is JEM EUSO, a telescope that would be mounted on a space station, pointed at Earth, would detect air showers via fluorescence like Auger's fluorescence telescopes do on the ground. This could theoretically cover a much larger area than traditional CR observatories. https://en.wikipedia.org/wiki/JEM-EUSO
Fluorescence detectors are volumetric though only operate during moondown at night, and in the case of both Auger and TA, are paired surface detectors. Auger uses more water tanks with a PMT lined with Tyvek and detect electronic and muonic secondary particle components for the EAS. TA has scintillation detectors with PMTs, which are polyvinyl toluene sheets embedded in a steel casing. In both cases, they have individual triggers, usually around 1 MIP (minimum ionizing particle), which will trigger communication to a tower, which would then poll nearby detectors for events greater than 1/3 MIP.
Or, at least this is how it was setup 15 years ago. Both experiments have added new fluorescence and surface detectors since then.
A century is about 3 billion seconds, and Earth's surface area is about half-a-billion square kilometres, so one of these hits Earth about every 6+ seconds.
Perhaps because the peer-reviewed paper was published only recently (23rd Nov) - while not a perfect system (see the recent issues with the retraction of the super-conductivity paper by Nature), I think it not a bad thing that journalists are covering such reports once they have undergone the peer-review process - however flawed it may be, and not just based on the university or research group published press-release announcing the results before the peer-review process is complete.
Also, while the final result can be simply stated in a headline, performing the analysis that rules out all known sources of error is very, very difficult and can take years to validate.
A slower news cycle. In recent days US politics and the various wars have hit relatively steady states. So the news outlets are casting out for interesting science content, stuff that doesn't require math to understand.
If we're talking about a single proton, and looking back at the direction the proton came from, we see nothing; what are the chances the proton passed close enough to a black hole to deflect it a significant angle from its original path, eventually landing on earth?
I often think about this. But then I also often think about synchrotron and how they create EM radiation by accelerating electrons to high speed then bending them in magnetic fields.
When the electron bends in the field, it releases energy in the form of light. It does this as for ‘ever action the is an opposite reaction’. I can’t figure out if it emits light because it’s in a magnetic field, or if it’s because it’s a charged particle.
Either way, when being bent by a large gravitational field, would a proton emit light and lose energy and slow down?
I think the electrons emit photons as a result of decelerating which is caused by the magnetic field (which can influence the charged electron).
And I’ve never actually pondered whether a proton changing its direction due to gravity (technically decelerating along its original path and accelerating along its new path) would emit photons. I don’t think it would because from its frame of reference it’s not accelerating… I need to do some homework, very interesting!
Edit: It seems a charged particle accelerating due to gravity will not emit photons.
Can someone explain why this particle couldn't have been gravitationally accelerated by taking a turn around a super-massive black hole just on the edge of the event horizon? Is there a limit to how much acceleration a black hole can give?
Mass is of a single particle is extremely small (proton has a mass of 1.6e-27 kg), speed is very high and already very close to the speed of light at all but the lowest energies, so gravity "assists" are not really anything with which you can gain energy.
These are charged particles (protons and fully ionized heavier nuclei), so electromagnetism is a much more efficient way to reach high energies.
All currently proposed mechanisms for cosmic ray acceleration involve turbulent plasmas and shock fronts that reflect particles magnetically, giving them a small bump in energy each time [1] or rotating magnetic fields [2] in sufficiently extreme environments.
Gravity just doesn't play a direct role for accelerating charged particles, it's much too weak compared to the electromagnetic force for a charged particle.
So, a pulsar would do the trick? How far back do we simulate movement of milky way and all galaxies around to say for sure they come from an 'empty' place? I think our local supercluster look quite a bit different say 1 billion years ago
Thank you for the explanation it makes sense (I wonder how much time that particle 'experienced'), apart from that obvious mystery of where it came from
So, I'd say it's a little more complicated than this even.
There are orbitals around black holes where particles could maintain stable orbits for long periods of time. Now when you get a complex environment around said black hole with lots matter attempting to infall but not having the correct momentum you will get a lot of interaction between particles of different velocities, hence why we actually see black holes at all. you get matter crashing into each other releasing gamma rays and such. From these interactions alone you can get gravitational particle acceleration.
It just gets more complicated from here as you add magnetic field interactions.
It's relatively easy to detect cosmic rays if you are happy with secondary muons that are produced in the atmosphere when a high energy cosmic ray is absorbed.
Detectors are relatively cheap and come out of the box for citizen science projects, this project is relatively well known:
http://www.cosmicwatch.lns.mit.edu/
For high energy cosmic rays, you need to observe many of these secondary particles that are produced during the absorption with high temporal resolution, few nanoseconds, to be able to tell anything about their properties.
There are essentially five or so detection principles for measuring cosmic rays at the ground, and many observatories combine multiple techniques.
First, you can observe the secondary particles that reach the ground, mainly muons and electrons. This is possible either using water tanks with photosensors inside detecting Cherenkov light or using scintillators like in the project I linked above.
Then you can also detect very short and faint light that is also emitted in the air shower. This also comes in two variants: Cherenkov light is emitted in a cone around the charged particles and results in a "light pool" of roughly 250m diameter on the ground. Fluorescence light is emitted in all directions. We build optical telescopes with extremely fast and sensitive cameras to detect Cherenkov or fluorescence light.
Last, there is also radio emission from air showers, you can detect with antennas.
Auger in Argentina combines water tanks, scintillators and fluorescence telescopes and is investigating adding in radio antennas.
Telescope Array, the experiment which measured this event here, is using scintillators and fluorescence telescopes.
Pierre Auger Observatory is a project with scientific collaborators from all over the world. The initial construction costs were shared by 15 different countries. Ongoing work and upgrades are paid by a variety of international funding sources. To get some sense of that, see [1] and the Acknowledgements section of journal papers arising from work there, such as [2].
Most DSLRs are designed to withstand down to 0C. Most freezers are around -20C. You risk damage to the battery, condensation on interior components, contraction and mechanical damage of components with tight tolerances, etc.
It's really easy to build a basic visible cosmic ray detector - they're called cloud chambers. Here [1] is a random video, but you can find countless sources just searching for 'how to build a cloud chamber.'
If you haven't seen one of these before, it might sound more cool in paper than in practice. Cosmic rays are absurdly abundant to the point that dozens to hundreds are passing through you per second. So a cosmic ray detector ends up turning more into something like a really neat art show than a search for a signal. Of course there is the search for 'the big one', but somehow it's not quite so romantic when you're getting plowed by these guys constantly.
I wonder if there is a common substance that could be scanned for historical interactions - like some salt flats.. that might preserve, like film, a history of energetic anomalies.
> an observed sharp enhancement of the production of cosmogenic isotopes by cosmic rays. It can be marked by a spike in the concentration of radioactive carbon isotope 14C in tree rings, as well as 10Be and 36Cl in ice cores, which are all independently dated.
320 EeV = 3.2e20 eV ~ 50 J ~ energy required to lift 5 kg by 1 m or energy sufficient to warm a cup of water from 0°C to 40°C
Edit: to launch 100t of payload to LEO with Starship you will need ~60 millions of such particles, a tiny fraction of the number of protons in human DNA.
A cup of water is 0.2 kg. Water specific heat is 4200 J/kg*°C, so it takes about 840 J to heat a cup of water by 1°C. 50 J is not going to make any noticeable difference.
OTOH a tennis ball is about 0.06 kg, so from e = m*v^2/2 would give about 41 m/s, a good serving that you will definitely feel if it hits you.
There’s a lot wrong here, starting with the energy of the newly discovered particle. The particle in question had an energy of 240 EeV, which comes to 38.45 J.
By accident I used the energy of OMG particle which is also mentioned in the article, but it is not completely wrong — it’s the same order of magnitude as the new one, so it still can be used to understand how much is that.
That would only get you to 3km. 20K to get to 100t * 3000 to get you to 3km. Would need another 100x to get to orbital height. But that's trivial compared to the energy you will need to get up to orbital speed. And then there's gravity drag.
I wonder what impact on eg human body such particle would have. would it just go through or boil some cells?
could someone unlucky hit by such particle just drop dead?
Probably pass through while generating secondary radiation also powerful enough to leave the body. Nothing you’d notice. Might cause cancer but so could other cosmic rays or radioactive rocks you encounter day to day.
The particle won't dissipate all its energy inside the body, so likely it won't even hurt you noticeably. It it hit you eye though you'd likely see a very bright flash.
50 J is about as much as an airsoft gun pellet carries; not going to hurt you even if it manages to dump all its energy on your body.
Most impressive about nuclear physics, is knowledge, that in classic rigid matter exists huge-huge emptiness - atom core is just about 1/100,000 of volume of atom.
And this emptiness is so huge, that to make safety blanket against cosmic particles for interstellar flights (on current level of radiology medicine), need few METERS of solid shield, or hundred KILOmeters of atmosphere.
Second important consideration, nuclear fission reactions are very slow if compare to speed of light, this is why possible nuclear bomb, as if cores explode immediately, all nuclear fuel will be just fly out and not react.
So, high probability, if SINGLE high energy particle will intersect human body, will nothing happen, this particle will just fly right through body without any visible effects.
But even if particle will be so fortunate to directly hit some core in human body, according to rule of impulse saving, this core will also got very high speed and will fly out of body before fission appear (when high energy particles hit core of air at high altitude, they have some time to fission, so on surface usually detected traces of fission).
You wouldn't notice it. The energy is sufficient, but the momentum is too small. Energy goes up with square of velocity and momentum just linearly, so this particle gets more of its energy from velocity and less from mass compared to a macroscopic object. The result is lowish momentum.
It probably hits a water molecule and the heat is quickly dissipated. You don't recoil because the momentum is small.
It might kill a few cells, but far smaller than the ordinary cycle of life. So any effect is swamped.
Okay so build a hundred km² (10 km per side) observatory and you’ve got yourself a ‘telescope’ to the same degree that super-kamiokande is sometimes described as being a “neutrino telescope”.
That's exactly the kind of detector that observed this event. It's been done. Multiple observatories exist that have effective observation areas of hundreds to thousands of square kilometers. Pierre Auger Observatory in Argentina is more than 3000 km²
From the link: "The scientists nicknamed the particle ‘Amaterasu’, after a Japanese Sun goddess."
Shall we conclude the Japanese Sun goddess is here to stay too?
Various devout Christians consider "Oh my God" to be contrary to the third commandment: Exodus 20:7 “You shall not take the name of the LORD your God in vain, for the LORD will not leave him unpunished who takes His name in vain.”
Madalyn Murray O'Hair, "America's Most Hated Woman" due to her advocacy of atheism, would say "oh my God" because it triggers Christians. https://youtu.be/pu5cqoSbeJA?t=832 .
So anything this energetic would need have a nearby source (in astrophysical terms) so it doesn't have time to slow down. But when we trace these things back, we see bupkus in the direction it came from.
This means there is either (almost certainly) interesting new astrophysics, or (with tiny probability) new particle physics involved. Whatever is giving individual protons the energy of a thrown baseball is probably something worth studying.