Over the past few years, something killed over 5 billion giant starfish and “science” has no idea what did it. Without the starfish, the sea urchin population exploded, turning kelp forest into desert.
I love the LHC. But we need to seriously grow the science pie and prioritize the science of ecosystem collapse and management. There simply aren’t enough trained scientists.
[..] we do not have a fixed pot of money that we sit down and say, “What is the best way we can spend this on research?”
It’s not like there’s a fixed number of dollars and we say, “Okay, biology gets so many and chemistry gets so many, and particle physics gets so many.” This is a lesson that has been beaten home over and over again when we’ve had an expensive science project that people have campaigned against on the theory that the money could be better spent elsewhere, and the project gets cancelled, and guess what, the money does not get spent on science at all ’cause there’s no rule that the amount of money spent on science has to be fixed
If it was a competition between english literature and greek philosophy you can be absolutely certain that the fixed amount would be a tiny fraction of what it is now.
Curing cancer, infinite amounts of cheap energy, staying competitive, winning against China both economically and militarily, you really think the size of the pie would be the same without these things?
Because the costs of not adapting to climate-related environmental change will be astronomical. I mean, would you say climate adaptation is on par with the importance and urgency of curing cancer? The arguments would be interesting.
Ultimately there is a fixed amount of effort people can expend - however the science budget isn't fixed in the sense that projects like the LHC can generate the excitement etc to enlarge the overall pie ( for science - taking away from something else of course - though that could be time spent at home unemployed ).
Most science is done through small incremental advances that add up over time - one of the problems with that is it's rather hard for politicians to see or understand that.
Big projects capture the attention. Though obviously big projects can also, if they fail, damage the reputation of experimental science as a whole.
So yes - LHC isn't probably the best use of money - there are more pressing immediate concerns - however on the other hand - everyone has heard of it - it attracts funding and people to science ( and does good science of course ).
This is a huge problem when the immediate extinction threat problems are global.
See the ssc as an example. Note also the nih is a massive fund (25+ billion in grants per year, all bio). Nsf is a better example because they have to fund many fields. Doe has fixed funding for nuclear, plus soft funds for scientists in many fields (they funded the human genome long before nih).
You want to remain conservative, but also want everyone else to be more risk tolerant?
It's always easy to talk about spending other peoples money isn't it? ;-)
Either we have a breakthrough that shakes physics and that leads to brand new discoveries that further shake engineering, or it is a waste of time. Between ~1870 and ~1950 we had a fireworks of discoveries that wee shaking our view of the world. And explained transistors.
Today? Nothing special except for a bunch of people and PhD students.
We are just starting to scratch the surface of biology - this is where we need to invest. Solid state physics could be another one. But not particle physics with their crazy energies that are completely unreachable outside of an accelerator.
For context, I have a PhD done in CERN. If I were to choose again, I would bo for bioinformatics or biophysics.
Money was not spent in order to confirm that the Higgs boson existed: rather, it was invested in order to improve technology to such a degree that we could have an answer. That is what people paid for, and that is what humanity will enjoy.
Whether the Standard Model is validated or challenged, it matters as much as a grain of sand on top of the Himalayas.
Thank you for the kind comment.
> it was invested in order to improve technology to such a degree that we could have an answer. That is what people paid for, and that is what humanity will enjoy.
What part exactly is humanity going to enjoy? The 8 T magnet? Where do you envision using the technology to drive particles?
Or, to be fair, which technology from the LHC was ever used for humanity?
Anything you do in biology today can have a direct effect OTOH.
This experience, coupled with the US government’s decision to not build an equivalent super collider in the 90s, led him to switch to computer science, where he had a bit of a “head start” due to his experience at CERN.
He always said the data analysis, measurement, and collection stuff was more advanced than anything else in the world at the time. The original “big data”. But in 2022, this is less of a distinguishing feature in his opinion. Also, Bell Labs probably faced similar issues.
I asked, and he agrees that he’d do molecular biology if he were starting over today. So you’re not at all crazy!
I made friends with some people working in the computer center (the famous 513 at the time, may have changed by now). I discovered UNIX, then Linux and I was sold. During my PhD I started to work for the industry in IT and never stopped since then.
CERN had an outstanding computer system, they invented the "grid". As for HTTP, it just happened to be invented at CERN when Tim Berners-Lee was working there (there was nothing special related to CERN). The hardware-software interface was bleeding edge technology (built into the detectors) and the data bandwidth capacities were ahead of their times.
All of this, however, is not particle physics and if CERN did not exist there would be not much "holes" in today's computers (as opposed, say, to DARPA or Linus).
Actually particle accelerators somewhat funded the early investment in high field superconducting magnets which might now appear to yield improvements in fusion energy Q factors due to the strong dependence on field intensity 
Of course it has to be useful. Based on that we build our world (through engineering) and if we work for something that cannot be used then it does not make sense. I do not see any use for quantic foam or quasars any time soon.
As for magnets: this certainly helps, but has CERN done any breakthroughs that were used afterwards? They had at some point the world record for the density of a magnetic field. There is no practical use for that neither in MRIs (that require a lower field and were used for many years already), nor tokamaks (where the homogeneity of the filed is paramount on larger scales compared to the ones in CERN.
I am not saying that the engineering work that is done in CERN is useless - it is just that the money poured there goes primarily in some "whose is bigger" contest that has exactly zero chances to be used.
No, it hasn't. "Science" means "knowledge", and the only things you can know are the things how they actually are.
A PhD from CERN should not confuse science with engineering.
Yes, but investing MM€ to know something that has no use does not make sense, if there is much more important knowledge competing for the fund. I guess that knowing how to cure MS is more important that discovering a four quarks particle, right? In an ideal world we could do everything but we have to choose wisely because the amount of funds is limited.
> A PhD from CERN should not confuse science with engineering.
Can you please tell me where I confused them?
... which started with a simple observation. 200 years ago in 1820, Ørsted was giving a lecture when he noticed that a current-carrying wire deflects a compass needle. Of course, he had no idea what that would lead to, nor did anyone else. It was a completely 'pure', 'unuseable' observation. But, as a student of Kant (what a waste of time!) he shared it in a four-page pamphlet.
It just doesn't pay enough. Society doesn't place enough value on these roles.
One example is a female friend of mine with two Master's degrees who's currently not getting paid a lot in a government job. She would love to do a PHD and start doing research in the areas she's been educated in and worked in, but she hardly gets invited to interviews and when she does, she never seems to get the job, because she lacks research experience. The longer she works outside of research, the less of a chance she gets to actually get hired for a research position.
I understand that there have to be high barriers to entry, because quality needs to be high, but I feel like the balance currently might not be right.
There's a lot of other scientists who are just as skilled, may have just as good ideas, as so on. They view their work as work, enjoyable for some or just a job for others (though these are very few as by this point they just find something less stressful that pays).
So it's not that the lack of money doesn't help, it basically shapes the entire labor pool of scientists selecting for the most passionate, the ones who might not be making much over a small resturaunt manager when you normalize their time spent to comp. Skilled scientists who want a life and see other opportunities tend to leave science as it's high risk low reward and apply their skill elsewhere. It's a lot to science.
If you look at CS, the research community is almost starving as most leave for big tech or others that can comp 3-5x what academia can provide. I suspect big pharma is a hit the same.
> Most scientists don't do it for the money though
Naturally, a field that doesn't pay well will fail to attract people who are motivated by money.
Funding needs to come first then extra scientists if budgets permit. The problem in immunology as far as I can see from a close onlooker perspective is there is less and less funding. Staff get by on meagre wages, hoping to hold onto their role as budgets get cut and grants get smaller.
Very sad state of affairs.
The US federal government seems to have a largely arbitrary budget. No real experience or rational principle restrains it. We have never seen a clear case of negative consequences related to spending too much, so no one is quite sure how much might be too much. Sometimes a politician sees political advantage in grandstanding about fiscal responsibility, but seconds later, that same politician will be found funneling some arbitrary number of billions of $$$ in pork to their district.
I suspect the truth is that politicians mostly just don't care that much about science because they aren't smart enough to understand its value, or decent enough to care.
We could and should 10x the science budget, and it would have vastly more impact than almost everything else the government spends money on. But we don't because politicians see it as some amusement to fritter a limited amount of money on. Occasionally Republicans will publish a list of ridiculous scientific projects and no one wants to be caught in the crosshairs of that -- even though tens of billions are probably being simultaneously spent on murdering people in a country you've never heard of, or flushing thousands of tons of soybeans down a toilet.
Defense-related science gets an exception to this, which is why a big war is often also a big leap forward for science.
You're right however, that nowadays it's structural that scientists request funding through defense, like of course it needs that cover.
Most politicians understand that humanity is never going to leave earth and so all of this stuff is basically pointless to the actual business of governing. So we will miss out on the next MRI or microwave because we missed some new technology... so what? How does that make their lives, personally, worse in any practical way?
Humanity is going to melt down in the pretty near future here... if the Russia stuff doesn't escalate, and the US doesn't slide into fascism, there will be yet another wave of conflicts tomorrow due to extremity from climate change or something else. Water is going to become extremely scarce, habitable lands are going to become uninhabitable and undesirable tundra is going to become extremely desirable, and countries will fight for ownership of the reshuffled deck.
Humanity has passed the peak of this enlightenment cycle and is heading for a new dark age, maybe the Forever Dark Age given the depletion of most of the accessible reserves of energy and critical materials. The best-case scenario is that we wind down to some steady-state existence with some level of mechanization/etc, but the exponential growth thing mathematically cannot continue indefinitely. The Earth does have physical limits and just because Malthus was wrong about it in 1700 doesn't mean you can grow an exponential rate in a finite system forever.
Who cares about space? We're not going there. And their job is making themselves comfy and keeping things going mostly straight in the meantime. Science really doesn't matter in that context as long as someone else isn't drastically ahead of you such that it produces a military advantage, they don't care about the absolute advancement, only the relative to everyone else.
I'm not saying this is a good thing, I'm saying this is how it is.
It's not continuing indefinitely. Since the 1960s, women have been drastically curtailing their reproduction when given education and access to contraceptives. Fertility is above replacement only in Sub-Saharan Africa and a couple other places. Global fertility is projected to drop below replacement in this decade, leading to a population peak and decline a few decades later.
People are constantly predicting doomsday because they think humanity deserves it. Maybe we do, but by my reckoning, there is no guarantee of our imminent demise just yet.
Care to name any inventions we owe to... the Spanish-American war?
The Krag didn't have spitzer-point bullets so it had to go, despite poor tactics really being more at play than the rifles themselves. And the Krag was banned from the service-rifle category of the influential National Match shooting competition after some humiliating upsets showed this.
We've still barely scratched the surface of ecological sciences
What changed to make a slew of new discoveries possible? Is it pure
chance like Bitcoin mining? If so, what's the chance of discovering
nothing for years and then, like London buses, three all come along at
What are the implications of a new "particle zoo"? Can I do anything
with these, like build new atoms, or use them to detect something?
I think the goal of this work is to understand the nature of the strong force. Quantum chromodynamics (QCD) is pretty difficult as far as quantum field theories go, its strongly-coupled, meaning making first-principles predictions of what to expect is really tough. Its a huge computational effort being run on some of the biggest computers on the planet (lattice QCD).
We observe that all the hadrons in experiment are "colour singlets" meaning that the colour charge of QCD is hidden. These are usually three-quark states (protons, neutrons, etc) or quark-antiquark states (pions, kaons, etc). There are many other ways of making "colour singlets". For example, these tetra and pentaquark combinations. There are also "hybrids" made of a gluon and some combination of quarks. There is some evidence on both experimental and theoretical sides for at least a few of these hybrids. Glueballs are also possible, states made entirely of gluons, but there is only really theoretical evidence for these so far in specific limits. We just don't know if they exist in reality.
Everything is made of this stuff. Most of the mass around us comes from the strong interactions. It's important to understand it.
I think people are a bit spoiled by the Higgs leak/announcement/discovery timeline. I'm sure those in the know have known about this discovery for some time but, like you said, it takes some time to gather enough data to be confident (and to qualify as the mathematical standard set for "discovery").
"Sigma" here refers to standard deviations off the Gaussian normal mean. Zero means completely random. In psychology they publish at 2x sigma, 95%, which means 20:1 odds against a spurious result, and they publish a lot of spurious results because you can generate an unlimited number of hypotheses. In physics, things are considered more deterministic, and an experiment doesn't need to recruit undergrads to be data points, so you run your LHC for a few more months and avoid wasting people's attention.
However, the approach taken by CERN is of course right. They find a result at a certain significance level and then collect more data to verify the result. As long as there aren’t thousands of simultaneous verifications running, this approach is sound. Obviously yes, physicist’s know what they’re doing.
Having said that, please don’t read this comment as me approving of frequentists statistics. Bayesian or cross-validations are way easier to interpret where possible.
But more idiot's questions if you have any thoughts....
My understanding was that particle accelerators were being used to try
and deconstruct matter, to do for want of a better word "fission" by
smashing things together and seeing what smaller bits came out - by
analogy to mass spectrometry.
What seems to be going on now is that we're trying to make new
particles. Have we switched to a sort of "fusion" - to see if smashing
things together will get them to stick in bigger configurations?
Have all the most fundamental bits (quarks?) been found now? Can we
prove that those are irreducible?
That energy forms into a bunch of particles, each of which will then decay into less esoteric particles.
We have no proof (and it's probably impossible to do so), that anything we've found is fundamental.
This is a good statement.
I've seen many online state that quarks are fundamental.
There is no way we can make such a definitive statement.
Quarks may be not be fundamental.
We will never, in the lifetime of anybody who guesses we ever existed, be able to build an accelerator powerful enough to check whether it is right about gravity. So, they potter and try to show this or that family of variations (among 10^500 imagined) does or doesn't contradict details of the Standard Model we have most confidence in.
But using current accelerator technology it would require an accelerator many times the size of the earth, _many_.
I use to work at a particle accelerator, part time, when i was in college. Fun fact i once confirmed Einsteins photoelectric effect using a high energy x-ray beam, a copper target, and high voltage.
It's not possible to do better than that, though, because you can't prove a negative statement like "there are no more fundamental particles". Even if we understood the laws of physics completely, they could always change on us. It's all up to the guy who owns the universe simulator.
I sense that the next step in physics and ontology can only happen when we have created a new linguistic approach to capture the 'fundamental' idea here.
All other particles would be derivative, and their being caused by some base rules.
Physics will only be complete when fully explained in terms of information, regardless of the physical reality of information. The two aspects of explanation are 1. the rules and 2. what are the bits? Perhaps both those things are one.
Perhaps bits are not fundamental and quaternary bits are, but that would still implicate information as fundamental.
this entire discussion is fully outside of my knowledge wheelhouse but why should we believe that the universe is anything less than infinitely fractal at the micro scale? like you said, how would we even know if something is fundamental?
And what basis does that claim rest on
Physics has had the other problem for a while now - they know the current theory is wrong, but they can't find any evidence to disprove it, and it's wasting generations of scientists and particle accelerators to do it.
Follow-up question. Why don't quark anti-quark combinations self annihilate?
I've been trying to understand this.
In the pentaquark, charm-anticharm annihilation can and will happen. The time for charm-anticharm annihilation is usually slower relative to light and strange hadronic interactions though. In part because the strength of strong interactions reduces at higher energies, and the charm quark is more massive and so the relevant energy scale for the decay is higher.
One charm-anticharm resonance, the J/psi(3097) is very long lived even though the quarks can annihilate. In many theoretical models of these things, its often treated as a stable particle.
When particles are collided and the result measured, there's probably lots of noise in the data. In a single picture, a single pixel (datapoint) tells you nothing. But capture enough results, and you can begin to filter the noise out, revealing patterns underneath.
First, there was a pretty constant stream of these kinds of discoveries over the last years. Most of them even discussed here [1, 2, 3 and more]
Second, what you need to find these rare events are two things: enough statistics and someone actually doing a specialized analysis looking for them.
Enough statistics have been accumulating over the past years, and now someone stepped forward, developed the analysis, tested it on small scale dataset, got it approved by the Collaboration, ran it on the large dataset and then chances are that you don't just find one but three very similar things.
Disclaimer: I am not part of LHCb and have no insight into the specific analysis of this discovery, just sitting on the same floor as the LHCb guys and knowing the general procedure.
Third, as noted in a sister comment, these quark states are expected from the standard model, they are just very rare and very short lived, so hard to find. Nothing new normally refers to "no physics found incompatible with the standard model".
These newly discovered particles are composed of multiple quarks and were found (at least up to the measurement precision) with the expected rules of how you can combine the elementary particles into composed particles (Baryons in thos case).
LHCb actually discovered a bunch of those in the last couple of years.
But these are expected from the standard model, just rare and thus not experimentally confirmed before.
"Nothing new" for LHC means no new physics. And if these quark states don't happen to be much more frequent or much more rare than predicted, there is no new physics here.
Means we are in a pretty tricky timeline.
They're probably interesting to study on their own, but the engineering instinct is to want to build something out of them, or use them as tools, which seems pretty hard if they disintegrate in a quintillionth of a second!
If a particle can decay into a lighter set of particles and still obey all of the conservation principles, they will. The heavier they are, the more they're going to decay and the more "options" they have to decay. An electron isn't going to do anything because there is nothing lighter than an electron that still carries charge, etc. Something much heavier, like a free neutron, will fall apart into a proton, an electron, and an anti-electron neutrino.
These particles have options galore as to what they can fall apart into being, and so they do, and with great haste.
For the sake of my own edification, I'd like to follow this up with a few somewhat seemingly dumb questions if you dont mind:
Is it the case that a given particle is trying to settle into a "lowest energy state" possible? I am not using physics terms here. More like conceptually, are these particles, due to the number of options available to them, decaying into the lightest stable variant allowed by the laws of physics? if that is the case, then could we perhaps find ways to engineer structures within which these particles last for a whole lot longer than they should (on a human timescale)? And what is stopping us from doing that? is it the energy cost associated with such a structure/device or is there a more fundamental reason we cant do that?
> Is it the case that a given particle is trying to settle into a "lowest energy state" possible?
Not exactly. Energy is conserved during these decays. In fact, energy is conserved during all physical processes, so the "lowest energy state possible" is a little bit of a white lie. What makes it a white lie is that it is a very good approximation to the truth for thermodynamic systems, i.e. systems consisting of large numbers of particles. But for quantum systems, it is no longer a good approximation. In quantum systems, what happens is that you have a wave function that describes all of the possible states a system can be in. The more mass the system contains, the more possible states there are in its wave function, and so the more likely it is to end up in some state other than the one it started out in.
It is even possible for the process of decay to reverse itself, and for the constituent particles to come back together and reconstruct the original, but for that to happen all the constituents have to be brought back together, so as a practical matter this never happens spontaneously in nature. In fact, that is the whole reason for building the LHC -- to make particles (protons) come together and make high-mass systems which then decay in interesting ways.
> are these particles, due to the number of options available to them, decaying into the lightest stable variant allowed by the laws of physics?
Not the lightest stable variant, just to one of the possibilities described by that particle's wave function. These will always be subject to the constraints of conservation laws, so the decay products will always be lighter than the original. But which particular set of possible decay products is actually produced in any given decay event is fundamentally random.
> if that is the case, then could we perhaps find ways to engineer structures within which these particles last for a whole lot longer than they should (on a human timescale)?
No. The wave functions for particles are fixed by nature. They are what give particles their identities. They cannot be engineered. The only thing that we can engineer is the arrangement of particles. Particles are like Lego bricks. You can stick them together in lots of different ways, but you can't change the shape of a given brick. Sometimes quantum Lego bricks fall apart spontaneously, but there is no way to control that.
That is correct. "Particle" is another one of those "white lies."
(Wouldn’t this be example of a structure that prevents decaying?)
Both fusion and fission release energy when they occur. Which seems somewhat weird to me. Is it the cases that the reason a stable atom has less energy than the sum of its parts (as you pointed out) is because it gave off some energy during the fusion process?
Be careful to remember your conservation of baryon number when listing your options!
Most Particles Decay — But Why?
Most Particles Decay — Yet Some Don’t!
Neutron Stability in Atomic Nuclei
It's more accurate to think of the energy "spreading out" (remember that mass is a form of energy too, since E=mc^2). The energy can rearrange (subject to conservation laws), between being one massive particle, or several lighter ones (in fact there's a superposition of possibilities, because quantum).
In principle the probability of switching back-and-forth is equal, e.g. the probability of particle A decaying into a B+C pair, is identical to the probability of a B+C collision producing an A. However, most of the directions those light particles can take will result in them flying apart rather than colliding; that spreads out the energy, so it can no longer switch back into the massive particle configuration.
Note that this is essentially the first and second laws of thermodynamics (energy is conserved, and concentrations tend to "spread out" over time)
Sometimes they will have intermediates, which then decay, and then those products decay, and so on. That's quite common. Eventually they just ... fall apart. The more options, the faster. The greater the energy stepdown, the faster, by which I mean "can it release a gamma? Or fall apart into some much smaller things?"
However, it is independent of "nearby" structure, where nearby is any distance larger than the nucleus. So, no, we cannot contain these particles within anything to prevent their decay, it is like trying to build a bouncy castle around a hand grenade in hopes that it won't go off.
Note that there is an apparent delay in decay, from our perspective, when particles are moving very fast, like a relativistic muon lasting longer (although still a very brief period of time by our standards) than expected, simply due to special relativity. But here this also would not help.
Things fall apart, the center cannot hold, and so on.
That puts it in perspective. Thanks for the reply and for taking the time!
>>trying to settle into a "lowest energy state" possible?
What if its actually the reverse: Its attempting and succeeding to be the most it can be given the eddy of forces around it - the particle is "becoming" - not "falling apart"
Some of the heavy elements assembled in colliders are described as decaying so quickly that one side of the nucleus is coming together even as the other side is disintegrating, a sort of brief wave of existence traveling at nearly the speed of light across this thing that has been forced together and wants to fly apart.
Also note that they're not rare and there's a fair bit of neat science behind that too.
> About 10,000 muons reach every square meter of the earth's surface a minute
(from https://www.scientificamerican.com/article/muons-for-peace/ ).
There's also neat stuff with time dilation and muons ( http://hyperphysics.phy-astr.gsu.edu/hbase/Relativ/muon.html ) - there should be far fewer observed muons at the surface if muons didn't experience time dilation from their relativistic speeds.
> The historical experiment upon which the model muon experiment is based was performed by Rossi and Hall in 1941. They measured the flux of muons at a location on Mt Washington in New Hampshire at about 2000 m altitude and also at the base of the mountain. They found the ratio of the muon flux was 1.4, whereas the ratio should have been about 22 even if the muons were traveling at the speed of light, using the muon half-life of 1.56 microseconds. When the time dilation relationship was applied, the result could be explained if the muons were traveling at 0.994 c.
(note: mean lifetime and half-life are different numbers)
The thing here is that 2.2 μs is slow, but even with something that is that fast (on a human scale), there's a lot of neat science that can be done with them. They've even made muonic atoms (where the electron is replaced by a muon) https://en.wikipedia.org/wiki/Exotic_atom ... and that leads to possibilities on lowering fusion temperature ( https://en.wikipedia.org/wiki/Muon-catalyzed_fusion ) because the muon is much closer to the nucleus in its ground state.
(Or at least, that's the magnitude of a Higgs boson decay, about 160 yoctoseconds.)
To save some googling: about 10^-22 seconds for a Higgs boson decay. Whe a nanosecond, one tick in a 1GHz clock, is 10^-9 seconds.
Lone neutrons are unstable, they decay in about 15 minutes to lighter particles. They have only a few MeV mass difference to the stable final state (the proton + electron + anti-neutrino).
For comparison, these particles are 2000+ MeV above their ground states so they decay pretty quickly.
(It might not even exist, after all…)
> There exists no formal definition of a WIMP, but broadly, a WIMP is a new elementary particle which interacts via gravity and any other force (or forces), potentially not part of the Standard Model itself, which is as weak as or weaker than the weak nuclear force, but also non-vanishing in its strength.
That's not regular matter.
Its MACHOs that are made up of regular matter (well, brown dwarfs and black holes).
There are also theories that put an undetected form of neutrino as dark matter which would be a bit more regular.
> Its MACHOs that are made up of regular matter (well, brown dwarfs and black holes).
I mean...isn't this "technically correct" on a level that's beyond even the usual extremes of pedantry? Or maybe I'm missing something?
There's no cheese in my fridge, only blocks of cheese that are made up of cheese...
A MACHO is a low-energy star or whatever that would explain the apparent presence of dark matter without actually requiring anything exotic like WIMPs. The idea is that these objects are (relatively) massive, numerous, and so low-energy that they are hard to detect and their combined mass would theoretically explain the effects we currently attribute to dark matter.
Or in other words, a WIMP would be like claiming that your fridge is disintegrating your cheese, and a MACHO would be the kid raiding the fridge for cheese at midnight when you're asleep.
To twist that by claiming such a particle would still be viewed as matter just like all the rest of the matter that goes into the stress-energy tensor is where the pedantry started in this thread. The original statement is pretty clear in its intent. The pedantic reading that followed that comment results in "normal" matter just being all matter by definition and hence "normal" is redundant since there can't be abnormal matter. That clearly isn't what the first comment intended since they actually meant something by "normal".
This is incorrect — the coupling with the Higgs field is responsible for the rest mass of electrons and isolated quarks.
> and even less to do with the mass of hadrons
Yes, this is true — the contribution of the separate masses of the constituent quarks to the mass of hadrons is small, like you write.
Edit: there is an excellent in-depth (but math-light) explanation about the Higgs mechanism by Leonard Susskind. I highly recommend it if you are interested, it lasts about 1h (plus some Q&A) and is extremely approachable, while being presented by an established authority in the field.
To use a more relatable analogy, it's a bit like using quantum mechanics to build a skyscraper. In principle it should be possible, it practice it is incalculable. Newtonian physics does the job fine in that scenario.
In general, materials researchers for something like concrete are going to be better off exploring the (very large!) high dimensional space of possible formulations of existing concrete ingredients and pushing out the pareto frontier for the best possible concrete that way. Also, one probably shouldn't be using bleeding-edge concrete tech for a skyscraper foundation - in a safety critical application like that you just build it 1.2x bigger than you need and it'll still be much cheaper and safer than a process like what you just described.
Materials research is super interesting, though, even if it's not building up from quantum-particle scale research. And atomic / molecular features of inputs can yield interesting material candidates.
Source: I work (as a software dev, not a materials researcher) at Citrine Informatics, selling software to assist companies who are trying to do practical materials things like make better concrete.
Material sciences, condensed matter physics, chemistry and etc work up from the abstraction layer of "atoms". It's a quite well defined and relevant layer. So, until that work brings some different configuration¹ for atoms, they will have no impact at all.
1 - It doesn't need to be as new elements, but even for the resonance between the nucleus and electrosphere they didn't create anything new, and only things affecting the electrosphere matter. (Even then, they didn't create anything new on a nucleus either.)
If those models had predictive power that translates to atomic scale you wouldn't need a multibillion collider to prove them.
One of the few molecular level effects that depends on the virtual particles that are important in the standard model is the Lamb shift https://en.wikipedia.org/wiki/Lamb_shift
In this case the virtual particle is a virtual photon, not a virtual weird particle, so it's just scratching the standard model.
I'm not sure if the g-2 Anomalous magnetic dipole moment of electrons https://en.wikipedia.org/wiki/Anomalous_magnetic_dipole_mome... can be measured with a cheap equipment.
Like all things in science has any stopped to think if we should push it?
We see it day in and day out where science has developed something without slowing down to do research into the affects other than the one they are scoped in on while making what they are making. I'm specifically thinking of the new chemical sciences that have brought out some formulas that are great at a specific thing, but are absolutely tragic to nature in so many more ways. The science shows these chemicals to be tragically toxic, yet that info gets shoved in a drawer so inventors can make money.
Great, we made something, but we should be able to say thanks but no thanks. Let's put that in the column of good idea, good science tech to achieve, but best left alone. Take that learning and try to achieve the samething in a different manner so that it doesn't kill everything else.
Shelving discoveries based on the perceived effect they (could) have (who would even evaluate that?) is a slippery slope if I ever seen one.
This is precisely what should happen though. We made ICE powered cars that used leaded gasoline because reasons, but the results of that were horrible for everything except the ICE. We shelved that tech because it was just bad.
We've shelved the widespread use of lead in paint. We've shelved the widespread use of asbestos in lots of things. There's nothing wrong with realizing the juice isn't worth the squeeze. We know that it is something that happens. Sometimes we make something that comes with a heavy cost. Obviously we don't have a way to know that until it exists. Then again, we should be able to start recognizing that particular chemical chains results in bad things so we should be super careful with the new thing because it is looks like something we've seen before. We can do this with virus and what not. Why not with chemistry?
The LHC employs a lot of people working on smart things. CERN gave rise to the world wide web and there are many other innovations in computing, construction, and theory that come from the work being done there.
> The Large Hadron Collider took about a decade to construct, for a total cost of about $4.75 billion. 
> Since the opening of Mercedes-Benz Stadium in 2017, the Falcons organization has publicly pegged the cost of the building at $1.5 billion 
It's the same order of magnitude of cost as a sports stadium. It's a tiny slice of the worldwide economy.
We don't know where the key discoveries in "theory state space" are, so we continue to search. Finding the right evidence or surprises could lead to rapid changes in how we think and view the universe.
I'm sure some medieval people must have found scientific tinkerers wasteful as well.
Diversification of investment is good. It's not like all research dollars are going to high energy physics.
The inventor of leaded gasoline (Thomas Midgley) also invented CFCs, but at least we didn't already know those were bad for the ozone layer at the time.
I doubt that the science can show a compound to be tragically toxic any more than it could show a compound to be hilariously toxic, frightenly toxic or delightfully toxic.
Apart from an observer, who is typically human (though sometimes in our mind an athropomorphized animal or superhuman deity) I'm not sure anything in nature can be tragic. It just is. No one mourns the trilobites.
Is this big news that could lead physics out of its long stasis? Or "just" relatively small details?
They discover two new composite particles. There are hundred of composite particles, so it's somewhat business as usual. Anyway, most composite particles have 2 or 3 quarks, but the new particles have 4 or 5 quarks. So they are weird new composite particles.
Making calculations of particles made of a few quarks is very difficult, borderline impossible, so it's interesting to find new particles and verify that the current approximations for particles made of a few quarks are good enough or fix them.
Also, the approximations for particles made of a few quarks use virtual particles that appear and disappear. And some of these virtual particle may be a unknown new particle. So if the calculation is too wrong it may be an indirect way to discover a new elementary particle and escape the "long stasis". But I'd not be too optimistic about a groundbreaking discovery.
 I don't think it's a problem yet. The current "long stasis" it's overrated IMHO.
But those rules for the quantum numbers can also be fulfilled with certain combinations of four or five quarks, and there is nothing in the Standard Model that either forbids or requires these combinations to exist as real particles. So it was new information when the first resonances that could be interpreted as those kind of particles were discovered and it is interesting that there are more of these. But it is not unexpected, either, the earliest paper on pentaquarks cited on the wikipedia page is from 1987.
So it is indeed close to business as usual. It is interesting, and new, but is is still filling out the corners of the Standard Model.
Like the wikipedia article on pentaquarks says, a five-quark bound cluster was considered by Gell-Mann back in 1964.
Of course, we aren’t entirely sure of what GRB’s come from either :D
First, not sure about the process that generates them. Saying "they came from an active galactic nuclei is ok... but how did they get accelerated to such energies?
The next problem is that being so highly energetic, they should be interacting with the cosmic background radiation if they're traveling about 160 mLy which would drain off some energy ( https://en.wikipedia.org/wiki/Greisen–Zatsepin–Kuzmin_limit ) and there are some observations that appear in violation of that limit ( https://en.wikipedia.org/wiki/Oh-My-God_particle )
https://www.quantamagazine.org/cosmic-map-of-ultrahigh-energ... is also interesting to look at (very neat visualization).
Part of the problem is that we're not entirely sure what they're made of. Most theories have been working on the "they're protons" assumption, but other approaches with having them be heavier nuclei means that they don't need to travel as fast to have the same amount of energy (which also changes the equation for the GZK limit as that applies to protons).
I suppose given the energies involved, we would need to observationally ascertain where in the sky the cosmic rays come from in order to put bounds on how they were made and what they are made of.
Do yo know of any efforts to observe cosmic ray sources or build a cosmic ray telescope?
This is part of the challenge - the map of where they are hint at some hot spots ( https://skyandtelescope.org/astronomy-news/cosmic-rays-hint-... ) but as these are charged particles (not light) the path that they follow isn't necessarily a "draw a straight like back to the source"
> Do yo know of any efforts to observe cosmic ray sources or build a cosmic ray telescope?
We don't directly observe the cosmic rays, but rather the cascade of particles that they make as they crash through the atmosphere.
There are several different approaches to this. https://en.wikipedia.org/wiki/Cosmic-ray_observatory
For example, there's Ice Cube ( https://en.wikipedia.org/wiki/IceCube_Neutrino_Observatory ) and a visualization of some of its results - https://youtu.be/2DDQYHIbL3Q and https://youtu.be/rSwbL2coz_Y
Another cosmic ray observatory - https://www.auger.org // https://en.wikipedia.org/wiki/Pierre_Auger_Observatory
> But since these high energy particles have an estimated arrival rate of just 1 per km2 per century, the Auger Observatory has created a detection area of 3,000 km2 (1,200 sq mi)—the size of Rhode Island, or Luxembourg—in order to record a large number of these events. It is located in the western Mendoza Province, Argentina, near the Andes.
Pierre Auger is looking at air showers - https://en.wikipedia.org/wiki/Air_shower_(physics)
I'd also suggest checking out PBS Space Time (in general) and in particular this episode - The Oh My God Particle - https://youtu.be/osvOr5wbkUw
It's almost as if we're looking at a continuous functions that can generate an infinite number of discrete segments.
Is there a way to even begin to assess the likelihood that research on esoteric subatomic particles is going yield useful results for energy, health care, or transportationn? Or are we just digging away ad nauseum with zero idea if the next rock contains some precious ore?
Our fundamental understanding of the Universe has barely over the past half century, and it continues to slow with every anticlimactic LHC collision that emerges.
What makes you think that's a fact?
LHCb refers to the specific detector and group responsible for measurements related to b-quark.
> The eventual recognition of the muon as a simple "heavy electron", with no role at all in the nuclear interaction, seemed so incongruous and surprising at the time, that Nobel laureate I. I. Rabi famously quipped, "Who ordered that?"
These new particles are also made from quarks (4 for the tetra quark and 5 for the pentaquark), and they also have a size.
We've never seen a free quark, so we just don't know.
The same thing here: it's not known if quarks have an internal structure. And unless your energy is higher than the size of the object being measured you certainly can't tell if it's point like.
Of course there is always a measurement limit. Current limits for the size are "smaller than 1e-19m". That's why I wrote "as far as we know", as we have no evidence that they are not point-like, and we have rather strong limits.
Here's an analogy that might not make sense to everybody, but to me, this feels a bit like the famous memes from the Chernobyl movie. "3.6 Roentgen. Not great, Not terrible." where the real answer about the radiation exposure was radically different. Some the people tasked with coming up with that 3.6 answer might not have had a bad intent, but they were at the limits of their technology to provide an answer and radically misinterpreted what they were seeing partially because of that.
"Particles" are phenomena which appear when fields are quantized. They aren't really balls of stuff. E.g. photon is a quantum of a wave in electromagnetic field.
These exotic particles are simply a confirmation of predictions made by the Standard Model, they are not surprising, it's just the first time there was enough data to test this particular prediction of the theory.
> I'm very skeptical that it's apparently just smaller and smaller particles all of the way down
The Standard Model was formulated in 1970s, no principally new phenomena were discovered since. So it's been strong for ~50 years.
It's known to be incomplete as it does not explain gravity, and physicists hope that better theories will be found in future. But so far nobody was able to formulate a theory which would make better predictions than the SM.
> What if everything here about reality is being misinterpreted?
The topic of physics is to build models which predict observed phenomena. Knowing what reality _is_ is a topic of philosophy and religion.
"The international LHCb collaboration at the Large Hadron Collider (LHC) has observed three never-before-seen particles."
But doesn't that make it a discovery? Sure, a discovery of something already predicted by the Standard Model, but unless it has been observed, we just don't know for sure if it actually exists.
The Standard Model just has been very successful at predicting stuff :)
It’s as likely as anything else given how common it is for even STEM educated humans to be religious and read into their tea leaves.
There is no center of the universe or higher power; why should a handful of physicists be unequivocally “correct”. Deference to their figurative identity is all the untrained can conjure. Doesn’t mean we need to shower them in praise and empower their agency at the expense of one’s own.
Even if the science is right they’re still just one of seven billion.
That's a question better saved for psychiatry and sociology.
Extraordinary claims need extraordinary evidence. This article doesn't really describe anything that unexpected. It's not that hard to believe.
Scientists make a theory.
The theory predicts a bunch of particles.
Scientists build LHC to check if the theory is right.
LHC finds the predicted particles.
The theory was right.
I.e., if I see some feature in the LHS data, what makes me call it a "particle" vs. some other concept / object / phenomenon?
Of note: these newly discovered particles are not smaller than previously known things. In fact they are bigger.
Well done sir.
still the desert otherwise, a higgs and nothing else :(
Does this in some way (maybe even indirect) help us with solving any of the environmental challenges we are facing right now?
I see CERN is using “home.cern”.
Google has “.google” but I’m not sure what the root is since they have many.
What’s the convention for the root of a companies branded TLD?
Since first "high energy" particle accelerator in 1932, we have used particle accelerator to smash and study subatomic particles. First we found a whole zoo of them, but then 1961 Gell-Mann's quark model explained how all those subatomic particles are not elementary particles, but made of various combinations of quarks.
From 1975 we've had the standard model of particle physics, and after 2012 with the experimental discovery of the Higgs boson, now we've found all the particles in the standard model. But the road up to 2012 was: Keep building bigger accelerators, keep adding more energy, keep finding new particles. Why stop now?
Some skeptics say that now we have maybe found them all. And we don't have a good theory that would predict new particles, so maybe we won't find new elementary particles, no matter how we go on. Maybe we should pause and reconsider. Work on new theories.
But, we also have theories: Supersymmetry, and string theory. Supersymmetry predicts the existence of the superpartners of all the 17 elementary particles. Maybe their discovery is just behind the corner, if we just keep going? Or maybe supersymmetry is wrong, and the superpartners don't exist at all.
Timeline of discoveries of the elementary particles:
1956 electron neutrino
1962 muon neutrino
1969 down quark
1969 strange quark
1969 up quark
1974 charm quark
1977 bottom quark
1983 W boson
1983 Z boson
1995 top quark
2000 tau neutrino
There are opinions, that we should stop and reorient: https://bigthink.com/hard-science/large-hadron-collider-econ...
Or rather, planning useful projects only gets you things you already knew to ask for. Doing basic research is a way to create things you didn't know to ask for.