It's even worse if the usage of 'amount' is correct. Although from the pictures it looks like they would need a few more sockets before they became uncountable.
20 posts • joined 16 Feb 2011
This is the first experiment I have ever worked on where I find myself learning updates in the press before colleagues...mind you I'm in Western Canada and have only just got up and checked the web before email. If you are interested there are picture of it hitting the ATLAS detector here:
but the beam is only at the injection energy of 450 GeV from the SPS. The real test will be when they accelerate the beams to 6.5TeV each which requires the full 11kA current in the magnets. That's when we will learn if the 2 years of repairs worked and the magnets can handle the current...so fingers crossed with a little luck this time our understanding of the universe will break before the machine!
That's not necessarily a good thing. There is nothing worse for destroying the immersive feeling of something than, just after arriving at an alien planet they wander into a city and your first reaction is "Hey that's Edmonton!". Simon Fraser University next door in BC is a particularly common location and has been in Stargate SG1 and Battlestar Galactica: apparently 1960's concrete looks 'alien'.
It cannot be infinite. The best way to think of this paper is that there is a fundamental maximum speed in the universe defined by the properties of space-time. Then there is the speed of light. Generally these two values are so incredibly close to each other that we can think of them as one and the same. However light interacts with the virtual particles in vacuum and so sometimes has a speed which almost imperceptibly less than the fundamental maximum. Essentially this is the same effect as the slower speed of light in water where photons interact with the water molecules and so are slowed down.
"What productivity that has helped humanity has come from knowing that there are particles smaller than we can use in life have there been?"
Says the person posting a message on a website using a computer. The fact that you have a computer is because 100 years ago Rutherford was bouncing alpha particles of a gold foil and Schrodinger, Heisenberg et al were developing quantum mechanics. Oh and the web itself was invented by CERN to help large, international groups of particle physicists to communicate with each other. So how about we start with those two.
"If a meson is composed of both a quark and an anti-quark, how can it have an anti-partner?"
D0 meson = charm quark + up anti-quark
anti-D0 meson = charm anti-quark + up quark
Since a charm quark is distinct from a charm anti-quark these are two different particles. However you can get mesons consisting of the quark and anti-quark of the same flavour (e.g. phi meson = strange+anti-strange). These are their own anti-particles just like a photon!
"When LEP found no SUSY physicists were already casting doubt on the existence of SUSY."
Not true. Thermal models of Big Bang WIMP production predict a Dark Matter mass around 1 TeV/c2. LEP could not reach anywhere even close to that. While I did not work on LEP I was at CERN at the time and knew a lot who did and I can honestly say that I knew nobody who though the LEP results cast any doubts on SUSY - simply that the mass scale was higher than LEP could reach.
"SUSY is dead in the view of most serious physicists."
Again not true - talk to my colleagues on ATLAS. We have a huge SUSY group actively looking for it (and I'm not even a member of it!). It is true to say that the MSSM is looking somewhat constrained but you massively overstate the case by saying we think it is dead...well unless you think that none of us working on the LHC are "serious physicists". ;-) If you look at the mass limits for the ATLAS and CMS searches most of these are around 1 TeV/c2. At this scale the hierarchy problem is still very readily solvable by SUSY. However if we do not see it after the long shutdown, then I would agree that at this stage the mass limits (expected to be several TeV and rising) will start to raise issues.
"What people forget is that SUSY makes the SM worse anyway, by introducing a 100 extra parameters that can have arbitrary values. The SM has about 18, I think."
Again you misstate the case. SUSY is effectively like a second Standard Model and has all the free parameters that the Standard Model had UNTIL we measured a lot of them and found them to be zero or forbidden. For example the SM has a free parameter 'theta' which gives the strength of CP violation in strong interactions. However this is not usually listed as a free parameter because, experimentally, theta is exactly zero. So while the MSSM has 120+ free parameters many of these may well turn out to be zero if SUSY is out there and it may well be that you end up with something close to the 25 free parameters of the SM.
So by all means say that if SUSY is out there it is not as obvious as we would have naively expected it to be but, at the moment, I would completely disagree that most physicists think it is dead as an explanation of the hierarchy problem and dark matter. It's under threat - which is the best place for a theory to be because it means we have a chance to either find it or rule it out - but we are not quite there yet.
"A former CERN particle collider - the LEP stuck the knife SUSY. The LHC then killed her completely"
Absolutely wrong on both accounts. LEP came nowhere close to ruling out even the minimal Supersymmetric model (MSSM). The LHC has ruled out large areas of the MSSM to the point where things are getting strained but this is the _minimal_ SUSY model. Go to the next-to-minimal model (nMSSM) and there is no problem. Even the LHCb results - which the experiment keep on touting to the press as ruling out SUSY - only constrain it to appear more Standard Model-like than absolutely required.
However if none of that convinced you then ask yourself this: why are so many of us working on the LHC experiments still actively searching for it if we already know it is not there?
"when simultaneity and causality aren't well-defined concepts anyway."
Depends which universe you are living in. In mine they are very well defined: simultaneous events are ones which occur at the same time according to an observer and causality requires that if event A causes event B event A had better have occurred before event B for all observers.
"That detector helped 2002 Nobel Laureate the late Dr Ray Davis detect “flavor (sic) changes” in solar neutrinos"
Ray Davis did not detector flavour changes in solar neutrinos! He detected a shortfall in the number of electron neutrinos predicted by solar models but did not make any measurement which showed why there was such a shortfall. This lead to what was called the "solar neutrino problem" where multiple experiments confirmed that the electron neutrino flux was only ~33-50% of what was expected.
If was the Sudbury Neutrino Observatory (SNO) [http://www.sno.phy.queensu.ca/] which showed that solar neutrinos changed flavour by measuring the total flux of all flavours vs. the flux of just the electron flavour.
Not convinced? Check the Nobel prize citation for Davis: "for pioneering contributions to astrophysics, in particular for the detection of cosmic neutrinos" [http://www.nobelprize.org/nobel_prizes/physics/laureates/2002/] - no mention of flavour oscillations!
" A neutrino smacking into something can cause Cherenkov radiation"
Not quite. A neutrino with sufficient energy which hits something and interacts (the latter is very unlikely) can either convert into the same flavour of charged lepton (electron, muon or tau) or can kick an electron out of matter.
It is this high energy charged particle which causes the Cherenkov radiation, not the neutrino itself because you have to have a charge to generate Cherenkov light. Also Cherenkov radiation is not one photon but a whole series emitted in a cone similar to the sonic boom of a supersonic plane but with light, not sound. The frequency of the radiation is determined by the refractive index of the material.
"It may be my fault, but it is your problem"
I think that might be the best bit of advice you've given and one that seems to have escaped the notice of most of the commenters here. It's also good advice for academia too - not that we get to negotiate pay rises! However in general you have to work around the problems with the system until you get to the point where you can fix them...and replace them with different problems of your own making which others will then have to live with! ;-)
If you are interested in fusion research you should have a look at General Fusion - a company based in Vancouver. They are developing a different type of reactor which uses a spinning ball of lead which is symmetrically compressed by steam pistons to compress injected plasma toroids to the point of fusion. A lot of the neutrons generated are trapped by the molten lead which is laced with lithium to breed tritium fuel.
No idea whether the technology will work well enough to let them succeed but the basic physics principles look sensible (to a physicist, but not a plasma physicist).
Speaking as a particle physicist who works at CERN, but with nothing whatsoever to do with CLOUD, I think the article grossly misrepresents the facts. First Rolf has requested that they refrain from interpretation of their results NOT that they do not publish them i.e. the data will still be out there in the public domain for all to interpret. Secondly the DG of CERN has no ability to prevent experiments at CERN from publishing whatever they like. Technically he may be able to make CERN employed personel remove their names from the paper (although that's not for certain) but most of us working at CERN have positions in external institutes and have the protection of tenure (at least those of us not in the UK) for precisely this reason: we can publish what we like without fear for our jobs.
Of course you would not lightly ignore the request of the CERN DG, especially when it is a sensible, well reasoned request such as this (sticking to the data is good science and avoids unnecessary politics). However it is a request.
"By definition, if time travel was to be possible ever, it would be possible now."
Not true. General relativistic time travel using a pair of connected worm holes would only allow time travel back to the point when the wormholes were created and not before. So it is at least conceivable that time travel might be possible in the future without us knowing about it today (although I doubt it!).
It's not just filling in paperwork. My sister is one of the rare primary school science teachers and her problem is that she is heavily restricted in what she can do. For example she got together with secondary school science teachers to arrange a program where pupils in the final year of primary got to go to the secondary school and use some of the more advanced lab apparatus. However this was nixed because it wasn't safe for them to be in a secondary school lab (with teacher present)...although magically the following year it all suddenly became ok.
Frankly I think the Royal Society has got it completely wrong. You attract good teachers by allowing them to take the initiative and make a difference. Good teachers will make subjects more attractive to students and give good advice. Diluting subjects in the hope that if you take enough "A" levels students can't avoid hitting at least one science subject is daft because then when they get to university they will know even less making the job of us university profs even harder.
Biting the hand that feeds IT © 1998–2020