Friday 28 August 2009

New ideas in dark matter

I keep receiving complaints about my meager blogging output. Concerning the last few weeks I have a good excuse: I was studying Mont Blanc (that narrow resonance visible at the LHC on a clear day); and after that I was being tired; and after that I was being lazy; and in any case nothing happens during summer months.

But the coming autumn is going to be exciting. The approaching LHC start-up is one but not the unique reason. On a completely different frontier, huge progress is expected in the area of direct detection of dark matter. The XENON100 experiment in Gran Sasso is going to kick off next month, while its bitter enemy - the LUX experiment in the Homestake mine - is hoping to follow before the end of the year. Within one year experimental sensitivity to the WIMP-nucleon cross section should be improved by two orders of magnitude, biting deep into the parameter space region where numerous popular theories predict a signal.

In the best case scenario the very first months or even days can lead to a discovery. This is the prediction of the dark matter models designed to resolve the DAMA puzzle. Recall that the DAMA experiment in Gran Sasso claims to have detected dark matter by observing the annual modulation of the count rate in their sodium-iodine detector. Particle theorists struggle to reconcile the DAMA signal with the null results from a dozen of other, in principle more sensitive, experiments. That is not quite impossible because various experiments use different targets and different detection techniques, in particular, the masses of the target nuclei as well as the range of the observable recoil energies are specific to each experiment. The game is thus to arrange the properties of dark matter such that the nuclear recoils due to scattering of dark matter particles could have been observed only by DAMA.

The standard WIMP is not an option here: DAMA would require large cross section at the level that has been excluded by CDMS, XENON10 (the little brother of XENON100) and others. But theorists are not easily discouraged and they are trying to come up with alternative ideas. Recently the so-called inelastic dark matter has gained a lot of publicity. In that scenario, dark matter particles scatter inelastically off nuclei into a slightly (hundred of keV) heavier state. Thus, one needs to provide enough energy to produce the heavier state which implies a minimum velocity of the initial dark matter particle for the scattering to occur. The splitting can be tuned such that DAMA is able to see the signal while the others are not. That of course requires some amount of conspiracy. Fortunately, the inelastic dark matter theory predicts a thunderstorm of events in the upcoming runs of XENON100 and LUX.

Until recently inelastic dark matter was the only plausible explanation of DAMA. But this week there was a paper exploring a different idea. In this new scenario, the scattering of dark matter on nucleons is elastic, but the scattering amplitude depends in a non-trivial way on the momentum transfer, hence the name form factor dark matter. If the form factor is suppressed outside the window to which DAMA happens to be sensitive, the null results of other experiments can be explained.

Non-trivial form factors can be arranged by some dirty model building tricks. The paper presents an example of a scalar dark matter particle with dipole-type ($D_\mu X^\dagger D_\nu X F_{\mu\nu}$) interactions with some new hidden vector fields. The latter mix with the photon which provides a coupling to the ordinary matter. To explain the DAMA phenomenology one needs at least two vector fields with opposite couplings to the photon and comparable masses, which makes the whole construction a bit contrived. Again, the model predicts a characteristic recoil spectrum, and a large number of events in XENON100 and LUX.

What seems to be most valuable in these constructions is that they demonstrate that dark matter can have very different properties than the standard WIMP. That should encourage the experimenters to extend the scope of their searches; so far their search algorithms have been tailor-made for the standard WIMP case, and they could have easily missed something interesting. The fantastic experimental progress makes the dark matter models testable well before the LHC can offer us any interesting data. If you have a good idea in store now is the good time to come out.

Wednesday 5 August 2009

10? 6.66? Eleventeen?

Today (Wednesday) the CERN management is going to reach a decision that will affect the life of everybody on this planet. Namely, the operating energy of the LHC machine in the first year will be decided today. Senior readers may remember that the LHC used to be a 14 TeV collider. However that energy cannot be achieved in near future due to poor quality of the magnets provided by industry. Reaching the nominal energy will require a long process of magnet training, and the prospect for upgrade seem unlikely within the next 3 years. For this reason, 10 TeV was the energy planned for the last year false start, as well as for the restart scheduled for mid-November.

However, the rumor is that even this smaller energy will not be achieved next year due to the well known problems with bad splices. The hundreds of individual magnets around the LHC ring are connected using a process called soldering - an advanced cutting-edge technique whose many aspects are clouded in mystery. There are in fact two separate problems with soldering that have been detected at the LHC. One is a poor quality of interconnections between the superconducting magnets. That leads to excessive resistance (like nanoohms) and, in consequence, the current flowing through the interconnection generates heat that triggers a quench of the superconductor. The other problem are faulty interconnections between copper bus bars who are supposed to carry the current when the superconductor quenches. It is suspected that the solder in the bus bars was sometimes accidentally melted during subsequent soldering of the superconducting cable connections. In fact, it was a combination of the two above mentioned problems that triggered the fireworks of September 19.

Bad splices are known to be present in the LHC ring, and those residing in cold sectors cannot be repaired without a considerable slip to the schedule. So the alternative is to either postpone the LHC restart or run at slightly lower energies (the latter implies slightly smaller currents running through magnets and thus a slightly smaller risk of another catastrophe). During the last few months numerous simulations and experiments have been performed to determine the maximum safe current.

After a careful study of the plot above, listening to the experts, and weighing all pros and cons, the director general is going to roll a pair of dice, and the sum of dots will determine the LHC energy for the coming restart. As for the rumors, I have heard any rational number between 4 and 10 TeV. So, now is the last moment place your bets. Theoretical analysis of the 2-dice experiment suggests that 7 is the most likely outcome :-).

Once we know the operating energy, we will have a better idea what kind of results to expect in the first year. It is already certain that for a while the LHC cannot compete with the Tevatron in the area of Higgs searches. In fact, almost all reasonable new physics signatures require at least one inverse femtobarn of integrated luminosity, much more than the 100 inverse picobarns expected in the first year. This leaves boring standard model signatures, including slightly less boring top quark physics (but even in the case of the top quark competing with the Tevatron results may be tough if the center-of-mass energy is lower than 10 TeV). However, some spectacular (and unlikely) signatures like a new 1 TeV Z' gauge boson or light superparticles may be within reach if the center of mass energy is not much less than 10 TeV. But realistically, we have to keep patient until at least 2012.

And the winner is...

Seven!