## July 30, 2004

### The President Answers his Critics

Thanks to the fevered imagining of Will Ferrell, we have a coherent response from Crawford Texas to all the calumnies emanating from the Democratic National Convention in Boston:

There’s certain Liberal agitators out there who’d like you to believe my Administration’s not doing such a good job. Those are people such as Howard Stern, Richard Clark and the News.

…

There’s people out there who’d like you to believe the Economy isn’t doing so well. To that I answer, “Hey, for the two million jobs we’ve lost, that means two million unemployed people sitting at home, watching reruns of quality television such as

The JeffersonsorFacts of Life. And that just means more ad revenue to Radio and TV stations.”…

So stick with Bush. Don’t vote, and don’t listen to Liberals, Democrats or other Republican who make fun of me, or read the News or watch the News, except for Fox.

Thank you, and God bless.

Brought to you by the good folks from ACT.

## July 29, 2004

### Blackhole Production

One (unlikely, but) exciting prospect is that, if there are large extra dimensions and TeV-scale gravity, one would produce blackholes at the LHC. Giddings and Eardley’s calculation of semiclassical blackhole formation in high energy scattering leads one to believe that, if the fundamental Planck scale is low enough (say, $M_D\sim 3\text{TeV}$) and the number, $n$, of large extra dimensions large enough, one ought to see copious numbers of blackholes ($\sim 10^3$/year for $n=3$, and a luminosity of $10^{33}\text{cm}^{-2}s^{-1}$) at the LHC.

This would be incredibly dramatic. But, as Yoshino and Nambu showed, there’s an important effect which needs to be included, namely that some significant fraction of the center-of-mass energy is radiated away before the blackhole forms. When you include this inefficiency, the production rate goes way down (because we need to be further out on the tail of the parton distribution function).

Anchordoqui, Feng, Goldberg and Shapere have looked at the numbers for the LHC, and compared them with what can be seen from cosmic ray observatories. In cosmic ray searches, one can look for quasi-horizontal showers produced by cosmic ray neutrinos. If the TeV-scale gravity story is correct, blackhole formation dominates the total cross section at high energies. Looking only at quasi-horizontal showers minimizes the background from other sources, so this provides a fairly clean, dramatic signal.

Cosmic ray neutrinos have some advantages over the LHC.

- You pay the price of only one parton distribution function, instead of two.
- AGASA is already taking data, and Auger will be soon.

By the time LHC turns on, they will have accumulated several more years of data. Either they will have seen blackholes, or they will have set limits which leave the LHC only a very narrow window for discovery.

There is, at least, one loophole in this analysis. Yoshino and Nambu only computed a *lower bound* on $M_{BH}/\sqrt{s}$. But it seems unlikely that the actual ratio will be significantly larger than this and — in any case — this affects both cosmic ray and LHC searches. Also, the flux of cosmic ray neutrinos isn’t really known but it is, at least, bounded below by that due to pion production from cosmic ray protons.

Now, I don’t think *either* Auger or LHC will see blackholes. But I was surprised to learn how competitive high energy cosmic ray observatories have become in doing particle physics.

## July 22, 2004

### No Information Lost Here!

The blogosphere — or, at least, that little corner of it that I pay attention to — is all a-twitter about Hawking’s announcement that he has solved the blackhole information problem (30 years after posing it). Normally, I try not to devote attention to such things, but the flurry of discussion prompted me to take a look at the transcript of Hawking’s talk.

He starts off with the amusing comment,

I adopt the Euclidean approach, the only sane way to do quantum gravity non-perturbative [sic].

Of course, among its other defects, the Euclidean path-integral he wishes to do is horribly infrared divergent. So his first step is to introduce an infrared regulator, in the form of a small negative cosmological constant. This is not merely a technicality. *None* of the subsequent arguments make any sense without it.

Anyone who hasn’t been *asleep* for the past 6 years knows that quantum gravity in asymptotically anti-de Sitter space has unitary time evolution. Blackholes may form and evaporate in interior, but the overall evolution is unitary and is holographically dual to the evolution in a gauge theory on the boundary.

With the large accumulation of evidence for AdS/CFT, I doubt there are many hold-outs left who doubt that the above statement holds, not just in the semiclassical limit that Hawking considers, but in the full nonperturbative theory.

Nonetheless, a “bulk” explanation of what is going on is desirable, and Hawking claims to provide one. Hawking devotes a long discussion to the point that trivial topology dominates the Euclidean path-integral (at zero temperature). Since the trivial topology can be foliated by spacelike surfaces, one can straightforwardly Wick-rotate and it follows that Minkowski-signature time evolution is unitary. Presumably, Hawking is aware of, but neglected to mention Witten’s old paper, which not only show the dominance of the trivial topology at low temperature, but shows that, at high temperature, the path integral is dominated by the Hawking-Page instanton (the analytic continuation of the AdS blackhole) and that, moreover, the phase transition which separates these two regimes (which Hawking and Page argued for, in the context of semiclassical gravity in AdS) is related to the confinement/deconfinement transition in the large-N gauge theory.

All of these are true facts, well-known to anyone familiar with AdS/CFT. But the latter goes well beyond the semiclassical approximation that Hawking uses. No one (at least, no one *I* talk to) has the slightest doubt that quantum gravity has unitary time evolution in asymptotically AdS space. The blackhole information paradox is *solved* in AdS, and it was “solved” long ago.

However, most people agree that the extrapolation to *zero* cosmological constant is not straightforward. There still room to doubt that time evolution in asymptotically flat space is unitary. On the thorny issue of extrapolating to zero cosmological constant, Hawking is silent.

## July 20, 2004

### Hair of the Dog

Longtime readers of this blog know that it started by accident. A bicycling accident, to be precise. Two summers ago at Aspen, I lived out east of town, on Laurel Mountain Drive. Cycling into town along Highway 82, on my second to last day, I hit a storm drain, and went over the handlebars…

As a result, I couldn’t write for several months — which ruled out my usual activities of doing calculations and lecturing at the blackboard. Ironically, what I *could* do was type, so I decided to download a copy of MovableType and see what I could do about putting physics on the web.

Well, I’m back in Aspen this year, and cycling about town. I’m hoping for a less eventful, if still productive stay.

### Dick Cheney — The Remix Version

[Via Crooked Timber] What really happened on the Senate Floor.

In other news, Ralph Nader is explicitly throwing in his lot with the Republicans, to get himself on the ballot in key swing states. Evidently, the Republicans have seen the light and recognize the wisdom of creating a viable 3^{rd} party.

It’s getting increasingly hard to tell the parody from the reality these days. Maybe I should stop trying.

## July 13, 2004

### “Fixed”

In the week since upgrading Apache from 2.0.49 to 2.0.50, I’ve been annoyed by what turns out to be this bug. When trying to track down a bug in a *brand-new* software release, however, one rarely looks for ones marked RESOLVED/FIXED, does one?

Anyway, the patch mentioned fixes the problem on MacOSX, too.

## July 9, 2004

### LHC

After Strings 2004, I spent the weekend in Paris, and hopped on the TGV to Geneva, where I’ve been spending the week at CERN. The big excitement here, of course, is the LHC, which is scheduled to turn on in the summer of 2007. The first half of the week was devoted to a Strings Workshop, where Fabiola Gianotti (of the ATLAS Collaboration) gave the lead-off lecture.

Wednesday afternoon, we got to tour the two main experimental facilities (ATLAS and CMS) and the magnet testing facility.

Since some of my readers aren’t physicists, let me reverse the order of things, and talk about the computing challenges involved. High energy experimental physics has always been on the cutting-edge of computing technology (the WorldWide Web, you’ll recall, was invented here at CERN), but the LHC experiments raise the bar considerably.

Because of the extraordinarily high luminosity of the LHC ($10^{34} \text{cm}^{-2} \text{s}^{-1}$), each detector will “see” $10^9$ collisions/s. An impressive amount of computation takes place in the custom ASICs in the fast electronic triggers which whittle those $10^9$ events down to 100 “interesting” events/s. Here, I have already oversimplified the problem. The protons arrive in 25 ns bunches. Each bunch crossing produces about 25 events. 25 ns is a *short* period of time, shorter than the time it takes a particle traveling at the speed of light to cross the detector. So there are a large number of events happening all-but-simultaneously, each producing hundreds of charged particle tracks. ATLAS’s level-1 triggers have the job of disentangling those multiple events and cutting the event rate from $10^9$/sec to $10^4$/s, with a discrimination time of 2 μs. To do this, the level-1 Calorimetry trigger has to analyse more than 3000 Gbits of input data/sec. The level-2 triggers cut the event rate further down to 100 events/s. These 100 events then need to be “reconstructed”. That’s done “offline” at a processor farm. Each event requires about 1 second of processing on a 1000 MIPS processor. The reconstructed event is whittled down to 0.5 MB of data. You can probably guess where I’m headed. 0.5 MB/event × 100 events/s: in a year of running you accumulate … 1.5 petabytes (1.5 *million* gigabytes) of data^{1}.

And then begins the data analysis …

The construction engineering challenges are equally impressive. The CMS detector (the smaller, but heavier of the two) is a 12,500 tonne instrument (the magnetic return yoke contains more iron than the Eiffel Tower), containing a 6 m diameter superconducting solenoid, operating at 4 Tesla. The various pieces need to be positioned with an precision of 0.01mm, after having been lowered into a pit 100 m underground. The ATLAS cavern, 55 m long, 40 m high and 35 m wide (also located 100m below the surface) is the largest man-made underground structure in the world and the ATLAS detector, 45 m long, 12 m in diameter, just barely squeezes inside.

Engineering challenges aside, what we really care about is the physics.

The LHC will reach a center-of-mass energy of $\sqrt{s}= 14\text{TeV}$, with the aforementioned luminosity of $10^{34} \text{cm}^{-1}\text{s}^{-1}$. Most optimistically, this gives them a “reach” for the discovery of new particles of up to $m\sim 6\text{TeV}$. Realistically, what one can see depends very much on the decay channel. Leptons and γs are very-well discriminated. Fully hadronic final states can only be extracted from the (huge) QCD background with a hard $p_T \gt 100\text{GeV}$ cut, which limits them to only very heavy objects.

To see a very light Higgs (say, 115 GeV) at the $5\sigma$ level will require a year of running, as the $H\to \gamma\gamma$ channel requires excellent EM calorimetry. For $m_H\gt 130 \text{GeV}$, the $H\to Z Z^* \to 4 \text{lepton}$ channel opens up, which will be much easier to separate from background. A squark or gluino with a mass less than 1.3 TeV will be seen in less than a month of running. Masses up to 2.5 TeV could be seen in about a year.

With a year of running, all sorts of other beyond-the-Standard-Model physics (from new $Z'$ gauge bosons to TeV-scale blackholes) will be testable too. I’m really very excited about the prospects for our field finally seeing some experimental input again.

^{1} The total storage requirements expected for the LHC is about 10 petabytes/year. Currently, the largest database in the world is the BaBar Database at SLAC, which holds just shy of a petabyte.

## July 2, 2004

### Strings 2004: a Day Late and a Dollar Short

Wednesday morning was Cosmic Strings Morning. Rob Myers gave a very pretty review of his work with Copeland and Polchinski. Nick Jones talked about the evolution of cosmic F- and D-string networks. They compute the reconnection probability for colliding F- and D-strings (which get folded into existing computations of the evolution of string networks). They claim that the reconnection probablility can be much smaller ($P\sim 10^{-3}$ for F-F and $P\sim .1$ for D-D strings) than the standard Nielsen-Olesen strings ($P\sim 1$). I’m not sure why these results differ from the gauge theory answer (presumably because the quantum corrections to the latter have never been properly taken into account) but, in any case, the most interesting features of these string networks is that the collision of general $(p,q)$ strings produces 3-string junctions (they also compute the probability for this process) and the existence of “baryons” (D3-branes wrapped on cycles with $n$-units of $F_{(3)}$ flux) on which $n$ fundamental strings can end. These change the evolution of these string networks in ways that cannot be captured by simply taking existing simulations of string networks and changing the reconnection probablility, $P$.

Probably the most exciting talks so far have been about a cluster of work by Vafa and collaborators on topological string theory and its relationship with other aspects of string theory.

Strominger talked about the surprising relation between black hole entropy in $N=2$ supergravity and the topological A-model. If you take Type IIA string theory compactified on a Calabi-Yau manifold, and look for supersymmetric blackhole solutions, you find the well-known attractor mechanism where, whatever the values of the Kähler moduli, $X^\Lambda$, of the Calabi-Yau at spatial infinity, as you approach the horizon, they are attracted to the locus

where $(q_\Lambda,p^\Lambda)$ are the electric and magnetic charges of the blackhole, $F_{0\Lambda}= \partial F_0/\partial X^\Lambda$ and $F_0$ is the prepotential (related to the genus-zero topological string vacuum amplitude). The Kahler form is

and the attractor equations fix $C$ and the moduli, $X^\Lambda$ up to a Kähler transformation. Lopes, Cardoso, de Wit and Mohaupt, found a beautiful formula for the corrections to the area law expression for entropy of the blackhole. Define

where $F_h$ is proportional to the genus-$h$ topological string amplitude, and $T_{\mu\nu}$ is the (anti-self dual part of the) graviphoton field strength. At the horizon, the exact attractor equation is

and the blackhole entropy

where the first term is, essentially, the area law. This formula can be recast in terms of a mixed canonical/microcanonical partition function

The stunning result is

where

and $\log Z_{\text{top}}= \sum_{h=0}^\infty g_{\text{top}}^{2h-2} F_{h(\text{top})}(t^A)$ is the topological A-model partition function.

Robert Dijkgraaf talked about a 7-dimensional field theory based on work of Hitchin on $G_2$ structures, which might be called (the spacetime theory of) topological M-theory. When defined on a manifold of the form $M_{\text{CY}}\times S^1$, it provides a derivation of the proposed S-duality between the topological A-model and the topological B-model (whose spacetime theories are Kodaira-Spencer theory and Kähler gravity, respectively). There’s also an 8-dimensional theory, based on $Spin(7)$ structures.

Both Seiberg and Rastelli gave beautiful talks about $c\lt 1$ noncritical string theory, and progress in understanding them from the point of view of D-branes (whose collective field theory is none other than the Matrix model).

## July 1, 2004

### Trackbacks and MTStripControlChars

A little interlude in the physics reportage. There’s been some more controversy on the subject of Trackbacks.

A bit of background. The Trackback protocol does not discuss the issue of character encodings. Since it proceeds via an HTTP POST, in the absence of any charset declaration, it *ought* to be assumed that the charset is ISO-8859-1. But, in point of fact, it could be *anything*.

The obvious long-term solution is for the Trackback Specification to demand that a charset be declared (explicitly or implicitly) and for implementations (like MovableType) to handle the requisite transcoding to/from your blog’s native charset.

But we ain’t there yet^{1}. Right now, you just have to guess at the trackback’s charset, and try to deal intelligently with the result.

Over a year ago, I wrote a plugin to ensure that data (like a trackback) which is purportedly ISO-8859-1 is really valid. Sam Ruby points out that I did an incomplete job of it. There were still some invalid characters that I accepted. That is, as they say, … *unacceptable*.

So I’ve revised MTStripControlChars to be really *bulletproof*.

^{1} After waiting around for six months, I finally implemented my own solution. This doesn’t obviate the need to **MTStripControlChars**, but it does mean that I don’t have to bone-headedly pretend that all trackbacks are iso-8859-1.