Skip to the Main Content

Note:These pages make extensive use of the latest XHTML and CSS Standards. They ought to look great in any standards-compliant modern browser. Unfortunately, they will probably look horrible in older browsers, like Netscape 4.x and IE 4.x. Moreover, many posts use MathML, which is, currently only supported in Mozilla. My best suggestion (and you will thank me when surfing an ever-increasing number of sites on the web which have been crafted to use the new standards) is to upgrade to the latest version of your browser. If that's not possible, consider moving to the Standards-compliant and open-source Mozilla browser.

July 30, 2004

The President Answers his Critics

Thanks to the fevered imagining of Will Ferrell, we have a coherent response from Crawford Texas to all the calumnies emanating from the Democratic National Convention in Boston:

There’s certain Liberal agitators out there who’d like you to believe my Administration’s not doing such a good job. Those are people such as Howard Stern, Richard Clark and the News.

There’s people out there who’d like you to believe the Economy isn’t doing so well. To that I answer, “Hey, for the two million jobs we’ve lost, that means two million unemployed people sitting at home, watching reruns of quality television such as The Jeffersons or Facts of Life. And that just means more ad revenue to Radio and TV stations.”

So stick with Bush. Don’t vote, and don’t listen to Liberals, Democrats or other Republican who make fun of me, or read the News or watch the News, except for Fox.

Thank you, and God bless.

Brought to you by the good folks from ACT.

Posted by distler at 10:17 AM | Permalink | Post a Comment

July 29, 2004

Blackhole Production

One (unlikely, but) exciting prospect is that, if there are large extra dimensions and TeV-scale gravity, one would produce blackholes at the LHC. Giddings and Eardley’s calculation of semiclassical blackhole formation in high energy scattering leads one to believe that, if the fundamental Planck scale is low enough (say, M D3TeVM_D\sim 3\text{TeV}) and the number, nn, of large extra dimensions large enough, one ought to see copious numbers of blackholes (10 3\sim 10^3/year for n=3n=3, and a luminosity of 10 33cm 2s 110^{33}\text{cm}^{-2}s^{-1}) at the LHC.

This would be incredibly dramatic. But, as Yoshino and Nambu showed, there’s an important effect which needs to be included, namely that some significant fraction of the center-of-mass energy is radiated away before the blackhole forms. When you include this inefficiency, the production rate goes way down (because we need to be further out on the tail of the parton distribution function).

Anchordoqui, Feng, Goldberg and Shapere have looked at the numbers for the LHC, and compared them with what can be seen from cosmic ray observatories. In cosmic ray searches, one can look for quasi-horizontal showers produced by cosmic ray neutrinos. If the TeV-scale gravity story is correct, blackhole formation dominates the total cross section at high energies. Looking only at quasi-horizontal showers minimizes the background from other sources, so this provides a fairly clean, dramatic signal.

Cosmic ray neutrinos have some advantages over the LHC.

  1. You pay the price of only one parton distribution function, instead of two.
  2. AGASA is already taking data, and Auger will be soon.

By the time LHC turns on, they will have accumulated several more years of data. Either they will have seen blackholes, or they will have set limits which leave the LHC only a very narrow window for discovery.

allowed region for discovery of TeV-scale blackholes at the LHC, and bounds from Auger

The discovery reaches for the LHC (solid) for 3 different integrated luminosities and n=6n = 6 extra dimensions. Also shown is the region of parameter space which can be excluded at 95% CL if no neutrino showers mediated by BHs are observed in 5 years at the PAO. The shaded (cross-hatched) region assumes 2 SM neutrino + 0 (10) hadronic background events. M DM_D is the fundamental Planck scale, and x minx_{\text{min}} is the mass (in units of M DM_D) of the smallest mass blackhole which can be cleanly distinguished. Ten years of the LHC running at 10 3410^{34} represents an integrated luminosity of 1000 fb-1 (from Anchordoqui et al).

There is, at least, one loophole in this analysis. Yoshino and Nambu only computed a lower bound on M BH/sM_{BH}/\sqrt{s}. But it seems unlikely that the actual ratio will be significantly larger than this and — in any case — this affects both cosmic ray and LHC searches. Also, the flux of cosmic ray neutrinos isn’t really known but it is, at least, bounded below by that due to pion production from cosmic ray protons.

Now, I don’t think either Auger or LHC will see blackholes. But I was surprised to learn how competitive high energy cosmic ray observatories have become in doing particle physics.

Posted by distler at 9:33 AM | Permalink | Followups (1)

July 22, 2004

No Information Lost Here!

The blogosphere — or, at least, that little corner of it that I pay attention to — is all a-twitter about Hawking’s announcement that he has solved the blackhole information problem (30 years after posing it). Normally, I try not to devote attention to such things, but the flurry of discussion prompted me to take a look at the transcript of Hawking’s talk.

He starts off with the amusing comment,

I adopt the Euclidean approach, the only sane way to do quantum gravity non-perturbative [sic].

Of course, among its other defects, the Euclidean path-integral he wishes to do is horribly infrared divergent. So his first step is to introduce an infrared regulator, in the form of a small negative cosmological constant. This is not merely a technicality. None of the subsequent arguments make any sense without it.

Anyone who hasn’t been asleep for the past 6 years knows that quantum gravity in asymptotically anti-de Sitter space has unitary time evolution. Blackholes may form and evaporate in interior, but the overall evolution is unitary and is holographically dual to the evolution in a gauge theory on the boundary.

With the large accumulation of evidence for AdS/CFT, I doubt there are many hold-outs left who doubt that the above statement holds, not just in the semiclassical limit that Hawking considers, but in the full nonperturbative theory.

Nonetheless, a “bulk” explanation of what is going on is desirable, and Hawking claims to provide one. Hawking devotes a long discussion to the point that trivial topology dominates the Euclidean path-integral (at zero temperature). Since the trivial topology can be foliated by spacelike surfaces, one can straightforwardly Wick-rotate and it follows that Minkowski-signature time evolution is unitary. Presumably, Hawking is aware of, but neglected to mention Witten’s old paper, which not only show the dominance of the trivial topology at low temperature, but shows that, at high temperature, the path integral is dominated by the Hawking-Page instanton (the analytic continuation of the AdS blackhole) and that, moreover, the phase transition which separates these two regimes (which Hawking and Page argued for, in the context of semiclassical gravity in AdS) is related to the confinement/deconfinement transition in the large-N gauge theory.

All of these are true facts, well-known to anyone familiar with AdS/CFT. But the latter goes well beyond the semiclassical approximation that Hawking uses. No one (at least, no one I talk to) has the slightest doubt that quantum gravity has unitary time evolution in asymptotically AdS space. The blackhole information paradox is solved in AdS, and it was “solved” long ago.

However, most people agree that the extrapolation to zero cosmological constant is not straightforward. There still room to doubt that time evolution in asymptotically flat space is unitary. On the thorny issue of extrapolating to zero cosmological constant, Hawking is silent.

Posted by distler at 11:35 AM | Permalink | Followups (12)

July 20, 2004

Hair of the Dog

Longtime readers of this blog know that it started by accident. A bicycling accident, to be precise. Two summers ago at Aspen, I lived out east of town, on Laurel Mountain Drive. Cycling into town along Highway 82, on my second to last day, I hit a storm drain, and went over the handlebars…

As a result, I couldn’t write for several months — which ruled out my usual activities of doing calculations and lecturing at the blackboard. Ironically, what I could do was type, so I decided to download a copy of MovableType and see what I could do about putting physics on the web.

Well, I’m back in Aspen this year, and cycling about town. I’m hoping for a less eventful, if still productive stay.

Posted by distler at 11:06 AM | Permalink | Followups (3)

Dick Cheney — The Remix Version

[Via Crooked Timber] What really happened on the Senate Floor.

In other news, Ralph Nader is explicitly throwing in his lot with the Republicans, to get himself on the ballot in key swing states. Evidently, the Republicans have seen the light and recognize the wisdom of creating a viable 3rd party.

It’s getting increasingly hard to tell the parody from the reality these days. Maybe I should stop trying.

Posted by distler at 10:30 AM | Permalink | Post a Comment

July 13, 2004

“Fixed”

In the week since upgrading Apache from 2.0.49 to 2.0.50, I’ve been annoyed by what turns out to be this bug. When trying to track down a bug in a brand-new software release, however, one rarely looks for ones marked RESOLVED/FIXED, does one?

Anyway, the patch mentioned fixes the problem on MacOSX, too.

Posted by distler at 10:26 PM | Permalink | Post a Comment

July 9, 2004

LHC

After Strings 2004, I spent the weekend in Paris, and hopped on the TGV to Geneva, where I’ve been spending the week at CERN. The big excitement here, of course, is the LHC, which is scheduled to turn on in the summer of 2007. The first half of the week was devoted to a Strings Workshop, where Fabiola Gianotti (of the ATLAS Collaboration) gave the lead-off lecture.

Wednesday afternoon, we got to tour the two main experimental facilities (ATLAS and CMS) and the magnet testing facility.

Since some of my readers aren’t physicists, let me reverse the order of things, and talk about the computing challenges involved. High energy experimental physics has always been on the cutting-edge of computing technology (the WorldWide Web, you’ll recall, was invented here at CERN), but the LHC experiments raise the bar considerably.

Because of the extraordinarily high luminosity of the LHC (10 34cm 2s 110^{34} \text{cm}^{-2} \text{s}^{-1}), each detector will “see” 10 910^9 collisions/s. An impressive amount of computation takes place in the custom ASICs in the fast electronic triggers which whittle those 10 910^9 events down to 100 “interesting” events/s. Here, I have already oversimplified the problem. The protons arrive in 25 ns bunches. Each bunch crossing produces about 25 events. 25 ns is a short period of time, shorter than the time it takes a particle traveling at the speed of light to cross the detector. So there are a large number of events happening all-but-simultaneously, each producing hundreds of charged particle tracks. ATLAS’s level-1 triggers have the job of disentangling those multiple events and cutting the event rate from 10 910^9/sec to 10 410^4/s, with a discrimination time of 2 μs. To do this, the level-1 Calorimetry trigger has to analyse more than 3000 Gbits of input data/sec. The level-2 triggers cut the event rate further down to 100 events/s. These 100 events then need to be “reconstructed”. That’s done “offline” at a processor farm. Each event requires about 1 second of processing on a 1000 MIPS processor. The reconstructed event is whittled down to 0.5 MB of data. You can probably guess where I’m headed. 0.5 MB/event × 100 events/s: in a year of running you accumulate … 1.5 petabytes (1.5 million gigabytes) of data1.

And then begins the data analysis …

Live Webcam of the ATLAS Cavern
The ATLAS Cavern

The construction engineering challenges are equally impressive. The CMS detector (the smaller, but heavier of the two) is a 12,500 tonne instrument (the magnetic return yoke contains more iron than the Eiffel Tower), containing a 6 m diameter superconducting solenoid, operating at 4 Tesla. The various pieces need to be positioned with an precision of 0.01mm, after having been lowered into a pit 100 m underground. The ATLAS cavern, 55 m long, 40 m high and 35 m wide (also located 100m below the surface) is the largest man-made underground structure in the world and the ATLAS detector, 45 m long, 12 m in diameter, just barely squeezes inside.

A cross-sectional slice through the CMS Detector
Schematic Cross-Sectional view of the CMS Detector

Engineering challenges aside, what we really care about is the physics.

The LHC will reach a center-of-mass energy of s=14TeV\sqrt{s}= 14\text{TeV}, with the aforementioned luminosity of 10 34cm 1s 110^{34} \text{cm}^{-1}\text{s}^{-1}. Most optimistically, this gives them a “reach” for the discovery of new particles of up to m6TeVm\sim 6\text{TeV}. Realistically, what one can see depends very much on the decay channel. Leptons and γs are very-well discriminated. Fully hadronic final states can only be extracted from the (huge) QCD background with a hard p T>100GeVp_T \gt 100\text{GeV} cut, which limits them to only very heavy objects.

To see a very light Higgs (say, 115 GeV) at the 5σ5\sigma level will require a year of running, as the HγγH\to \gamma\gamma channel requires excellent EM calorimetry. For m H>130GeVm_H\gt 130 \text{GeV}, the HZZ *4leptonH\to Z Z^* \to 4 \text{lepton} channel opens up, which will be much easier to separate from background. A squark or gluino with a mass less than 1.3 TeV will be seen in less than a month of running. Masses up to 2.5 TeV could be seen in about a year.

With a year of running, all sorts of other beyond-the-Standard-Model physics (from new ZZ' gauge bosons to TeV-scale blackholes) will be testable too. I’m really very excited about the prospects for our field finally seeing some experimental input again.

1 The total storage requirements expected for the LHC is about 10 petabytes/year. Currently, the largest database in the world is the BaBar Database at SLAC, which holds just shy of a petabyte.

Posted by distler at 7:42 AM | Permalink | Followups (8)

July 2, 2004

Strings 2004: a Day Late and a Dollar Short

Wednesday morning was Cosmic Strings Morning. Rob Myers gave a very pretty review of his work with Copeland and Polchinski. Nick Jones talked about the evolution of cosmic F- and D-string networks. They compute the reconnection probability for colliding F- and D-strings (which get folded into existing computations of the evolution of string networks). They claim that the reconnection probablility can be much smaller (P10 3P\sim 10^{-3} for F-F and P.1P\sim .1 for D-D strings) than the standard Nielsen-Olesen strings (P1P\sim 1). I’m not sure why these results differ from the gauge theory answer (presumably because the quantum corrections to the latter have never been properly taken into account) but, in any case, the most interesting features of these string networks is that the collision of general (p,q)(p,q) strings produces 3-string junctions (they also compute the probability for this process) and the existence of “baryons” (D3-branes wrapped on cycles with nn-units of F (3)F_{(3)} flux) on which nn fundamental strings can end. These change the evolution of these string networks in ways that cannot be captured by simply taking existing simulations of string networks and changing the reconnection probablility, PP.

Probably the most exciting talks so far have been about a cluster of work by Vafa and collaborators on topological string theory and its relationship with other aspects of string theory.

Strominger talked about the surprising relation between black hole entropy in N=2N=2 supergravity and the topological A-model. If you take Type IIA string theory compactified on a Calabi-Yau manifold, and look for supersymmetric blackhole solutions, you find the well-known attractor mechanism where, whatever the values of the Kähler moduli, X ΛX^\Lambda, of the Calabi-Yau at spatial infinity, as you approach the horizon, they are attracted to the locus

(1)Re(CX Λ)=p Λ,Re(CF 0Λ)=q Λ Re(C X^\Lambda)= p^\Lambda, \qquad Re(C F_{0\Lambda}) = q_\Lambda

where (q Λ,p Λ)(q_\Lambda,p^\Lambda) are the electric and magnetic charges of the blackhole, F 0Λ=F 0/X ΛF_{0\Lambda}= \partial F_0/\partial X^\Lambda and F 0F_0 is the prepotential (related to the genus-zero topological string vacuum amplitude). The Kahler form is

(2)K=log(iX¯ ΛF 0ΛX ΛF¯ 0Λ) K = - \log( i \overline{X}^\Lambda F_{0\Lambda}- X^\Lambda \overline{F}_{0\Lambda})

and the attractor equations fix CC and the moduli, X ΛX^\Lambda up to a Kähler transformation. Lopes, Cardoso, de Wit and Mohaupt, found a beautiful formula for the corrections to the area law expression for entropy of the blackhole. Define

(3)F(X,T 2)= h=0 F h(X)T 2h F(X,T^2) = \sum_{h=0}^\infty F_h(X) T^{2h}

where F hF_h is proportional to the genus-hh topological string amplitude, and T μνT_{\mu\nu} is the (anti-self dual part of the) graviphoton field strength. At the horizon, the exact attractor equation is

(4)C 2T 2=256,Re(CX Λ)=p Λ,Re(CF Λ(X,T 2))=q Λ C^2 T^2 =256,\qquad Re(C X^\Lambda)= p^\Lambda, \qquad Re(C F_{\Lambda}(X,T^2)) = q_\Lambda

and the blackhole entropy

(5)S BH=iπ2[C¯X¯ Λq Λp ΛC¯F¯ Λ] attr+128πi[F¯T¯ 2FT 2] attr S_{\text{BH}} = \frac{i\pi}{2}\left[\overline{C}\overline{X}^\Lambda q_\Lambda - p^\Lambda \overline{C}\overline{F}_\Lambda \right]_{\text{attr}} + 128\pi i \left[ \frac{\partial \overline{F}}{\partial\overline{T}^2}-\frac{\partial F}{\partial T^2}\right]_{\text{attr}}

where the first term is, essentially, the area law. This formula can be recast in terms of a mixed canonical/microcanonical partition function

(6)S BH=logZ BH(ϕ Λ,p Λ)ϕ Λϕ ΛlogZ BH S_{\text{BH}}= \log Z_{\text{BH}}(\phi^\Lambda,p^\Lambda) - \phi^\Lambda \frac{\partial}{\partial \phi^\Lambda} \log Z_{\text{BH}}

The stunning result is

(7)Z BH(ϕ Λ,p Λ)=|Z top(t A,g top)| 2 Z_{\text{BH}}(\phi^\Lambda,p^\Lambda) = | Z_{\text{top}}(t^A, g_{\text{top}})|^2

where

(8)t A=p A+iϕ A/πp 0+iϕ 0/π,g top=±4πip 0+iϕ 0/π t^A =\frac{p^A +i\phi^A/\pi}{p^0 +i\phi^0/\pi},\qquad g_{\text{top}}= \pm \frac{4\pi i}{p^0 +i\phi^0/\pi}

and logZ top= h=0 g top 2h2F h(top)(t A)\log Z_{\text{top}}= \sum_{h=0}^\infty g_{\text{top}}^{2h-2} F_{h(\text{top})}(t^A) is the topological A-model partition function.

Robert Dijkgraaf talked about a 7-dimensional field theory based on work of Hitchin on G 2G_2 structures, which might be called (the spacetime theory of) topological M-theory. When defined on a manifold of the form M CY×S 1M_{\text{CY}}\times S^1, it provides a derivation of the proposed S-duality between the topological A-model and the topological B-model (whose spacetime theories are Kodaira-Spencer theory and Kähler gravity, respectively). There’s also an 8-dimensional theory, based on Spin(7)Spin(7) structures.

Both Seiberg and Rastelli gave beautiful talks about c<1c\lt 1 noncritical string theory, and progress in understanding them from the point of view of D-branes (whose collective field theory is none other than the Matrix model).

Posted by distler at 4:52 AM | Permalink | Followups (12)

July 1, 2004

Trackbacks and MTStripControlChars

A little interlude in the physics reportage. There’s been some more controversy on the subject of Trackbacks.

A bit of background. The Trackback protocol does not discuss the issue of character encodings. Since it proceeds via an HTTP POST, in the absence of any charset declaration, it ought to be assumed that the charset is ISO-8859-1. But, in point of fact, it could be anything.

The obvious long-term solution is for the Trackback Specification to demand that a charset be declared (explicitly or implicitly) and for implementations (like MovableType) to handle the requisite transcoding to/from your blog’s native charset.

But we ain’t there yet1. Right now, you just have to guess at the trackback’s charset, and try to deal intelligently with the result.

Over a year ago, I wrote a plugin to ensure that data (like a trackback) which is purportedly ISO-8859-1 is really valid. Sam Ruby points out that I did an incomplete job of it. There were still some invalid characters that I accepted. That is, as they say, … unacceptable.

So I’ve revised MTStripControlChars to be really bulletproof.


1 After waiting around for six months, I finally implemented my own solution. This doesn’t obviate the need to MTStripControlChars, but it does mean that I don’t have to bone-headedly pretend that all trackbacks are iso-8859-1.

Posted by distler at 2:57 AM | Permalink | Followups (2)