## January 30, 2010

### This Week’s Finds in Mathematical Physics (Week 292)

#### Posted by John Baez In week292 of This Week’s Finds, learn about Henry Paynter’s “bond graphs” — diagrams that engineers use to model systems made of mechanical, electronic, and/or hydraulic components: springs, gears, levers, pulleys, pumps, pipes, motors, resistors, capacitors, inductors, transformers, amplifiers, and so on. And learn how different classes of systems like this call for different kinds of mathematics: symplectic geometry, complex analysis, and even a bit of Hodge theory.

Posted at January 30, 2010 5:12 AM UTC

TrackBack URL for this Entry:   http://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2162

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I’m not sure I ever quite got to the bottom of the Legendre/Laplace transform business. Litvonov muddied things further by calling the Legendre transform the idempotent version of the Fourier-Laplace transform here on p. 11.

Perhaps we’ll have the chance to understand some of these issues in terms of your physical systems.

Posted by: David Corfield on January 30, 2010 12:00 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Indeed, the whole ‘ordinary analysis / idempotent analysis’ analogy is very relevant to the business I’m talking about now.

I’ve been avoiding discussing it, because it multiplies the whole discussion by two in complexity — roughly — and I’m already struggling to fit together all the puzzle pieces I’m laying out. Having too many puzzle pieces can be confusing! (Though having too few can be even worse.)

For the most part, you can say that “Legendre transforms show up in classical mechanics, while Fourier transforms show up in quantum mechanics and Laplace transforms show up in statistical mechanics”, with classical mechanics being the $\hbar \to 0$ limit of quantum mechanics and the $T \to 0$ limit of statistical mechanics, and with quantum mechanics and statistical mechanics linked by $i \hbar \cong T$.

So, in particular, if you take the Laplace transform

$\widehat{f}(k) = \int_0^\infty exp(- k x) f(x) d x$

and replace ordinary analysis by idempotent analysis, you get the Legendre transform:

$\widehat{f}(k) = inf_{k \in [0,\infty)} f(x) - k x$

Or something like that. If you push me harder by asking more specific questions, I can try to answer them, and then fall on my face, and then figure things out.

There are plenty of anomalies making the puzzle hard to solve. Here’s one:

The engineers like this big chart:

                displacement    flow          momentum      effort
q           q'              p            p'
Mechanics       position       velocity       momentum      force
(translation)
Mechanics       angle          angular        angular       torque
(rotation)                     velocity       momentum
Electronics     charge         current        flux          voltage
Hydraulics      volume         flow           pressure      pressure
momentum
Thermodynamics  entropy        entropy        temperature   temperature
flow           momentum
Chemistry       moles          molar          chemical      chemical
flow           momentum      potential


Typically in engineering each row in the chart is treated ‘classically’ — but we can also treat them quantum-mechanically, or statistical-mechanically. In the first two rows this is a famous thing to do. There’s also some interesting work on quantization of electrical circuits, which I’ll discuss someday if I live long enough. But the row called thermodynamics is a bit disturbing, since thermodynamics is the macroscopic story that’s ‘explained’ by statistical mechanics! In week289 I asked about the idea of quantizing thermodynamics. It’s an odd thing to do from a physics perspective. For example, I’ve never heard people talk about a ‘temperature operator’ and ‘entropy operator’ that may fail to commute. But mathematically it’s very easy, thanks to this analogy chart!

But just as easily as you can quantize thermodynamics, you can also replace $i \hbar$ by $T$ and ‘statistical-mechanize thermodynamics’. And that’s even more odd! You get two different things called temperature, for example.

But of course these curious problems are part of what makes analogies so much fun.

Posted by: John Baez on February 1, 2010 1:56 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

“Legendre transforms show up in classical mechanics, while Fourier transforms show up in quantum mechanics and Laplace transforms show up in statistical mechanics”

But the Laplace transform crops up in circuit analysis for other reasons.

Posted by: David Corfield on February 1, 2010 8:26 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

the Laplace transform crops up in circuit analysis

(Really the problem is on their end; someday they may fix it and then David's link will work.)

Posted by: Toby Bartels on February 1, 2010 6:02 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

David wrote:

But the Laplace transform crops up in circuit analysis for other reasons.

Yes: this makes it possible to treat all linear circuit elements, most notably capacitors or inductors, as resistors whose resistance is a nontrivial complex-valued function of frequency. Engineers like to use the Laplace transform to turn functions of time into functions of frequency. A Fourier transform would do just as well — the difference is minor and ‘purely technical’.

Right now I’m thinking of this appearance of the Laplace transform as unrelated to the Legendre transforms that are showing up all over the place here. Perhaps a deeper treatment would tie it in somehow. But I’ve really got my hands full already.

Posted by: John Baez on February 1, 2010 10:14 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

a

                displacement    flow          momentum      effort
q           q'              p            p'
Mechanics       position       velocity       momentum      force
(translation)
Mechanics       angle          angular        angular       torque
(rotation)                     velocity       momentum
Electronics     charge         current        flux          voltage
Hydraulics      volume         flow           pressure      pressure
momentum
Thermodynamics  entropy        entropy        temperature   temperature
flow           momentum
Chemistry       moles          molar          chemical      chemical
flow           momentum      potential



                displacement    flow          momentum      effort
q           q'              p            p'
Maxwell          D-flux        H-Field        B-flux       E-field


Remember:
d/dt E = -B*surface
d/dt H = D*surface

As a circuit, this will work in 1 dimension or 2 dimensions.
But in 3 dimensions, to do the Maxwell analogy properly, you need to generalize a circuit to a *2-complex*.
Challenge for the bond graph fans: Can you do that with bond graphs?

In the Maxwell, case, I distinguished between flux and field. This is part of a subtlety that we haven’t discussed yet:

    (0-Chunk)                         (1-Chunk)
Voltage ---(Co)Boundary---->Voltage difference
^                              |
|                              |
Ohm's law (Capacitor)       Ohm's law (inductor)
|                              |
|                              |
|                              V
d/dt Charge----(Co)Boundary----d/dt Current
(N-Chunk)                       (N-1-Chunk)


Going from voltage to current actually involves 2 steps: Taking a coboundary and applying Ohm’s law. This subtlety becomes more important as things get trickier, such as in the Maxwell case.
Gerard

Posted by: Gerard Westendorp on February 1, 2010 7:38 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

So Ohm's Law is Hodge duality?

Posted by: Toby Bartels on February 1, 2010 8:08 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I believe Ohm’s Law is the equivalent of Hodge duality.
Ohm’s law combines material properties with geometrical properties. Probably these material properties are left out (set constant) in many field theories.
As John suggested, n-forms, n-cochains, and what I call “n-chunks” are related.

So far, we looked at 2 variables whose product is power eg. (V,I), (F,v). But if you want to work with field quantities, you get things like current-density (j). You pair these field quantities so their product equals power per volume, eg (grad(V),j), (grad(p),v), (E,H).

So if you take a “chunk” of field power, you have both field quantities and a volume. eg.

Power = sigma * Volume * j * grad(V)

If you convert a field quantity + a space dimension into a lumped quantity, eg grad(V) to (V1-V2) by integrating over (X), then the other quantity becomes something that scales with d(Volume)/dX: It is proportional to a surface.

Gerard

Posted by: Gerard Westendorp on February 1, 2010 10:02 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

What I wrote is not quite correct.
(I havn’t got MathML to work yet, so I’ll try to do this in ACCII.)

I wrote:
d/dt E = -B*surface
d/dt H = D*surface

But it should be:

d/dt integral_loop( E * dx) = -integral_surface (B*dA)
d/dt integral_loop (H * dx) = integral_surface (D*dA)

Maybe it is not wise to use the words “flux” and “field”
I prefer to use the word “chunk”:

A D-chunk, is D integrated over a surface.
An E-chunk is E integrated over a line.
A B-chunk, is B integrated over a surface.
An H-chunk is H integrated over a line.

So:

                displacement    flow          momentum      effort
q           q'              p            p'
Maxwell          D-chunk       H-chunk         B-chunk       E-chunk


Gerard

Posted by: Gerard Westendorp on February 2, 2010 10:27 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I have no idea how this might fit into what’s being discussed here, but you may be interested in recent work of Artstein-Avidan and Milman and Alesker, Artstein-Avidan, and Milman which shows that classical duality transformations like the Legendre and Fourier transforms are quite canonical. (I don’t know whether they deal explicitly with Laplace transforms.) For example, modulo technicalities and up to changes of coordinates, the Fourier transform is the unique bijective transformation of functions that turns convolution into multiplication and vice versa, and the Legendre transform is the unique involution on the space of convex functions which reverses inequality.

Posted by: Mark Meckes on February 2, 2010 2:32 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Thanks for pointing this out!

Posted by: John Baez on February 3, 2010 3:50 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Hi John,

I really missed in your recent weeks the so called “memristor”. It’s been the focus of a lot of press as well as new venues for innovation. What is its place in these whole schemes?

Posted by: Daniel de Franca MTd2 on January 30, 2010 2:58 PM | Permalink | Reply to this

### Each went his own way; Re: This Week’s Finds in Mathematical Physics (Week 292)

Interdepartmental communicating of my interdisciplinary PhD research 1973-1977 was difficult, because the Biologists on my Dissertation Committee didn’t know about Laplace Transforms (as I was composing subsystems into systems with feedback), the Computer Scientists didn’t know any Metabolomics (as basic as the Michaelis-Menten equations, ODEs for chemical flux in systems of enzymes), the Mathematicians didn’t see the point in the interactive computer graphics front end that I built for my simulations, and so forth. This ignorance by each discipline about each other discipline is why each developed different vocabulary and notation, and why this half century of unifying meta-theory was so urgently needed.

Posted by: Jonathan Vos Post on January 30, 2010 5:21 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Daniele wrote:

I really missed in your recent weeks the so called “memristor”. It’s been the focus of a lot of press as well as new venues for innovation. What is its place in this whole scheme?

It’s coming, along with some other funky 1-ports I haven’t gotten to yet.

From one viewpoint, I already have enough 1-ports, 2-ports and 3-ports to illustrate the overall scheme, and extra examples just need to be put in their proper slots. Are memristors active or passive? If passive, are they conservative or dissipative? Are they linear or nonlinear? Once we know that, we know where they belong in the grand scheme.

From another viewpoint, each individual $n$-port deserves its own detailed study! I would need to learn more about memristors to do them justice.

Posted by: John Baez on February 1, 2010 2:07 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

The word reminds me of some SF-stories by Stanislav Lem from the early 1970’s, where smart robots worked by what translators called “mnestronen”.

Posted by: Thomas on February 1, 2010 9:06 AM | Permalink | Reply to this

### typo

All of the Thévenin’s, have an extra byte before the é – hex 81. This displays badly, in my browser at least.

Posted by: Aaron on January 31, 2010 12:54 AM | Permalink | Reply to this

### Re: typo

Thanks — fixed! The dangers of naive cut-and-paste.

Posted by: John Baez on January 31, 2010 9:03 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I like seeing the unification of particle physics with engineering techniques as in your “big analogy chart”.

I would add that it is relatively easy to also incorporate economics techniques, since basically physics and engineering deal with energy economics.

Posted by: Doug on January 31, 2010 2:43 PM | Permalink | Reply to this

### Money to Burn; Re: This Week’s Finds in Mathematical Physics (Week 292)

Not to get into Econophysics (I know Economists and Physicists who love the term, and others who hate it). But a simple back-of envelope question: How cost-efficient is power from burning one-dollar bills? Say, several tons at a time, to boil water for a steam plant? What emissions are there? Do they have to be beneficiated by removing a metal wire, or is that only in higher denominations bills? This could be a useful baseline/normalization for public exposition, or business plans, on power generation systems.

Posted by: Jonathan Vos Post on January 31, 2010 7:27 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Doug wrote:

I like seeing the unification of particle physics with engineering techniques as in your “big analogy chart”.

Thanks! But I didn’t really say much about particle physics, except for the humble case of classical point particles. There is indeed an analogy between circuit diagrams and Feynman diagrams, but I haven’t said much about that yet.

I would add that it is relatively easy to also incorporate economics techniques, since basically physics and engineering deal with energy economics.

As the ‘smart grid’ grows and matures, this may become a big deal.

In week289 I mentioned a version of bond graphs where instead of

$position, momentum,$ $velocity and force$

we have

$inventory, economic momentum,$ $product flow and price$

But I haven’t studied this version and I don’t know how much sense it makes.

James Dolan also pointed out an analogy where time and interest are canonically conjugate variables, with the present value of a future income stream $f(t)$ at interest rate $k$ being the Laplace transform

$\widehat{f}(k) = \int_0^\infty exp(-k t) f(t) d t$

Jonathan wrote:

But a simple back-of envelope question: How cost-efficient is power from burning one-dollar bills?

Here’s what I learned online. Maybe it’s true, maybe not.

Colored office paper has a gross calorific value of about 6300 BTU per pound.

A British Thermal Unit is about 1055 joules, and a pound is about 454 grams.

That gives me about 14,600 joules per dollar bill.

That seems like a shockingly large number at first, so let me compare: a cup of whole milk has about 150 ‘calories’, but these are really kilocalories, and a kilocalorie is 4184 joules, so that’s 628,000 joules.

So, okay, that sounds about right: you’d need to eat about 40 dollar bills to get the caloric equivalent of a cup of milk, assuming you could metabolize cellulose and extract all the energy you could from burning it! For example, if you were a termite…

Now let’s compare gasoline. For example, we can imagine a car that runs on dollar bills, and see how it would stack up against a normal petrol-powered car.

Conveniently, it says here that “If it were possible for human beings to digest gasoline, a gallon would contain about 31,000 food calories – the energy in a gallon of gasoline is equivalent to the energy in about 110 McDonalds hamburgers!”

So, a gallon of gasoline, which costs about $3 in California right now, contains about 130,000,000 joules. That’s about 40,000,000 joules per dollar. Whereas the dollar bill itself provides only about 628,000 joules if you burn it. That’s about a factor of 64. Posted by: John Baez on February 1, 2010 1:04 AM | Permalink | Reply to this ### Re: This Week’s Finds in Mathematical Physics (Week 292) I’ll completely miss the point of a back-of-the-envelope calculation by mentioning that (as a result of having watched Inside man I know that) American currency is not made of paper but a mixture of 25 percent linen and 75 percent cotton. Internet references only give BTU values for “cotton waste” not “usable cotton”, but that’s apparently 8000-9000 BTU per pound. (Incidentally looking at oil/gasoline “energy as food” gets taken one step further in the energy slave “unit system”.) Posted by: bane on February 1, 2010 4:29 AM | Permalink | Reply to this ### Re: This Week’s Finds in Mathematical Physics (Week 292) John Baez wrote So, a gallon of gasoline, which costs about$3 in California right now, contains about 130,000,000 joules. That’s about 40,000,000 joules per dollar. Whereas the dollar bill itself provides only about 628,000 joules if you burn it. That’s about a factor of 64.

That factor will soon decrease as gasoline get’s more expensive, unless it is subsidized (some people will argue that it already is massively subsidized in the USA - indirectly - but I mean that the government spends money explicitly, say by handing everyone that buys a gallon one dollar).

And if you try to print the money that was spent e.g. on the 2009 bailout on one dollar notes, you will probably find that the cost to produce a one dollar note will soon surpass one dollar…so, taking second order effects into account, we may need another envelope…

Posted by: Tim van Beek on February 1, 2010 10:48 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

John - talking of analogies between different physical systems, any chance you could comment on Verlinde’s new paper on the origin of gravity and the laws of Newton? Some friends were talking about it as a Big Deal - everything derived from entropic forces - and I wondered whether I’d find any talk about it here. It’s 1001.0785v1 on the arXiv. Thanks!

Posted by: Monty on February 1, 2010 4:07 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

any chance you could comment on Verlinde’s new paper

There was some discussion in the comments to Week 191.

Posted by: Toby Bartels on February 1, 2010 6:09 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Oh! Right you are (for “191” read “291”). I searched the blog for “Verlinde” but the only hits were quite old. Maybe comments aren’t indexed.

Posted by: Monty on February 1, 2010 7:23 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I might get around to talking about Verlinde’s paper sometime. If I were trying to ‘boost my ratings’ and get as many people to read This Week’s Finds as possible, I would have done it already!

For now, I again urge everyone to Jacobson’s paper incredibly interesting, back in 1995 — so I again urge everyone to read that. It’s short and you can get a lot out of it even if you skip the 6 numbered equations.

Posted by: John Baez on February 2, 2010 2:38 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I respect that you probably don’t like to discuss this topic here, at least not right now - but the natural question is: “Are there any interesting follow-ups to Jacobson’s paper”?

(Oh my, when it was published, I was still a barely sentient, jellyfish like being in high-school).

Reading the last paragraph disclosed a basic gap in my knowledge of GR: I suppose that everone has seen the derivation of linear gravitational waves (and the spectacular evidence of the existence of these from the orbital decay of binary quasar systems).
I just realized that I never saw or thought about nonlinear gravitational waves, so a hint about that would be appreciated, too.

Posted by: Tim van Beek on February 2, 2010 12:35 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Tim vB wrote:

I respect that you probably don’t like to discuss this topic here, at least not right now - but the natural question is: “Are there any interesting follow-ups to Jacobson’s paper”?

Try these.

I just realized that I never saw or thought about nonlinear gravitational waves, so a hint about that would be appreciated, too.

There’s a lot of work on these. If you have some spare time, read The Global Nonlinear Stability of the Minkowski Space by Demetrios Christodoulou
Demetrios Christodoulou and Sergiu Klainerman. It’s 514 pages long and it gives a rigorous proof that Minkowski spacetime is stable in general relativity: small perturbations will not grow, but form gravitational radiation that fades away as it spreads out. I’m sure you’ll be relieved to hear this!

Posted by: John Baez on February 2, 2010 5:17 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

“Are there any interesting follow-ups to Jacobson’s paper”?

Via the arXiv trackbacks, I saw that Jacques Distler mentions the work of Bousso (and follow-ups thereto).

Posted by: Blake Stacey on February 2, 2010 6:53 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Thanks! It will take me some time to digest this…

As for the “nonlinear” gravitational waves, maybe I should explain why Jacobon’s paper made me ask about it:

For sufficiently high sound frequency or intensity one knows that the local equilibrium condition breaks down, entropy increases, and sound no longer propagates in a time reversal invariant manner.

If we accept the existence of linear gravitational waves, the next stepping stone in pushing the analogy of sound waves and gravitational waves further could be the question what classical GR could possibly tell us about nonlinear effects with regard to wave like phenomena (entropy does not enter the picture yet, but less trivial interactions of the “wave” with it’s “background” do). I already suspected that much is known about this, but curiously enought it’s not mentioned in the textbooks that I know.

To get ahold of the book by Christodoulou and Klainerman will take some time and effort, but meanwhile I found an article in the Living Rev. Relativity that contains a (very very) short introduction to this topic (and further references):

Posted by: Tim van Beek on February 3, 2010 9:35 AM | Permalink | Reply to this

### Kerr-Newman; Re: This Week’s Finds in Mathematical Physics (Week 292)

For one thing, gravitational radiation is quadrupole, unlike the compression waves of sound, or the transverse dipole electromagnetic waves. More interesting is the transducing (nonlinear) aspect of scattering gravitational radiation from charged, spinning black holes (Kerr-Newman). Incoming gravitational radiation yields outgoing electromagnetic waves, and vice versa.

Posted by: Jonathan Vos Post on February 3, 2010 4:43 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Tim vB wrote:

To get ahold of the book by Christodoulou and Klainerman will take some time and effort…

I wasn’t seriously suggesting that you read that book! I was just pointing out its existence. It’s a tour de force of nonlinear analysis, quite difficult to follow, which won Christodoulou a MacArthur ‘genius grant’ and various other prizes. You can read about nonlinear aspects of gravitational waves in many other simpler papers. For example:

Posted by: John Baez on February 5, 2010 6:10 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Professor Baez,

You speak of linear passive circuit elements and then at the end say ”If we drop the linearity assumption and consider fully general circuits…”

Can you just explain what the difference is between linear and non linear circuits?
I am not clear.

Posted by: kim on February 2, 2010 1:10 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

For now, you can think of a linear circuit as one made of linear resistances, capacitances and inductances, along with all the 2-ports and 3-ports I’ve introduced (which are are all linear). I defined resistances, capacitances and inductances in week290, and I said what it meant for them to be ‘linear’.

More generally, an $n$-port is defined by $n$ equations relating $n$ efforts, $n$ flows, and perhaps the time variable $t$. If these equations are all linear in the efforts and flows (for each fixed time $t$), we say the $n$-port is linear. Then, we say a circuit is linear if it’s built from linear $n$-ports.

Of course this definition only works for circuits built from $n$-ports.

I suppose I should fix week292 a bit so that it includes an answer to your question.

Posted by: John Baez on February 2, 2010 2:20 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

In electronics we often look at current as a function of voltage and resistance. Does it matter whether p’=f(q’) or q’=f(p’) ?

Posted by: kim on February 2, 2010 4:52 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Ah, interesting question! I was planning to talk about that later. The short answer is no: it doesn’t really matter, you can usually think about it either way.

But in the case where the resistance is zero (short circuit) we can’t think of the current as a function of voltage. And when the resistance is infinite (open circuit) we can’t think of the voltage as a function of current. A really good theory has to handle these cases! And this pushes the math in some interesting directions.

Posted by: John Baez on February 3, 2010 3:47 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

I know how to derive the principle of least energy from the principle of least action by starting with a conservative system and imposing the assumption that it’s static. But how about the principle of least power? Where does this come from?

I don’t know. If you know, tell me!

Well I’ll give it a try….
Previously, I already mentioned space time circuits.

Action is a relativistic concept.
Conventional circuit elements are “power chunks”: The power in the system is split *spatially*(discretised) over the components. But *time* discretisation is not done: the law for a capacitor and inductor involves continuous time. But fundamental mathematical physics usually try to treat space and time on equal footing. So let’s try to discretise time in the same way.
If a quantity on a vertex in a “space circuit” is dV/dt, it becomes a *difference* between 2 vertices separated by a time interval:

Conventional:
dV/dt
---0---

Space time:
-------> time
V(t)    ______    V(t+dt)
O------[______]------O


So the variables on the vertices get changed from dV/dt to V. The same happens to the variables on the edges. The result is that their product, formerly Power, becomes the double time integral of Power: Action.

So in a space time circuit, the circuit elements are chunks of action.
The principle of least power becomes the principle of least action.

I don’t really know an everyday life meaning of the Action concept. The way I see it, is as something that arises when you look at the world as space-time, rather than just space.

Gerard

Posted by: Gerard Westendorp on February 2, 2010 9:57 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

will my question(s) be answered? Or are these discussions only good for phd’s? as a beginner am I somehow excluded from the discussion?

Some of the notes in ‘This weeks find’ contain basic information but sometimes its mixed with a bunch of postdoc level stuff.

Posted by: kim on February 3, 2010 2:06 AM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

Patience! I tried to answer your second question this morning, but I didn’t succeed in posting my reply. I posted it just now.

Posted by: John Baez on February 3, 2010 3:49 AM | Permalink | Reply to this

### Only n-ports?

Am I the only one to be puzzled by your treatment of bond graphs compared to Paytner’s? His diagrams you show us manifestly have e.g. 2-inputs and one output
??

Posted by: jim stasheff on February 5, 2010 1:47 PM | Permalink | Reply to this

### Re: Only n-ports?

Jim wrote:

Am I the only one to be puzzled by your treatment of bond graphs compared to Paytner’s? His diagrams you show us manifestly have e.g. 2-inputs and one output??

An $n$-port has a total of $n$ inputs and outputs. I drew a resistor, which is a 1-port:

           e       \
----------------- R
f


This has 1 input at left, and no outputs: the resistor R is not an output: ‘the buck stops here’.

Paynter’s picture also includes 2-ports and 3-ports: The reservoir is a 1-port, the fluid power transmission is a 2-port, and the speed governor is a 3-port.

Is that what’s bothering you? That I only drew a 1-port? I wanted to draw some 2-ports and 3-ports myself, but I discovered it would take some work to draw them nicely. So, I decided to use Paynter’s picture for those.

If you’re having trouble understanding this stuff, I’m sure lots of other people are, too. So please let me know what’s the problem.

Posted by: John Baez on February 5, 2010 4:02 PM | Permalink | Reply to this

### Re: This Week’s Finds in Mathematical Physics (Week 292)

It’s hard to find clear online treatments of bond graphs. But these lectures are very thoughtful and nice. I thank C. J. Fearnley for pointing them out:

Posted by: John Baez on February 7, 2010 4:36 AM | Permalink | Reply to this

### Quantisation of a circuit.

I was wandering,

Suppose you have a circuit configuration for which you can calculate the Action (S). Is there some magical quantisation rule like:

amplitude = exp (iS/h_bar)?

Then integrate over all possible configurations… I remember Derek Wise was doing something like that. (There was some subtlety with gauge degrees of freedom that you had to eliminate first, I’ll revive that later)

One thing I can’t figure out is how to impose an amount of energy on the circuit. You need some way to tell it that it is not in the vacuum state, but that it has a finite temperature (T).

This is something I would really like to understand.

Gerard

Posted by: Gerard Westendorp on February 8, 2010 11:19 PM | Permalink | Reply to this

Post a New Comment