Ferromagnets are solids, in pretty nearly every instance I can recall (though I suppose it's not impossible to imagine an itinerant Stoner magnet that's a liquid below its Curie temperature, and here is one apparent example). There's a neat paper in Science this week, reporting liquid droplets that act like ferromagnets and can be reshaped.

The physics at work here is actually a bit more interesting than just a single homogeneous material that happens to be liquid below its magnetic ordering temperature. The liquid in this case is a suspension of magnetite nanoparticles. Each nanoparticle is magnetic, as the microscopic ordering temperature for Fe_{3}O_{4} is about 858 K. However, the individual particles are so small (22 nm in diameter) that they are superparamagnetic at room temperature, meaning that thermal fluctuations are energetic enough to reorient how the little north/south poles of the single-domain particles are pointing. Now, if the interface at the surface of the suspension droplet confines the nanoparticles sufficiently, they jam together with such small separations that their magnetic interactions are enough to lock their magnetizations, killing the superparamagnetism and leading to a bulk magnetic response from the aggregate. Pretty cool! (Extra-long-time readers of this blog will note that this hearkens waaaay back to this post.)

The physics at work here is actually a bit more interesting than just a single homogeneous material that happens to be liquid below its magnetic ordering temperature. The liquid in this case is a suspension of magnetite nanoparticles. Each nanoparticle is magnetic, as the microscopic ordering temperature for Fe

Applied Category Theory 2019 happened last week! It was very exciting: about 120 people attended, and they’re pushing forward to apply category theory in many different directions. The topics ranged from ultra-abstract to ultra-concrete, sometimes in the same talk.

The talks are listed above — click for a more readable version. Below you can read what Jules Hedges and I wrote about all those talks:

• Jules Hedges, Applied Category Theory 2019.

I tend to give terse summaries of the talks, with links to the original papers or slides. Jules tends to give his impressions of their overall significance. They’re nicely complementary.

You can also see videos of some talks, created by Jelle Herold with help from Fabrizio Genovese:

• Giovanni de Felice, Functorial question answering.

• Antonin Delpeuch, Autonomization of monoidal categories.

• Colin Zwanziger, Natural model semantics for comonadic and adjoint modal type theory.

• Nicholas Behr, Tracelets and tracelet analysis Of compositional rewriting systems.

• Dan Marsden, No-go theorems for distributive laws.

• Christian Williams, Enriched Lawvere theories for operational semantics.

• Walter Tholen, Approximate composition.

• Erwan Beurier, Interfacing biology, category theory & mathematical statistics.

• Stelios Tsampas, Categorical contextual reasoning.

• Fabrizio Genovese, idris-ct: A library to do category theory in Idris.

• Michael Johnson, Machine learning and bidirectional transformations.

• Bruno Gavranović, Learning functors using gradient descent

• Zinovy Diskin, Supervised learning as change propagation with delta lenses.

• Bryce Clarke, Internal lenses as functors and cofunctors.

• Ryan Wisnewsky, Conexus AI.

• Ross Duncan, Cambridge Quantum Computing.

• Beurier Erwan, Memoryless systems generate the class of all discrete systems.

• Blake Pollard, Compositional models for power systems.

• Martti Karvonen, A comonadic view of simulation and quantum resources.

• Quanlong Wang, ZX-Rules for 2-qubit Clifford+T quantum circuits, and beyond.

• James Fairbank, A Compositional framework for scientific model augmentation.

• Titoan Carette, Completeness of graphical languages for mixed state quantum mechanics.

• Antonin Delpeuch, A complete language for faceted dataflow languages.

• John van der Wetering, An effect-theoretic reconstruction of quantum mechanics.

• Vladimir Zamdzhiev, Inductive datatypes for quantum programming.

• Octavio Malherbe, A categorical construction for the computational definition of vector spaces.

• Vladimir Zamdzhiev, Mixed linear and non-linear recursive types.

Applied Category Theory 2019 happened last week! It was very exciting: about 120 people attended, and they’re pushing forward to apply category theory in many different directions. The topics ranged from ultra-abstract to ultra-concrete, sometimes in the same talk.

Now the Applied Category Theory 2019 *school* is about to start. But we shouldn’t let the momentum built up at the conference dissipate.

The talks at ACT2019 are listed above — click for a more readable version. Here you can read what Jules Hedges and I wrote about all those talks:

- Jules Hedges, Applied Category Theory 2019.

I tend to give terse summaries of the talks, with links to the original papers or slides. Jules tends to give his impressions of their overall significance. They’re nicely complementary.

You can also see videos of some talks, created by Jelle Herold with help from Fabrizio Genovese. They’re all here.

Yoram Alhassid asked the question at the end of my Yale Quantum Institute colloquium last February. I knew two facts about Yoram: (1) He belongs to Yale’s theoretical-physics faculty. (2) His PhD thesis’s title—“On the Information Theoretic Approach to Nuclear Reactions”—ranks … Continue reading

Yoram Alhassid asked the question at the end of my Yale Quantum Institute colloquium last February. I knew two facts about Yoram: (1) He belongs to Yale’s theoretical-physics faculty. (2) His PhD thesis’s title—“On the Information Theoretic Approach to Nuclear Reactions”—ranks among my three favorites.^{1}

Over the past few months, I’ve grown to know Yoram better. He had reason to ask about quantum statistical mechanics, because his research stands up to its ears in the field. If forced to synopsize quantum statistical mechanics in five words, I’d say, “study of many-particle quantum systems.” Examples include gases of ultracold atoms. If given another five words, I’d add, “Calculate and use partition functions.” A partition function is a measure of the number of states, or configurations, accessible to the system. Calculate a system’s partition function, and you can calculate the system’s average energy, the average number of particles in the system, how the system responds to magnetic fields, etc.

My colloquium concerned quantum thermodynamics, which I’ve blogged about many times. So I should have been able to distinguish quantum thermodynamics from its neighbors. But the answer I gave Yoram didn’t satisfy me. I mulled over the exchange for a few weeks, then emailed Yoram a 502-word essay. The exercise grew my appreciation for the question and my understanding of my field.

An adaptation of the email appears below. The adaptation should suit readers who’ve majored in physics, but don’t worry if you haven’t. Bits of what distinguishes quantum thermodynamics from quantum statistical mechanics should come across to everyone—as should, I hope, the value of question-and-answer sessions:

One distinction is a return to the operational approach of 19th-century thermodynamics. Thermodynamicists such as Sadi Carnot wanted to know how effectively engines could operate. Their practical questions led to fundamental insights, such as the Carnot bound on an engine’s efficiency. Similarly, quantum thermodynamicists often ask, “How can this state serve as a resource in thermodynamic tasks?” This approach helps us identify what distinguishes quantum theory from classical mechanics.

For example, quantum thermodynamicists found an advantage in charging batteries via nonlocal operations. Another example is the “MBL-mobile” that I designed with collaborators.* *Many-body localization (MBL), we found, can enhance an engine’s reliability and scalability.

Asking, “How can this state serve as a resource?” leads quantum thermodynamicists to design quantum engines, ratchets, batteries, etc. We analyze how these devices can outperform classical analogues, identifying which aspects of quantum theory power the outperformance. This question and these tasks contrast with the questions and tasks of many non-quantum-thermodynamicists who use statistical mechanics. They often calculate response functions and (e.g., ground-state) properties of Hamiltonians.

These goals of characterizing what nonclassicality is and what it can achieve in thermodynamic contexts resemble upshots of quantum computing and cryptography. As a 21st-century quantum information scientist, I understand what makes quantum theory quantum partially by understanding which problems quantum computers can solve efficiently and classical computers can’t. Similarly, I understand what makes quantum theory quantum partially by understanding how much more work you can extract from a singlet (a maximally entangled state of two qubits) than from a product state in which the reduced states have the same forms as in the singlet, .

As quantum thermodynamics shares its operational approach with quantum information theory, quantum thermodynamicists use mathematical tools developed in quantum information theory. An example consists of generalized entropies. Entropies quantify the optimal efficiency with which we can perform information-processing and thermodynamic tasks, such as data compression and work extraction.

Most statistical-mechanics researchers use just the Shannon and von Neumann entropies, and , and perhaps the occasional relative entropy. These entropies quantify optimal efficiencies in large-system limits, e.g., as the number of messages compressed approaches infinity and in the thermodynamic limit.

Other entropic quantities have been defined and explored over the past two decades, in quantum and classical information theory. These entropies quantify the optimal efficiencies with which tasks can be performed (i) if the number of systems processed or the number of trials is arbitrary, (ii) if the systems processed share correlations, (iii) in the presence of “quantum side information” (if the system being used as a resource is entangled with another system, to which an agent has access), or (iv) if you can tolerate some probability that you fail to accomplish your task. Instead of limiting ourselves to and , we use also “-smoothed entropies,” Rényi divergences, hypothesis-testing entropies, conditional entropies, etc.

Another hallmark of quantum thermodynamics is results’ generality and simplicity. Thermodynamics characterizes a system with a few macroscopic observables, such as temperature, volume, and particle number. The simplicity of some quantum thermodynamics served a chemist collaborator and me, as explained in the introduction of https://arxiv.org/abs/1811.06551.

Yoram’s question reminded me of one reason why, as an undergrad, I adored studying physics in a liberal-arts college. I ate dinner and took walks with students majoring in economics, German studies, and Middle Eastern languages. They described their challenges, which I analyzed with the physics mindset that I was acquiring. We then compared our approaches. Encountering other disciplines’ perspectives helped me recognize what tools I was developing as a budding physicist. How can we know our corner of the world without stepping outside it and viewing it as part of a landscape?

^{1}The title epitomizes clarity and simplicity. And I have trouble resisting anything advertised as “the information-theoretic approach to such-and-such.”

Eilers and I looked at the dependence of the kinematics of disk populations with various element-abundance ratios. And we built a model to capitalize on these differences without binning the data: We parameterized the dependences of kinematics (phase-space distribution function) on element abundances and then re-fit our dynamical model. It didn't work great; we don't yet understand why.