## May 16, 2013

### The Propositional Fracture Theorem

#### Posted by Mike Shulman

Suppose $X$ is a topological space and $U\subseteq X$ is an open subset, with closed complement $K = X\setminus U$. Then $U$ and $K$ are, of course, topological spaces in their own right, and we have $X = U\sqcup K$ as a set. What additional information beyond the topologies of $U$ and $K$ is necessary to enable us to recover the topology of $X$ on their disjoint union?

Recall that the subspace topologies of $U$ and $K$ say that for each open $V\subseteq X$, the intersections $V\cap U$ and $V\cap K$ are open in $U$ and $K$, respectively. Thus, if a subset of $X$ is to be open, it must yield open subsets of $U$ and $K$ when intersected with them. However, this condition is not in general sufficient for a subset of $X$ to be open — it does define a topology on $X$, but it’s the coproduct topology, which may not be the original one.

One way we could start is by asking what sort of structure relating $U$ and $K$ we can deduce from the fact that both are embedded in $X$. For instance, suppose $A\subseteq U$ is open. Then there is some open $V\subseteq X$ such that $V\cap U = A$. But we could also consider $V\cap K$, and ask whether this defines something interesting as a function of $A$.

Of course, it’s not clear that $V\cap K$ is a function of $A$ at all, since it depends on our choice of $V$ such that $V\cap U = A$. Is there a canonical choice of such $V$? Well, yes, there’s one obvious canonical choice: since $U$ is open in $X$, $A$ is also open as a subset of $X$, and we have $A\cap U = A$. However, $A\cap K = \emptyset$, so choosing $V=A$ wouldn’t be very interesting.

The choice $V=A$ is the smallest possible $V$ such that $V\cap U = A$. But there’s also a largest such $V$, namely the union of all such $V$. This set is open in $X$, of course, since open sets are closed under arbitrary unions, and since intersections distribute over arbitrary unions, its intersection with $U$ is still $A$.

Let’s call this set $i_\ast(A)$. In fact, it’s part of a triple of adjoint functors $i_! \dashv i^\ast \dashv i_\ast$ between the posets $O(U)$ and $O(X)$ of open sets in $U$ and $X$, where $i^\ast:O(X)\to O(U)$ is defined by $i^\ast(V) = V\cap U$, and $i_!:O(U)\to O(X)$ is defined by $i_!(A)=A$. Here $i$ denotes the continuous inclusion $U\hookrightarrow X$.

Now we can consider the intersection $i_\ast(A) \cap K$, which I’ll also denote $j^\ast i_\ast(A)$, where $j:K\hookrightarrow X$ is the inclusion. It turns out that this is interesting! Consider the following example, which is easy to visualize:

• $X = \mathbb{R}^2$.
• $U = \{ (x,y) | x \lt 0 \}$, the open left half-plane.
• $K = \{ (x,y) | x \ge 0 \}$, the closed right half-plane.

If an open subset $A\subseteq U$ “doesn’t approach the boundary” between $U$ and $K$, such as the open disc of radius $1$ centered at $(-2,0)$, then it’s fairly easy to see that $i_\ast(A) = A \cup \{(x,y) | x \gt 0 \}$, and therefore $j^\ast i_\ast(A) = \{(x,y) | x \gt 0 \}$ is the open right half-plane.

On the other hand, consider some open subset $A\subseteq U$ which does approach the boundary, such as $A = \{ (x,y) | x^2 + y^2 \lt 1 \;\text{and}\; x \lt 0 \}$ the intersection with $U$ of the open disc of radius $1$ centered at $(0,0)$. A little thought should convince you that in this case, $i_\ast(A)$ is the union of the open right half-plane with the whole open disc of radius $1$ centered at $(0,0)$. Therefore, $j^\ast i_\ast(A)$ is the open right half-plane together with the strip $\{ (0,y) | -1 \lt y \lt 1 \}$.

This example suggests that in general, $j^\ast i_\ast(A)$ measures how much of the “boundary” between $U$ and $K$ is “adjacent” to $A$. I leave it to some enterprising reader to try to make that precise. Here’s another nice exercise: what can you say about $i^\ast j_\ast(B)$ for an open subset $B\subseteq K$?

Let us however go back to our original question of recovering the topology of $X$. Suppose $A\subseteq U$ and $B\subseteq K$ are open such that $A\cup B$ is open in $X$; how does this latter fact manifest as a property of $A$ and $B$? Note first that $(A\cup B) \cap U = A$. Thus, since $i_\ast(A)$ is the largest $V$ such that $V\cap U = A$, we have $A\cup B \subseteq i_\ast(A)$, and therefore $B = j^\ast(A\cup B) \subseteq j^\ast i_\ast(A)$. Let me say that again: $B \subseteq j^\ast i_\ast(A).$ This is a relationship between $A$ and $B$ which is expressed purely in terms of the topological spaces $U$ and $K$ and the function $j^\ast i_\ast : O(U) \to O(K)$, which we have just shown is necessary for $A\cup B$ to be open in $X$.

In fact, it is also sufficient! For suppose this to be true. Since $B$ is open in $K$, there is some open $C\subseteq X$ such that $C\cap K = B$. Given such a $C$, the union $C\cup U$ also has this property, since $U\cap K = \emptyset$. Note that in fact $C\cup U = B\cup U$, and also $B\cup U = j_\ast (B)$, the largest open subset of $X$ whose intersection with $K$ is $B$. (Since $K$, unlike $U$, is not open, there may not be a smallest such, but there is always a largest such.) Now I claim we have $A \cup B = j_\ast (B) \cap i_\ast(A)$ To show this, it suffices to show that the two sides become equal after intersecting with $U$ and with $K$. For the first, we have $(j_\ast (B) \cap i_\ast(A)) \cap U = j_\ast (B) \cap (i_\ast(A) \cap U) = j_\ast (B) \cap A = A = (A\cup B) \cap U$ and for the second we have $(j_\ast (B) \cap i_\ast(A)) \cap K = (j_\ast (B) \cap K) \cap i_\ast(A) = B \cap i_\ast(A) = B = (A\cup B) \cap K$ using the assumption at the step $B \cap i_\ast(A) = B$.

In conclusion, the topology of $X$ is entirely determined by

• the induced topology of an open subspace $U\subseteq X$,
• the induced topology on its closed complement $K = X\setminus U$, and
• the induced function $j^\ast i_\ast : O(U) \to O(K)$.

Specifically, the open subsets of $X$ are those of the form $A\cup B$ — or equivalently, by the above argument, $i_\ast(A) \cap j_\ast(B)$ — where $A\subseteq U$ is open in $U$, $B\subseteq K$ is open in $K$, and $B\subseteq j^\ast i_\ast(A)$.

An obvious question to ask now is, suppose given two arbitrary topological spaces $U$ and $K$ and a function $f:O(U)\to O(K)$; what conditions on $f$ ensure that we can define a topology on $X\coloneqq U\sqcup K$ in this way, which restricts to the given topologies on $U$ and $K$ and induces $f$ as $j^\ast i_\ast$? We may start by asking what properties $j^\ast i_\ast$ has. Well, it preserves inclusion of open sets (i.e. $A\subseteq A' \Rightarrow j^\ast i_\ast(A) \subseteq j^\ast i_\ast(A')$) and also finite intersections ($j^\ast i_\ast(A\cap A') = j^\ast i_\ast(A) \cap j^\ast i_\ast(A')$), including the empty intersection ($j^\ast i_\ast(U) = K$). In other words, it is a finite-limit-preserving functor between posets. Perhaps surprisingly, it turns out that this is also sufficient: any finite-limit-preserving $f:O(U) \to O(K)$ allows us to glue $U$ and $K$ in this way; I’ll leave that as an exercise too.

Okay, that was some fun point-set topology. Now let’s categorify it. Open subsets of $X$ are the same as 0-sheaves on it, i.e. sheaves of truth values, or of subsingleton sets, and the poset $O(X)$ is the (0,1)-topos of 0-sheaves on $X$. So a certain sort of person immediately asks, what about $n$-sheaves for $n\gt0$?

In other words, suppose we have $X$, $U$, and $K$ as above; what additional data on the toposes $Sh(U)$ and $Sh(K)$ of sheaves (of sets, or groupoids, or homotopy types, etc.) allows us to recover the topos $Sh(X)$? As in the posetal case, we have adjunctions $i_! \dashv i^\ast \dashv i_\ast$ and $j^\ast \dashv j_\ast$ relating these toposes, and we may consider the composite $j^\ast i_\ast : Sh(U) \to Sh(K)$.

The corresponding theorem is then that $Sh(X)$ is equivalent to the comma category of $Id_{Sh(K)}$ over $j^\ast i_\ast$, i.e. the category of triples $(A,B,\phi)$ where $A\in\Sh(U)$, $B\in Sh(K)$, and $\phi:B \to j^\ast i^\ast(A)$. This is true for 1-sheaves, $n$-sheaves, $\infty$-sheaves, etc. Moreover, the condition on a functor $f:Sh(U) \to Sh(K)$ ensuring that its comma category is a topos is again precisely that it preserves finite limits. Finally, this all works for arbitrary toposes, not just sheaves on topological spaces. I mentioned in my last post some applications of gluing for non-sheaf toposes (namely, syntactic categories).

One new-looking thing does happen at dimension 1, though, relating to what exactly the equivalence $Sh(X) \simeq (Id_{Sh(K)} \downarrow j^\ast i_\ast)$ looks like. The left-to-right direction is easy: we send $C\in Sh(X)$ to $(i^\ast C, j^\ast C, \phi)$ where $\phi : j^\ast C \to j^\ast i_\ast i^\ast C$ is $j^\ast$ applied to the unit of the adjunction $i^\ast \dashv i_\ast$. But in the other direction, suppose given $(A,B,\phi)$; how can we reconstruct an object of $Sh(X)$?

In the case of open subsets, we obtained the corresponding object (an open subset of $X$) as $A\cup B$, but now we no longer have an ambient “set of points” in which to take such a union. However, we also had the equivalent characterization of the open subset of $X$ as $i_\ast(A) \cap j_\ast(B)$, and in the categorified case we do have objects $i_\ast(A)$ and $j_\ast(B)$ of $Sh(X)$. We might initially try their cartesian product, but this is obviously wrong because it doesn’t incorporate the additional datum $\phi$. It turns out that the right generalization is actually the pullback of $j_\ast(\phi)$ and the unit of the adjunction $j^\ast\dashv j_\ast$ at $i_\ast(A)$: $\array{ C & \to & j_\ast(B) \\ \downarrow && \downarrow^{j^\ast(\phi)} \\ i_\ast(A) & \to & j_\ast j^\ast i_\ast(A) }$ In particular, any object $C\in Sh(X)$ can be recovered from $i^\ast C$ and $j^\ast C$ by this pullback: $\array{ C & \to & j_\ast j^\ast C \\ \downarrow && \downarrow \\ i_\ast i^\ast C & \to & j_\ast j^\ast i_\ast i^\ast C }$

Now let’s shift perspective a bit, and ask what all this looks like in the internal language of the topos $Sh(X)$. Inside $Sh(X)$, the subtoposes $Sh(U)$ and $Sh(K)$ are visible through the left-exact idempotent monads $i_\ast i^\ast$ and $j_\ast j^\ast$, whose corresponding reflective subcategories are equivalent to $Sh(U)$ and $Sh(K)$ respectively. In the internal type theory of $Sh(X)$, $i_\ast i^\ast$ and $j_\ast j^\ast$ are modalities, which I will denote $I_U$ and $J_U$ respectively. Thus, inside $Sh(X)$ we can talk about “sheaves on $U$” and “sheaves on $K$” by talking about $I_U$-modal and $J_U$-modal types (or sets).

Moreover, these particular modalities are actually definable in the internal language of $Sh(X)$. Open subsets $U\subseteq X$ can be identified with subterminal objects of $Sh(X)$, a.k.a. h-propositions or “truth values” in the internal logic. Thus, $U$ is such a proposition. Now $I_U$ is definable in terms of $U$ by $I_U(C) = (U\to C)$ I’m using type-theorists’ notation here, so $U\to C$ is the exponential $C^U$ in $Sh(X)$. The other modality $J_U$ is also definable internally, though a bit less simply: it’s the following pushout: $\array{ U\times C & \to & C\\ \downarrow & & \downarrow \\ U & \to & J_U(C)}.$ In homotopy-theoretic language, $J_U(C)$ is the join of $C$ and $U$, written $U\ast C$. And if we identify $Sh(U)$ and $Sh(K)$ with their images under $i_\ast$ and $j_\ast$, then the functor $j^\ast i_\ast : Sh(U) \to Sh(K)$ is just the modality $J_U$ applied to $I_U$-modal types.

Finally, the fact that $Sh(X)$ is the gluing of $Sh(U)$ with $Sh(K)$ means internally that any type $C$ can be recovered from $I_U(C)$, $J_U(C)$, and the induced map $J_U(C) \to J_U(I_U(C))$ as a pullback: $\array{ C & \to & J_U(C) \\ \downarrow && \downarrow \\ I_U(C) & \to & J_U(I_U(C)) }$ Now recall that internally, $U$ is a proposition: something which might be true or false. Logically, $I_U(C) = (U\to C)$ has a clear meaning: its elements are ways to construct an element of $C$ under the assumption that $U$ is true.

The logical meaning of $J_U$ is somewhat murkier, but there is one case in which it is crystal clear. Suppose $U$ is decidable, i.e. that it is true internally that “$U$ or not $U$”. If the law of excluded middle holds, then all propositions are decidable — but of course, internally to a topos, the LEM may fail to hold in general. If $U$ is decidable, then we have $U + \neg U = 1$, where $\neg U = (U\to 0)$ is its internal complement. It’s a nice exercise to show that under this assumption we have $J_U(C) = (\neg U \to C)$.

In other words, if $U$ is decidable, then the elements of $J_U(C)$ are ways to construct an element of $C$ under the assumption that $U$ is false. In the decidable case, we also have $J_U(I_U(C))=1$, so that $C = I_U(C) \times J_U(C)$ — and this is just the usual way to construct an element of $C$ by case analysis, doing one thing if $U$ is true and another if it is false.

This suggests that we might regard internal gluing as a “generalized sort of case analysis” which applies even to non-decidable propositions. Instead of ordinary case analysis, where we have to do two things:

• assuming $U$, construct an element of $C$; and
• assuming not $U$, construct an element of $C$

in the non-decidable case we have to do three things:

• assuming $U$, construct an element of $C$;
• construct an element of the join $U\ast C$; and
• check that the two constructions agree in $U*(U\to C)$.

I have no idea whether this sort of generalized case analysis is useful for anything. I kind of suspect it isn’t, since otherwise people would have discovered it, and be using it, and I would have heard about it. But you never know, maybe it has some application. In any case, I find it a neat way to think about gluing.

Let me end with a tantalizing remark (at least, tantalizing to me). People who calculate things in algebraic topology like to work by “localizing” or “completing” their topological spaces at primes, since it makes lots of things simpler. Then they have to try to put this “prime-by-prime” information back together into information about the original space. One important class of tools for this “putting back together” is called fracture theorems. A simple fracture theorem says that if $X$ is a $p$-local space (meaning that all primes other than $p$ are inverted) and some technical conditions hold, then there is a pullback square: $\array{ X & \to & X^{\wedge}_p\\ \downarrow & & \downarrow\\ X_{\mathbb{Q}} & \to & (X^{\wedge}_p)_{\mathbb{Q}} }$ where $(-)^{\wedge}_p$ denotes $p$-completion and $(-)_{\mathbb{Q}}$ denotes “rationalization” (inverting all primes). A similar theorem applies to any space $X$ (with technical conditions), yielding a pullback square $\array{ X & \to & \prod_p X_{(p)}\\ \downarrow & & \downarrow \\ X_{\mathbb{Q}} & \to & \Big(\prod_p X_{(p)}\Big)_{\mathbb{Q}} }$ where $(-)_{(p)}$ denotes localization at $p$.

Clearly, there is a formal resemblance to the pullback square involved in the gluing theorem. At this point I feel like I should be saying something about $Spec(\mathbb{Z})$. Unfortunately, I don’t know what to say! Maybe some passing expert will enlighten us.

Posted at May 16, 2013 7:47 PM UTC

TrackBack URL for this Entry:   https://golem.ph.utexas.edu/cgi-bin/MT-3.0/dxy-tb.fcgi/2618

### Re: The Propositional Fracture Theorem

I have been wondering about this exact thing recently, though not in quite so much detail! This post gave me some new ideas that I will have to sleep on, but hopefully someone can come by and enlighten us both.

Posted by: Patrick Durkin on May 17, 2013 4:33 AM | Permalink | Reply to this

### Re: The Propositional Fracture Theorem

Because you mention join, and because there is a natural pairing $U \times (U\to C) \to C$, there is also a Hopf construct $U\star (U \to C) \to \Sigma C$, where we might as well have $\Sigma C := bool \star C$; with you, at the momement, in that I don’t know if it’s good for anything, but it seemed worth mentioning anyway. Of course, this is supposing that what you mean by “join” is something close to what I mean by “join”, though perhaps the coincidence is invariant under semantics of join?

Posted by: Jesse McKeown on May 17, 2013 5:01 AM | Permalink | Reply to this

### Re: The Propositional Fracture Theorem

I mean the same thing by join as a topologist does: the homotopy pushout of two spaces under their cartesian product.

Posted by: Mike Shulman on May 21, 2013 9:08 PM | Permalink | Reply to this

Post a New Comment