# Tropical string mathematics

Draft of 2018.10.16

May include: games ↗ combinatorics ↖ &c.

In doing some background reading for another project, I came across this readable and interesting introduction to Tropical Mathematics from several years ago. Since the other project involves language design for genetic programming, maybe the confluence led to the dream I had last night about matrices of strings. Who knows?

Tropical arithmetic focuses on the *tropical semiring* \(\left(\mathbb{R} \cup \{\infty\}, \oplus, \odot \right)\), where \(\oplus\) represents the “tropical sum” operator (which produces the *minimum* of its two numerical arguments), and \(\odot\) the “tropical product” (which produces the *sum* of its two arguments). That is, for example,

As it happens, familiar behavior from traditional algebra holds here as well: associativity, commutativity, and so on (though not subtraction or division, apparently). All kinds of interesting mathematical and useful models can be derived from this simple beginning, because (as you might imagine) just as a lot of linear algebraic formalism follows from the simple facts of traditional arithmetic, a lot of interesting tropical linear algebra follows from these variations.

But I’m going in a different direction here today, since it’s taking the form of a game or “what if” exercise or an exploration or a distraction from the work I should be doing… or whatever.

## Tropical string manipulations

Let me hand-wave in my non-mathematical way and draw an analogy from the minimization and addition operations in tropical arithmetic to a slightly different pair of string operations. Specifically, let me define nearly-identical *lexical* operators for strings: \(\oplus\) will return the “minimum” or *lexically first* of its two arguments, and \(\odot\) will produce the “concatenation” of its arguments.

There are some noteworthy differences and slack definitions in play here. This all came to me in the form of a dream as I was waking up this morning, and so this whole piece is merely notes on that dream, so maybe we’re not all the way to rigor yet, OK?

First, there’s the troubling and/or interesting observation that unlike the case with tropical arithmetic, \(\left(a \odot b\right) \neq \left(b \odot a\right)\). I guess we’ll have to live with that for now.

Then there’s that character \(\epsilon\) I’ve dropped into the examples above. That’s just there to signify the empty string, “”. I’ll probably use “” more often than \(\epsilon\), because I can type it faster.

The other difference comes from my weird retention of the \(``\infty\text{''}\) symbol I’ve slapped in there. I suspect we all have a notion of what *an* infinite string might be: a string of characters going on forever. And given a rule for alphabetization that places `"abc"`

before `"aaaa"`

in a list, it is the case that for *any* infinitely long string of characters, a finite string will appear before it in the list. It turns out that there is such a rule, typically used in the field of combinatorics, and it’s called shortlex order. That is, shorter strings appear before longer ones (in all instances), and within a fixed length we alphabetize the strings.

If we were using traditional lexicographic ordering, where `"aaaa"`

appears before `"bbb"`

, we might be in trouble. If we were to make that design decision, we would need to consider what an “infinite string” might mean if *any* infinitely-long sequence of letters would be salted infinitely often here and there within a size-independent alphabetically-sorted sequence, rather than at the end, as though they were the irrational numbers written out in decimal form on the number line. You might find that notion interesting and useful to pursue, or it might hurt your head, or (probably) both. In any case, here I’ll use *shortlex* order.

OK then.

What else do we get? Group Theorists seem to be concerned with commutativity and suchlike. Do we have any of that-there stuff with strings?

Seems like we have associativity for \(\oplus\) and \(\odot\):

\[\left(a \oplus b\right) \oplus c = a \oplus \left(b \oplus c\right)\] \[\left(a \odot b\right) \odot c = a \odot \left(b \odot c\right)\]Which is to say:

\[\begin{align*} \left(``\text{foo''} \oplus ``\text{bar''}\right) \oplus ``\text{baz''} &= \\ ``\text{foo''} \oplus \left(``\text{bar''} \oplus ``\text{baz''}\right) &= ``\text{bar''} \end{align*}\]and:

\[\begin{align*} \left(``\text{foo''} \odot ``\text{bar''}\right) \odot ``\text{baz''} &= \\ ``\text{foo''} \odot \left(``\text{bar''} \odot ``\text{baz''}\right) &= ``\text{foobarbaz''} \end{align*}\]We have commutativity for \(\oplus\)

\[a \oplus b = b \oplus a\] \[\begin{align*} ``\text{foo''} \oplus ``\text{bar''} &= \\ ``\text{bar''} \oplus ``\text{foo''} &= ``\text{bar''} \end{align*}\]But not for \(\odot\)

\[a \odot b \neq b \odot a\] \[\begin{align*} ``\text{foo''} \odot ``\text{bar''} &= ``\text{foobar''}\\ ``\text{bar''} \odot ``\text{foo''} &= ``\text{barfoo''} \end{align*}\]How about left distributivity?

\[a \odot ( b \oplus c) \stackrel{?}{=} (a \odot b) \oplus (a \odot c)\] \[\begin{align*} ``\text{foo''} \odot ( ``\text{bar''} \oplus ``\text{baz''}) &= ``\text{foobar''} \\ (``\text{foo''} \odot ``\text{bar''}) \oplus (``\text{foo''} \odot ``\text{baz''}) &= \\ ``\text{foobar''} \oplus ``\text{foobaz''} &= ``\text{foobar''} \end{align*}\]How about right distributivity?

\[( b \oplus c) \odot a\stackrel{?}{=} (b \odot a) \oplus (c \odot a)\] \[\begin{align*} ``( ``\text{bar''} \oplus ``\text{baz''}) \odot \text{foo''} &= ``\text{barfoo''} \\ (``\text{bar''} \odot ``\text{foo''}) \oplus (``\text{baz''} \odot ``\text{foo''}) &= \\ ``\text{barfoo''} \oplus ``\text{bazfoo''} &= ``\text{barfoo''} \end{align*}\]Seems to check out. `¯\_(ツ)_/¯`

Maybe there are some weirdnesses about the additive and multiplicative identities, and/or zero? The identity operations feel about right when we “add” an infinite string, or “multiply by” (concatenate) an empty string. There doesn’t seem to be a string such that, if we “multiply” (concatenate) it, the result is gone? That is, there doesn’t seem to be a “zeroing” term under \(\odot\).

So something something maybe-not-a-semiring I guess?

## Hapless fun with vectors and matrices

Given “addition” and “multiplication”, as you might expect there is a reasonably well-behaved version of matrix multiplication using tropical arithmetic, and indeed it’s quite useful in many graph theory and combinatorics algorithms.

Eliding for the foreseeable future what a “vector” or “matrix of strings” *even means* (geometrically), let’s play with them as if we were on firm theoretical ground already.

First off, let me dispense with the quotes around strings like “foo” and “bar” entirely (I’ll reserve the use of quotes as scare quotes around my slapdash terminology), and use letters to signify strings in question.

It looks like we can multiply a string “scalar” by a “vector” \(v_n\) of \(n\) strings like this:

\[s \odot (v_1, v_2, v_3, \dots, v_n) = (s \odot v_1, s \odot v_2, s \odot v_3, \dots, s \odot v_n)\]but of course \((s \odot v_i) = sv_i\) (indicating concatenation), so

\[s \odot (v_1, v_2, v_3, \dots, v_n) = (sv_1, sv_2, sv_3, \dots, sv_n)\]That seems OK. In functional programming terms, I’ve just mapped some kind of `prepend`

function over this ordered collection of strings.

Let’s jump all the way to a simple matrix or two. Sticking with the tried-and-true \(2 \times 2\) versions for the moment:

\[\begin{bmatrix} a_1 & a_2 \\ a_3 & a_4 \end{bmatrix} \begin{bmatrix} b_1 & b_2 \\ b_3 & b_4 \end{bmatrix} = \begin{bmatrix} (a_1 b_1) \oplus (a_2 b_3) & (a_1 b_2) \oplus (a_2 b_4) \\ (a_3 b_1) \oplus (a_4 b_3) & (a_3 b_2) \oplus (a_4 b_4) \end{bmatrix}\]Whew. That was fun to typeset. But it’s not intuitively obvious what is going on here, so let me try it with one-letter strings (and leaving out all the fricking quotes):

\[\begin{bmatrix} \text{L} & \text{O} \\ \text{V} & \text{E} \end{bmatrix} \begin{bmatrix} \text{H} & \text{A} \\ \text{T} & \text{E} \end{bmatrix} = \begin{bmatrix} \text{LH} \oplus \text{OT} & \text{LA} \oplus \text{OE} \\ \text{VH} \oplus \text{ET} & \text{VA} \oplus \text{EE} \end{bmatrix} = \begin{bmatrix} \text{LH} & \text{LA} \\ \text{ET} & \text{EE} \end{bmatrix}\]So what *appears* to be happening here is that the row-column application of \(\odot\) is constructing an interesting subset of concatenations of the strings in those elements, and the final application of \(\oplus\) is dropping most of that set to return only the alphabetically earliest one.

Let me try something a little *teeny* bit more challenging, just to see if I have this straight:

Well that’s interesting. Seems to do *something*, doesn’t it?

There’s a sense of “expanding” strings combinatorially in the first phase where we concatenate several pairs of them together, but then there’s a sense of “collapse” towards smaller strings closer to the start of the alphabet in the second phase where we apply shortlex minimization.

Is this at all useful? I’m not the one to judge. It’s been fun exercise for me to typeset the things in MathJax, so it’s surely “useful” in the sense of being fun exercise.

It may also have some applications—out along the usual hand-wavy sort of path—in bioinformatics and/or finite automata grammar stuff. But surely (“surely” in scare quotes) concatenation and alphabetization have not much to do with natural language processing. I’m thinking that there may be some fun things here revolving around the combinatorics of variable naming. And to be frank I think there may be some interesting things going on here that may be useful in code generation schemes in GP, with concatenation and possibly some different partner operations not “alphabetical minimization” but perhaps something involving performance measures of the resulting concatenated strings.

Maybe I can play with that some day. But it’s fun enough for now. It’s from a dream. Maybe someday it will bubble up again in reality, now that I’ve got it out of my system.