# Algebra is about composition

This is an episode of *Thoughts on Functional Programming*, a podcast by Eric Normand.

Subscribe: RSSApple PodcastsGoogle PlayOvercast

When we look at the definitions of algebraic properties, we often see that we are defining how things compose. This is one of the main advantages of using algebraic properties to constrain our operations. If we define how they should compose before we implement them (as a unit test, for instance) we can guarantee that things will compose.

## Transcript

Eric Normand: Algebra is all about composition. In this episode, I'm going to follow up with a previous episode I did, all about algebra and algebraic properties, and why they were important.

It's something I forgot to talk about, which is that one of the reasons that we use algebra is that it helps us to find how things compose.

My name is Eric Normand, and I help people thrive with Functional Programming.

Like I said before, a few episodes ago, I talked all about algebra and why I liked the ideas. As soon as it was published and everything, I realized, "Oh, man. I left out the why. Why is this important?"

Functional programmers talk a lot about composition, about building things out of small, simple pieces that compose really well. That's how you make more complex, more useful things. You make them out of smaller things by composing.

That's all well and good. It's true. But then, what does the composition really mean? How do you ensure that things compose well? Algebra and algebraic properties are really the study of that kind of composition. I want to try to explain why.

If you have some algebraic property, usually it is about...OK, I don't want to say "usually," because I haven't done a full, exhaustive study. Very often, if you take something like associativity, this is the case.

Very often, it's a formula, right? It's some equation, some equality, that this thing is equal to that thing. They're equivalent, so you could swap between them in your code, those two things. What are the two things that are equivalent?

In associativity, it's calling something with the first two...if you're calling an operator or a function with the first two elements, and then using the result of that with a third element in the same function, is equal to calling the first one with the operation applied to the second two.

What are you doing? You're comparing the function call with itself. You're basically showing how these function calls can compose. Take commutativity. Commutativity says, "F of A and B is equal to F of B and A." You're comparing the function to itself, OK?

There's no real composition there, except in the sense that with a lot of arguments like the arguments they're composing in a certain order, but associativity is actually a better example.

What you're doing is, you're showing that these things can compose in certain regular ways. By ensuring that you have that property, you're ensuring that you can do things down the line later with those operators, with those functions.

Often, we talk about composition, but we don't know how to build it in. Maybe we use our intuitions. Maybe we use our experience like, "Oh, if I do something this way, I know I'll be able to do this other thing later with it."

These algebraic properties, at least many of them, are ways of doing that up front, ahead of time. That is one of the reasons why I think algebraic properties are important for functional programming.

We talk about composition. This is where composition comes from. These are the ways that we compose things. Now, I'm not a stickler for the mathematical algebraic properties, meaning I think you could come up with new ones.

You could come up with new ways of composing functions that either haven't been named yet by mathematicians, or they have, but you don't have to know the name. You can make up your own formulas. You can start with the formulas and modify them a little bit.

I think that's all great. Just know that when you go off the path, you don't have as much to rely on, but sometimes you don't need it. Most of the time you don't need it, so that's fine.

An example of going off the path is, I like to talk about merge, so merging two hash maps. It is associative. Everyone would agree, it's associative. It's a clear example of associativity. However, it's not commutative in general. The order does matter, the order of the arguments.

If you have two hash maps and they share a key but they have different values, one of those is going to have to win. One of them is going to overwrite the other. That usually depends on which position it's in. If it's in the first argument position versus the second argument position, usually higher numbered arguments overwrite the lower number arguments.

Order matters. It's not commutative, except if you put an if statement around it, you could say it's commutative. It is commutative if they don't share keys, right? There's no keys to overwrite the other, it's commutative.

Very often, at least in Clojure and other languages where you use hash maps a lot, you often know the keys that you have. You often know that there are no shared keys. Merge in those case, is commutative.

How would that translate into something useful, or something practical? If you were doing a property-based test, you could put that as a condition on the data that gets generated.

OK, generated means two random hash maps and filter out all the ones that have keys in common. Now, those two hash maps, they don't share any keys. It's commutative with those two arguments.

I like to play with it. I start with the standard definitions, and then change it from there. Often, you're not dealing with strict equality, you're dealing with some other kind of equality that doesn't take into account timestamps, or something like that.

It's not really equals, that equals is flexible, etc. You're still dealing with composition, and that's what I wanted to talk about.

OK, I feel like I've touched on this enough, so I'm going to wrap it up. If you like this episode, you can find all the past episodes at lispcast.com/podcast. There, you'll find all the past episodes with audio, video, and text transcripts, however you want to consume it.

You'll also find links to subscribe. Please do subscribe, so that you'll get all the new episodes as they come out, and links to find me on social media, where you can get in touch with me and ask me questions, comment. A lot of these topics come from questions.

This has been my thought on Functional Programming. My name is Eric Normand. Thank you for listening, and rock on.