The Many Faces of Flat‑Map: Part 5

Episode #46 • Feb 4, 2019 • Subscriber-Only

Finishing our 3-part answer to the all-important question “what’s the point?”, we finally show that standing on the foundation of our understanding of map, zip and flatMap we can now ask and concisely answer very complex questions about the nature of these operations.

The Many Faces of Flat‑Map: Part 5
Introduction
00:05
Function composition and flatMap
02:23
Nested containers
10:03
Map from flatMap
18:29
Zip from flatMap: Optional and Array
22:58
Zip from flatMap: Result and Validated
25:06
Zip from flatMap: Func and Parallel
27:13
The point
30:42

Unlock This Episode

Our Free plan includes 1 subscriber-only episode of your choice, plus weekly updates from our newsletter.

Introduction

So let’s talk the third and final part of “what’s the point?”. We’ve now spent a bunch of time getting comfortable with the idea of flatMap, justifying why we should use it, and why we should build an intuition for it. Once we did that, we convinced ourselves that the signature of flatMap and its friends is so important that we’re going to defend it from anyone that may disparage it: you shouldn’t change its signature, it’s there for a reason.

The reason we’ve done all this work is that now we can build off that foundation and ask very complex questions: questions that may have been seemingly intractable had we not taken this deep journey of discovery.

We’re going to look at composition of functions when it comes to flatMap. We saw that map had a wonderful property: the map of the compositions is the same as the composition of the maps. What that meant was that if you have a big chain of maps, you can collapse all that into a single map and call it once with the composition of all the units of work. Is there a version of this for flatMap? There is!

Next, we know that flatMap can flatten nested containers, like optionals of optionals and results of results, but what about nested containers of different types, like an array of results, or array of parallels, etc. Is there anything we can discover with those kinds of nested containers.

Finally, what is the precise relationship between map, zip, and flatMap? Can some operations be derived from others, what does it say about types that can do so, and is there some kind of hierarchy between these things?

These are some pretty complicated questions that we want to ask and we can finally answer them!

This episode is for subscribers only.

Subscribe to Point-Free

Access this episode, plus all past and future episodes when you become a subscriber.

See plans and pricing

Already a subscriber? Log in

Exercises

  1. Implement flatMap on the nested type Result<A?, E>. It would have the signature:

    func flatMap<A, B, E>(
      _ f: @escaping (A) -> Result<B?, E>
      ) -> (Result<A?, E>) -> Result<B?, E> {
    
      fatalError("Implement me!")
    }
    
    Solution

    This function cannot be implemented easily using just the map and flatMap on Result and Optional. We have to drop down into explicit switch destructuring to handle all of the cases:

    func flatMap<A, B, E>(
      _ f: @escaping (A) -> Result<B?, E>
      ) -> (Result<A?, E>) -> Result<B?, E> {
    
      return { resultOfOptionalA in
        switch resultOfOptionalA {
        case let .success(.some(a)):
          return f(a)
        case .success(.none):
          return .success(.none)
        case let .failure(error):
          return .failure(error)
        }
      }
    }
    
  2. Implement flatMap on the nested type Func<A, B?>. It would have the signature:

    func flatMap<A, B, C>(
      _ f: @escaping (B) -> Func<A, C?>
      ) -> (Func<A, B?>) -> Func<A, C?> {
    
      fatalError("Implement me!")
    }
    
  3. Implement flatMap on the nested type Parallel<A?>. It would have the signature:

    func flatMap<A, B>(
      _ f: @escaping (A) -> Parallel<B?>
      ) -> (Parallel<A?>) -> Parallel<B?> {
    
      fatalError("Implement me!")
    }
    
  4. Do you see anything in common with all of the implementations in the previous 3 exercises? It turns out that if a generic type F<A> has a flatMap operation, then you can define a flatMap on F<A?> in a natural way.

  5. Implement flatMap on the nested type Func<A, Result<B, E>>. It would have the signature:

    flatMap: ((B) -> Func<A, Result<C, E>>)
             -> (Func<A, Result<B, E>>)
             -> Func<A, Result<C, E>>
    

References

Railway Oriented Programming — error handling in functional languages

Scott Wlaschin • Wednesday Jun 4, 2014

This talk explains a nice metaphor to understand how flatMap unlocks stateless error handling.

When you build real world applications, you are not always on the “happy path”. You must deal with validation, logging, network and service errors, and other annoyances. How do you manage all this within a functional paradigm, when you can’t use exceptions, or do early returns, and when you have no stateful data?

This talk will demonstrate a common approach to this challenge, using a fun and easy-to-understand “railway oriented programming” analogy. You’ll come away with insight into a powerful technique that handles errors in an elegant way using a simple, self-documenting design.

A Tale of Two Flat‑Maps

Brandon Williams & Stephen Celis • Tuesday Mar 27, 2018

Up until Swift 4.1 there was an additional flatMap on sequences that we did not consider in this episode, but that’s because it doesn’t act quite like the normal flatMap. Swift ended up deprecating the overload, and we discuss why this happened in a previous episode:

Swift 4.1 deprecated and renamed a particular overload of flatMap. What made this flatMap different from the others? We’ll explore this and how understanding that difference helps us explore generalizations of the operation to other structures and derive new, useful code!

Monad (functional programming)

Well, the cat’s out of the bag. For the past 5 episodes, while we’ve been talking about flatMap, we were really talking about something called “monads.” Swift cannot (yet) fully express the idea of monads, but we can still leverage the intuition of how they operate.

This reference is to the Wikipedia page for monads, which is terse but concise.