SlideShare a Scribd company logo
1 of 49
Download to read offline
PRACTICAL CATS
sharing what i’ve learnt
ABOUT MYSELF
CATS….
To use cats effectively, understand what each
construct does:
Functor
Monads
Applicatives
Monoids
Semigroups
SO MANY ACRONYMS !!!
Validated
Building blocks …
Understand that they are building blocks

so that you can write code that is pure and
code that has side-effects — separation of
concerns.
Typeclasses …
Each of the type class (e.g. functors,
monoids, monads etc) are governed by laws.
Typeclasses! 

they are behaviours that can be “inherited”

by your code.
Semigroups - what are they?
trait Semigroup[A] {

def combine(x: A, y: A) : A

}
 general structure to define things 

that can be combined.

*Cats provides “default” implementations; developers 

(like you & me) need to provide implementations that conform to the traits. *
Monoids - what are they?
trait Monoid[A] extends Semigroup[A] {

def empty: A

def combine(x: A, y: A) : A

}
 general structure to define things 

that can be combined and has a “default”

element.

*Cats provides “default” implementations; developers 

(like you & me) need to provide implementations that conform to the traits. *
Monoids - what are they?
> import cats._, data._, implicits._

> Monoid[String].combine(“hi”, “there”)

// res0: String = “hithere”

> “hi” |+| “there”

// res1: String = “hithere”
Use case for Monoids/Semigroups
They’re good for combining 2 or more things of a similar
nature

data-type-a data-type-b
data-stream end-
point
parser
collector
of either data-type-a or
data-type-b
Use case #1 - Monoids for “smashing” values
* all names used here do not reflect the actuals *
// Monoid[DataTypeAB] defined somewhere else
def buildDataFromStream(datatypeA : DataTypeA,
datatypeB : DataTypeB,
accumulator: DatatypeAB) =
validateData(datatypeA, datatypeB).fold(
onError => {
// `orError` is lifted into the datatype
val errors = Monoid[DatatypeAB].empty.copy(lift(onError))
Monoid[DatatypeAB].combine(accumulator, errors)
},
parsedValue => {
// `parsedValue` is lifted into the datatype
val newValue = Monoid[DatatypeAB].empty.copy(lift(parsedValue))
Monoid[DatatypeAB].combine(accumulator, newValue)
}
)
Functors - what are they?
trait Functor[F[_]] {

def map[A,B](fa: F[A])(f: A => B) : F[B]

}

general structure to represent something

that can be mapped over. If you’ve been using Lists

, Options, Eithers, Futures in Scala, you’ve been using
functors.

!!! They are very common structures indeed ☺ !!!
* functors are used in clever things like recursion-schemes *
Functors - what are they?
> import cats._, data._, implicits._

> Functor[List].lift((x:Int) => x + 1)

// res0: List[Int] => List[Int]

> res0(List(1))

// res1: List[Int] = List(2)
* Nugget of info: Functors preserve “structure” *
Monads
Monads are meant for 

sequencing computations
Monads
someList.flatmap(element => 

someOtherList.flatmap(element2 =>

(element, element2)))
*No tuples generated if either “someList”

Or “someOtherList” is empty*
Monads
someList.flatmap(element => 

someOtherList.flatmap(element2 =>

(element, element2)))
Monads allows for short-circuiting of

computations
Monads - a quick summary?
Writers - information can be carried along with
the computation

Readers - compose operations that depend

on some input.

State - allows state to be “propagated”

Eval - abstracts over eager vs lazy evaluation.
Monads - examples
> List(1,2,3) >>= (value => List(value+1))

> res0: List[Int] = List(2,3,4)
def >>=[A,B](fa: F[A])(f:A => F[B]): F[B] =
flatMap(fa)(f)
“>>=“ is also known as “bind” (in Cats, its really “flatMap”)
Monads - examples
> Monad[List].lift((x:Int) => x + 1)(List(1,2,3))

> res1: List[Int] = List(2,3,4)
Typeclasses allows you to define re-usable code by
lifting functions.
Monads - examples
> Monad[List].pure(4)

> res2: List[Int] = List(4)
This is to lift values into a context, in this case
Monadic context.
Monads - flow
scala> def first = Reader( (x:Int) => Monad[List].ifM(List(true,true))(x::Nil, Nil) )

extractGroups: cats.data.Reader[Int,List[Int]]

scala> def second = Reader( (x:Int) => Monad[List].ifM(List(true,true))(x::Nil, Nil) )

loadGroup: cats.data.Reader[Int,List[Int]]

scala> for { g <- first(4); gg <- second(g) } yield gg

res21: List[Int] = List(4, 4, 4, 4)
Writer Monad
“Writers” are typically used to carry not only the value
of a computation but also some other information (typically, its
used to carry logging info).
source: http://eed3si9n.com/herding-cats/Writer.html
Writer Monad
scala> def logNumber(x: Int): Writer[List[String], Int] =
         Writer(List("Got number: " + x.show), 3)
logNumber: (x: Int)cats.data.Writer[List[String],Int]
scala> def multWithLog: Writer[List[String], Int] =
         for {
           a <- logNumber(3)
           b <- logNumber(5)
         } yield a * b
multWithLog: cats.data.Writer[List[String],Int]
scala> multWithLog.run
res2: cats.Id[(List[String], Int)] = (List(Got number: 3, Got number: 5),9)
scala> multWithLog.reset
res6: cats.data.WriterT[cats.Id,List[String],Int] = WriterT((List(),9))
scala> multWithLog.swap
res8: cats.data.WriterT[cats.Id,Int,List[String]] = WriterT((9,List(Got number: 4, Got
number: 3)))
scala> multWithLog.value
res9: cats.Id[Int] = 9
scala> multWithLog.written
res10: cats.Id[List[String]] = List(Got number: 4, Got number: 3)
source: http://eed3si9n.com/herding-cats/Writer.html
Compose Writers
Reader Monad
“Readers” allows us to compose operations which depend
on some input.
source: http://eed3si9n.com/herding-cats/Reader.html
Reader Monad
case class Config(setting: String, value: String)
def getSetting = Reader {
(config: Config) => config.setting
}
def getValue = Reader {
(config: Config) => config.value
}
for {
s <- getSetting
v <- getValue
} yield Config(s, v)
Compose Readers
FP-style to abstract and
encapsulate.
State Monad
Allows us to pass state-information around in a computation.
http://eed3si9n.com/herding-cats/State.html
Use case #3 - Reader + State Monad
def process: Reader[Elem, Seq[Mapping]] = Reader {
(xml: Elem) =>
for {
groups <- extractGroups(dataXml).toOption
group <- groups
grpCfg <- loadGroupConfig(group).toOption
stateObj <- ConfigState(grpCfg).pure[Option]
records <- loadRecords(group).toOption
record <- records
row <- processRecord(i)(stateObj)(record).pure[Option]
} yield {
// processing …
}
}
case class ConfigState(init: Config) {
private[this] var currentState: Config = init
def storeCfg : State[Config, Config] =
State{ (cfg: Config) =>
val prevState = currentState
currentState = cfg
(currentState, prevState) }
def loadCfg : Config =
( for {
s <- State.get[Config]
} yield s ).runA(currentState).value
}
Use case #3 - Reader + State Monad
def process: Reader[Elem, Seq[Mapping]] = Reader {
(xml: Elem) =>
for {
groups <- extractGroups(dataXml).toOption
group <- groups
grpCfg <- loadGroupConfig(group).toOption
stateObj <- ConfigState(grpCfg).pure[Option]
records <- loadRecords(group).toOption
record <- records
row <- processRecord(i)(stateObj)(record).pure[Option]
} yield {
// processing …
}
}
case class ConfigState(init: Config) {
private[this] var currentState: Config = init
def storeCfg : State[Config, Config] =
State{ (cfg: Config) =>
val prevState = currentState
currentState = cfg
(currentState, prevState) }
def loadCfg : Config =
( for {
s <- State.get[Config]
} yield s ).runA(currentState).value
}
Separation of concerns
State management
Applicative
Applicatives allows for functions to be
lifted over a structure (Functor).

Because the function and the value it’s being applied 

to both have structures, hence its needs to be
combined.
Applicative - examples
scala> Applicative[List].lift((x:Int) => x + 1)
res1: List[Int] => List[Int] = <function1>
scala> Applicative[List].lift(
| (x:List[Int=>Int]) =>
| x.map(f => f(2)))
| (List( List((x:Int) => x + 1 )))
res7: List[List[Int]] = List(List(3))
scala> val fs = List(List((x:Int) => x + 1))
fs: List[List[Int => Int]] = List(List(<function1>))
scala> fs.map(_(2))
res15: cats.data.Nested[List,List,Int] =
Nested(List(List(3)))
Applicative is like a Functor
Applicative - examples
scala> Applicative[List].lift((x:Int) => x + 1)
res1: List[Int] => List[Int] = <function1>
scala> Applicative[List].lift(
| (x:List[Int=>Int]) =>
| x.map(f => f(2)))
| (List( List((x:Int) => x + 1 )))
res7: List[List[Int]] = List(List(3))
scala> val fs = List(List((x:Int) => x + 1))
fs: List[List[Int => Int]] = List(List(<function1>))
scala> fs.map(_(2))
res15: cats.data.Nested[List,List,Int] =
Nested(List(List(3)))
Applicative is like a Functor
Applying a function which is nested.
Applicative - examples
scala> Applicative[List].lift((x:Int) => x + 1)
res1: List[Int] => List[Int] = <function1>
scala> Applicative[List].lift(
| (x:List[Int=>Int]) =>
| x.map(f => f(2)))
| (List( List((x:Int) => x + 1 )))
res7: List[List[Int]] = List(List(3))
scala> val fs = List(List((x:Int) => x + 1))
fs: List[List[Int => Int]] = List(List(<function1>))
scala> fs.map(_(2))
res15: cats.data.Nested[List,List,Int] =
Nested(List(List(3)))
Applicative is like a Functor
Applying a function which is nested.
Cat has a “Nested” to achieve the same.
Applicative - examples
A typical application is to leverage Applicatives in writing
Logic to validate configurations, forms etc
import cats.Cartesian
import cats.data.Validated
import cats.instances.list._ // Semigroup for List
type AllErrorsOr[A] = Validated[List[String], A]
Cartesian[AllErrorsOr].product(
Validated.invalid(List("Error 1")),
Validated.invalid(List("Error 2")) )
// res1: AllErrorsOr[(Nothing, Nothing)] = Invalid(List(Error 1,Error 2))
Applicative - examples
package xxx.config
import scala.concurrent.duration.{Duration,FiniteDuration}
import cats._
import cats.data._
import cats.implicits._
import cats.data.Validated
import cats.data.Validated.{Invalid, Valid}
// code that needs to remain hidden
sealed abstract class ConfigError
final case class MissingConfig(field : String) extends ConfigError
final case class ParseError(field: String) extends ConfigError
case class Config(map : Map[String,String])
case class HuffConfig(
clusterName: String,
clusterPort : Int,
clusterAddress : String,
hostname: String,
listeningPort: Int)
object Validator {
def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] =
Apply[ValidatedNel[ConfigError, ?]].map5(
config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel,
config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel,
config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel,
config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel,
config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) {
case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) =>
HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort)
}
}
package xxx.config
import scala.concurrent.duration.{Duration,FiniteDuration}
import cats._
import cats.data._
import cats.implicits._
import cats.data.Validated
import cats.data.Validated.{Invalid, Valid}
// code that needs to remain hidden
sealed abstract class ConfigError
final case class MissingConfig(field : String) extends ConfigError
final case class ParseError(field: String) extends ConfigError
case class Config(map : Map[String,String])
case class HuffConfig(
clusterName: String,
clusterPort : Int,
clusterAddress : String,
hostname: String,
listeningPort: Int)
object Validator {
def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] =
Apply[ValidatedNel[ConfigError, ?]].map5(
config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel,
config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel,
config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel,
config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel,
config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) {
case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) =>
HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort)
}
}
Define types to represent “errors"
package xxx.config
import scala.concurrent.duration.{Duration,FiniteDuration}
import cats._
import cats.data._
import cats.implicits._
import cats.data.Validated
import cats.data.Validated.{Invalid, Valid}
// code that needs to remain hidden
sealed abstract class ConfigError
final case class MissingConfig(field : String) extends ConfigError
final case class ParseError(field: String) extends ConfigError
case class Config(map : Map[String,String])
case class HuffConfig(
clusterName: String,
clusterPort : Int,
clusterAddress : String,
hostname: String,
listeningPort: Int)
object Validator {
def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] =
Apply[ValidatedNel[ConfigError, ?]].map5(
config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel,
config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel,
config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel,
config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel,
config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) {
case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) =>
HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort)
}
}
Define types to represent “errors"
Validate and read into configuration object.
package xxx.config
import scala.concurrent.duration.{Duration,FiniteDuration}
import cats._
import cats.data._
import cats.implicits._
import cats.data.Validated
import cats.data.Validated.{Invalid, Valid}
// code that needs to remain hidden
sealed abstract class ConfigError
final case class MissingConfig(field : String) extends ConfigError
final case class ParseError(field: String) extends ConfigError
case class Config(map : Map[String,String])
case class HuffConfig(
clusterName: String,
clusterPort : Int,
clusterAddress : String,
hostname: String,
listeningPort: Int)
object Validator {
def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] =
Apply[ValidatedNel[ConfigError, ?]].map5(
config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel,
config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel,
config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel,
config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel,
config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) {
case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) =>
HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort)
}
}
Define types to represent “errors"
Validate and read into configuration object.
Validation logic
How does anyone create a stack of Monads ?
Monad Transformers
How does anyone create a stack of Monads ?
Monad Transformers
Let’s take a closer look
scala> case class Cat(name: String, alive: Boolean)
defined class Cat
scala> def isAlive = Reader{ (u:User) => if (u.alive) u.asRight[Throwable].toOption:: Nil
| else scala.util.Try(throw new Exception("Dead!")).asLeft[User].toOption::Nil }
isAlive2: cats.data.Reader[User,List[Option[User]]]
scala> def lookup = Cat("cat", true).some::Nil
lookup: List[Option[Cat]]
scala> for {
| someCat <- lookup
| } yield {
| for {
| cat <- someCat
| } yield isAlive(cat)
|}
res47: List[Option[cats.Id[List[Option[Cat]]]]] = List(Some(List(User(cat,true))))
Let’s say we like to look up a cat and find out whether its alive.
We would use Option[Cat] to say whether we can find one, and perhaps
Either[Throwable,Cat] to represent when cat is dead, we throw an exception
else we return the Cat
First Attempt
Let’s take a closer look
scala> case class Cat(name: String, alive: Boolean)
defined class Cat
scala> def isAlive =
| Reader{ (u: Cat) => if (u.alive) OptionT( u.asRight[Throwable].toOption:: Nil)
| else OptionT( scala.util.Try(throw new Exception("Dead!")).asLeft[Cat].toOption::Nil) }
isAlive: cats.data.Reader[Cat,cats.data.OptionT[List, Cat]]
scala> def lookup = OptionT(Cat("cat", true).some::Nil)
lookup: cats.data.OptionT[List, Cat]
scala> for {
| cat <- lookup
| checked <- isAlive(cat)
| } yield checked
res32: cats.data.OptionT[List, Cat] = OptionT(List(Some(Cat(cat,true))))
The nested-yield loops can quickly get very confusing ….
that’s where Monad Transformers help!
Second Attempt
Effectful Monads aka Eff-Monads
Effectful Monads

An alternative to Monad Transformers
http://atnos-org.github.io/eff/
Use-case #4
Putting in the type-definitions: making use of the

Reader, Writer, Either Effects from Eff !
import xxx.workflow.models.{WorkflowDescriptor, Service}
import scala.language.{postfixOps, higherKinds}
import org.atnos.eff._, all._, syntax.all._
import com.typesafe.config._
import com.typesafe.scalalogging._
class LoadWorkflowDescriptorEff {
import cats._, data._, implicits._
import io.circe._, io.circe.generic.auto._, io.circe.parser._, io.circe.syntax._
lazy val config = ConfigFactory.load()
lazy val logger = Logger(getClass)
type WorkflowIdReader[A] = Reader[String, A]
type WriterString[A] = Writer[String,A]
type DecodeFailure[A] = io.circe.DecodingFailure Either A
type ParseFailure[A] = io.circe.ParsingFailure Either A
// ...
}
import java.time._
type LoadDescStack =
Fx.fx6[WorkflowIdReader, WriterString, DecodeFailure, ParseFailure, Throwable Either ?, Eval]
def loadDescriptor : Eff[LoadDescStack, WorkflowDescriptor] =
for {
workflowId <- ask[LoadDescStack,String]
_ <- tell[LoadDescStack,String](s"[${Instant.now()}] About to load data about workflow: $workflowId")
contents <- fromEither[LoadDescStack,java.lang.Throwable,String](loadContents(workflowId))
_ <- tell[LoadDescStack,String](s"[${Instant.now()}] Data is loaded from storage: $contents")
json <- fromEither[LoadDescStack,io.circe.ParsingFailure,io.circe.Json](parse(contents))
_ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor parsed successfully")
desc <- fromEither[LoadDescStack, io.circe.DecodingFailure, WorkflowDescriptor](json.as[WorkflowDescriptor])
_ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor hydrated into object.")
} yield desc
// Below is a test and you can choose either runEval or attemptEval
// attemptEval is a better option as it captures any errors met during the
// computation.
//println(loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure)
lazy val result = {
val a = loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure
val t = a.get
t.joinRight
}
// the logging version
lazy val result2 = {
val a = loadDescriptor.runReader("1").runWriterLog.runEither.runEither.runEither.runPure
val t = a.get
t.joinRight
}
}
Use-case #4
import java.time._
type LoadDescStack =
Fx.fx6[WorkflowIdReader, WriterString, DecodeFailure, ParseFailure, Throwable Either ?, Eval]
def loadDescriptor : Eff[LoadDescStack, WorkflowDescriptor] =
for {
workflowId <- ask[LoadDescStack,String]
_ <- tell[LoadDescStack,String](s"[${Instant.now()}] About to load data about workflow: $workflowId")
contents <- fromEither[LoadDescStack,java.lang.Throwable,String](loadContents(workflowId))
_ <- tell[LoadDescStack,String](s"[${Instant.now()}] Data is loaded from storage: $contents")
json <- fromEither[LoadDescStack,io.circe.ParsingFailure,io.circe.Json](parse(contents))
_ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor parsed successfully")
desc <- fromEither[LoadDescStack, io.circe.DecodingFailure, WorkflowDescriptor](json.as[WorkflowDescriptor])
_ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor hydrated into object.")
} yield desc
// Below is a test and you can choose either runEval or attemptEval
// attemptEval is a better option as it captures any errors met during the
// computation.
//println(loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure)
lazy val result = {
val a = loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure
val t = a.get
t.joinRight
}
// the logging version
lazy val result2 = {
val a = loadDescriptor.runReader("1").runWriterLog.runEither.runEither.runEither.runPure
val t = a.get
t.joinRight
}
}
Use-case #4
Eff-Monads allows us to stack computations
Learning resources
https://www.haskell.org/tutorial/monads.html
http://eed3si9n.com/herding-cats/
http://typelevel.org/cats/
http://blog.higher-order.com/
https://gitter.im/typelevel/cats
That’s it from me :)
Questions ?

More Related Content

What's hot

Getting Started With Scala
Getting Started With ScalaGetting Started With Scala
Getting Started With ScalaMeetu Maltiar
 
An Introduction to Programming in Java: Arrays
An Introduction to Programming in Java: ArraysAn Introduction to Programming in Java: Arrays
An Introduction to Programming in Java: ArraysMartin Chapman
 
Java Arrays
Java ArraysJava Arrays
Java ArraysOXUS 20
 
Arrays in python
Arrays in pythonArrays in python
Arrays in pythonLifna C.S
 
An Introduction to Part of C++ STL
An Introduction to Part of C++ STLAn Introduction to Part of C++ STL
An Introduction to Part of C++ STL乐群 陈
 
Arrays In Python | Python Array Operations | Edureka
Arrays In Python | Python Array Operations | EdurekaArrays In Python | Python Array Operations | Edureka
Arrays In Python | Python Array Operations | EdurekaEdureka!
 
Lec 25 - arrays-strings
Lec 25 - arrays-stringsLec 25 - arrays-strings
Lec 25 - arrays-stringsPrincess Sam
 
Ti1220 Lecture 7: Polymorphism
Ti1220 Lecture 7: PolymorphismTi1220 Lecture 7: Polymorphism
Ti1220 Lecture 7: PolymorphismEelco Visser
 
Java collections concept
Java collections conceptJava collections concept
Java collections conceptkumar gaurav
 
Java chapter 6 - Arrays -syntax and use
Java chapter 6 - Arrays -syntax and useJava chapter 6 - Arrays -syntax and use
Java chapter 6 - Arrays -syntax and useMukesh Tekwani
 
Arrays in python
Arrays in pythonArrays in python
Arrays in pythonmoazamali28
 

What's hot (20)

Python programming : Arrays
Python programming : ArraysPython programming : Arrays
Python programming : Arrays
 
Introducing scala
Introducing scalaIntroducing scala
Introducing scala
 
Getting Started With Scala
Getting Started With ScalaGetting Started With Scala
Getting Started With Scala
 
An Introduction to Programming in Java: Arrays
An Introduction to Programming in Java: ArraysAn Introduction to Programming in Java: Arrays
An Introduction to Programming in Java: Arrays
 
Java arrays
Java arraysJava arrays
Java arrays
 
Java Arrays
Java ArraysJava Arrays
Java Arrays
 
Arrays in python
Arrays in pythonArrays in python
Arrays in python
 
An Introduction to Part of C++ STL
An Introduction to Part of C++ STLAn Introduction to Part of C++ STL
An Introduction to Part of C++ STL
 
Arrays In Python | Python Array Operations | Edureka
Arrays In Python | Python Array Operations | EdurekaArrays In Python | Python Array Operations | Edureka
Arrays In Python | Python Array Operations | Edureka
 
Lec 25 - arrays-strings
Lec 25 - arrays-stringsLec 25 - arrays-strings
Lec 25 - arrays-strings
 
LectureNotes-03-DSA
LectureNotes-03-DSALectureNotes-03-DSA
LectureNotes-03-DSA
 
Ti1220 Lecture 7: Polymorphism
Ti1220 Lecture 7: PolymorphismTi1220 Lecture 7: Polymorphism
Ti1220 Lecture 7: Polymorphism
 
Array in Java
Array in JavaArray in Java
Array in Java
 
Array lecture
Array lectureArray lecture
Array lecture
 
Java collections concept
Java collections conceptJava collections concept
Java collections concept
 
Java chapter 6 - Arrays -syntax and use
Java chapter 6 - Arrays -syntax and useJava chapter 6 - Arrays -syntax and use
Java chapter 6 - Arrays -syntax and use
 
Arrays in Java
Arrays in JavaArrays in Java
Arrays in Java
 
Arrays
ArraysArrays
Arrays
 
Python array
Python arrayPython array
Python array
 
Arrays in python
Arrays in pythonArrays in python
Arrays in python
 

Similar to Practical cats

Humble introduction to category theory in haskell
Humble introduction to category theory in haskellHumble introduction to category theory in haskell
Humble introduction to category theory in haskellJongsoo Lee
 
Scala. Introduction to FP. Monads
Scala. Introduction to FP. MonadsScala. Introduction to FP. Monads
Scala. Introduction to FP. MonadsKirill Kozlov
 
Why Haskell Matters
Why Haskell MattersWhy Haskell Matters
Why Haskell Mattersromanandreg
 
Fp in scala part 2
Fp in scala part 2Fp in scala part 2
Fp in scala part 2Hang Zhao
 
Stata Programming Cheat Sheet
Stata Programming Cheat SheetStata Programming Cheat Sheet
Stata Programming Cheat SheetLaura Hughes
 
Functions in advanced programming
Functions in advanced programmingFunctions in advanced programming
Functions in advanced programmingVisnuDharsini
 
From Java to Scala - advantages and possible risks
From Java to Scala - advantages and possible risksFrom Java to Scala - advantages and possible risks
From Java to Scala - advantages and possible risksSeniorDevOnly
 
Stata cheatsheet programming
Stata cheatsheet programmingStata cheatsheet programming
Stata cheatsheet programmingTim Essam
 
Functions In Scala
Functions In Scala Functions In Scala
Functions In Scala Knoldus Inc.
 
Monads and Monoids by Oleksiy Dyagilev
Monads and Monoids by Oleksiy DyagilevMonads and Monoids by Oleksiy Dyagilev
Monads and Monoids by Oleksiy DyagilevJavaDayUA
 

Similar to Practical cats (20)

Spark workshop
Spark workshopSpark workshop
Spark workshop
 
Scala Bootcamp 1
Scala Bootcamp 1Scala Bootcamp 1
Scala Bootcamp 1
 
Humble introduction to category theory in haskell
Humble introduction to category theory in haskellHumble introduction to category theory in haskell
Humble introduction to category theory in haskell
 
Scala Collections
Scala CollectionsScala Collections
Scala Collections
 
Scala Introduction
Scala IntroductionScala Introduction
Scala Introduction
 
Frp2016 3
Frp2016 3Frp2016 3
Frp2016 3
 
Scala. Introduction to FP. Monads
Scala. Introduction to FP. MonadsScala. Introduction to FP. Monads
Scala. Introduction to FP. Monads
 
Why Haskell Matters
Why Haskell MattersWhy Haskell Matters
Why Haskell Matters
 
R교육1
R교육1R교육1
R교육1
 
Fp in scala part 2
Fp in scala part 2Fp in scala part 2
Fp in scala part 2
 
Stata Programming Cheat Sheet
Stata Programming Cheat SheetStata Programming Cheat Sheet
Stata Programming Cheat Sheet
 
Functions in advanced programming
Functions in advanced programmingFunctions in advanced programming
Functions in advanced programming
 
Pune Clojure Course Outline
Pune Clojure Course OutlinePune Clojure Course Outline
Pune Clojure Course Outline
 
From Java to Scala - advantages and possible risks
From Java to Scala - advantages and possible risksFrom Java to Scala - advantages and possible risks
From Java to Scala - advantages and possible risks
 
An introduction to scala
An introduction to scalaAn introduction to scala
An introduction to scala
 
Stata cheatsheet programming
Stata cheatsheet programmingStata cheatsheet programming
Stata cheatsheet programming
 
Functions In Scala
Functions In Scala Functions In Scala
Functions In Scala
 
Scala in Places API
Scala in Places APIScala in Places API
Scala in Places API
 
C# programming
C# programming C# programming
C# programming
 
Monads and Monoids by Oleksiy Dyagilev
Monads and Monoids by Oleksiy DyagilevMonads and Monoids by Oleksiy Dyagilev
Monads and Monoids by Oleksiy Dyagilev
 

More from Raymond Tay

Principled io in_scala_2019_distribution
Principled io in_scala_2019_distributionPrincipled io in_scala_2019_distribution
Principled io in_scala_2019_distributionRaymond Tay
 
Building a modern data platform with scala, akka, apache beam
Building a modern data platform with scala, akka, apache beamBuilding a modern data platform with scala, akka, apache beam
Building a modern data platform with scala, akka, apache beamRaymond Tay
 
Toying with spark
Toying with sparkToying with spark
Toying with sparkRaymond Tay
 
Distributed computing for new bloods
Distributed computing for new bloodsDistributed computing for new bloods
Distributed computing for new bloodsRaymond Tay
 
Functional programming with_scala
Functional programming with_scalaFunctional programming with_scala
Functional programming with_scalaRaymond Tay
 
Introduction to cuda geek camp singapore 2011
Introduction to cuda   geek camp singapore 2011Introduction to cuda   geek camp singapore 2011
Introduction to cuda geek camp singapore 2011Raymond Tay
 
Introduction to Erlang
Introduction to ErlangIntroduction to Erlang
Introduction to ErlangRaymond Tay
 
Introduction to CUDA
Introduction to CUDAIntroduction to CUDA
Introduction to CUDARaymond Tay
 

More from Raymond Tay (8)

Principled io in_scala_2019_distribution
Principled io in_scala_2019_distributionPrincipled io in_scala_2019_distribution
Principled io in_scala_2019_distribution
 
Building a modern data platform with scala, akka, apache beam
Building a modern data platform with scala, akka, apache beamBuilding a modern data platform with scala, akka, apache beam
Building a modern data platform with scala, akka, apache beam
 
Toying with spark
Toying with sparkToying with spark
Toying with spark
 
Distributed computing for new bloods
Distributed computing for new bloodsDistributed computing for new bloods
Distributed computing for new bloods
 
Functional programming with_scala
Functional programming with_scalaFunctional programming with_scala
Functional programming with_scala
 
Introduction to cuda geek camp singapore 2011
Introduction to cuda   geek camp singapore 2011Introduction to cuda   geek camp singapore 2011
Introduction to cuda geek camp singapore 2011
 
Introduction to Erlang
Introduction to ErlangIntroduction to Erlang
Introduction to Erlang
 
Introduction to CUDA
Introduction to CUDAIntroduction to CUDA
Introduction to CUDA
 

Recently uploaded

Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr BaganFwdays
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfRankYa
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Manik S Magar
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesZilliz
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek SchlawackFwdays
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024BookNet Canada
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 

Recently uploaded (20)

Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan"ML in Production",Oleksandr Bagan
"ML in Production",Oleksandr Bagan
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Search Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdfSearch Engine Optimization SEO PDF for 2024.pdf
Search Engine Optimization SEO PDF for 2024.pdf
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!Anypoint Exchange: It’s Not Just a Repo!
Anypoint Exchange: It’s Not Just a Repo!
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Vector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector DatabasesVector Databases 101 - An introduction to the world of Vector Databases
Vector Databases 101 - An introduction to the world of Vector Databases
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
"Subclassing and Composition – A Pythonic Tour of Trade-Offs", Hynek Schlawack
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
New from BookNet Canada for 2024: BNC CataList - Tech Forum 2024
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 

Practical cats

  • 3.
  • 4. CATS…. To use cats effectively, understand what each construct does: Functor Monads Applicatives Monoids Semigroups SO MANY ACRONYMS !!! Validated
  • 5.
  • 6. Building blocks … Understand that they are building blocks so that you can write code that is pure and code that has side-effects — separation of concerns.
  • 7. Typeclasses … Each of the type class (e.g. functors, monoids, monads etc) are governed by laws. Typeclasses! they are behaviours that can be “inherited” by your code.
  • 8. Semigroups - what are they? trait Semigroup[A] { def combine(x: A, y: A) : A } general structure to define things that can be combined. *Cats provides “default” implementations; developers (like you & me) need to provide implementations that conform to the traits. *
  • 9. Monoids - what are they? trait Monoid[A] extends Semigroup[A] { def empty: A def combine(x: A, y: A) : A } general structure to define things that can be combined and has a “default” element. *Cats provides “default” implementations; developers (like you & me) need to provide implementations that conform to the traits. *
  • 10. Monoids - what are they? > import cats._, data._, implicits._ > Monoid[String].combine(“hi”, “there”) // res0: String = “hithere” > “hi” |+| “there” // res1: String = “hithere”
  • 11. Use case for Monoids/Semigroups They’re good for combining 2 or more things of a similar nature data-type-a data-type-b data-stream end- point parser collector of either data-type-a or data-type-b
  • 12. Use case #1 - Monoids for “smashing” values * all names used here do not reflect the actuals * // Monoid[DataTypeAB] defined somewhere else def buildDataFromStream(datatypeA : DataTypeA, datatypeB : DataTypeB, accumulator: DatatypeAB) = validateData(datatypeA, datatypeB).fold( onError => { // `orError` is lifted into the datatype val errors = Monoid[DatatypeAB].empty.copy(lift(onError)) Monoid[DatatypeAB].combine(accumulator, errors) }, parsedValue => { // `parsedValue` is lifted into the datatype val newValue = Monoid[DatatypeAB].empty.copy(lift(parsedValue)) Monoid[DatatypeAB].combine(accumulator, newValue) } )
  • 13. Functors - what are they? trait Functor[F[_]] { def map[A,B](fa: F[A])(f: A => B) : F[B] } general structure to represent something that can be mapped over. If you’ve been using Lists , Options, Eithers, Futures in Scala, you’ve been using functors. !!! They are very common structures indeed ☺ !!! * functors are used in clever things like recursion-schemes *
  • 14. Functors - what are they? > import cats._, data._, implicits._ > Functor[List].lift((x:Int) => x + 1) // res0: List[Int] => List[Int] > res0(List(1)) // res1: List[Int] = List(2) * Nugget of info: Functors preserve “structure” *
  • 15. Monads Monads are meant for sequencing computations
  • 16. Monads someList.flatmap(element => someOtherList.flatmap(element2 => (element, element2))) *No tuples generated if either “someList” Or “someOtherList” is empty*
  • 17. Monads someList.flatmap(element => someOtherList.flatmap(element2 => (element, element2))) Monads allows for short-circuiting of computations
  • 18. Monads - a quick summary? Writers - information can be carried along with the computation Readers - compose operations that depend on some input. State - allows state to be “propagated” Eval - abstracts over eager vs lazy evaluation.
  • 19. Monads - examples > List(1,2,3) >>= (value => List(value+1)) > res0: List[Int] = List(2,3,4) def >>=[A,B](fa: F[A])(f:A => F[B]): F[B] = flatMap(fa)(f) “>>=“ is also known as “bind” (in Cats, its really “flatMap”)
  • 20. Monads - examples > Monad[List].lift((x:Int) => x + 1)(List(1,2,3)) > res1: List[Int] = List(2,3,4) Typeclasses allows you to define re-usable code by lifting functions.
  • 21. Monads - examples > Monad[List].pure(4) > res2: List[Int] = List(4) This is to lift values into a context, in this case Monadic context.
  • 22. Monads - flow scala> def first = Reader( (x:Int) => Monad[List].ifM(List(true,true))(x::Nil, Nil) ) extractGroups: cats.data.Reader[Int,List[Int]] scala> def second = Reader( (x:Int) => Monad[List].ifM(List(true,true))(x::Nil, Nil) ) loadGroup: cats.data.Reader[Int,List[Int]] scala> for { g <- first(4); gg <- second(g) } yield gg res21: List[Int] = List(4, 4, 4, 4)
  • 23. Writer Monad “Writers” are typically used to carry not only the value of a computation but also some other information (typically, its used to carry logging info). source: http://eed3si9n.com/herding-cats/Writer.html
  • 24. Writer Monad scala> def logNumber(x: Int): Writer[List[String], Int] =          Writer(List("Got number: " + x.show), 3) logNumber: (x: Int)cats.data.Writer[List[String],Int] scala> def multWithLog: Writer[List[String], Int] =          for {            a <- logNumber(3)            b <- logNumber(5)          } yield a * b multWithLog: cats.data.Writer[List[String],Int] scala> multWithLog.run res2: cats.Id[(List[String], Int)] = (List(Got number: 3, Got number: 5),9) scala> multWithLog.reset res6: cats.data.WriterT[cats.Id,List[String],Int] = WriterT((List(),9)) scala> multWithLog.swap res8: cats.data.WriterT[cats.Id,Int,List[String]] = WriterT((9,List(Got number: 4, Got number: 3))) scala> multWithLog.value res9: cats.Id[Int] = 9 scala> multWithLog.written res10: cats.Id[List[String]] = List(Got number: 4, Got number: 3) source: http://eed3si9n.com/herding-cats/Writer.html Compose Writers
  • 25. Reader Monad “Readers” allows us to compose operations which depend on some input. source: http://eed3si9n.com/herding-cats/Reader.html
  • 26. Reader Monad case class Config(setting: String, value: String) def getSetting = Reader { (config: Config) => config.setting } def getValue = Reader { (config: Config) => config.value } for { s <- getSetting v <- getValue } yield Config(s, v) Compose Readers FP-style to abstract and encapsulate.
  • 27. State Monad Allows us to pass state-information around in a computation. http://eed3si9n.com/herding-cats/State.html
  • 28. Use case #3 - Reader + State Monad def process: Reader[Elem, Seq[Mapping]] = Reader { (xml: Elem) => for { groups <- extractGroups(dataXml).toOption group <- groups grpCfg <- loadGroupConfig(group).toOption stateObj <- ConfigState(grpCfg).pure[Option] records <- loadRecords(group).toOption record <- records row <- processRecord(i)(stateObj)(record).pure[Option] } yield { // processing … } } case class ConfigState(init: Config) { private[this] var currentState: Config = init def storeCfg : State[Config, Config] = State{ (cfg: Config) => val prevState = currentState currentState = cfg (currentState, prevState) } def loadCfg : Config = ( for { s <- State.get[Config] } yield s ).runA(currentState).value }
  • 29. Use case #3 - Reader + State Monad def process: Reader[Elem, Seq[Mapping]] = Reader { (xml: Elem) => for { groups <- extractGroups(dataXml).toOption group <- groups grpCfg <- loadGroupConfig(group).toOption stateObj <- ConfigState(grpCfg).pure[Option] records <- loadRecords(group).toOption record <- records row <- processRecord(i)(stateObj)(record).pure[Option] } yield { // processing … } } case class ConfigState(init: Config) { private[this] var currentState: Config = init def storeCfg : State[Config, Config] = State{ (cfg: Config) => val prevState = currentState currentState = cfg (currentState, prevState) } def loadCfg : Config = ( for { s <- State.get[Config] } yield s ).runA(currentState).value } Separation of concerns State management
  • 30. Applicative Applicatives allows for functions to be lifted over a structure (Functor). Because the function and the value it’s being applied to both have structures, hence its needs to be combined.
  • 31. Applicative - examples scala> Applicative[List].lift((x:Int) => x + 1) res1: List[Int] => List[Int] = <function1> scala> Applicative[List].lift( | (x:List[Int=>Int]) => | x.map(f => f(2))) | (List( List((x:Int) => x + 1 ))) res7: List[List[Int]] = List(List(3)) scala> val fs = List(List((x:Int) => x + 1)) fs: List[List[Int => Int]] = List(List(<function1>)) scala> fs.map(_(2)) res15: cats.data.Nested[List,List,Int] = Nested(List(List(3))) Applicative is like a Functor
  • 32. Applicative - examples scala> Applicative[List].lift((x:Int) => x + 1) res1: List[Int] => List[Int] = <function1> scala> Applicative[List].lift( | (x:List[Int=>Int]) => | x.map(f => f(2))) | (List( List((x:Int) => x + 1 ))) res7: List[List[Int]] = List(List(3)) scala> val fs = List(List((x:Int) => x + 1)) fs: List[List[Int => Int]] = List(List(<function1>)) scala> fs.map(_(2)) res15: cats.data.Nested[List,List,Int] = Nested(List(List(3))) Applicative is like a Functor Applying a function which is nested.
  • 33. Applicative - examples scala> Applicative[List].lift((x:Int) => x + 1) res1: List[Int] => List[Int] = <function1> scala> Applicative[List].lift( | (x:List[Int=>Int]) => | x.map(f => f(2))) | (List( List((x:Int) => x + 1 ))) res7: List[List[Int]] = List(List(3)) scala> val fs = List(List((x:Int) => x + 1)) fs: List[List[Int => Int]] = List(List(<function1>)) scala> fs.map(_(2)) res15: cats.data.Nested[List,List,Int] = Nested(List(List(3))) Applicative is like a Functor Applying a function which is nested. Cat has a “Nested” to achieve the same.
  • 34. Applicative - examples A typical application is to leverage Applicatives in writing Logic to validate configurations, forms etc
  • 35. import cats.Cartesian import cats.data.Validated import cats.instances.list._ // Semigroup for List type AllErrorsOr[A] = Validated[List[String], A] Cartesian[AllErrorsOr].product( Validated.invalid(List("Error 1")), Validated.invalid(List("Error 2")) ) // res1: AllErrorsOr[(Nothing, Nothing)] = Invalid(List(Error 1,Error 2)) Applicative - examples
  • 36. package xxx.config import scala.concurrent.duration.{Duration,FiniteDuration} import cats._ import cats.data._ import cats.implicits._ import cats.data.Validated import cats.data.Validated.{Invalid, Valid} // code that needs to remain hidden sealed abstract class ConfigError final case class MissingConfig(field : String) extends ConfigError final case class ParseError(field: String) extends ConfigError case class Config(map : Map[String,String]) case class HuffConfig( clusterName: String, clusterPort : Int, clusterAddress : String, hostname: String, listeningPort: Int) object Validator { def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] = Apply[ValidatedNel[ConfigError, ?]].map5( config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel, config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel, config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel, config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel, config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) { case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) => HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort) } }
  • 37. package xxx.config import scala.concurrent.duration.{Duration,FiniteDuration} import cats._ import cats.data._ import cats.implicits._ import cats.data.Validated import cats.data.Validated.{Invalid, Valid} // code that needs to remain hidden sealed abstract class ConfigError final case class MissingConfig(field : String) extends ConfigError final case class ParseError(field: String) extends ConfigError case class Config(map : Map[String,String]) case class HuffConfig( clusterName: String, clusterPort : Int, clusterAddress : String, hostname: String, listeningPort: Int) object Validator { def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] = Apply[ValidatedNel[ConfigError, ?]].map5( config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel, config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel, config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel, config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel, config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) { case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) => HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort) } } Define types to represent “errors"
  • 38. package xxx.config import scala.concurrent.duration.{Duration,FiniteDuration} import cats._ import cats.data._ import cats.implicits._ import cats.data.Validated import cats.data.Validated.{Invalid, Valid} // code that needs to remain hidden sealed abstract class ConfigError final case class MissingConfig(field : String) extends ConfigError final case class ParseError(field: String) extends ConfigError case class Config(map : Map[String,String]) case class HuffConfig( clusterName: String, clusterPort : Int, clusterAddress : String, hostname: String, listeningPort: Int) object Validator { def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] = Apply[ValidatedNel[ConfigError, ?]].map5( config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel, config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel, config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel, config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel, config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) { case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) => HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort) } } Define types to represent “errors" Validate and read into configuration object.
  • 39. package xxx.config import scala.concurrent.duration.{Duration,FiniteDuration} import cats._ import cats.data._ import cats.implicits._ import cats.data.Validated import cats.data.Validated.{Invalid, Valid} // code that needs to remain hidden sealed abstract class ConfigError final case class MissingConfig(field : String) extends ConfigError final case class ParseError(field: String) extends ConfigError case class Config(map : Map[String,String]) case class HuffConfig( clusterName: String, clusterPort : Int, clusterAddress : String, hostname: String, listeningPort: Int) object Validator { def getHuffConfig(config: Config) : ValidatedNel[ConfigError, HuffConfig] = Apply[ValidatedNel[ConfigError, ?]].map5( config.parse[String] ("DL_CLUSTER_NAME").toValidatedNel, config.parse[Int] ("DL_CLUSTER_PORT").toValidatedNel, config.parse[String] ("DL_CLUSTER_ADDRESS").toValidatedNel, config.parse[String] ("DL_HTTP_ADDRESS").toValidatedNel, config.parse[Int] ("DL_HTTP_PORT").toValidatedNel) { case (clusterName, clusterPort, clusterAddress, httpAddr, httpPort) => HuffConfig(clusterName, clusterPort, clusterAddress, httpAddr, httpPort) } } Define types to represent “errors" Validate and read into configuration object. Validation logic
  • 40. How does anyone create a stack of Monads ? Monad Transformers
  • 41. How does anyone create a stack of Monads ? Monad Transformers
  • 42. Let’s take a closer look scala> case class Cat(name: String, alive: Boolean) defined class Cat scala> def isAlive = Reader{ (u:User) => if (u.alive) u.asRight[Throwable].toOption:: Nil | else scala.util.Try(throw new Exception("Dead!")).asLeft[User].toOption::Nil } isAlive2: cats.data.Reader[User,List[Option[User]]] scala> def lookup = Cat("cat", true).some::Nil lookup: List[Option[Cat]] scala> for { | someCat <- lookup | } yield { | for { | cat <- someCat | } yield isAlive(cat) |} res47: List[Option[cats.Id[List[Option[Cat]]]]] = List(Some(List(User(cat,true)))) Let’s say we like to look up a cat and find out whether its alive. We would use Option[Cat] to say whether we can find one, and perhaps Either[Throwable,Cat] to represent when cat is dead, we throw an exception else we return the Cat First Attempt
  • 43. Let’s take a closer look scala> case class Cat(name: String, alive: Boolean) defined class Cat scala> def isAlive = | Reader{ (u: Cat) => if (u.alive) OptionT( u.asRight[Throwable].toOption:: Nil) | else OptionT( scala.util.Try(throw new Exception("Dead!")).asLeft[Cat].toOption::Nil) } isAlive: cats.data.Reader[Cat,cats.data.OptionT[List, Cat]] scala> def lookup = OptionT(Cat("cat", true).some::Nil) lookup: cats.data.OptionT[List, Cat] scala> for { | cat <- lookup | checked <- isAlive(cat) | } yield checked res32: cats.data.OptionT[List, Cat] = OptionT(List(Some(Cat(cat,true)))) The nested-yield loops can quickly get very confusing …. that’s where Monad Transformers help! Second Attempt
  • 44. Effectful Monads aka Eff-Monads Effectful Monads An alternative to Monad Transformers http://atnos-org.github.io/eff/
  • 45. Use-case #4 Putting in the type-definitions: making use of the Reader, Writer, Either Effects from Eff ! import xxx.workflow.models.{WorkflowDescriptor, Service} import scala.language.{postfixOps, higherKinds} import org.atnos.eff._, all._, syntax.all._ import com.typesafe.config._ import com.typesafe.scalalogging._ class LoadWorkflowDescriptorEff { import cats._, data._, implicits._ import io.circe._, io.circe.generic.auto._, io.circe.parser._, io.circe.syntax._ lazy val config = ConfigFactory.load() lazy val logger = Logger(getClass) type WorkflowIdReader[A] = Reader[String, A] type WriterString[A] = Writer[String,A] type DecodeFailure[A] = io.circe.DecodingFailure Either A type ParseFailure[A] = io.circe.ParsingFailure Either A // ... }
  • 46. import java.time._ type LoadDescStack = Fx.fx6[WorkflowIdReader, WriterString, DecodeFailure, ParseFailure, Throwable Either ?, Eval] def loadDescriptor : Eff[LoadDescStack, WorkflowDescriptor] = for { workflowId <- ask[LoadDescStack,String] _ <- tell[LoadDescStack,String](s"[${Instant.now()}] About to load data about workflow: $workflowId") contents <- fromEither[LoadDescStack,java.lang.Throwable,String](loadContents(workflowId)) _ <- tell[LoadDescStack,String](s"[${Instant.now()}] Data is loaded from storage: $contents") json <- fromEither[LoadDescStack,io.circe.ParsingFailure,io.circe.Json](parse(contents)) _ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor parsed successfully") desc <- fromEither[LoadDescStack, io.circe.DecodingFailure, WorkflowDescriptor](json.as[WorkflowDescriptor]) _ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor hydrated into object.") } yield desc // Below is a test and you can choose either runEval or attemptEval // attemptEval is a better option as it captures any errors met during the // computation. //println(loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure) lazy val result = { val a = loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure val t = a.get t.joinRight } // the logging version lazy val result2 = { val a = loadDescriptor.runReader("1").runWriterLog.runEither.runEither.runEither.runPure val t = a.get t.joinRight } } Use-case #4
  • 47. import java.time._ type LoadDescStack = Fx.fx6[WorkflowIdReader, WriterString, DecodeFailure, ParseFailure, Throwable Either ?, Eval] def loadDescriptor : Eff[LoadDescStack, WorkflowDescriptor] = for { workflowId <- ask[LoadDescStack,String] _ <- tell[LoadDescStack,String](s"[${Instant.now()}] About to load data about workflow: $workflowId") contents <- fromEither[LoadDescStack,java.lang.Throwable,String](loadContents(workflowId)) _ <- tell[LoadDescStack,String](s"[${Instant.now()}] Data is loaded from storage: $contents") json <- fromEither[LoadDescStack,io.circe.ParsingFailure,io.circe.Json](parse(contents)) _ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor parsed successfully") desc <- fromEither[LoadDescStack, io.circe.DecodingFailure, WorkflowDescriptor](json.as[WorkflowDescriptor]) _ <- tell[LoadDescStack, String](s"[${Instant.now()}] Workflow descriptor hydrated into object.") } yield desc // Below is a test and you can choose either runEval or attemptEval // attemptEval is a better option as it captures any errors met during the // computation. //println(loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure) lazy val result = { val a = loadDescriptor.runReader("1").runWriter.runEither.runEither.runEither.runPure val t = a.get t.joinRight } // the logging version lazy val result2 = { val a = loadDescriptor.runReader("1").runWriterLog.runEither.runEither.runEither.runPure val t = a.get t.joinRight } } Use-case #4 Eff-Monads allows us to stack computations
  • 49. That’s it from me :) Questions ?