Functional Programming – Let's see some Python code: – Unix



Functional Programming – Let's see some Python code: – Unix

0 0


nma-nida


On Github rtfb / nma-nida

Functional Programming

Presented by Vytautas Šaltenis / @rtfb

Intro and agenda

  • What FP is and isn't
  • Some examples in various lingos
  • Motivation
  • Patterns
  • More patterns
  • Then a workshop

What is functional programming?

Not procedural

Not imperative

Not objective

What is functional programming?

Intuitive explanation:

Break down your problem in such a way that it could be solved by functions and functions only. Like this:

f(g(h(i(initial_data))))
=> final_result

No ifs, no loops, no variables, no nothing. Just functions.

Purity

...but most importantly, no side effects!

Every function should be mathematically pure, taking parameters and producing its output independently from anything else in the world.

Full purity is impossible

Because I/O is also a side effect!

Random fact:

FP is based on lambda calculus, invented in 1930s

Alonzo Church

Random fact 2:

First implemented in Lisp in 1958

John McCarthy

Random fact 3:

The purest FP language I know is Haskell

Simon Peyton Jones

Let's see some Python code:

def triplicate(lst):
    return map(lambda ch: ch*3, lst)

triplicate(['a', 'b', 'c'])
=> ['aaa', 'bbb', 'ccc']

And now in JavaScript:

function triplicate(arr) {
    return arr.map(function(item) {
        return item + item + item;
    });
}

triplicate(['a', 'b', 'c']);
=> ["aaa", "bbb", "ccc"]

And in Lisp, just for the kicks :-)

(defun triplicate (list)
  (mapcar (lambda (item)
            (concatenate 'string item item item))
          list))

(triplicate '("a" "b" "c"))
=> ("aaa", "bbb", "ccc")

Why bother?

Parallelism!

Abstractions

Formal provability

Modularity

Composability

Lazy evaluation

Unix

Parallelism

Modern GPU can run ~14k threads at once!

Modern server farm can do much more

Yes, you will use server farms, most likely on the cloud

And it's pretty much impossible to program one of the above w/o functional approach

Abstractions

vector<string> vs{"a", "b", "c"};
for (vector<string>::iterator it = vs.begin(); it != vs.end(); ++it) {
    do_something_with(*it + *it + *it);
}
vector<string> vs{"a", "b", "c"};
for (auto &s : vs) {
    do_something_with(s + s + s);
}

What does this code do?

var arr = [3, 6, 2];
for (var i = 0; i < arr.length; ++i) {
    print(arr[i]);
}

What does this code do?

var arr = [3, 6, 2];
for (var i = arr.length; i > 0; --i) {
    print(arr[i]);
}

Why not write it like this?

var arr = [3, 6, 2];
arr.reverse().forEach(function(item) {
    print(item);
});

Formal provability

The idea is to generate a formal proof with help of invariants.

Works for other programming styles as well, but FP makes it easier

Not that practical as some researchers would like us believe :-)

Modularity

Since FP puts some draconian constraints, it becomes necessary to break down things into smaller pieces. And smaller pieces are better pieces.

Composability

Remember?

f(g(h(i(initial_data))))
=> final_result

This is more useful than it seems at first glance, since you can then recompose functions from other places. You can do that with other code, too, but FP makes it easer as long as you have uniform data structures.

Lazy evaluation

Infinite data structures, generators and such...

Found in Python, C#, F#, Haskell elsewhere.

Unix

tr 'A-Z' 'a-z' < ~/shakespeare.txt | tr -sc 'A-Za-z' '\n' | \
    sort | uniq -c | sort -n -r | head
$ tr 'A-Z' 'a-z' < ~/shakespeare.txt | tr -sc 'A-Za-z' '\n' | sort | uniq -c | sort -n -r | head
  27843 the
  26847 and
  22538 i
  19882 to
  18307 of
  14800 a
  13928 you
  12490 my
  11563 that
  11183 in

Algorithms and patterns

Map

Filter

Zip

Reduce

Map

Takes a function and a list, produces new list:

map(chr, [79, 72, 65, 73, 33])
=> ['O', 'H', 'A', 'I', '!']

Average age of people?

person = {
    name: 'John',
    surname: 'Foo',
    age: 27,
    height: 172,
    weight: 70.2,
    eye_color: '#ff00ff',
    ...
}
people = [...]

Explicit loop

sum = 0
for p in people:
    sum += p.age
sum / len(people)

sum()

sum([1, 2, 3, 4, 5])
=> 15

Map

sum(map(lambda p: p.age, people)) / len(people)

Lambda??

lambda argument: do_something_with(argument)

def func(argument):
    do_something_with(argument)

Could be written as:

def get_age(person):
    return person.age

sum(map(get_age, people)) / len(people)

How does map() work?

def map(fn, lst):
    result = []
    for i in lst:
        result.append(fn(i))
    return result

Average age of adults?

person = {
    name: 'John',
    surname: 'Foo',
    age: 27,
    height: 172,
    weight: 70.2,
    eye_color: '#ff00ff',
    ...
}
people = [...]

Adults?

adults = pick_adults(people)

Filter

adults = filter(lambda p: p.age >= 18, people)

Filter and sum

adults = filter(lambda p: p.age >= 18, people)
sum(map(lambda p: p.age, adults)) / len(people)

How does filter() work?

def filter(predicate, lst):
    result = []
    for i in lst:
        if predicate(i):
            result.append(i)
    return result

Predicates

In general, a function that takes an object and returns some bool about it is called a predicate

In Lisp there's a convention to call predicates with suffix -p:

adultp
numberp
listp
standard-char-p

Side story: AoS vs. SoA

AoS: Array of Structs

SoA: Struct of Arrays

Zip

names = ['Luke', 'Leia', 'Han']
ages = [21, 18, 29]
zip(names, ages)
=> [('Luke', 21), ('Leia', 18), ('Han', 29)]

Usually used in conjunction with something else:

map(process_person, zip(names, ages))

Reduce

aka fold

Takes a list and reduces it to a single value

We've seen one special case already:

sum([21, 18, 29])

How could it work?

def sum(lst):
    acc = 0
    for i in lst:
        acc += i
    return acc

But it could be more general:

reduce(operator.add, [21, 18, 29])

How could reduce work?

def my_reduce(func, lst):
    if len(lst) < 2:
        return lst[0]
    return func(lst[0], my_reduce(func, lst[1:]))

does roughly this:

func(a, func(b, func(c, func(d, ...))))

More useful example:

input_list = [[1, 2, 3, 4, 5], [2, 3, 4, 5, 6], [3, 4, 5, 6, 7]]
reduce(set.intersection, map(set, input_list))
=> set([3, 4, 5])

Tail recursion

What you think happens if I call this?

my_reduce(operator.add, range(1000))

That's what :-)

File "mr.py", line 7, in my_reduce
return func(lst[0], my_reduce(func, lst[1:]))
File "mr.py", line 7, in my_reduce
return func(lst[0], my_reduce(func, lst[1:]))
File "mr.py", line 7, in my_reduce
return func(lst[0], my_reduce(func, lst[1:]))
RuntimeError: maximum recursion depth exceeded

So ideally a functional language should have tail recursive call optimization. Python doesn't.

Recap

  • Higher order abstractions
  • Tighter code
  • Robust code
  • Parallel-ready code
  • Fun! code :-)

Questions?

Functional Programming Presented by Vytautas Šaltenis / @rtfb