19. Simulations

19.1. Monte Carlo Estimation

19.2. Fractals

Fractals are figures that are self-similar at several scales.

  • Write a script that displays the Koch snowflake

    Hints:

    • use the turtle module

    • use recursion

    My solution: koch0.py

19.3. Formal systems

19.3.1. MU Puzzle

A famous seminal book in Cognitive Science is Gödel Escher Bach: An Eternal Golden by Douglas Hofstadter. Its main topic is recursion and self-reference (see also I am strange loop by the same author).

According to Hofstadter, the formal system that underlies all mental activity transcends the system that supports it. If life can grow out of the formal chemical substrate of the cell, if consciousness can emerge out of a formal system of firing neurons, then so too will computers attain human intelligence. Gödel, Escher, Bach is a wonderful exploration of fascinating ideas at the heart of cognitive science: meaning, reduction, recursion, and much more (from https://medium.com/@alibedirhan.d/mu-puzzle-f651ef3957c5)

The book is filled with puzzles, including Hofstadter’s famous MU puzzle. The MU puzzle involves a simple formal system called MIU.

A starting string, MI, is given. Four rules for changing the string of characters into a new one are provided (see below). A each step, the current string can be transformed into a new string by the application of one of the four rules. Note that rules are one-way! In case there are several applicable rules, there is nothing that will dictate which rule you should use, it’s up to you! Here are the rules:

  1. If you possess a string whose last letter is I, you can add on a U at the end. For example MIUI can be rewritten MIUII. This rule can be written xI -> xIU where x represents any string

  2. Suppose you have Mx. Then you may rewrite it Mxx. For example, from MIU, you may get MIUIU (x = IU therefore; Mxx = MIUIU; From MUM, you may get MUMUM, From MU, you may get MUU, …

  3. If III occurs in one of the strings, you may make a new string with U in place of III. For example,

    From UMIIIMU, you could make UMUMU; From MIIII, you could make MIU (also MUI). From IIMII, you can’t get anywhere using this rule because the three I’s have to be consecutive.

  4. If UU occurs inside one of your strings, you can drop it. From UUU, you get U. From MUUUIII, get MUIII.

The Mu Puzzle asks whether starting from the string MI, there exists a derivation, that is a sequence of aplications of the rules, that can yield the string MU.

Exercice: Write a Python script that explores the set of strings generated by this formal system: study the length of the string produced after ‘n’ steps (run this process several times and compute an histogram).

Then you may read :

19.3.2. Cellular Automata

Learn about Conway’s Game of Life. Watch this and that videos.

  • Implement an Elementary cellular automaton. The aim is to reproduce the graphics shown at the bottom on the previous page. you can take inspiration from the excellent Think Complexity by Allen B. Downey. My solution is at cellular-automata/1d-ca.py.

  • Implement the Game of Life in 2D.

  • Going futher: If you enjoy Cellular Automata, you can read Stephen Wolfram’s A New Kind of Science. A more general book about Complexity is Melanie Mitchell’s Complexity: a guided tour.

19.3.3. Natural Language Parsing

Parsing refers to building the syntactic structure of a sentence from the linear sequence of words that compose it.

19.4. Artificial Neural networks

19.4.1. Hopfield network

Read https://towardsdatascience.com/hopfield-networks-are-useless-heres-why-you-should-learn-them-f0930ebeadcd and do not look at the jupyter notebook implementation provided by the author. Try to program a hopefield network and teach irt a few patterns. Only then, check the author’s solution.

To go further you can read:

  • Ramsauer, Hubert, Bernhard Schäfl, Johannes Lehner, Philipp Seidl, Michael Widrich, Thomas Adler, Lukas Gruber, et al. 2020. “Hopfield Networks Is All You Need.” ArXiv:2008.02217 [Cs, Stat], December. http://arxiv.org/abs/2008.02217.

19.4.2. Perceptron

  1. Read about the Percepton at https://medium.com/@thomascountz/perceptrons-in-neural-networks-dc41f3e4c1b9

  2. Implement a Perceptron in Python

For a solution, check : https://blog.dbrgn.ch/2013/3/26/perceptrons-in-python/

19.4.3. Backpropagation

To understand the basics of artificial neural networks, I recommend that you first read https://victorzhou.com/blog/intro-to-neural-networks/ and then watch the four excellent videos at https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi . The last two of them focus on the backpropagation algorithm that allows one to train network to learn mappings.

Next, you can read and try to understand this implementation of the backpropagation algorithm.

Then, see a modern and efficient implementation of neural networks: https://pytorch.org/tutorials/beginner/deep_learning_nlp_tutorial.html

More readings:

19.4.4. Using tensorflow and keras to build neural nets

The following book is a gentle introduction: https://www.goodreads.com/book/show/53483757-ai-and-machine-learning-for-coders