EPFL

Algo+LMA

Report 3 on Phase Transition - 28 Apr 2010

Report 1 on Applications of Coding Theory in Neuroscience - 28 Apr 2010

Report 2 on Applications of Coding Theory in Neuroscience - 10 May 2010

Report 3 on Applications of Coding Theory in Neuroscience - 17 May 2010

Report 4 on Applications of Coding Theory in Neuroscience - 6 June 2010

Applications of Coding Theory in Biological Systems (PhD Candidacy Exam Report)

This is the report for my candidacy exam in which I consider possible applications of coding theory in both genetics and neuroscience. As suggested by the exams committee members, I am going to focus on the applications in neuroscience as it seems more promising than the ones in genetics. In particular, the relationship with coding theory and associative memory, a mechanism in brain for doing memory and recall in presence of noise, will be investigated in future.

Report 5 on Applications of Coding Theory in Neuroscience - 8 July 2010

The first part of this report addresses the other project on Taylor expansion of entropy function. The rest summarizes an attempt to extend density evolution to weighted bipartite graph with neural update rule on both sides. It can be shown that if weights are i.i.d Gaussian random variables, one can obtain a recursive equation for probability of error at any iteration in a bipartite Hopfield graph.

Report 6 on Applications of Coding Theory in Neuroscience - 1 September 2010

This report summarizes Berrou's recent papers on Coded Hopfield Network and how to increase their storage capacity.

Report 7 on Applications of Coding Theory in Neuroscience - 13 September 2010

This report contains an analysis of Berrou's recent paper: “Coded Hopfield Networks” which was presented in ISTC 2010. In this paper, he and his colleague try to increase the storage capacity of Hopfield networks by using Walsh-Hadamard sequences, which have relatively good low correlation properties. However, they consider an additional ML decoding network besides the traditional Hopfield network, which does not seem to be biologically meaningful. We hope to come up with an algorithm that increases the storage capacity of Hopfield networks without adding an additional ML decoding stage.

Report 8 on Applications of Coding Theory in Neuroscience - 3 October 2010

In this report, we briefly overview the principles of Hopfield network. Then we introduce the idea of using Hopfield networks to memorize patterns with relatively large minimum distance (as opposed to purely random patterns) and do a first attempt to investigate the iterative behavior of Hopfield networks using coding theoretical arguments. At this point, we do not consider Hopfield traditional weighting scheme. Instead, we approximate weights by a Gaussian distribution. We provide some biological evidence in favor of such approximation. Nevertheless, as discussed in the report, in some cases the biological weight distribution obey a truncated Gaussian distribution (meaning that only positive weights are allowed) and in most other cases it follows a log-normal distribution.

Report 9 on Applications of Coding Theory in Neuroscience - 18 October 2010

This report summarizes our efforts on modifying the Hopfield weighting rule by introducing a diagonal matrix when calculating the outer product of patterns matrix and its transpose. The goal is to overcome the issue of having all weights becoming zero when we use codewords of a linear code as our desired patterns.

Report 10 on Applications of Coding Theory in Neuroscience - 12 November 2010

In this report, we give a summary of our activities on the BioCoding project from 19.10.2010 until 12.11.2010. It includes a summary of our discussion with Prof. Gerstner and his group, Prof. Cevher and Prof. Leveque, as well as our recent advances on applications of coding theory in increasing the storage capacity of Hopfield networks using low correlation sequences.

Report 11 on Applications of Coding Theory in Neuroscience - 13 December 2010

This report summarized stability of Gold sequences if they are memorized by Hopfield networks, as well as their performance in presence of noise. We show that although the performance in absence of noise, i.e. stability, is excellent, the performance in presence of noise is unacceptable and the network can not correct even a single bit error.

Report 12 on Applications of Coding Theory in Neuroscience - 16
December 2010

In this report we will get back one more time to the Gold sequences! This time, we consider applying our old idea on using diagonal matrices in deriving weights with Gold sequences.
We suggest a scheme based on the original Hopfield network and this modified weighting scheme which is able to to store M=n patterns with a network of n nodes with the capability of correcting a number of erroneous bits (4 for the case of n = 127). The proposed method is iterative, meaning that we let it converge to a stable state, in contrast to our previous attempts which was mainly based on a single shot approach. This probably gives us more degrees of freedom in addition to the new weighting scheme. We have simulated the performance of the proposed method and found some good results.