Differences

This shows you the differences between two versions of the page.

Link to this comparison view

en:projects:details:neuralsemproject [2011/09/23 14:58]
amir
en:projects:details:neuralsemproject [2016/06/23 11:26]
Line 1: Line 1:
-/* This is the template for project details pages */ 
  
-/*  
-  The database entry: 
-  "​type"​ is one of the following: phd theses, phd semester, master thesis, master semester, bachelor semester 
-  "​status"​ is one of the following: available, taken, completed (please upgrade accordingly!!!!!!!!!!) ​ 
-  "​by"​ should be filled as soon as the project is taken/​completed 
-  "​completed_dt"​ is the date when the project was completed (YYYY-MM-DD). ​ 
-  "​output_media"​ is the link to the pdf of the project (wiki syntax) 
-  "​table"​ must be "​projects"​ => don't touch it! 
-*/ 
----- dataentry project ---- 
-title : Simple neural networks with error correcting abilities 
-contactname:​ Amir Hesam Salavati 
-contactmail_mail:​ hesam.salavati@epfl.ch 
-contacttel: 021 - 693 81 37 
-contactroom:​ BC 160 
-type : master semester 
-status : available 
-created_dt : 2010-11-15 
-taken_dt : YYYY-MM-DD 
-completed_dt : YYYY-MM-DD 
-by : the full name of the student 
-output_media : en:​projects:​neural_storage_capacity.pdf|Download Abstract in PDF Format 
-table : projects 
-====== 
-template:​datatemplates:​project 
----- 
- 
-/* Description of the project */ 
-===== Background ===== 
-Memorizing patterns and correctly recalling them later is an essential ingredient of neural activity. In past 25 years, a number of neural networks ​ has been invented memorize and recall patterns. Interestingly,​ some of these networks are able to recall the correct pattern even if the input pattern contains error and is partially corrupted because of noise. In this regard, these artificial networks resemble error correcting codes, i.e. they are able to recognize the correct pattern in presence of noise. 
- 
-However, the storage capacity of these networks are quite small compared to their counterparts in coding theory. Given the fact that modern codes use the same basic structure to do error correction and the one used by neural networks, i.e. a bipartite graph with local message passing, it seems interesting to consider applications of modern coding theory to increase the storage capacity of neural networks by finding the appropriate weights for the neural graph. ​ 
- 
-Up to this point, some weighting schemes (including the Hebbian rule) were tested without much success. Considering more weighting schemes, such as the BCM rule, would be the next step toward the goal of increasing the storage capacity which is the main objective of this project. ​ 
- 
- 
-===== Project Goals ===== 
-The objectives of this project are: 
-1)To understand the principles of neural networks and modern codes. 
-2)Applying different weighting scheme to neural networks and evaluate the performance for storing codewords of a linear code.  
- 
-This project is suitable for students who prefer doing simulations to find the appropriate framework for doing theoretical analysis. ​ 
- 
-The prerequisites are: 
-1)Basic knowledge of coding theory. 
-2)Being familiar with a suitable programming language (C/​C++,​MATLAB)