Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
en:projects:details:neuralsemproject2 [2011/09/23 15:30]
amir
en:projects:details:neuralsemproject2 [2016/06/23 11:26] (current)
Line 4: Line 4:
   The database entry:   The database entry:
   "​type"​ is one of the following: phd theses, phd semester, master thesis, master semester, bachelor semester   "​type"​ is one of the following: phd theses, phd semester, master thesis, master semester, bachelor semester
-  "status" is one of the following: available, taken, completed (please upgrade accordingly!!!!!!!!!!) ​+  "state" is one of the following: available, taken, completed (please upgrade accordingly!!!!!!!!!!) ​
   "​by"​ should be filled as soon as the project is taken/​completed   "​by"​ should be filled as soon as the project is taken/​completed
   "​completed_dt"​ is the date when the project was completed (YYYY-MM-DD). ​   "​completed_dt"​ is the date when the project was completed (YYYY-MM-DD). ​
Line 11: Line 11:
 */ */
 ---- dataentry project ---- ---- dataentry project ----
-title : Low correlation sequences as a means of error correction ​in neural networks+title : Nonlinear Dimensionality Reduction Techniques and Their Application ​in Neural Networks
 contactname:​ Amir Hesam Salavati contactname:​ Amir Hesam Salavati
 contactmail_mail:​ hesam.salavati@epfl.ch contactmail_mail:​ hesam.salavati@epfl.ch
Line 17: Line 17:
 contactroom:​ BC 160 contactroom:​ BC 160
 type : master semester type : master semester
-status ​available+state completed
 created_dt : 2011-09-23 created_dt : 2011-09-23
-taken_dt : YYYY-MM-DD +taken_dt : 2013-01-17 
-completed_dt : YYYY-MM-DD +completed_dt : 2013-06-12 
-by : the full name of the student +by : Michael Hobbs  
-output_media : en:​projects:​neural_storage_capacity.pdf|Download ​Abstract ​in PDF Format+output_media ​:​en:​projects:​master_semester:​hobbs_salavati_semester_project_report_2013.pdf|Download ​Project Report ​in PDF Format
 table : projects table : projects
 ====== ======
 template:​datatemplates:​project template:​datatemplates:​project
 ---- ----
 +\\ 
 +\\
 /* Description of the project */ /* Description of the project */
-===== Background ===== 
-Memorizing patterns and correctly recalling them later is an essential ingredient of neural activity. In past 25 years, a number of neural networks ​ has been invented memorize and recall patterns. Interestingly,​ some of these networks are able to recall the correct pattern even if the input pattern contains error and is partially corrupted because of noise. In this regard, these artificial networks resemble error correcting codes, i.e. they are able to recognize the correct pattern in presence of noise. 
- 
-However, the storage capacity of these networks are quite small compared to their counterparts in coding theory. Using low correlation sequences and some structured neural networks, we have been able to increase this capacity a bit (see the paper below). Nevertheless,​ we have not yet been able to find a tight bound on the actual number of errors that can be corrected using the suggested scheme. Finding such a bound is the main objective of this project. ​ 
- 
- 
 ===== Project Description ===== ===== Project Description =====
-This project has two phases: +\\ 
-  - In the first phase, we become familiar ​with the main concepts ​of neural networksassociative memory and low correlation sequences +Dimensionality reduction is a widely used technique in machine learning and data processing. Since dealing ​with large amount ​of high dimensional data is difficultone could use dimension reduction ​to reduce ​the number of variables (dimensions) and then use this coarse version of input data for further processing
-  - Next, we focus on analyzing the dynamics of the suggested neural network in [1] and try to obtain a bound on the number of errors that can be corrected using this network.+
  
 +It is believed that human neural system also performs lots of dimensionality reduction to efficiently deal with natural stimuli. ​
  
-The first phase requires reading ​some background literature. The second one needs some background in discrete mathematics and finite fields+In this project we focus on studying and implementing ​some of the widely used nonlinear dimensionality reduction techniques using neural networks. The implemented approaches are then applied to a dataset of natural stimuli (images or sounds) to extract important features from the datasets. The final goal of this project would be to see if such features could help us increase the storage capacity of various artificial neural memories.
  
-This project is suitable for students interested ​in neural networks, coding theory and mathematics who prefer theoretical works+The implementation can be done in MATLAB or C/C++ (MATLAB is preferred though)
  
-The prerequisites ​are: +To read more about nonlinear dimensionality reduction, here are the corresponding wikipedia entries [[http://​en.wikipedia.org/​wiki/​Dimension_reduction|Dimension Reduction]] ​and [[http://​en.wikipedia.org/​wiki/​Nonlinear_dimensionality_reduction|Nonlinear Dimensionality Reduction]].
-  - Some knowledge of finite fields ​and discrete mathematics +
-  - Basic knowledge of low correlation sequences and linear algebra would be useful (although not necessary).+
 \\ \\
 \\ \\
 \\ \\
 +
 +====== Report ======
 +The report ​ is available via the following link:
 \\ \\
-To read further, the following paper is recommended: +{{:en:projects:​master_semester:​hobbs_salavati_semester_project_report_2013.pdf|Nonlinear Dimensionality Reduction Techniques and Their Application in Neural Networks}}
-\\ +
-\\ +
-[1] A.H. Salavati, K.R. Kumar, A. Shokrollahi and W. Gerstner, "​Neural Pre-coding Increases the Pattern Retrieval Capacity of Hopfield and Bidirectional Associative Memories",​ IEEE International Symposium on Information Theory (ISIT), 2011. ([[http://​infoscience.epfl.ch/​record/​168881/​files/​ISIT_camera_ready.pdf|pdf]]) +