Post here (preferably projects that are still being maintained):The OpenCog Framework - http://wiki.opencog.orgEmergent Neural Network Simulation System -http://grey.colorado.edu/emergent/index.php/Main_PageNeuroph Java Neural Network Framework - http://neuroph.sourceforge.netEncog Machine Learning Framework - http://www.heatonresearch.com/encogThese are just a few of the best I've found. If you know of any good ones, post em!
2/27/2013 3:14:29 PM
It depends up what you're trying to do I guess. I use a lot of machine learning techniques, but they aren't for neural simulation, so I use R for most of my work. http://cran.r-project.org/web/views/MachineLearning.htmlMy favorite packages include gradient boosting machines (gbm package), random forests (randomForest), and multivariate adaptive regression splines (earth). I'm not a huge fan of the neural network packages in R. I use a lot of these as base learners that I then combine or "ensemble" them to form higher level, meta-learners. Deep learning has received quite a bit of attention this year. Deep learning refers to training multi-layer neural networks (i.e. "deep" nets) that are capable of learning complicated structures. There is a nice toolbox for matlab and a site dedicated to deep learning techniques. http://deeplearning.net/http://deeplearning.net/software_links/They have come to dominate speech recognition, at least that's my understanding. The press has picked up on some of google's deep learning experiments:http://deeplearning.net/2012/12/13/googles-large-scale-deep-learning-experiments/There is an excellent free online class (not a MOOC) going on at NYU right now, but it's not based on neural nets. http://cilvr.cs.nyu.edu/doku.php?id=courses:bigdata:startThere was a really good class on neural networks on coursera taught by Hinton, who is one of the major pioneers in the field, but I don't know if they plan to teach it again.If you want to get your hands dirty, pick your favorite method and try one of the contests on Kaggle:http://www.kaggle.com/I know that python is getting popular too. Numpy and Scipy get used a lot and implement a variety of techniques, but I've come to dislike python, so I rarely use it.[Edited on February 27, 2013 at 4:07 PM. Reason : ]
2/27/2013 4:04:08 PM
Oh damn I had forgotten about deep learning. I remember hearing about it and thinking "oh well that's not new" and for that reason I guess I forgot about it never did much research on it. I won't make that mistake again.
2/27/2013 4:14:17 PM
Hinton gave a good talk about deep learning at NIPS this year:http://videolectures.net/nips2012_hinton_networks/It's about a technique to improve them, but should get you up to speed on deep nets as well. Might as well check out the other NIPS talks while you're at it:http://videolectures.net/nips2012_laketahoe/[Edited on February 27, 2013 at 4:21 PM. Reason : ]Oh yeah, one of the best technical books out there is available as a PDF for free as well.http://www-stat.stanford.edu/~tibs/ElemStatLearn/http://www-stat.stanford.edu/~tibs/ElemStatLearn/printings/ESLII_print10.pdf[Edited on February 27, 2013 at 4:47 PM. Reason : ]
2/27/2013 4:20:51 PM
Thanks. VideoLectures has filled many idle hours in the past, but I had not actually found anything particularly interesting recently. So this is good.
2/27/2013 4:46:29 PM
Don't know how I forgot about NELL:http://rtw.ml.cmu.edu/rtw/
2/27/2013 8:12:58 PM
bump! I was hoping more people had some stuff. Oh well
3/3/2013 11:13:56 PM
http://conductrics.com/data-science-resources/http://conductrics.com/data-science-resources-2
3/4/2013 9:09:58 AM
there's some library i used for making neural networks in matlab once upon a time, but i've pretty much forgotten it at this pointfuck, let me look at some old code and see if i can find a function call i can google[Edited on March 4, 2013 at 9:39 AM. Reason : found it http://www.mathworks.com/products/neural-network/]
3/4/2013 9:37:30 AM
^To build on that, I'm using a deep learning toolbox currently.https://github.com/rasmusbergpalm/DeepLearnToolboxVowpal Wabbit (It's pronounced in the way Elmer Fudd would pronounce Vorpal Rabbit) is good for big data sets: https://github.com/JohnLangford/vowpal_wabbit/wikiWeka is a popular Java machine learning platform: http://www.cs.waikato.ac.nz/ml/weka/Everything I've posted in this thread so far would largely fall under probabilistic methods, meaning they attempt to find assocations between inputs and outputs. There is a large literature on finding causal relationships. Judea Pearl is the big name in this field. Here is his blog:http://www.mii.ucla.edu/causality/and his seminal book on the topic:http://www.amazon.com/gp/product/052189560X/ref=pd_lpo_k2_dp_sr_1?pf_rd_p=486539851&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=0521773628&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=0B0GY7PT27CB569XQPXWHere are some blogs I frequently read, that are at least related to ML:http://hunch.nethttp://normaldeviate.wordpress.comhttp://nuit-blanche.blogspot.ithttp://rjlipton.wordpress.comhttp://www.scottaaronson.com/blog/
3/4/2013 10:50:30 AM
If you're looking for some data to try these methods on, check out:http://archive.ics.uci.edu/ml/http://www.sigkdd.org/kddcup/index.phphttp://datamob.orghttp://www.fedstats.govhttp://www.census.gov/main/www/access.htmlIf you really want to get some experience using these methods, I'll recommend again you sign up at Kaggle and enter some competitions.Do you have any specific interests or projects you are currently or would like to work on?
3/4/2013 11:02:02 AM
InsaneMan?
3/4/2013 11:52:43 AM
Julia is supposed to be the next big thing in scientific computing, but right now it's still pretty immature:http://julialang.org/
3/4/2013 2:50:47 PM
Not a library, but related:http://www.nytimes.com/2013/02/18/science/project-seeks-to-build-map-of-human-brain.html?pagewanted=1&_r=0
3/5/2013 10:10:24 AM
The time for mobility is ending, the time for AI is coming!
10/9/2018 3:08:51 PM
neolithic was ahead of his time, a lot of those resources still hold up today.
10/9/2018 9:50:22 PM