Machine Learning Beginner level. Give me all the Resourcezzzzzz



  • I am extremely curious about this Machine learning stuff and I want to know what is the best place to start.

    1. I would like to learn Hands-on either using Python or C# or F#.

    2. I am sufficiently familiar with both C# and Python. F# I know very little but would not mind investing time in.

    3.I would like something that is not math heavy. A book with 423% Math is not okay, a book that is 30% math,40% code and 30% theory would be fine.

    4.I want to get a taste of what machine learning entails and how it is done through programming very broadly. Nothing too specific.

    5.I do not have the moneyz, so expensive online courses are out of the question.

    So yeah, Any recommendations are appreciated and will help get me started on this front instead of just thinking about it.

    Thank you.



  • Are you sure you want machines learning? All the documentaries I've seen say that's a bad thing.

    Filed under: Terminator and I Robot are documentaries right?



  • Elon Musk does, too! 😉



  • @powerlord said:

    Are you sure you want machines learning? All the documentaries I've seen say that's a bad thing.

    Filed under: Terminator and I Robot are documentaries right?

    Stop complaining and go fix the AE-35 Unit, Dave.



  • I did machine learning for a few years.

    Machine learning isn't at a state where you can do it without the math.

    But the math isn't hard. The big idea behind machine learning is the same as the big idea behind calculus/statistics: turning complex inputs into linear approximations (i.e., find the matrix/linear transformation that best "explains" the data/the transformation it represents).

    There are lots of algorithms for doing just that. The most basic is probably principle component analysis, where you calculate a covariance matrix for the data and then find its eigenvectors. You can then express the data points using the eigenvectors as a basis.

    Then there are support vector machines, where you construct a basis by weighing each data point and weighting each basis vector.

    Then there are non-linear methods, like Bayesian networks or neural networks. Bayesian networks can take advantage of Bayes theorem, and are good for classification problems. Neural networks are the same idea, but more general, where the network you build keeps track of weights, but results later in the network can feed back earlier in the network. There is a lot of theory behind these. They're also slow, so avoid them unless you have really non-linear (in every basis!) data and lots of input data.

    "Finally," there's genetic programming, where you essentially write a "special" compiler (actually, you would write plumbing around an existing compiler or interpreter) that knows how to mutate a program/network/etc and knows how to search for the best one given the empirical data it is fed.


Log in to reply