I would say your choice of essentials seems odd to me, if you're getting in neural networks, or more broadly "machine learning", you can throw out a few of those.
Here's what my list would look like:
Languages
- Python - Absolutely required. This is the language of the entire field, everything high-level is done in it. Several very popular machine learning frameworks all use it as their implementation language. ,
- CUDA (C++) - Heavily recommended for GPU acceleration. Nvidia has a stranglehold over machine learning and AI, so if you have an Nvidia GPU or access to a VM that does, you're in luck and have access to an easy-to-use interface for massively accelerating your machine learning code. The language used itself is kinda like a subset of C++, not necessarily C but you'll catch on quick knowing either language. If your machine is AMD, it's not impossible to achieve the same but it's much harder. Less help and fewer community resources, but it can be done. (There are also free-for-learning Nvidia compute VMs on the internet if you look)
Frameworks
Pick one (or make your own):
- PyTorch - Facebook's machine learning framework, more high-level compared to TensorFlow. Generally used for research and development, though recently TorchServe can be used to bring your app to production.
- TensorFlow - Google's answer to PyTorch, more low-level than PyTorch, so to speak. Generally used in production environments.
- Keras - A high-level neural network framework that is usually used to act as a wrapper around TensorFlow, makes TensorFlow easier to use for some.
There are countless others but those are the big two (including keras + tensorflow) used in research, development, and production.
Fields to study
Calculus I/II - Heavily used in the field. The holy grail of neural networks, the back propagation learning method, enables learning by turning a neural network into one big differential math equation. You should at least understand how it works.
Linear algebra - Neural networks are often represented as matrices, so you should know how to read and manipulate them.
Why not these?
- Assembly: Pretty much unrelated to neural networks, but perhaps you might want to export a production neural network to a hardware device. For instance, modern CPUs use AI to predict branching of assembly code to improve performance.
- Haskell: As cool and interesting as haskell is, it has zero use in machine learning over python or C++.
- Verilog: Not needed at all unless you're making FPGAs or similar hardware devices, which you will hardcode an AI on as a dedicated AI device.
Thank you for your well reasoned reply. It helps tremendously.
In fact I love it enough to have printed it out for myself!
Again, thank you so much, not just for your knowledge in regards to programming, but also for your clear, succinct way of writing and formatting your message.
It'll be useful to reference in the coming weeks to support me in my quest to sharpen my writing skills.
No problem, glad it helped you.
(post is archived)