End to End theory of all ML algorithms is covered along with Visualization and Illustrative examples in MS Excel followed by hands on application in python for various Finance and Risk topics such as Algo trading, Pricing and Risk Management.

Sln | Topic | Details | ||

MODULE 1 - PRIMER |
||||

01 | Introduction to Machine Learning | a. What is Machine Learning b. Types of Learning (Supervised, Unsupervised, Reinforced) c. Structured vs Unstructured Data d. Machine Learning and the World today! e. Specific Use cases in Finance |
||

02 | Math toolbox for ML | a. Linear Algebra i. Vector Algebra (Addition, Product, Projections) ii. Matrix Algebra (Transpose, Multiplication, Inverse, Eigen Values) b. Optimization i. Maxima and Minima (calculus based) ii. Lagrangian Multipliers iii. Gradient Descent c. Parameter Estimation i. Maximum Likelihood Method (MLE) ii. Maximum a Posteriori (MAP) |
||

03 | Getting Started with Python | a. Importing Libraries b. Data types and Functions c. Data Preprocessing (Missing Data, Categorical Encoding) d. Splitting to Training, Validation and Testing sets e. Feature Scaling |
||

Module 2 – Supervised Learning (Regression) |
||||

04 | Linear Regression | Introduction and Objective Cost Function (MSE vs MAD) Simple vs Multiple Linear Regression Regression Assumptions (Multicollinearity, Exogeneity, Serial Correlation, Homoscedasticity) Parameter Estimation (Analytical and Gradient Descent) |
||

05 | Stepwise Regression for High Dimensional Data | Forward Selection Backward Elimination Least Angle Regression (LARS) |
||

06 | Polynomial Regression | |||

07 | Support Vector Regression | Linear SVR Kernel SVR |
||

08 | Decision Tree Regression | Splitting, Stopping Bagging and Boosting |
||

09 | Random Forest Regression | |||

10 | Regression Model Selection and Performance | |||

09 | K - NEAREST NEIGHBOUR | 1.K-Nearest neighbour 2.K-means protype |
||

10 | NAIVE BAYES CLASSIFIER | Bayes Theorem | ||

Module 3 – Supervised Learning (Classification) |
||||

11 | Supervised Learning – Classification | Introduction and Objective Loss Functions (Logistic Loss, Hinge Loss) Confusion Matrix and ROC |
||

12 | Classification – Logistic Regression | Logistic Regression Multinomial Logit Multiordinal Logit |
||

13 | K-Nearest Neighbor (kNN) | |||

14 | Naïve Bayes Classifier | |||

15 | Linear and Quadratic Discriminant Analysis (LDA) | |||

16 | Support Vector Machines | |||

17 | Decision Tree Classification | |||

18 | Random Forest Classification | |||

19 | Classification Model Selection and Performance | |||

Module 4 – Anomaly Detection |
||||

20 | One Class SVM | |||

21 | Isolation Forest | |||

Module 5 – Unsupervised Learning |
||||

22 | Dimensionality Reduction | Principal Component Analysis (PCA) Kernel PCA |
||

23 | Clustering | Hierarchical Clustering K-Means Clustering Density Based Clustering |
||

Module 6 – Reinforcement Learning |
||||

24 | Upper Confidence Bound | |||

25 | Thompson Sampling | |||

Module 7 – Natural Language Processing (NLP) |
||||

26 | Processing Unstructured Data | |||

27 | Word2Vec | |||

28 | Feature Selection (Chi-sq and MI) | |||

Module 8 – Deep Learning |
||||

29 | Artificial Neural Network (ANN) | |||

30 | Recurrent Neural Network (RNN) and Long short term Memory (LSTM) | |||

31 | Convolutional Neural Network (CNN) |

No content

Ans. 1. Anyone with finance background like having studied some level of CFA FRM or actuaries can join this program.

Ans.2. Maths Primers and Python Primers have been included in the program, so no previous experience is expected.

Ans 3. This course is quite long & comprehensive only because we have covered the entire curriculum in 3 parts – theory discussion, visualisations in excel, practical implementation through hands-on session in excel & python

Ans.4. To get certificates you need to complete all topic wise assignments, master project and pass the Final exam.

Ans.5. You can take either 1 year access or lifetime access. Please note that lifetime access is chargeable extra

Ans.6 With this website we have integrated a customized P2T player that will allow you to play encrypted classes. There are no limitations on the number of views. Also the software is compatible with Windows, Mac, Android or iPhone

Ans.7. To interact with the trainer we have a dedicated forum ‘D-forum’. Any questions asked on D-forum are expected to be replied within 24 hours by trainers and team of moderators & experts.

Ans. 8. Presently we are conducting exams in Aug mid and Jan mid. You can choose any of the cohort. In case you are not able to pass the exam in one go, you can re-book at a nominal charge

Ans.9. Every class is supported by One note files, Excel sheets & Python notebooks, Assignments and Quizzes, all these are available in the course section only.

Ans. 10. You get Letter of Recommendation physically delivered to you within 60 days of passing the exam. LOR’s also mention the chosen specialisation with the project details.

Sln | Topic | Details | ||

MODULE 1 - PRIMER |
||||

01 | Introduction to Machine Learning | a. What is Machine Learning b. Types of Learning (Supervised, Unsupervised, Reinforced) c. Structured vs Unstructured Data d. Machine Learning and the World today! e. Specific Use cases in Finance |
||

02 | Math toolbox for ML | a. Linear Algebra i. Vector Algebra (Addition, Product, Projections) ii. Matrix Algebra (Transpose, Multiplication, Inverse, Eigen Values) b. Optimization i. Maxima and Minima (calculus based) ii. Lagrangian Multipliers iii. Gradient Descent c. Parameter Estimation i. Maximum Likelihood Method (MLE) ii. Maximum a Posteriori (MAP) |
||

03 | Getting Started with Python | a. Importing Libraries b. Data types and Functions c. Data Preprocessing (Missing Data, Categorical Encoding) d. Splitting to Training, Validation and Testing sets e. Feature Scaling |
||

Module 2 – Supervised Learning (Regression) |
||||

04 | Linear Regression | Introduction and Objective Cost Function (MSE vs MAD) Simple vs Multiple Linear Regression Regression Assumptions (Multicollinearity, Exogeneity, Serial Correlation, Homoscedasticity) Parameter Estimation (Analytical and Gradient Descent) |
||

05 | Stepwise Regression for High Dimensional Data | Forward Selection Backward Elimination Least Angle Regression (LARS) |
||

06 | Polynomial Regression | |||

07 | Support Vector Regression | Linear SVR Kernel SVR |
||

08 | Decision Tree Regression | Splitting, Stopping Bagging and Boosting |
||

09 | Random Forest Regression | |||

10 | Regression Model Selection and Performance | |||

09 | K - NEAREST NEIGHBOUR | 1.K-Nearest neighbour 2.K-means protype |
||

10 | NAIVE BAYES CLASSIFIER | Bayes Theorem | ||

Module 3 – Supervised Learning (Classification) |
||||

11 | Supervised Learning – Classification | Introduction and Objective Loss Functions (Logistic Loss, Hinge Loss) Confusion Matrix and ROC |
||

12 | Classification – Logistic Regression | Logistic Regression Multinomial Logit Multiordinal Logit |
||

13 | K-Nearest Neighbor (kNN) | |||

14 | Naïve Bayes Classifier | |||

15 | Linear and Quadratic Discriminant Analysis (LDA) | |||

16 | Support Vector Machines | |||

17 | Decision Tree Classification | |||

18 | Random Forest Classification | |||

19 | Classification Model Selection and Performance | |||

Module 4 – Anomaly Detection |
||||

20 | One Class SVM | |||

21 | Isolation Forest | |||

Module 5 – Unsupervised Learning |
||||

22 | Dimensionality Reduction | Principal Component Analysis (PCA) Kernel PCA |
||

23 | Clustering | Hierarchical Clustering K-Means Clustering Density Based Clustering |
||

Module 6 – Reinforcement Learning |
||||

24 | Upper Confidence Bound | |||

25 | Thompson Sampling | |||

Module 7 – Natural Language Processing (NLP) |
||||

26 | Processing Unstructured Data | |||

27 | Word2Vec | |||

28 | Feature Selection (Chi-sq and MI) | |||

Module 8 – Deep Learning |
||||

29 | Artificial Neural Network (ANN) | |||

30 | Recurrent Neural Network (RNN) and Long short term Memory (LSTM) | |||

31 | Convolutional Neural Network (CNN) |

No content