There are always new skills to learn in fast-moving fields like Data Science or Artifical Intelligence, new features to implement or existing procedures to optimize. As a life-long learner, I approach these challenges with curiosity and enthusiasm. In particular, I have sought every opportunity during my tenure as a PhD student to broaden my technical horizon and prepare myself more explicitly for roles in industry. Below you can view some of these endeavours, including Neuroscience and Deep Learning Bootcamps and LinkedIn Learning courses on a variety of topics, including machine learning, data science, SQL, and AI.

Contents

  1. Graduate Coursework
  2. Workshops and Bootcamps
  3. LinkedIn Learning
  4. Awards and Achievements

Graduate Coursework

In my 4th year as a PhD candidate, I decided to augment my personal learning in data science and machine learning with actual graduate-level coursework from the Duke University Department of Computer Science & Engineering. I am hoping to continue taking these courses throughout the rest of my time at Duke, which may even result in a Masters in Computer Science & Engineering.

Introduction to Deep learning (ECE 685D)

  • Professor: Vahid Tarokh, Ph.D.
  • Lecture Topics:
    1. Introduction: What is Machine Learning? Data Sets
    2. Mathematical Background: Linear Algebra, Calculus, Probability and Statistics
    3. Linear and Logistic Regression
    4. Multi-Layer Perceptron (MLP) Networks: Weights, Biases, Initialization, Non-linear activation, Loss functions,
    5. Back-propagation: Chain rule of multivariable calculus
    6. Optimization for Training Deep Networks: Stochastic Gradient Descent, Nestrov Acceleration Methods, Stochastic Methods with Momentum, Adagrad, RMSProp, Adam, Second Order Methods, Learning Rate
    7. Underfitting, Overfitting, Training Tricks: Bias and Variance Trade-off, Underfitting, Overfitting, Regularization, Weight-decay (l2 -norm), Sparsity and l1 -norm, Dropout, Early Stopping, Image Augmentation
    8. Convolutional Neural Networks (CNNs): Biological inspiration, Receptive Fields, Parameter Sharing, Convolution and Correlation Operation, Multiple Input and Output Channels, Filters and Feature Maps, Stride and Padding, Pooling, Max-pooling, Average Pooling, Batch-Normalization, Back-propagation for CNNs, Modern Convolutional Neural Networks
    9. Applications of CNNs to Computer Vision: Image Classification and Detection
    10. Generative Models I: Naive Bayes and LDA, Graphical Models, Directed Graphical Models, Undirected Graphical Models, Hidden Markov Models (HMM), Linear Factor Models and Factor Analysis, PCA, Probabilistic-PCA, Slow Feature Analysis, ICA, Sparse Coding and Dictionary Learning
    11. Generative Models II: Restricted Boltzmann Machine, Deep Belief Networks
    12. Generative Models III: Auto-Encoders, Variational Autoencoders, Importance Weighted Autoencoders (IWAE), Conditional VAEs,
    13. Generative Models IV: Generative Adversarial Networks (GANs), Vanilla-GAN, DCGAN, Conditional GAN, InfoGAN, f-GAN, CycleGAN, Energy-Based GAN, Coupled GAN
    14. Recurrent Neural Networks (RNNs) and Time Series: Sequential Data sets, Time Series and Prediction, ARIMA Models, RNNs Architecture, Gated Recurrent Unit (GRU), Long-Short-Term-Memory (LSTM), Deep RNNs, Bi-directional RNNs
    15. Introduction to Natural Language Processing (NLP) and RNNs, Text Processing, Language Models , Attention Mechanisms, Sequence-to-Sequence Models
    16. Advanced Topics: Physics-Informed DNNs, Image and Video Inpainting, Information bottleneck and invariance, Deep Neural Compression, Neural Architecture Search, Adversarial examples in DNNs, Computational Neuro-Science, Theory of Deep Learning

Workshops and Bootcamps

NeuroMatch Academy 2020, Computational Neuroscience

Interactive Track - Intensive (10 hrs/day M-F for 3 weeks) cohort-based design.

Syllabus

  • Modeling (Types, Practice, Fitting)
  • Machine Learning
  • Dimensionality Reduction
  • Bayesian Statistics
  • Reinforcement Learning
  • Deep Learning
  • Linear Systems
  • Decision Making
  • Dynamic Networks & Network Causality

Projects

  • Social versus Non-Social Decision Making in the Brain
    • Techniques:
      • Bayesian Likelihood Approximation
      • L1/L2 Regularization
      • Logistic Regression
      • K-fold cross-validation

NeuroMatch Academy 2021, Deep Learning

Observer Track - Self Paced (5 hrs/day M-F for 3 weeks) solo design.

Syllabus

  • Linear DL and MLPs
  • Optimization
  • Regularization
  • Convnets and RNNs
  • Attention and Transformers
  • Generative Models (VAEs & GANs)
  • Unsupervised and Self-supervised Learning
  • Deep Reinforcement Learning
  • Continual Learning

Duke University Neuroscience Bootcamp

Two week intensive bootcamp (6 hr/day) on neurobiology, computational, and cognitive neuroscience methods.


LinkedIn Learning

Completed Learning Paths

  • Become a Machine Learning Engineer (10 hr 58 min)
  • Master the Fundamentals of AI and Machine Learning (14 hr 5 min)

Individual courses listed below, organized by topic


Data Science and Machine Learning

Python

  • Pandas Essential Training
  • Advanced Pandas
  • Machine Learning with Scikit-Learn

Algorithms

  • Algorithmic Thinking with Python: Foundations
  • Python Data Structures and Algorithms
    • Breadth-First Search
    • Depth-First Search
    • A* Search
    • Stacks, Queues, and Priority Queues (i.e., heaps)
  • AI Algorithms for Gaming
    • Minimax Algorithm
    • Alpha-Beta Pruning
    • Depth-Limited Search
    • Iterative Deepening
  • Machine Learning with Python: Foundations
  • Applied Machine Learning: Algorithms
    • Logistic Regression
    • SVM
    • MLPs
    • Random Forest
    • Boosting

Predictive Analytics

  • Machine Learning and AI Foundations: Decision Trees with SPSS
    • CHAID Decision Tree Algorithm
    • CART Decision Tree Algorithm
  • Machine Learning and AI: Advanced Decision Trees with SPSS
    • QUEST Decision Tree Algorithm
    • C5.0 Decision Tree Algorithm
  • Machine Learning with Python: Decision Trees
    • Classification and Regression Decision Trees
    • Recursive Partitioning
    • Entropy, Gini, and Sum of Squared Residuals as impurity measures
    • Cost-Complexity Pruning

Deep Learning and Artificial Intelligence

  • Deep Learning: Getting Started
  • Artificial Intelligence Foundations: Thinking Machines
  • Artificial Intelligence Foundations: Machine Learning
  • Artificial Intelligence Foundations: Neural Networks
  • Training Neural Networks in Python
  • Building and Deploying Deep Learning Applications with TensorFlow
  • Building Deep Learning Applications with Keras 2.0
  • Learning XAI: Explainable Artificial Intelligence
  • Reinforcement Learning Foundations

SQL

  • SQL Essential Training

Miscellaneous

  • Developing Chatbots with Azure
  • Cognitive Technologies: The Real Opportunities for Business
  • AI The LinkedIn Way: A Conversation with Deepak Agarwal
  • Artificial Intelligence for Cybersecurity
  • Artificial Intelligence for Project Managers
  • AI Accountability Essential Training

Awards and Achievements

  • Charles Lafitte Foundation (2021, 2022)
  • NSF Graduate Research Fellowship Program - Honorable Mention (2019)
  • James B. Duke Fellowship (2019 - Present)
  • Trainee Professional Development Award - Society for Neuroscience (2018)
  • Departmental Citation for Outstanding Undergraduate Research - UC Davis (2017)
  • Regents Scholarship - UC Davis (2013 - 2017)