Image placeholder


Designs and teaches many courses at Stanford University and Coursera. Works on Massive Open Online Courses like the Deep Learning specialization and the Natural Language Processing specialization on Coursera. At Stanford, he helped build 3 AI courses and taught 2 of them for a few years with Andrew Ng. He built 50+ AI programming assignemnts with their AI autograders and gave 100s of lectures.


Classification and Vector spaces

This is the 1st course of the Natural Language Specialization. It covers text processing techniques for Naive Bayes and logistic regression. Introduces students to PCA and locality sensitive hashing for approximate nearest neighbors.

Probabilistic Models in NLP

This is the 2nd course of the Natural Language Processing specialization. Students implement auto-correct, and auto-complete. They learn about the viterbi algorithm which is used in speech recognition and part of speech tagging. They also train word vectors from scratch.

Sequence Models in NLP

This is the 3rd course of the Natural Language Specialization. You will learn to use neural network for sentiment analysis, named entity recognition, character generation, and to identify question duplicates. Models covered include NNs, GRUs, LSTMs, Siamese networks with hard negative mining.

Attention models in NLP

This is the 4th course of the specialization. Focus is on attention, transformers, reformers, etc... Students build Machine translation systems, summarizers, question answering programs, and chatbots. Focus is on dialog systems.

Neural Networks and Deep learning

This course covers goes over the deep learning basics. Students learn to build neural nets from scratch.

Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization

Hyperparameter tuning is an important skill in AI. In this course students learn to choose their hyper parameters.

Structuring Machine Learning Projects

Understanding how to properly structure your project can save you months of work.

Convolutional Neural Networks

CNNs are fundamental to computer vision. This course covers the fundamentals and allows you to build them from scratch.

Sequence Models

Sequence models have a lot of applications and and are widely used in the AI community. This course allows you to build cool products like word generation, debiasing word vectors, etc...

Applied Machine learning

This course is Stanford's undergraduate Machine Learning CS129. I helped create it and taught it since many undergrads did not have a nice transition to the upper div courses. It covers concepts ranging from linear regression, all the way to SVMs and collaberative filtering.

Deep Learning

Known as CS230 at Stanford. It consists of the deeplearning specialization I helped build.

Teaching Artificial Intelligence

Known as CS93 at Stanford. In this course, I show students how to evaluate AI papers, and how to teach state of the art AI. Sometimes it could be difficult to transform research papers into lectures.

Get in touch. Let's chat.