Machine.Learning

Machine Learning in Python and R

View project on GitHub

Machine Learning A-Z

  • Getting the dataset
  • Importing the Libraries
  • Importing the Dataset
  • For Python learners, summary of Object-oriented programming: classes & objects
  • Missing Data
  • Splitting the Dataset into the Training set and Test set
  • Feature Scaling
  • Data Preprocessing Template
  • Simple Linear Regression Intuition
  • Simple Linear Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Simple Linear Regression to the Training set
    • Predicting the Test set results
    • Visualising the Training set results
    • Visualising the Test set results
  • Simple Linear Regression in Python - Backward Elimination
  • Simple Linear Regression in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Simple Linear Regression to the Training set
    • Predicting the Test set results
    • Visualising the Training set results
    • Visualising the Test set results
  • Simple Linear Regression in R - Backward Elimination
  • Multiple Linear Regression Intuition
  • What is the P-Value?
  • Multiple Linear Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Encoding categorical data
    • Avoiding the Dummy Variable Trap
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Multiple Linear Regression to the Training set
    • Predicting the Test set results
  • Multiple Linear Regression in Python - Backward Elimination
  • Multiple Linear Regression in Python - Automatic Backward Elimination
  • Multiple Linear Regression in R
    • Importing the dataset
    • Encoding categorical data
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Multiple Linear Regression to the Training set
    • Predicting the Test set results
  • Multiple Linear Regression in R - Backward Elimination
  • Multiple Linear Regression in R - Automatic Backward Elimination

  • Polynomial Regression Intuition
  • Polynomial Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Linear Regression to the dataset
    • Fitting Polynomial Regression to the dataset
    • Visualising the Linear Regression results
    • Visualising the Polynomial Regression results
    • Visualising the Polynomial Regression results (for higher resolution and smoother curve)
    • Predicting a new result with Linear Regression
    • Predicting a new result with Polynomial Regression
  • Polynomial Regression in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Linear Regression to the dataset
    • Fitting Polynomial Regression to the dataset
    • Visualising the Linear Regression results
    • Visualising the Polynomial Regression results
    • Visualising the Regression Model results (for higher resolution and smoother curve)
    • Predicting a new result with Linear Regression
    • Predicting a new result with Polynomial Regression
  • SVR Intuition
  • SVR in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting SVR to the dataset
    • Predicting a new result
    • Visualising the SVR results
    • Visualising the SVR results (for higher resolution and smoother curve)
  • SVR in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting SVR to the dataset
    • Predicting a new result
    • Visualising the SVR results
    • Visualising the SVR results (for higher resolution and smoother curve)
  • Decision Tree Regression Intuition
  • Decision Tree Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Decision Tree Regression to the dataset
    • Predicting a new result
    • Visualising the Decision Tree Regression results (higher resolution)
  • Decision Tree Regression in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Decision Tree Regression to the dataset
    • Predicting a new result with Decision Tree Regression
    • Visualising the Decision Tree Regression results (higher resolution)
    • Plotting the tree
  • Random Forest Regression Intuition
  • Random Forest Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Random Forest Regression to the dataset
    • Predicting a new result
    • Visualising the Random Forest Regression results (higher resolution)
  • Random Forest Regression in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Random Forest Regression to the dataset
    • Predicting a new result with Random Forest Regression
    • Visualising the Random Forest Regression results (higher resolution)
  • R-Squared Intuition
  • Adjusted R-Squared Intuition
  • Evaluating Regression Models Performance
  • Interpreting Linear Regression Coefficients


  • Logistic Regression Intuition
  • Logistic Regression in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Logistic Regression in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • K-NN Intuition
  • K-NN in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting K-NN to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • K-NN in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting K-NN to the Training set and Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • SVM Intuition
  • SVM in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • SVM in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Kernel SVM Intuition
  • Mapping to a higher dimension
  • The Kernel Trick
  • Types of Kernel Functions
  • Kernel SVM in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Kernel SVM in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Bayes Theorem
  • Naive Bayes Intuition
  • Naive Bayes in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Naive Bayes to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Naive Bayes in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Decision Tree Classification Intuition
  • Decision Tree Classification in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Decision Tree Classification to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Decision Tree Classification in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Decision Tree Classification to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
    • Plotting the tree
  • Random Forest Classification Intuition
  • Random Forest Classification in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Random Forest Classification to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Random Forest Classification in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Random Forest Classification to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
    • Choosing the number of trees
  • False Positives & False Negatives
  • Confusion Matrix
  • Accuracy Paradox
  • CAP Curve
  • CAP Curve Analysis

  • K-Means Clustering Intuition
  • K-Means Random Initialization Trap
  • K-Means Selecting The Number Of Clusters
  • K-Means Clustering in Python
    • Importing the libraries
    • Importing the dataset
    • Using the elbow method to find the optimal number of clusters
    • Fitting K-Means to the dataset
    • Visualising the clusters
  • K-Means Clustering in R
    • Importing the dataset
    • Using the elbow method to find the optimal number of clusters
    • Fitting K-Means to the dataset
    • Visualising the clusters
  • Hierarchical Clustering Intuition
  • Hierarchical Clustering How Dendrograms Work
  • Hierarchical Clustering Using Dendrograms
  • Hierarchical Clustering in Python
    • Importing the libraries
    • Importing the dataset
    • Using the dendrogram to find the optimal number of clusters
    • Fitting Hierarchical Clustering to the dataset
    • Visualising the clusters
  • Hierarchical Clustering in R
    • Importing the dataset
    • Using the dendrogram to find the optimal number of clusters
    • Fitting Hierarchical Clustering to the dataset
    • Visualising the clusters

  • Apriori Intuition
  • Apriori in Python
    • Importing the libraries
    • Data Preprocessing
    • Training Apriori on the dataset
    • Visualising the results
  • Apriori in R
    • Data Preprocessing
    • Training Apriori on the dataset
    • Visualising the results
  • Eclat Intuition
  • Eclat in R
    • Data Preprocessing
    • Training Eclat on the dataset
    • Visualising the results
  • Thompson Sampling Intuition
  • Algorithm Comparison: UCB vs Thompson Sampling
  • Thompson Sampling in Python
    • Importing the libraries
    • Importing the dataset
    • Implementing Thompson Sampling
    • Visualising the results - Histogram
  • Thompson Sampling in R
    • Importing the dataset
    • Implementing Thompson Sampling
    • Visualising the results
  • Natural Language Processing Intuition
  • Natural Language Processing in Python
    • Importing the libraries
    • Importing the dataset
    • Cleaning the texts
    • Creating the Bag of Words model
    • Splitting the dataset into the Training set and Test set
    • Fitting Naive Bayes to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
  • Natural Language Processing in R
    • Importing the dataset
    • Cleaning the texts
    • Creating the Bag of Words model
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Fitting Random Forest Classification to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising
  • Plan of attack
  • The Neuron
  • The Activation Function
  • How do Neural Networks work?
  • How do Neural Networks learn?
  • Gradient Descent
  • Stochastic Gradient Descent
  • Backpropagation
  • Business Problem Description
  • Artificial Neural Networks in Python
    • Installing Theano, Tensorflow and Keras
    • Part 1 - Data Preprocessing
      • Importing the libraries
      • Importing the dataset
      • Encoding categorical data
      • Splitting the dataset into the Training set and Test set
      • Feature Scaling
    • Part 2 - Making the ANN
      • Importing the Keras libraries and packages
      • Initialising the ANN
      • Adding the input layer and the first hidden layer
      • Adding the second hidden layer
      • Adding the output layer
      • Compiling the ANN
      • Fitting the ANN to the Training set
    • Part 3 - Making the predictions and evaluating the model
      • Predicting the Test set results
      • Making the Confusion Matrix
  • Artificial Neural Networks in R
    • Importing the dataset
    • Encoding the categorical variables as factors
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting ANN to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
  • Plan of attack
  • What are convolutional neural networks?
  • Step 1 - Convolution Operation
  • Step 1(b) - ReLU Layer
  • Step 2 - Pooling
  • Step 3 - Flattening
  • Step 4 - Full Connection
  • Softmax & Cross-Entropy
  • Convolutional Neural Networks in Python
    • Installing Theano, Tensorflow and Keras
    • Part 1 - Building the CNN
      • Importing the Keras libraries and packages
      • Initialising the CNN
      • Step 1 - Convolution
      • Step 2 - Pooling
      • Adding a second convolutional layer
      • Step 3 - Flattening
      • Step 4 - Full connection
      • Compiling the CNN
    • Part 2 - Fitting the CNN to the images
  • Principal Component Analysis (PCA) Intuition
  • Principal Component Analysis in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying PCA
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Principal Component Analysis in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying PCA
    • Fitting SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Linear Discriminant Analysis (LDA) Intuition
  • Linear Discriminant Analysis in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying LDA
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Linear Discriminant Analysis in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying LDA
    • Fitting SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Visualising the Test set results
  • Kernel PCA in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying Kernel PCA
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Before Kernal PCA
    • After Kernal PCA
  • Kernel PCA in R
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Applying Kernel PCA
    • Fitting Logistic Regression to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Visualising the Training set results
    • Before Kernal PCA
    • After Kernal PCA
  • k-Fold Cross Validation in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation
  • Grid Search in Python
    • Importing the libraries
    • Importing the dataset
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation
    • Applying Grid Search to find the best model and the best parameters
  • k-Fold Cross Validation in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation
  • Grid Search in R
    • Importing the dataset
    • Encoding the target feature as factor
    • Splitting the dataset into the Training set and Test set
    • Feature Scaling
    • Fitting Kernel SVM to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation
    • Applying Grid Search to find the best parameters
  • XGBoost in Python
    • Installing XGBoost
    • Importing the libraries
    • Importing the dataset
    • Encoding categorical data
    • Splitting the dataset into the Training set and Test set
    • Fitting XGBoost to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation
  • XGBoost in R
    • Importing the dataset
    • Encoding the categorical variables as factors
    • Splitting the dataset into the Training set and Test set
    • Fitting XGBoost to the Training set
    • Predicting the Test set results
    • Making the Confusion Matrix
    • Applying k-Fold Cross Validation