ML Algorithms Implementation
Me and rohan started coding a python library with basic ML algorithms which took off and we had over 30 contributors from all over the world.
Oh the good old college days... This is one of my first big projects that I did while in my 2nd year of college. Me and Rohan were learning ML on our own cz why not and so we thought it would be cool if we create our own python library with all the basic ML algorithms and all.
The idea really took off and we were soon joined by a bunch of our friends and we started working on this project. At some point we made it official and opened our library for open source contributions. We had over 30 contributors from all over the world and we were able to create a pretty decent library with all the basic ML algorithms.
The project implements various algorithms across multiple categories, including activation functions, optimizers, models, backend utilities, pre-processing methods, loss functions, regularizers, and metrics.
Activation functions such as Sigmoid, Tanh, Softmax, Softsign, Relu, Leaky Relu, Elu, Swish, and Unit Step are implemented in activations.py.
Optimizers like Gradient Descent, Stochastic Gradient Descent, Mini Batch Gradient Descent, Momentum Gradient Descent, Nesterov Accelerated Descent, Adagrad, Adadelta, and Adam are available in optimizers.py.
The models include Linear Regression, Logistic Regression, Decision Tree Classifier, KNN Classifier/Regressor, Naive Bayes (including Gaussian, Multinomial, and Bernoulli), Random Forest Classifier, K Means Clustering, Divisive Clustering, Agglomerative Clustering, Bayes Optimization, Numerical Outliers, Principal Component Analysis, Z Score, and Sequential Neural Networks, which are located in models.py.
The backend utilities include Autograd in autograd.py, Tensor operations in tensor.py, and Functional operations in functional.py.
Pre-processing methods such as Bell Curve, Standard Scaler, MaxAbs Scaler, Z Score Normalization, Mean Normalization, Min-Max Normalization, and Feature Clipping are defined in preprocessor_utils.py.
Loss functions include Mean Squared Error, Logarithmic Error, Absolute Error, Cosine Similarity, Log Cosh, and Huber, implemented in loss_func.py. Regularizers like L1 and L2 are in regularizer.py.
Metrics including Confusion Matrix, Precision, Accuracy, Recall, and F1 Score are available in metrics.py.