support@logyc.co
Supply Chain & Product Life Cycle Augmentation
We are happy to meet you during our working hours. Please make an appointment.
– Deep Boltzmann Machine (DBM)– Deep Belief Networks (DBN)– Convolutional Neural Network (CNN)– Stacked Auto-Encoders
– Radial Basis Function Network (RBFN)– Perceptron– Backpropagation– Hopfield Network
– Random Forest– Gradient Boosting Machines (GBM)– Boosting– Bootstrap Aggregation (Bagging)– AdaBoost– Stacked Generalization (Blending)– Gradient Boosted Regression Trees (GBRT)
– Ridge Regression– Least Absolute Shrinkage and Selection Operator (LASSO)– Elastic Net– Least Angle Regression (LARS)
– Cubist– One Rule (OneR)– Zero Rule (ZeroR)– Repeated Incremental Pruning to Produce Error Reduction (RIPPER)
– Linear Regression– Ordinary Least Squares Regression (OLSR)– Stepwise Regression– Multivariate Adaptive Regression Splines (MARS)– Locally Weighted Scatterplot Smoothing (LOWESS)– Logistic Regression
– Naive Bayes– Averaged One-Dependence Estimators (AODE)– Bayesian Belief Network (BBN)– Gaussian Naive Bayes– Multinomial Naive Bayes– Bayesian Network (BN)
– Classification and Regression Tree (CART)– Iterative Dichotomiser 3 (ID3)– C4.5– C5.0– Chi-squared Automatic Interaction Detection (CHAID)– Decision Stump– Conditional Decision Trees– M5
– Principal Component Analysis (PCA)– Partial Least Squares Regression (PLSR)– Sammon Mapping– Multidimensional Scaling (MDS)– Projection Pursuit (PP)– Principal Component Regression (PCR)– Partial Least Square Discriminant Analysis– Mixture Discriminant Analysis (MDA)– Quadratic Discriminant Analysis (QDA)– Flexible Discriminant Analysis (FDA)– Linear Discriminant Analysis (LDA)
– k-Nearest Neighbors (kNN)– Learning Vector Quantization (LVQ)– Self-Organizing Map (SOM)– Locally Weighted Learning (LWL)
– k-Means– k-Medians– Expectation Maximization (EM)– Hierarchical ClusteringÂ