In 2022, we are celebrating the 10th anniversary of [at] – Alexander Thamm.
10 years ago, we were the first consultancy in the German-speaking area to take up the cause of Data & AI. Today, we can say that artificial intelligence has the potential to make an important contribution to some of the major economic and social challenges of our time: AI plays a role in the energy transition and climate change, autonomous driving, the detection and treatment of diseases and in pandemic control. AI increases the efficiency of production processes and makes companies more adaptable to market changes through delivering real-time information and predictions.
The economic significance of this technology is growing rapidly: More than two-thirds of German companies now use artificial intelligence and machine learning.
With #AITOP10, we show you what’s hot right now in the field of Data & AI. Our TOP10 lists present podcast highlights, industry-specific AI trends, AI experts, tool recommendations, and much more. You get a broad cross-section of the Data & AI universe that has been driving us for 10 years now.
Enjoy the reading – and feel free to expand the list!
10th place Metaflow
Metaflow is a Python framework developed by Netflix and released as an open-source project in December 2019. It simplifies some of the challenges data scientists face around scalability and version control to speed up the time-to-production of ML projects. Metaflow snapshots code, data, and dependencies in a content-addressed datastore allow to resume workflows, reproduce past results, and inspect the current workflow in notebooks. The framework furthermore solves scalability issues as each step can be run on AWS using a separate node and unique dependencies, without having to worry about the inter-communication. Metaflow is useful if you are planning on managing, deploying, and running your ML workflow in a production environment, especially on AWS.
9th place H2O
The H2O platform solves many well-known known difficulties of ML model development and deployment in one package. It provides off-the-shelf ML algorithms like GLM, XGBoost, random forest, DNN, GAM, or k-means, and is easily extensible with Hadoop, Spark, and other frameworks. Furthermore, it provides an intuitive UI and AutoML functionality to speed up ML model development by automating feature engineering, model training, model tuning, and common boilerplate code. Supporting many programming languages and MLOps possibilities for advanced engineers, H2O is widely used by big enterprises for many use cases in insurance, finance, and healthcare.
8th place Apache MXNet
This lightweight deep-learning framework was developed as the result of a partnership between Intel and the Apache Software Foundation, suited for flexible research prototyping and production. MXNet is used to define, train, and deploy neural networks. It provides an ultra-scalable infrastructure and a flexible neural network model implementable in various coding languages. Designed from the ground up to perform on cloud infrastructure, the framework can be used to distribute deep learning workloads across multiple CPUs or GPUs with near-linear scalability. It is easy to use with a frontend providing the Gluon API, which offers a full set of plug-and-play neural network building blocks, including predefined layers, optimizers, and initializers.
7th place FastAI
Developing ML model prototypes is often time-consuming, especially when building a new model from scratch. As the name “FastAI” suggests, the framework speeds up neural network model development. With a high-level API, built-in algorithms, and a well-thought-through configuration, practitioners can achieve working ML model prototypes in less time. When digging deeper, the framework also offers capabilities for researchers and heavy users to achieve sophisticated, fine-tuned, and optimized ML models by using the low-level API. It enables practitioners to achieve state-of-the-art results effortlessly and simplifies deep experimentation for researchers. The goal of the non-profit organization behind the framework is to “make neural networks uncool again”. This is not to diminish the popularity of neural networks, but to expand the accessibility of the technology beyond the academic elite and experts.
6th place XGBoost
If you are working with structured or tabular data, a decision-tree-based algorithm should be part of your shortlist. Consider the XGBoost framework as it is a perfect combination of software and hardware optimization techniques to speed up the gradient boosted trees algorithm. With APIs in Python, C++, R, Java, Scala, and Julia, the framework supports different implementations of Gradient Boosted Trees and can run on CPUs, GPUs as well as distributed computing resources. Being the winning framework of many Kaggle competitions, you can think of XGBoost as being Gradient Boosting on steroids. The combination of hardware optimization, parallelization and cloud integration offers great possibilities to speed up decision-tree-based ML tasks.
5th place Apache Spark
If you are searching for a one-fits-all solution rather than a highly specialized algorithmic framework, then hang on. Being one of the biggest open-source projects in history, Apache Spark is a unified computing engine for large-scale data analytics. As one of the most actively developed open-source engines, Spark is a prominent tool for any developer or data scientist interested in big data. The engine supports multiple widely used programming languages and includes libraries for diverse tasks ranging from SQL to streaming and machine learning. With Spark MLLib, the engine offers diverse machine learning algorithms and workflow tools to experiment, deploy and scale your ML model using Spark’s high-speed computing engine. Apache Spark runs anywhere from a laptops CPU to a cluster of thousands of servers and is therefore an easy system to start with and scale up to big data processing.
4th place Scikit-Learn
Especially for beginners, Sckikit-Learn is an easy-to-master framework for predictive data analysis. The high-level framework with a set of simple and efficient tools for data mining and data analysis is considered incredibly robust and easy to understand. It is built on top of several popular Python packages, namely NumPy, SciPy, and matplotlib. You will get a grasp of the concept of the typical machine learning workflow. Considering its simplicity and many well-described examples, it’s an accessible tool for non-experts and new ML engineers, enabling quick application of machine learning algorithms to data.
3rd place Keras
Neural nets can be tough to develop, especially when just starting out with deep learning. The Keras framework provides a Python API for artificial neural networks and a suitable way to define and train nearly any kind of deep-learning model. Originally developed for scholars and researchers, the framework is meant to reduce the time from ideas to experimentation with neural networks. Keras offers consistent and simple APIs, explainable debugging and reduces the actions needed for common use cases. With the performance of TensorFlow and the high-level API designed for faster experimentation, Keras is an industry-strength framework used by many researchers and multimillion-dollar companies.
2nd place PyTorch
On your search for a flexible and highly scalable deep learning framework, PyTorch might have crossed your path already – for a good reason. One of the main advantages of using PyTorch for deep learning is its use of dynamic computation graphs. While static computational graphs, like those used in TensorFlow, are defined prior to runtime, dynamic graphs are defined “on the fly” via the forward computation. In other words, the graph is rebuilt from scratch on every iteration. This can be especially helpful when defining recurrent neural networks, especially for use cases in NLP with a variable input sequence length. As PyTorch is designed for a low-level environment, it provides significant levels of flexibility and power, though it can be challenging for a beginner.
1st place TensorFlow
What is your favourite framework to work with when it comes to ML and Deep Learning?
Share your favourite with us below this article!