Algorithmia Integrates AI Model Governance with GitOps

Algorithmia Integrates AI Model Governance with GitOps

In AI by James KobielusLeave a Comment

Algorithmia Integrates AI Model Governance with GitOps

The News: Algorithmia announced integration of its machine-learning (ML) model management and source-code management capabilities with GitHub. The GitHub integration is now available to Algorithmia public users and for existing Algorithmia Enterprise customers. It also announced a new on-premises offering, Algorithmia Enterprise on VMWare, that allows customers to run integrated ML/code DevOps across multi-cloud environments that implement VMware’s virtualization technology. Read the full Algorithmia post and Release Notes.

Algorithmia Integrates AI Model Governance with GitOps

Analyst Take: Algorithmia’s announcement, announcing the integration of AI model governance with GitOps acknowledges that Git repositories are becoming a core platform for continuous integration and continuous deployment (CI/CD) of machine learning (ML) models within enterprise application development pipelines.

Enterprises are Managing ML development with DevOps Discipline

DevOps refers to a widely adopted set of tools, practices, and skills that organizations must adopt to deliver software releases more continuously and rapidly without sacrificing quality, security, and consistency.

Data science is the core skillset of the next-generation developer. Increasingly, it’s also an operational IT staff function that’s necessary for managing a constant stream of updates to the machine learning, deep learning, predictive analytics, and other data-science deliverables that power recommendation engines, e-commerce chatbots, and other intelligent applications. Consequently, DevOps is just as relevant to the success of statistical modelers, data engineers, and other data science professionals as it is to programmers and other traditional developers.

To successfully scale up mature application-release pipelines, data science teams are adopting collaboration and workflow tools that implement DevOps practices. Data science workbenches such as Algorithmia are the focus of much ML development in the application lifecycle. These environments allow data scientists and ML engineers to automate DevOps workflows around ML models. They do this by enabling developers to connect data sources, orchestration engines, and step functions, and deploy models from major frameworks, languages, platforms, and tools.

The latest generation of these tools is leveraging cloud-native infrastructure and interfaces, such as source-control repositories, to deploy and manage a steady stream of ML models and code builds all the way to the edge.

Source-control repositories—such as GitHub—are where all builds, logs, and other pipeline artifacts are stored, managed, and controlled throughout every step of the DevOps lifecycle. The repository serves as the hub for collaboration, reuse, and sharing of all pipeline artifacts by all participants. It provides access to the model and code libraries that data scientists and other developers incorporate in their projects. It’s the point from which consistent policies for workflow, sharing, reuse, permissioning, check in/check-out, change tracking, versioning, training, configuration, deployment, monitoring, testing, evaluation, performance assessment, and quality assurance across all pipeline artifacts.

ML Models are Integral to More Application Development Practices

GitOps is an emerging practice for automating software release workflows in cloud-native application environments. Under GitOps, DevOps teams store and manage every application artifact in a Git repository. This generally includes all policies, code, configuration, and events that are integral to an application’s design, as well as ML models that are integral to deployed AI applications.

This approach requires a Git repository manager with built-in CI/CD capabilities. This should encompass end-to-end planning, creation, configuration, verification, packaging, integration, deployment, monitoring, and management of every artifact that will be integral to production applications.

This is the paradigm that Algorithmia is enabling through the new integration of its ML model management and source-code management capabilities with GitHub. Connecting their Algorithmia and GitHub accounts enables developers to store and manage their source code on GitHub and deploy it directly to Algorithmia to be integrated with their ML models. This allows developers to leverage existing GitHub DevOps workflows, access GitHub Actions and all other GitHub features, and ensure that source code visibility is restricted to those with proper GitHub permissions.

Developers can host source code on Algorithmia, either on the public version, in their own private Algorithmia Enterprise instance, or in their preferred local repository. They can access GitHub Actions to automate software reviews, as well as for building, testing, tracking, dependency management, release, and auditing workflows. And they can set up Azure Pipelines or other integrations by using Algorithmia linked with GitHub repositories.

Integrated ML/code DevOps is Spanning Multiclouds

GitOps is becoming the predominant approach in open application ecosystems that span public, private, hybrid, and multicloud environments.

VMware Inc., the 20-year-old virtualization pioneer, has become the go-to partner in the multicloud-cloud market. In the race to migrate enterprise workloads from on-premises platforms to their respective clouds, the leading public cloud providers—Amazon Web Services, Microsoft Azure, and Google Cloud Platform—have each enlisted VMware as their strategic partner to help customers “lift and shift” their application workloads to complex cloud environments.

In the same announcement as its GitHub integration, Algorithmia announced a new offering, Algorithmia Enterprise on VMware, that allows customers to run integrated ML/code DevOps across multi-cloud environments that implement VMware’s cloud virtualization technology. The new solution is deployable on existing VMware on-premises compute and storage infrastructure or on new on-premises infrastructure that has been integrated with any supported multicloud environment. To create an end-to-end GitOps and other DevOps workflows across multiclouds, users can set up GitHub Actions linked with GitHub repositories, use Azure Actions, or other integrations.

In addition, the vendor announced new model auditing tools that support detailed analysis of how its ML pipeline environment, at the granular level of individual models and function calls, is consuming hardware resources (CPUs and GPUs) across a multicloud Algorithmia GitOps/DevOps deployments.

The Takeaway – Algorithmia Brings Machine Learning Pipeline into GitOps Workflows

GitHub is the central infrastructure of modern application development collaboration, development, and deployment processes. With this announcement, Algorithmia has made it easier to use GitHub to break down the siloes that traditionally have kept ML developers and application coders from integrating tightly within today’s continuous DevOps workflows.

My take is that Algorithmia understands exactly what today’s data scientists need to be effective in larger enterprise application development teams. Hubbed on Git repositories of the user’s choice, the new offering allows multiple developers to contribute to the same ML model, collaborate on a centralized codebase, and engage in code reviews, version controls, pull requests, and issue tracking. They can easily Git-push their pre-trained model, functions, or algorithm into production, while automatically creating versioned, permissioned, scalable API endpoint that any application or model can call.

Going forward, I hope that Algorithmia will also support other common cloud-native source repositories—especially Bitbucket, CloudForge, and SourceForge—for storage and management of source code and deployment of the same to Algorithmia to be integrated with ML models. Considering the cross-platform complexity of many multicloud environments, there are likely to be more hybridized repository environments that will need to be supported in the future.

Futurum Research provides industry research and analysis. These columns are for educational purposes only and should not be considered in any way investment advice.

Latest insights from the Futurum Research team:

Google Applies for Permission to Resume Business with Huawei

The Retirement of DAWNBench — What’s Next for Benchmarking the Next Gen Infrastructure for Industrialized Data Science

Zoom Stock Finds a Bright Spot in Coronavirus Fears

Image Credit: Algorithmia

 

The original version of this article was first published on Futurum Research.

James has held analyst and consulting positions at SiliconANGLE/Wikibon, Forrester Research, Current Analysis and the Burton Group. He is an industry veteran, having held marketing and product management positions at IBM, Exostar, and LCC. He is a widely published business technology author, has published several books on enterprise technology, and contributes regularly to InformationWeek, InfoWorld, Datanami, Dataversity, and other publications.

Leave a Comment