Charmed Kubeflow now integrates with MindSpore
Canonical
on 8 November 2022
Tags: AI/ML , Charmed Kubeflow , deep learning , Huawei , Kubeflow , MLOps , Open source
The integration allows users to leverage deep learning for AI/ML projects within the MLOps platform
On 8 November 2022, at Open Source Experience Paris, Canonical announced that Charmed Kubeflow, Canonical’s enterprise-ready Kubeflow distribution, now integrates with MindSpore, a deep learning framework open-sourced by Huawei.
Charmed Kubeflow is an end-to-end MLOps platform with optimised complex model training capabilities designed for use with Kubernetes. Part of a robust set of integrations, the new, native integration with MindSpore provides access to unified APIs and end-to-end AI capabilities for model development, execution and deployment. MindSpore offers AI-native execution modes, fully using the computing power provided by Huawei’s hardware.
MindSpore: an innovative AI framework with wide industry adoption
Industry partners and developers have actively participated in MindSpore’s development since it first became open source in 2020. MindSpore serves over 5,000 businesses and has been downloaded over 2.49 million times. More than 6,600 developers have contributed code to the program, and more than 110 universities and 40 top research institutes have used MindSpore for scientific research and teaching.
MindSpore is optimised for Huawei hardware, such as Ascend/GPU/CPU, and can be quickly deployed in cloud, edge and mobile scenarios, allowing engineers to develop solutions for a wide spectrum of use cases. MindSpore runs on the Jupyter Notebook, a web-based interactive computing platform. When creating the Notebook, engineers can choose the MindSpore image from the default Jupyter Lab image list, assign certain CPU resources and spin up a notebook instance that can run MindSpore and its Vision suite code to do machine learning experiments.
Get started using our practical guide.
Robust integrations for MLOPs at scale
By providing a native integration with MindSpore and collaborating with Huawei on this open-source project, Canonical is providing the MLOPs ecosystem with an increasingly rich toolset. Charmed Kubeflow integrates with multiple AI/ML tools, such as MLFlow, which provides a central model registry, and Spark, which facilitates data streaming. The MindSpore integration provides access to the solution’s various features, such as unified programming and operators, automatic model partitioning and dynamic and static computation graphs. R
Learn more at the Open Source Experience conference in Paris
Canonical and Huawei will be present at Open Source Experience in Paris on 8-9 November 2022. Schedule a meeting to learn more about Mindspore and Charmed Kubeflow. Event attendees can demo a real use case that benefits from the integration running on top of Atlas 500 Pro AI Edge Servers.
Read more about Kubeflow at Open Source Experience and meet us at booth E32
About Canonical
Canonical is the publisher of Ubuntu, the leading operating system for container, cloud and hyperscale computing. Ubuntu is the OS for most public cloud workloads as well as the emerging categories of smart gateways, self-driving cars and advanced robots. Canonical provides enterprise security, support, and services to commercial users of Ubuntu. Established in 2004, Canonical is a privately held company.For more information on Canonical and Ubuntu, visit www.canonical.com and www.ubuntu.com
Run Kubeflow anywhere, easily
With Charmed Kubeflow, deployment and operations of Kubeflow are easy for any scenario.
Charmed Kubeflow is a collection of Python operators that define integration of the apps inside Kubeflow, like
katib or pipelines-ui.
Use Kubeflow on-prem, desktop, edge, public cloud and multi-cloud.
What is Kubeflow?
Kubeflow makes deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.
Kubeflow is the machine learning toolkit for Kubernetes. It extends Kubernetes ability to run independent and
configurable steps, with machine learning specific frameworks and libraries.
Install Kubeflow
The Kubeflow project is dedicated to making deployments of machine learning workflows on Kubernetes simple,
portable and scalable.
You can install Kubeflow on your workstation, local server or public cloud VM. It is easy to install
with MicroK8s on any of these environments and can be scaled to high-availability.
Newsletter signup
Related posts
Meet us in Sydney and let’s talk about how you can navigate your AI journey
Date: August 27, 2024 Venue: The Fullerton Hotel Sydney Time: 13:00 PM – 18:00 PM AI has officially taken off. Today, thousands of exciting projects are being...
Edge AI: what, why and how with open source
Edge AI is transforming the way that devices interact with data centres, challenging organisations to stay up to speed with the latest innovations. From...
Meet Canonical at KubeCon + CloudNativeCon North America 2024
We are ready to connect with the pioneers of open-source innovation! Canonical, the force behind Ubuntu, is returning as a gold sponsor at KubeCon +...