Your submission was sent successfully! Close

Thank you for contacting us. A member of our team will be in touch shortly. Close

You have successfully unsubscribed! Close

Thank you for signing up for our newsletter!
In these regular emails you will find the latest updates about Ubuntu and upcoming events where you can meet our team.Close

Meet Canonical at MLOps World 2023

Andreea Munteanu

on 28 September 2023

This article is more than 1 year old.


The AI Roadshow lands in the USA

Date: 25-26 October 2023

Location: Renaissance Austin Hotel, Austin, Texas

The Canonical AI Roadshow will soon cross the Atlantic and stop in Austin, Texas. We will be at MLOps World, as well as the Generative AI Summit, a co-located event. Machine learning operations (MLOps), large language models (LLMs) and open source are just some of the topics shaping technology’s future. The latest advances will be covered from a business and technical perspective during the two events.

At the conference, pioneers in the AI world, organisations that run machine learning at scale and creators will discover how to use open source tools to further their initiatives. From getting started with lightweight tools such as Charmed MLflow to building portable and reproducible workloads with open source MLOps solutions such as Kubeflow, Spark and Opensearch

Canonical AI Roadshow at MLOps World

MLOps World is an international community group of practitioners who are mastering the science of deploying ML models into live production environments. It is led by Toronto Machine Learning Society (TMLS), which seeks to unite and support the wider AI ecosystem.

Our team of experts will travel to the event to deliver a  workshop about large language models (LLMs), give a talk about MLOps with highly sensitive data and answer your questions about AI, MLOps and open source. We will showcase a wide variety of demos for different industries. 

Bonus: Everyone who visits our booth can interact with a conversational assistant, similar to ChatGPT, built fully with open source components, to learn more about Ubuntu and Canonical. 

Topics we will cover at the MLOps World and Generative AI Summit include:

  • How Canonical built an end-to-end solution to run AI at scale
  • How Canonical collaborates with partners to help enterprises kickstart and take their AI projects to production
  • Canonical MLOps and future plans
  • Common challenges to AI adoption at scale
  • AI use cases within different industries

MLOps on highly sensitive data

MLOps is used in various organisations that operate on very sensitive datasets. For instance, pharmaceutical and life science companies handle human DNA samples.  Healthcare institutions need to train models on patient data. Telecom and financial services companies in highly regulated environments face similar challenges. End users, machine learning engineers or data scientists can need to be vigilant about software vulnerabilities, data leaks, or any lack of data protection measures.

Maciej Mazur and I will give a talk about MLOps on highly sensitive data. The presentation will cover how you can improve compliance and security with features like Kubernetes strict confinement, blockchain-based tokenization and privacy-enhancing technologies like confidential computing. We will feature a real-life case study to demonstrate how you can use these technology building blocks. After the talk, you will understand how you can apply them yourself on cloud environments using Kubeflow for MLOps.

Build your own ChatGPT with open source tooling

Large language models are at the forefront of all enterprises. From sentiment analysis to privately developed chatbots, there are plenty of applications across various industries which benefit from LLMs. Yet, professionals are often confused by the scattered landscape and apparent high-costs of experimenting with the latest technologies.

Open source is at the heart of a big shift in the AI industry. We’re seeing a clear trend in the industry to make tools easy to use for everyone, so getting started with your first LLM-based project is simple. Starting with the architecture needed to build a chatbot using your company’s data, the workshop will walk you through all the steps that you need to take in order to have at the end a fully-functional project.

At the end of the workshop, you will be able to try out your own chatbot but also have a better understanding of the considerations of building your own project using LLMs. It will help you use open source tools, such as Kubeflow or OpenSearch, and give you a better overview on how to choose the right LLM, based on your use case.

Generative AI: the open source way

Generative AI is probably the topic of the year. Leaders across the world feel the pressure of missed opportunities related to the latest technology. Professionals are also concerned about the impact that the latest technology will have on their roles. Data scientists and machine learning engineers need to upskill quickly. Will generative AI deliver on its promises?

This lighting talk will depict the opportunities that open source offers to accelerate innovation with generative AI. If you can’t join us at MLOps world, be sure to keep an eye on the rest of our AI roadshow.

What is the Canonical AI roadshow?

Canonical AI Roadshow is a series of events that will highlight generative AI use cases powered by open source software. The roadshow will take us around the globe between mid-September and mid-November to talk about the latest innovations from the industry, demo some of the most exciting use cases from various industries, including financial service, telco and oil and gas, answer questions about AI, MLOps, big data, and more.

We will stop in:


kubeflow logo

Run Kubeflow anywhere, easily

With Charmed Kubeflow, deployment and operations of Kubeflow are easy for any scenario.

Charmed Kubeflow is a collection of Python operators that define integration of the apps inside Kubeflow, like katib or pipelines-ui.

Use Kubeflow on-prem, desktop, edge, public cloud and multi-cloud.

Learn more about Charmed Kubeflow ›

kubeflow logo

What is Kubeflow?

Kubeflow makes deployments of Machine Learning workflows on Kubernetes simple, portable and scalable.

Kubeflow is the machine learning toolkit for Kubernetes. It extends Kubernetes ability to run independent and configurable steps, with machine learning specific frameworks and libraries.

Learn more about Kubeflow ›

kubeflow logo

Install Kubeflow

The Kubeflow project is dedicated to making deployments of machine learning workflows on Kubernetes simple, portable and scalable.

You can install Kubeflow on your workstation, local server or public cloud VM. It is easy to install with MicroK8s on any of these environments and can be scaled to high-availability.

Install Kubeflow ›

Newsletter signup

Get the latest Ubuntu news and updates in your inbox.

By submitting this form, I confirm that I have read and agree to Canonical's Privacy Policy.

Related posts

What is MLflow?

MLflow is an open source platform, used for managing machine learning workflows. It was launched back in 2018 and has grown in popularity ever since, reaching...

Charmed Kubeflow vs Kubeflow

Why should you use an official distribution of Kubeflow? Kubeflow is an open source MLOps platform that is designed to enable organizations to scale their ML...

A deep dive into Kubeflow pipelines 

Widely adopted by both developers and organisations, Kubeflow is an MLOps platform that runs on Kubernetes and automates machine learning (ML) workloads. It...