I was asked this question: What is the connection between AI, Cloud-Native and Edge devices?
On first impressions, it sounds like an amalgamation of every conceivable buzzword around - but I think there is a coherent answer which points to a business need.
Let us start with the term ‘Cloud Native.’
Cloud-native computing is an approach in software development that utilizes cloud computing technologies such as
- Containers
- Microservices
- Continuous Delivery
- DevOps
Using Cloud Native technologies, we can create loosely coupled systems that are scalable and resilient.
In practice, this means
a) The system is built as a set a set of microservices that run in Docker containers
b) The containers may be orchestrated via Kubernetes
c) The deployment is managed through docker containers through a CI/CD process
In itself, this approach is valuable and follows a stack that is rapidly emerging at the Enterprise level.
But how does it tie to Edge devices?
- Docker allows you to create a single packaged deployment through a container, which creates a virtualized environment at the target device. AI models are trained in the cloud and deployed on edge devices. The docker/ cloud-native format enables you to run AI in containers across various environments, including at the Edge. The container-based architecture is especially relevant for AI on edge devices because of the diversity of devices.
- Secondly, AI models need to be refreshed and deployed frequently – including on edge devices. For this reason, also, the cloud-native and container architecture helps.
Welcome thoughts and comments
Image source: Cloud Native Definition
from Featured Blog Posts - Data Science Central https://ift.tt/3isVCqh
via Gabe's MusingsGabe's Musings