IoT Edge Computing with MicroK8s: A hands-on approach to building, deploying, and distributing production-ready Kubernetes on IoT and Edge platforms
- Length: 416 pages
- Edition: 1
- Language: English
- Publisher: Packt Publishing
- Publication Date: 2022-09-30
- ISBN-10: 1803230630
- ISBN-13: 9781803230634
- Sales Rank: #807782 (See Top 100 Books)
A step-by-step, comprehensive guide that includes real-world use cases to help you successfully develop and run applications and mission-critical workloads using MicroK8s
Key Features
- An easy-to-follow guide that helps you get started with MicroK8s and other Kubernetes components
- Understand the key concepts and constraints for building IoT and edge architectures
- Get guidance on how to develop and deploy use cases and examples on IoT and edge computing platforms
Book Description
Are you facing challenges with developing, deploying, monitoring, clustering, storing, securing, and managing Kubernetes in production environments as you’re not familiar with infrastructure technologies? MicroK8s – a zero-ops, lightweight, and CNCF-compliant Kubernetes with a small footprint is the apt solution for you.
This book gets you up and running with production-grade, highly available (HA) Kubernetes clusters on MicroK8s using best practices and examples based on IoT and edge computing.
Beginning with an introduction to Kubernetes, MicroK8s, and IoT and edge computing architectures, this book shows you how to install, deploy sample apps, and enable add-ons (like DNS and dashboard) on the MicroK8s platform. You’ll work with multi-node Kubernetes clusters on Raspberry Pi and networking plugins (such as Calico and Cilium) and implement service mesh, load balancing with MetalLB and Ingress, and AI/ML workloads on MicroK8s. You’ll also understand how to secure containers, monitor infrastructure and apps with Prometheus, Grafana, and the ELK stack, manage storage replication with OpenEBS, resist component failure using a HA cluster, and more, as well as take a sneak peek into future trends.
By the end of this book, you’ll be able to use MicroK8 to build and implement scenarios for IoT and edge computing workloads in a production environment.
What you will learn
- Get a holistic view of MicroK8s features using a sample application
- Understand IoT and edge computing and their architecture constraints
- Create, scale, and update HA Raspberry Pi multi-node clusters
- Implement AI/ML use cases with the Kubeflow platform
- Work with various networking plugins, and monitoring and logging tools
- Perform service mesh integrations using Istio and Linkerd
- Run serverless applications using Knative and OpenFaaS frameworks
- Secure your containers using Kata and strict confinement options
Who this book is for
This book is for DevOps and cloud engineers, SREs, and application developers who want to implement efficient techniques for deploying their software solutions. It will also be useful for technical architects and technology leaders who are looking to adopt cloud-native technologies. A basic understanding of container-based application design and development, virtual machines, networking, databases, and programming will be helpful for using this book.
Cover Title Page Copyright and credits Contributors About the reviewers Table of Contents Preface Part 1: Foundations of Kubernetes and MicroK8s Chapter 1: Getting Started with Kubernetes The evolution of containers Kubernetes overview – understanding Kubernetes components Interacting with a Kubernetes cluster Understanding pods Understanding deployments Understanding StatefulSets and DaemonSets StatefulSets DaemonSets Understanding jobs and CronJobs Jobs CronJob Understanding services Summary Chapter 2: Introducing MicroK8s Introducing MicroK8s Kubernetes Quick installation Technical requirements Step 1 – Installation Step 2 – Verify the installation Deploying a sample application Enabling add-ons Full list of add-ons Starting/stopping MicroK8s Configuring MicroK8s to use local images Configuring MicroK8s to use its built-in registry Configuring MicroK8s to use private/public registries Configuring MicroK8s services Troubleshooting application and cluster issues The application level The cluster level Summary Part 2: Kubernetes as the Preferred Platform for IoT and Edge Computing Chapter 3: Essentials of IoT and Edge Computing What is IoT? Key elements of an IoT solution What is edge computing? How are IoT and the edge related? Benefits of edge computing What does it take to enable edge computing, edge analytics, and edge intelligence? Summary Chapter 4: Handling the Kubernetes Platform for IoT and Edge Computing Deployment approaches for edge computing Deployment of the entire Kubernetes cluster at the edge Deployment of Kubernetes nodes at the edge Deployment of virtual Kubernetes nodes at the edge Deployment of Kubernetes devices at the edge Propositions that Kubernetes offers Summary Part 3: Running Applications on MicroK8s Chapter 5: Creating and Implementing Updates on a Multi-Node Raspberry Pi Kubernetes Clusters Creating a MicroK8s multi-node cluster using a Raspberry Pi What we are trying to achieve Configuring Wi-Fi access settings Installing and configuring MicroK8s Adding the worker node Deploying a sample containerized application Performing rolling updates to the application with a new software version Scaling the application deployment Guidelines on multi-node cluster configuration Cluster-level configuration/settings Container life cycle management Deploying and sharing HA applications Summary Chapter 6: Configuring Connectivity for Containers CNI overview Communication flow from Pod 3 to Pod Configuring Calico Requirements Step 1 – Creating a MicroK8s Raspberry Pi cluster Step 2 – Enabling the Calico CNI add-on Step 3 – Deploying a sample containerized application Step 4 – Applying isolation by using NetworkPolicy Step 5 – Enabling access Configuring Cilium Step 1 – Enabling the Cilium add-on Step 2 – Enabling the DNS add-on Step 3 – Deploying a sample containerized application Step 4 – Applying isolation by using NetworkPolicy Step 5 – Enabling access Configuring Flannel CNI Disabling the HA cluster to enable the Flannel add-on Guidelines on choosing a CNI provider Key considerations when choosing a CNI provider Summary Chapter 7: Setting Up MetalLB and Ingress for Load Balancing Overview of MetalLB and Ingress Configuring MetalLB to load balance across the cluster Requirements Step 1 – Creating a MicroK8s Raspberry Pi cluster Step 2 – Enabling the MetalLB add-on Step 3 – Deploying a sample containerized application Step 4 – Verifying the load balancer mechanism Configuring Ingress to expose Services outside the cluster Option 1 – Using the Ingress NodePort method Option 2 – Using Ingress and a load balancer Guidelines on how to choose the right load balancer for your applications Summary Chapter 8: Monitoring the Health of Infrastructure and Applications Overview of monitoring, logging, and alerting options Configuring a monitoring and alerting stack using the Prometheus, Grafana, and Alertmanager tools Requirements for setting up a MicroK8s Raspberry Pi cluster Step 1 – Creating a MicroK8s Raspberry Pi cluster Step 2 – Configuring Prometheus, Grafana, and Alertmanager Step 3 – Accessing Prometheus, Grafana, and Alertmanager Configuring a logging, monitoring, and alerting stack using the EFK toolset Step 1 – Enabling the Fluentd add-on Step 2 – Defining an index pattern Step 3 – Filtering and viewing the data Key metrics that need to be monitored Kubernetes events Summary Chapter 9: Using Kubeflow to Run AI/MLOps Workloads Overview of the ML workflow Introduction – Kubeflow and its components Introduction to the ML workflow Kubeflow components in each phase Kubeflow Pipelines Deploying Kubeflow What we are trying to achieve Step 1 – Installing and configuring MicroK8s Step 2 – Installing Juju Operator Lifecycle Manager Step 3 – Post-installation configurations Accessing the Kubeflow dashboard Creating a Kubeflow pipeline to build, train, and deploy a sample ML model Step 1 – launching a new notebook server from the Kubeflow dashboard Step 2 – creating a Kubeflow pipeline Step 3 – compiling and running Recommendations – running AL/ML workloads on Kubernetes Best practices for running AI/ML workloads Summary Chapter 10: Going Serverless with Knative and OpenFaaS Frameworks Overview of the Knative framework Build components Serving components Eventing components Enabling the Knative add-on Deploying and running a sample service on Knative Overview of the OpenFaaS framework Enabling the OpenFaaS add-on Deploying and running a sample function on OpenFaaS Best practices for developing and deploying serverless applications Serverless function = specific function Using microservices Using appropriate stacks for various resources Applying the principle of least privilege Performing load testing Using a CI/CD pipeline Constant monitoring is required Auditing in addition to monitoring Auditing software dependencies Summary Part 4: Deploying and Managing Applications on MicroK8s Chapter 11: Managing Storage Replication with OpenEBS Overview of OpenEBS Control plane Data plane Storage engines Configuring and implementing a PostgreSQL stateful workload Requirements Step 1 – Creating the MicroK8s Raspberry Pi cluster Step 2 – Enabling the OpenEBS add-on Step 3 – Deploying the PostgreSQL stateful workload Step 4 – Creating the test data Step 5 – Simulating node failure Kubernetes storage best practices Guidelines on choosing OpenEBS data engines Summary Chapter 12: Implementing Service Mesh for Cross-Cutting Concerns Overview of the Linkerd service mesh Enabling the Linkerd add-on and running a sample application Step 1 – Enabling the Linkerd add-on Step 2 – Deploying the sample application Step 3 – Exploring the Linkerd dashboard Overview of the Istio service mesh Enabling the Istio add-on and running a sample application Step 1 – Enabling the Istio add-on Step 2 – Deploying the sample application Step 3 – Exploring the Istio service dashboard Common use cases for a service mesh Guidelines on choosing a service mesh Best practices for configuring a service mesh Summary Chapter 13: Resisting Component Failure Using HA Clusters An overview of HA topologies Setting up an HA Kubernetes cluster Requirements Step 1 – Creating the MicroK8s Raspberry Pi cluster Step 2 – Examining the HA setup Step 3 – Deploying a sample containerized application Step 3 – Simulating control plane node failure Kubernetes HA best practices Summary Chapter 14: Hardware Virtualization for Securing Containers Overview of Kata Containers How Kata Containers works Enabling the Kata add-on and running a sample application Step 1 – Enabling the Kata add-on Step 2 – Deploying a sample application Container security best practices Utilizing DevSecOps Scanning external vulnerabilities via dependency scanning Analyzing container images using image scanning tools Enforcing image content trust Securing registries Securing your host Securing your runtime Reviewing container privileges Using real-time event and log auditing Monitoring resource usage Common security misconfigurations and remediation Summary Chapter 15: Implementing Strict Confinement for Isolated Containers Overview of Snap, Snapcraft, and Ubuntu Core Setting up Ubuntu Core on a Raspberry Pi board What we are trying to achieve Requirements Step 1 – Setting up an Ubuntu Core image to an SD card Step 2 – Creating an Ubuntu SSO account Step 3 – Generating an SSH key pair Step 4 – Booting Ubuntu Core on Raspberry Pi Setting up MicroK8s on Ubuntu Core Adding the worker node Deploying a sample containerized application Summary Chapter 16: Diving into the Future How MicroK8s is uniquely positioned for accelerating IoT and Edge deployments Some of the notable challenges in operating IoT edge How MicroK8s Kubernetes is benefiting edge devices Looking forward – Kubernetes trends and industry outlook Trend 1 – security is still everyone’s concern Trend 2 – GitOps for continuous deployment Trend 3 – App store for operators Trend 4 – Serverless computing and containers Trend 5 – AI/ML and data platforms Trend 6 – Stateful applications Summary Further reading Frequently Asked Questions About MicroK8s Index Other Books You May Enjoy
Donate to keep this site alive
How to download source code?
1. Go to: https://github.com/PacktPublishing
2. In the Find a repository… box, search the book title: IoT Edge Computing with MicroK8s: A hands-on approach to building, deploying, and distributing production-ready Kubernetes on IoT and Edge platforms
, sometime you may not get the results, please search the main title.
3. Click the book title in the search results.
3. Click Code to download.
1. Disable the AdBlock plugin. Otherwise, you may not get any links.
2. Solve the CAPTCHA.
3. Click download link.
4. Lead to download server to download.