Kubermatic today announced the release of Kubermatic Kubernetes Platform (KKP) 2.15. Significant work went into facilitating the installation process with the new KKP installer and introducing external cluster support.
Read on for more details about these and other major improvements we made in this release.
Improved Installation Experience
The 2.15 release marks the first technical preview of the new KKP installer. Throughout the past releases, we learned a lot about how organizations are setting up and managing KKP, and now we can make this process much easier.
While the KKP Operator, introduced in 2.14, manages KKP itself, the new installer is responsible for installing/upgrading the operator and auxiliary components, like nginx-ingress-controller, cert-manager, and others.
In this preview, the installer can be used to set up a KKP master cluster. It will help with validation, proper installation procedure and guide the user in managing DNS records. Most importantly, however, the installer will take care of upgrades, reducing the manual migration steps to the minimum amount.
Simplified User Cluster Management With New etcd-launcher
With the release, we introduce the technical preview of our new etcd-launcher for facilitated user cluster management.
In the past, KKP ran a static 3-node etcd ring for each user Kubernetes cluster. The new etcd-launcher provides a wrapper that runs the etcd ring and allows for more advanced operational and Day 2 capabilities.
Once the etcd-launcher is enabled, users can change the size of the etcd ring up to nine nodes. Additionally, the etcd-launcher enables the etcd-ring to recover from volume or node automatically without external intervention. This makes users clusters more resilient and allows operators to provide different availability and performance configurations for their user clusters.
More Flexibility With External Cluster Support
Kubermatic Kubernetes Platform automates the deployment and Day 2 operations of Kubernetes clusters across any infrastructure from one single management UI. With the introduction of external cluster support, platform operators benefit from an even higher degree of flexibility and freedom of choice in their set-up. From now on, users will be able to connect any existing Kubernetes cluster to the system and display their details.
After providing kubeconfig and the cluster name, the cluster is added to the KKP project. It retrieves the list of connected nodes, events, and metrics and displays them in the KKP UI. External clusters are displayed alongside Kubermatic clusters inside each project. This empowers operators to keep track of and control all their clusters from one single pane of glass.
Enhanced Control With Project Restrictions
To give administrators more flexibility and control in project management, we have added the possibility to restrict or limit project creation for regular users. This feature is now available from the admin settings.
Even Less Manual Work With Dynamic Data Center Support
The 2.15 release introduces the dynamic data center feature that allows users to manage their data center options for creating user clusters. As before, the data centers are part of the KKP seed, but instead of having to manually manipulate the seed objects to add or change data centers, it is now available through the API/UI. (This feature is currently only available for KKP Enterprise Edition).
Run Kubernetes As You Like
Kubermatic Kubernetes Platform 2.15 supports Kubernetes 1.19. To meet the highest security and reliability standards, we always run Kubermatic Kubernetes Platform master components with the Kubernetes version that addresses recently discovered vulnerabilities.
Moreover, with the 2.15 release, Kubermatic no longer supports Kubernetes 1.15 and 1.16. All existing clusters will automatically be upgraded to Kubernetes 1.17.