In our previous posts, we’ve explored the Red Hat container ecosystem, the Red Hat Container Development Kit (CDK), OpenShift as a local deployment and OpenShift in production. In this final post of the series, we’re going to take a look at how a team can take advantage of the advanced features of OpenShift in order to automatically move new versions of applications from development to production — a process known as Continuous Delivery (or Continuous Deployment, depending on the level of automation).
Continue reading “Continuous Delivery / Deployment with OpenShift Enterprise”
In a previous blog post we took a look at the Red Hat Container Development Kit (CDK) and how it can be used to build and deploy applications within a development environment that closely mimics a production OpenShift cluster. In this post, we’ll take an in-depth look at what a production OpenShift cluster looks like — the individual components, their functions, and how they relate to each other. We’ll also check out how OpenShift supports scaling up and scaling out applications in a production environment.
Continue reading “OpenShift Enterprise in Production”
IT decision makers seem to be up in arms regarding discussions on “next generation” technologies. In the past three months it has been nearly impossible to hold a conversation where the terms cloud, OpenStack, or (Linux) containers don’t surface. Hot topics and buzzwords aside, it has become clear (to me) that the right mix of market conditions are causing organizations to express a renewed interest in enterprise virtualization.
Many organizations are now ready to adopt the next generation of server hardware. The popular Ivy Bridge and Sandy Bridge chipsets from Intel are four to five years old and those who purchased such hardware tend to refresh their equipment every four to five years. In addition, we see Intel Haswell technology approaching its third anniversary. Organizations that lease hardware on a three year cycle will also be looking at what the next generation of hardware has to offer.
What does a potential wave of hardware refresh have to do with a renewed interest in enterprise virtualization? To no one’s surprise
Continue reading “Why Now is the Perfect Time to Adopt Red Hat Enterprise Virtualization”
There is a lot of confusion around which pieces of your application you should break into multiple containers and why. I recently responded to this thread on the Docker user mailing list which led me to writing today’s post. In this post I plan to examine an imaginary Java application that historically ran on a single Tomcat server and to explain why I would break it apart into separate containers. In an attempt to make things interesting – I will also aim to
Continue reading “Container Tidbits: When Should I Break My Application into Multiple Containers?”
The rapid rise of Linux containers as an enterprise-ready technology in 2015, thanks in no small part to the technology provided by the Docker project, should come as no surprise: Linux containers offer a broad array of benefits to the enterprise, from greater application portability and scalability to the ability to fully leverage the benefits of composite applications.
But these benefits aside, Linux containers can, if IT security procedures are not followed, also cause serious harm to mission-critical operations. As Red Hat’s Lars Herrmann has pointed out, containers aren’t exactly transparent when it comes to seeing and understanding all of their internal code. This means that tools and technologies to actually see inside a container are critical to enterprises that want to deploy Linux containers in mission-critical scenarios.
Continue reading “Schrodinger’s Container: How Red Hat is Building a Better Linux Container Scanner”
When it comes to adopting containers, security is the highest adoption barrier according to 53 percent of IT operations and development professionals working with containers today. While there is no shortage of container security news, there is still some debate about the best way to properly secure containers.
Continue reading “13-JAN Webcast: Container Security and Authentication in Red Hat Atomic Enterprise Platform”
Red Hat recently announced the availability of Red Hat Atomic Enterprise Platform as a public preview. What key capabilities does Red Hat Atomic Enterprise Platform provide and what materials are available for you to get started? Attend the webcast on Wednesday, December 16, 2015 from 11:00 AM to 12:00 PM (ET), to find out.
Continue reading “Join Us on 12/16 for a Webcast on Getting Started with Red Hat Atomic Enterprise Platform”
This morning, Red Hat announced the general availability of OpenShift Enterprise 3.1 as well as a public preview of Red Hat Atomic Enterprise Platform. Red Hat’s updated container offerings are:
- OpenShift Enterprise 3.1, the latest version of Red Hat’s application platform designed to build, deploy and run stateful and stateless applications on private and public cloud infrastructure.
- Red Hat Atomic Enterprise Platform Public Preview, an optimized container infrastructure platform for deploying, running and managing containers across the enterprise.
Both enable enterprises to develop, integrate, deploy, and manage a variety of applications consistently across a more secure, container-optimized infrastructure. If you’re looking to adopt container-based architectures, OpenShift and Atomic allow you to use Docker-formatted Linux containers to create microservices-based applications and modernize traditional workloads – all with the security of a consistent foundation based on Red Hat Enterprise Linux.
Continue reading “Announcing OpenShift Enterprise 3.1 and Red Hat Atomic Enterprise Platform Public Preview”
Given the recent massive spike in interest in Linux Containers, you could be forgiven for wondering, “Why now?”. It has been argued that the increasingly prevalent cloud computing model more closely resembles hosting providers than traditional enterprise IT, and that containers are a perfect match for this model.
Despite the sudden ubiquity of container technology, like so much in the world of open source software, containerization depends on a long series of previous innovations, especially in the operating system. “One cannot resist an idea whose time has come.” Containers are such an idea, one that has been a long time coming.
Continue reading “The History of Containers”
If you’re looking at running Linux containers, you should be heading to ContainerCon in Seattle next week. Co-located with LinuxCon and CloudOpen, ContainerCon is where leading contributors in Linux containers, the Linux kernel, and related projects will get together to educate the community on containers and related innovations.
Red Hatters are contributing to over 40 sessions on this year’s agenda, including a keynote from Red Hat VP of Engineering Matt Hicks. In “Revolutionizing Application Delivery with Linux and Containers,” Matt will focus on how Linux containers are changing the way that companies develop, consume and manage applications and will emphasize how open source communities and projects like Docker and Kubernetes are delivering this next wave of enterprise application architecture.
If you’re attending ContainerCon, check out Matt’s keynote and some of the sessions below:
Continue reading “See You at ContainerCon in Seattle”