IT decision makers seem to be up in arms regarding discussions on “next generation” technologies. In the past three months it has been nearly impossible to hold a conversation where the terms cloud, OpenStack, or (Linux) containers don’t surface. Hot topics and buzzwords aside, it has become clear (to me) that the right mix of market conditions are causing organizations to express a renewed interest in enterprise virtualization.
Many organizations are now ready to adopt the next generation of server hardware. The popular Ivy Bridge and Sandy Bridge chipsets from Intel are four to five years old and those who purchased such hardware tend to refresh their equipment every four to five years. In addition, we see Intel Haswell technology approaching its third anniversary. Organizations that lease hardware on a three year cycle will also be looking at what the next generation of hardware has to offer.
What does a potential wave of hardware refresh have to do with a renewed interest in enterprise virtualization? To no one’s surprise
Continue reading “Why Now is the Perfect Time to Adopt Red Hat Enterprise Virtualization”
Red Hat has long advocated for the importance of cross-industry IT standards, with the intention of enabling ecosystems with broad industry participation and providing a common basis for innovation. Perhaps even more importantly, these standards can help drive adoption of new technologies within enterprises, pushing the cycle of innovation even further along.
With ARM being one of these emerging ecosystems, we wanted to provide a snapshot of a recent event that highlights some of the standards-based work happening in this growing community: last week’s Linaro Connect conference in Bangkok, Thailand.
Continue reading “Connecting the Dots at Linaro Connect”
In last year’s blog series, I covered both direct and indirect Active Directory integration options. But I never explained what we actually suggest / recommend. Some customers looking at indirect integration saw only the overhead of providing an interim server and the costs related to managing it. To be clear, these costs are real and the overhead does exist. But we still recommend
Continue reading “Why is Indirect Integration Better?”
Yogi Berra, the late baseball great and oft-quoted source of humorous statements about the condition of the world, once said, “It’s tough to make predictions, especially about the future.” Some of his most celebrated remarks were eerily prescient on the subject of using technology to predict the future. As many IT managers today ponder the best way forward with predictive analytics, it might be interesting to think about it from his perspective. Consider predictive analytics in the context of the following classic Yogi-isms
Continue reading “Yogi Berra, Predictive Analytics, and SAP HANA Running on Red Hat Enterprise Linux for SAP HANA”
Over last several months, in meetings with many Red Hat customers, I have been asked about best practices related to migration from an existing third-party identity management solution to Red Hat’s Identity Management (IdM) solution. In today’s post I will share some of my thoughts on this matter…
Continue reading “When to Migrate: Red Hat Identity Management vs. Third-Party Solutions”
Setting up a local development environment that corresponds as close as possible to production can be a time-consuming and error-prone task. However, for OpenShift deployments, we have the Red Hat Container Development Kit (CDK) which does a good job at solving this and also provides a great environment for experimenting with containers and the Red Hat container ecosystem in general.
In this blogpost we will cover deploying applications using the OpenShift Enterprise PaaS that comes with the CDK. The whole process will be driven via the OpenShift CLI, in contrast to our last post which focused on OpenShift’s web interface. If you haven’t yet installed the CDK, check out the previous blog post for instructions.
By the end of this article you will know how to build existing applications on OpenShift, whether they already use
Continue reading “Test Driving OpenShift with the Red Hat Container Development Kit (CDK)”
Docker containers are used to package software applications into portable, isolated stores. Developing software with containers helps developers create applications that will run the same way on every platform. However modern microservice deployments typically use a scheduler such as Kubernetes to run in production. In order to fully simulate the production environment, developers require a local version of production tools. In the Red Hat stack, this is supplied by the Red Hat Container Development Kit (CDK).
The Red Hat CDK is a customized virtual machine that makes it easy to run complex deployments resembling production. This means complex applications can be developed using production grade tools from the very start
Continue reading “Getting Started with the Red Hat Container Development Kit (CDK)”
Red Hat will once again have a booth at this year’s RSA Conference. This time, however, we will have a bigger presence and more staff – featuring a number of Red Hat security experts with a variety of backgrounds. We will be covering not only Identity Management (IdM) but the broader landscape of security related topics. Whether you’re interested in talking about high level security strategy, a vision for adopting IdM at your organization, or are simply seeking practical tips on how to solve specific problems related to risk assessment, governance, compliance, or
Continue reading “Red Hat at RSA Conference 2016”
In a commissioned study conducted by Forrester Consulting on behalf of Red Hat, 44% of IT professionals identified performance in their top three concerns for adopting container technologies. Benchmarks indicate that containers result in equal or better performance than virtual machines in almost all cases, with the runtime costs of containers as “negligible”.
What are the abstraction costs and what do you need to consider when running container-based applications on Atomic Enterprise Platform Public Preview?
Continue reading “10-FEB Webcast: Wicked Fast Container-Based Apps and Performance Tuning with Atomic Enterprise Platform”
The rapid rise of Linux containers as an enterprise-ready technology in 2015, thanks in no small part to the technology provided by the Docker project, should come as no surprise: Linux containers offer a broad array of benefits to the enterprise, from greater application portability and scalability to the ability to fully leverage the benefits of composite applications.
But these benefits aside, Linux containers can, if IT security procedures are not followed, also cause serious harm to mission-critical operations. As Red Hat’s Lars Herrmann has pointed out, containers aren’t exactly transparent when it comes to seeing and understanding all of their internal code. This means that tools and technologies to actually see inside a container are critical to enterprises that want to deploy Linux containers in mission-critical scenarios.
Continue reading “Schrodinger’s Container: How Red Hat is Building a Better Linux Container Scanner”