Red Hat has just launched new updates to Red Hat AI, its portfolio of products and services designed to accelerate the development and deployment of AI solutions in the hybrid cloud. Red Hat AI offers an enterprise AI platform for model training and inference, providing more expertise, flexibility, and a streamlined experience for deploying systems anywhere in the hybrid cloud.
In the quest to reduce the costs of implementing large language models (LLMs) to meet a growing number of use cases, companies still face the challenge of integrating these systems with their proprietary data and accessing them from anywhere—whether in a data center, the public cloud, or even at the edge.
By integrating both Red Hat OpenShift AI and Red Hat Enterprise Linux AI (RHEL AI), Red Hat AI addresses these concerns by providing an enterprise AI platform that enables the adoption of more efficient and optimized models, fine-tuned with business-specific data, with the possibility of being deployed in the hybrid cloud to train models across a wide range of computing architectures.
According to Joe Fernandes, Vice President and General Manager of Red Hat’s AI Business Unit, this update enables organizations to be precise and cost-effective in their AI journeys. “Red Hat understands that companies will need ways to manage the rising costs of their generative AI deployments as they bring more use cases into production and operate at scale. Red Hat AI helps organizations address these challenges by allowing them to access more efficient, purpose-built models, trained with their data, and enabling flexible inference in on-premises, cloud, and edge environments.”
Red Hat OpenShift AI
Red Hat OpenShift AI offers a comprehensive AI platform for managing the lifecycles of predictive and generative AI (Gen AI) in the hybrid cloud, including machine learning operations (MLOps) and Large Language Model Operations (LLMOps) capabilities. The platform provides functionalities for building predictive models and fine-tuning Gen AI models, along with tools to simplify AI model management—from data science pipelines and models to model monitoring, governance, and more.
The latest version of the platform, Red Hat OpenShift AI 2.18, adds new updates and capabilities to support Red Hat AI’s goal of bringing more optimized and efficient AI models to the hybrid cloud. Key features include:
●Distributed service: Available through the vLLM inference server, the distributed service allows IT teams to split the model service across multiple graphics processing units (GPUs). This helps alleviate the load on a single server, speeds up training and fine-tuning, and promotes more efficient use of computing resources while helping distribute services across nodes for AI models.
●End-to-end model tuning experience:Using InstructLab and Red Hat OpenShift AI’s data science pipelines, this new feature helps simplify the fine-tuning of LLMs, making them more scalable, efficient, and auditable in large production environments, while delivering management through the Red Hat OpenShift AI control panel.
●AI Guardrails: Red Hat OpenShift AI 2.18 improves the accuracy, performance, latency, and transparency of LLMs through a preview of AI Guardrails technology, which monitors and protects user input interactions and model outputs. AI Guardrails offers additional detection capabilities to help IT teams identify and mitigate potentially hateful, abusive, or profane speech, personally identifiable information, competitor data, or other content restricted by corporate policies.
●Model evaluation: Using the language model evaluation component (lm-eval) to provide key insights into overall model quality, model evaluation allows data scientists to compare the performance of their LLMs across various tasks—from logical and mathematical reasoning to adversarial natural language—helping create more effective, responsive, and tailored AI models.
RHEL AI
Part of the Red Hat AI portfolio, RHEL AI is a foundational model platform for developing, testing, and running LLMs more consistently, with the goal of driving enterprise applications. RHEL AI offers Granite LLMs and InstructLab model alignment tools, which are packaged in a bootable Red Hat Enterprise Linux image and can be deployed in the hybrid cloud.
Launched in February 2025, RHEL 1.4 brought several improvements, including:
●Granite 3.1 8B model supportas the latest addition to the open-source licensed Granite model family. The model adds multilingual support for inference and taxonomy/knowledge customization (developer preview), along with a 128k context window to improve the adoption of summarization results and Retrieval-Augmented Generation (RAG) tasks.
●New graphical user interface for contributing skills and prior knowledge, available as a developer preview, designed to simplify data consumption and fragmentation while allowing users to add their own skills and contributions to AI models.
●Document Knowledge-bench (DK-bench)to facilitate comparisons between AI models fine-tuned with relevant private data and the performance of the same unadjusted base models.
Red Hat AI InstructLab on IBM Cloud
Increasingly, companies are seeking AI solutions that prioritize the accuracy and security of their data while keeping costs and complexity as low as possible. Red Hat AI InstructLab, available as a service on IBM Cloud, is designed to simplify, scale, and help improve security in the training and deployment of AI systems. By simplifying InstructLab model tuning, organizations can build more efficient platforms tailored to their unique needs while maintaining control of their confidential information.
Free AI Fundamentals Training
AI is a transformative opportunity that is redefining how businesses operate and compete. To support organizations in this dynamic landscape, Red Hat offers free online training on AI Fundamentals. The company is providing two AI learning certificates, aimed at both experienced senior leaders and beginners, helping educate users at all levels about how AI can help transform business operations, streamline decision-making, and drive innovation.
Availability
Red Hat OpenShift AI 2.18 and Red Hat Enterprise Linux AI 1.4 are now available. More information about additional features, improvements, bug fixes, and how to update your version of Red Hat OpenShift AI to the latest can be found here, and the latest version of RHEL AI can be found here.
Red Hat AI InstructLab on IBM Cloud will be available soon. Red Hat’s AI Fundamentals training is already available to customers.