Red Hat has just released new updates to Red Hat AI, its portfolio of products and services designed to accelerate the development and deployment of AI solutions in hybrid cloud. Red Hat AI offers an enterprise AI platform for model training and inference, providing more experience, flexibility, and a simplified experience to deploy systems anywhere in the hybrid cloud.
In the quest to reduce the implementation costs of large language models (LLMs) to serve an increasing number of use cases, companies still face the challenge of integrating these systems with their proprietary data and accessing them from anywhere: whether in a data center, in the public cloud, or even at the edge.
By integrating both Red Hat OpenShift AI and Red Hat Enterprise Linux AI (RHEL AI), Red Hat AI addresses these concerns by providing an enterprise AI platform that enables the adoption of more efficient and optimized models, tailored with business-specific data, with the possibility of being deployed in a hybrid cloud to train models across a wide range of computing architectures.
For Joe Fernandes, Vice President and General Manager of Red Hat's AI Business Unit, the update enables organizations to be precise and cost-effective in their AI journeys. Red Hat knows that companies will need ways to manage the rising costs of their generative AI deployments as they bring more use cases into production and operate at scale. Red Hat AI helps organizations address these challenges by enabling them to have more efficient models, purpose-built, trained with their data, and allowing flexible inference in on-premises, cloud, and edge environments.
Red Hat OpenShift AI
Red Hat OpenShift AI offers a comprehensive AI platform to manage the lifecycle of predictive and generative AI (gen AI) in the hybrid cloud, including machine learning operations (MLOps) and Large Language Model Operations (LLMOps) capabilities. The platform provides functionalities to build predictive models and tune gen AI models, along with tools to simplify AI model management, from data science pipelines and models to model monitoring, governance, and much more.
The latest version of the platform, Red Hat OpenShift AI 2.18, adds new updates and capabilities to support Red Hat AI's goal of bringing more optimized and efficient AI models to the hybrid cloud. The main features include
● Distributed serviceAvailable through the vLLM inference server, the distributed service allows IT teams to split the model service across multiple graphics processing units (GPUs). This helps to alleviate the load on a single server, speeds up training and fine-tuning, and promotes more efficient use of computing resources, while also helping to distribute services among nodes for AI models.
● End-to-end model tuning experienceUsing InstructLab and the data science pipelines of Red Hat OpenShift AI, this new feature helps simplify fine-tuning of LLMs, making them more scalable, efficient, and auditable in large production environments, while also providing management through the Red Hat OpenShift AI control panel.
● AI GuardrailsRed Hat OpenShift AI 2.18 helps improve the accuracy, performance, latency, and transparency of LLMs through a preview of AI Guardrails technology, which monitors and safeguards user input interactions and model outputs. AI Guardrails offers additional detection features to help IT teams identify and mitigate potentially hateful, abusive, or profane speech, personally identifiable information, competitor data, or other information restricted by corporate policies.
● Model evaluationUsing the language model evaluation component (lm-eval) to provide important insights into the overall quality of the model, model evaluation allows data scientists to compare the performance of their LLMs across various tasks, from logical and mathematical reasoning to adversarial natural language, helping to create more effective, responsive, and tailored AI models.
RHEL AI
Part of the Red Hat AI portfolio, RHEL AI is a foundational model platform to develop, test, and run LLMs more consistently, aiming to drive enterprise applications. RHEL AI offers Granite LLM models and InstructLab model alignment tools, which are packages in a bootable Red Hat Enterprise Linux image and can be deployed in the hybrid cloud.
Released in February 2025, RHEL 1.4 brought several improvements, including:
● Support for the Granite 3.1 8B modelas the latest addition to the Granite family of open-source licensed models. The model adds multilingual support for inference and taxonomy/knowledge customization (preview for developers), as well as a 128k context window to improve the adoption of summarization results and Retrieval-Augmented Generation (RAG) tasks.
● New graphical user interface to contribute with skills and prior knowledgeavailable in preview format for developers, aiming to simplify data consumption and fragmentation, as well as enable users to add their own skills and contributions to AI models.
● Document Knowledge-bench (DK-bench)to facilitate comparisons between AI models fine-tuned with relevant private data and the performance of the same base models without fine-tuning.
Red Hat AI InstructLab on IBM Cloud
More and more, companies are seeking AI solutions that prioritize the accuracy and security of their data while keeping costs and complexity as low as possible. The Red Hat AI InstructLab, available as a service on IBM Cloud, was designed to simplify, scale, and help improve security in the training and deployment of AI systems. By simplifying the model tuning process in InstructLab, organizations can build more efficient platforms tailored to their unique needs while maintaining control over their confidential information.
Free training on the Fundamentals of AI
AI is a transformative opportunity that is redefining how companies operate and compete. To support organizations in this dynamic scenario, Red Hat offers free online training on AI Fundamentals. The company is offering two AI learning certificates, aimed at both experienced senior leaders and beginners, helping to educate users of all levels on how AI can help transform business operations, streamline decision-making, and drive innovation.
Availability
Red Hat OpenShift AI 2.18 and Red Hat Enterprise Linux AI 1.4 are now available. More information about additional features, improvements, bug fixes, and how to update your Red Hat OpenShift AI version to the latest can be foundhere, and the latest version of RHEL AI can be foundhere.
The Red Hat AI InstructLab on IBM Cloud will be available soon. Red Hat's AI Fundamentals training is now available to customers.