Generative artificial intelligence (GenAI) is on many people's minds right now. Some are excited to see what kind of productivity boosts they can get with it, while others are concerned that the way this technology is progressing is very unethical.
I will let you be the judge of that. Meanwhile, many organizations have been pushing for more openness in the field of LLM development, with the most recent one being Red Hat, which recently introduced a new foundational model platform that is driven by the idea of open-source.
Let's see what they have been up to. 😃
What Is RHEL AI?
Red Hat Enterprise Linux AI is a foundation model platform for simplifying the development, testing, and running of GenAI models in an enterprise setting.
RHEL AI is based on the open-source InstructLab project, where Red Hat has combined IBM's open-source Granite family of models and InstructLab's model alignment tools (based on the LAB methodology) into a bootable RHEL image that can be quickly deployed on servers.
With this in place, developers can get a quick start into GenAI development, as the bootable image contains popular AI libraries and features an optimized software stack for hardware accelerators (or GPUs) from the likes of NVIDIA, Intel, and AMD.
When talking about this, VP and GM of Foundation Model Platforms at Red Hat, Joe Fernandes, noted that:
RHEL AI provides the ability for domain experts, not just data scientists, to contribute to a built-for-purpose gen AI model across the hybrid cloud, while also enabling IT organizations to scale these models for production through Red Hat OpenShift AI.
I suggest you give the previously published announcement blog a read, as it contains in-depth information on how RHEL AI came to be.
💰 Pricing and Availability
If you are interested, you will have to contact Red Hat for pricing on the official product page for RHEL AI. Currently, you have the option to run it on any server of your choice or via the “bring your own subscription” way for AWS and IBM Cloud.
Red Hat mentions that RHEL AI will be directly available via their OEM partners and that they are working on expanding their range of cloud and OEM partners for it.
Suggested Read 📖
More from It's FOSS...
- Support us by opting for It's FOSS Plus membership.
- Join our community forum.
- 📩 Stay updated with the latest on Linux and Open Source. Get our weekly Newsletter.