« Previous 1 2 3 4 Next »
AI and the BooM Stack
Containerized Frameworks
openEuler supports a number of popular AI development frameworks in containerized form for easy deployment and integration. In a few clicks, you can prepare a complete development environment for:
- PyTorch
- TensorFlow
- MindSpore
Other images support the CUDA and CANN development environments. This modular, containerized design lets you easily integrate the development tools you need without the clutter of the tools you don’t need – and if your needs change, you can easily adapt.
Acceleration
An accelerated system means faster learning and a shorter response time for AI queries. openEuler integrates several tools for accelerated operation in an AI context. The sysHAX LLM acceleration runtime enhances inference performance. sysHAX improves CPU throughput via NUMA-aware scheduling, parallelized matrix operations, and Scalable Vector Extension (SVE)-optimized inference operators. openEuler also provides the Expert Kit (EK) high-performance framework for scalable Mixture of Experts (MoE) LLM inference. (MoE is a machine learning technique where multiple expert neural networks, acting as learners, divide a problem space for more efficient learning. MoE layers are often used in Transformer models.)
Another powerful acceleration technology built into openEuler is KTransformers (pronounced “Quick Transformers”), an innovative tool developed at Tsinghua University. The Python-centric KTransformers framework offers kernel optimizations and parallel processing strategies. According to the KTransformers developers, “By implementing and injecting an optimized module with a single line of code, users gain access to a Transformers-compatible interface, RESTful APIs compliant with OpenAI and Ollama, and even a simplified ChatGPT-like web UI” [3].
« Previous 1 2 3 4 Next »
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.
