RaiderChip expands its supported LLM portfolio with the addition of OpenGPT-X Teuken 7B
The combination of this open-source model with RaiderChip’s NPU enables portable AI solutions trained in the 24 languages of the EU, and compliant with the European AI standards.
Spain, December 5th, 2024
On December 2, the prestigious German research institute Fraunhofer IAIS presented Teuken-7B LLM, the multilingual and open-source generative AI model from the German project OpenGPT-X.
This groundbreaking model offers significant advantages, including support for all 24 official EU languages, optimized efficiency processing non-English texts, and strict compliance with European regulations. These features position Teuken-7B as a standout alternative for commercial solutions that need to meet these rigorous standards.
RaiderChip’s decision to support this model aligns with its strategy to provide clients with all the flexibility and options needed to bring their generative AI-based products to market. In the words of Victor Lopez, its CTO: “The industry needs to bring AI solutions to the market that not only handle a wide range of tasks but also interact with us in the most natural way possible and in our native language. This is essential to fully integrate the potential of this technology into our daily lives, enabling its application in sectors such as automotive, medical devices, home automation systems, or customer service solutions. To achieve this, it is necessary to incorporate multilingual models that run in real-time, and that is what our NPU devices allow, in a fully offline and stand-alone manner.”
RaiderChip’s NPU already delivers Generative AI solutions locally, privately, and offline, designed to enable personal intelligent assistants. The arrival of the Teuken-7B model adds the capability to do so in the native languages of millions of users in an optimized manner. This is made possible by the innovative work of the OpenGPT-X team, which developed a highly efficient multilingual tokenization.
Companies interested in trying the Generative AI NPU may reach out to RaiderChip for access to our FPGA demo or a consultation on how our IP cores can accelerate their AI workloads.
More information at https://raiderchip.ai/technology/hardware-ai-accelerators