Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion

Deci, a deep learning company building AI has launched their generative AI foundation models, which beats Meta’s algorithm by a staggering 15 times. DeciDiffusion, DeciLM 6B and its new Inference Software Development Kit (SDK) are setting a new standard for performance and cost efficiency in the realm of generative AI. Unlike closed-source API models, Deci provided unrestricted access to its models that could be self-hosted anywhere. 

The computational requirements for training and inference of genAI models, hinder teams from cost-effectively launching and scaling genAI applications. With its latest releases, the Israel based company is directly addressing this gap, making scaling inference efficient, cost-effective, and ready for enterprise-grade integration. 

Better and Accurate 

AI researchers can reduce their inference compute costs by up to 80% by using Deci’s open-source models and Infery LLM. They can use the already existing and widely available GPUs such as the NVIDIA A10. The Deci models cater to diverse applications, ranging from content and code generation to image creation and chat applications, among many others.

One of their models, ‘DeciDiffusion 1.0,’ is a blazing-fast text-to-image model. It could generate high-quality images in less than a second, outperforming their competitor Stable Diffusion 1.5 model by a factor of three. 

DeciLM 6B, with its 5.7 billion parameters, set it apart by its blazing inference speed — 15 times faster than the Meta LLaMA 2 7B. Rounding out the lineup was ‘DeciCoder,’ a 1 billion parameter code generation LLM, which not only delivered exceptional inference speed but also maintained or exceeded accuracy standards.

Yonatan Geifman, Deci’s CEO and co-founder, emphasised the need for mastery over model quality, the inference process, and cost in the world of generative AI. 

These models were crafted using Deci’s proprietary Neural Architecture Search (AutoNAC) technology. Alongside its foundation models, Deci introduces Infery LLM – an inference SDK that enables developers to gain a significant performance speed-up on existing LLMs while retaining the desired accuracy. 

“With Deci’s solutions, companies receive both enterprise-grade quality and control, as well as the flexibility to customise models and the inference process according to their precise requirements” said Prof. Ran El Yaniv, Chief Scientist and co-founder of Deci.

“DeciLM-6B has set a new gold standard, outperforming Llama 2 7B’s throughput by an astonishing 15 times.This achievement is attributed to Deci’s cutting-edge neural architecture search engine, AutoNAC” said Akshay Pacchar, lead data scientist at TomTom in a tweet. 

The post Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion appeared first on Analytics India Magazine.



This post first appeared on Analytics India Magazine, please read the originial post: here

Share the post

Deci Open-Sources Foundational Models, Supasses Meta’s LLaMA & Stable Diffusion

×

Subscribe to Analytics India Magazine

Get updates delivered right to your inbox!

Thank you for your subscription

×