Get Even More Visitors To Your Blog, Upgrade To A Business Listing >>

Clika is building a platform to make AI models run faster

Tags: model clika asaf

Ben Asaf spent several years building dev infrastructure at Mobileye, the autonomous driving startup that Intel acquired in 2017, while working on methods to accelerate AI Model training at Hebrew University.

An expert in MLOps (machine learning operations) — tools for streamlining the process of taking AI models to production and then maintaining and monitoring them — Asaf was inspired to launch a company that removed the major roadblocks to software engineers and firms deploying AI models into production.

“When I first came to the idea of starting a company, not many companies and people had the industrial experience of having built or implemented the practice of MLOps into their AI development pipeline,” Asaf told TechCrunch in an email interview. “I thought we could make AI ‘tiny’ for lighter, faster and more affordable to productize and commercialize.”

In 2021, Asaf teamed up with Nayul Kim, his wife, who’d been working as a digital transformation consultant for enterprises, to found Clika, one of the startups competing in the Startup Battlefield 200 competition at TechCrunch Disrupt. Clika provides a toolkit for companies to automatically “downsize” their internally developed AI models, reducing the amount of compute power they consume and, as an added benefit, speeding up their inferencing.

“With Clika, you simply connect your pre-trained AI models and get an ‘auto-magically’ compressed model fully compatible with a target device — a server, the cloud, the edge or an embedded device,” Asaf said.

To achieve this feat, Clika relies on techniques such as quantization, which essentially lessens the number of bits — the smallest increment of data on a computer —  in a model needed to represent information. By sacrificing some precision, quantization can shrink a model without interfering with its ability to perform a particular task — such as identifying different dog breeds.

Clika also generates a report with potential improvements or changes that could be made to a model for improved performance. 

Interest in ways to make models more efficient is growing as the AI industry faces supply chain issues related to the hardware necessary to run these models. Microsoft recently warned in an earnings report that Azure customers might face service disruptions due to AI hardware shortages. Meanwhile, Nvidia’s best-performing AI chips — the H100 GPU series — are reportedly sold out until 2024.

Of course, Clika isn’t the only startup chasing after approaches to compress AI models. There’s Deci, backed by Intel; OctoML, which, like Clika, automatically optimizes and packages models for an array of different hardware; and CoCoPie, a startup creating a platform to optimize AI models specifically for edge devices.

But Asaf argues that Clika has a technological advantage.

“While other solutions use rule-based techniques for compression, Clika’s compression engine has [an] AI approach of understanding different AI model structures and applying the best compression method for each unique AI model,” he said. “We have the world’s best compression toolkit for vision AI, outperforming the performance of existing solution developed by Meta and Nvidia.”

“World’s best” is quite the claim. But Clika’s managed to win over investors for what it’s worth — raising $1.1 million in a pre-seed round last year with participation from Kimsiga Lab, Dodam Ventures, D-Camp and angel investor Lee Sanghee.

Asaf wasn’t ready to talk about customer momentum just yet — Clika’s currently pre-revenue, running a closed beta for a “select few businesses.” But he said that Clika plans to pursue seed funding sometime “soon.”

source



This post first appeared on Fintech Inshorts, please read the originial post: here

Share the post

Clika is building a platform to make AI models run faster

×

Subscribe to Fintech Inshorts

Get updates delivered right to your inbox!

Thank you for your subscription

×