The explosion of AI corporations has pushed demand for computing energy to new extremes, and corporations like CoreWeave, Collectively AI and Lambda Labs have capitalized on that demand, attracting immense quantities of consideration and capital for his or her capability to supply distributed compute capability.
However most corporations nonetheless retailer information with the large three cloud suppliers, AWS, Google Cloud, and Microsoft Azure, whose storage programs had been constructed to maintain information near their very own compute assets, not unfold throughout a number of clouds or areas.
“Modern AI workloads and AI infrastructure are choosing distributed computing instead of big cloud,” Ovais Tariq, co-founder and CEO of Tigris Information, instructed TechCrunch. “We want to provide the same option for storage, because without storage, compute is nothing.”
Tigris, based by the staff that developed Uber’s storage platform, is constructing a community of localized information storage facilities that it claims can meet the distributed compute wants of contemporary AI workloads. The startup’s AI-native storage platform “moves with your compute, [allows] data [to] automatically replicate to where GPUs are, supports billions of small files, and provides low-latency access for training, inference, and agentic workloads,” Tariq stated.
To do all of that, Tigris not too long ago raised a $25 million Sequence A spherical that was led by Spark Capital and noticed participation from current traders, which embody Andreessen Horowitz, TechCrunch has solely realized. The startup goes in opposition to the incumbents, who Tariq calls “Big Cloud.”
Tariq feels these incumbents not solely supply a costlier information storage service, however additionally a much less environment friendly one. AWS, Google Cloud, and Microsoft Azure have traditionally charged egress charges (dubbed “cloud tax” within the trade) if a buyer needs emigrate to a different cloud supplier, or obtain and transfer their information in the event that they wish to, say, use a less expensive GPU or practice fashions in several components of the world concurrently. Consider it like having to pay your gymnasium additional if you wish to cease going there.
In keeping with Batuhan Taskaya, head of engineering at Fal.ai, certainly one of Tigris’ clients, these prices as soon as accounted for almost all of Fal’s cloud spending.
Techcrunch occasion
San Francisco
|
October 27-29, 2025
Past egress charges, Tariq says there’s nonetheless the issue of latency with bigger cloud suppliers. “Egress fees were just one symptom of a deeper problem: centralized storage that can’t keep up with a decentralized, high-speed AI ecosystem,” he stated.
Most of Tigris’ 4,000+ clients are like Fal.ai: generative AI startups constructing picture, video, and voice fashions, which are inclined to have giant, latency-sensitive datasets.
“Imagine talking to an AI agent that’s doing local audio,” Tariq stated. “You want the lowest latency. You want your compute to be local, close by, and you want your storage to be local, too.”
Large clouds aren’t optimized for AI workloads, he added. Streaming large datasets for coaching or operating real-time inference throughout a number of areas can create latency bottlenecks, slowing mannequin efficiency. However having the ability to entry localized storage means information is retrieved quicker, which suggests builders can run AI workloads reliably and extra cost-effectively utilizing decentralized clouds.
“Tigris lets us scale our workloads in any cloud by providing access to the same data filesystem from all these places without charging egress,” Fal’s Taskaya stated.
There are different the reason why corporations wish to have information nearer to their distributed cloud choices. For instance, in extremely regulated fields like finance and healthcare, one giant roadblock to adopting AI instruments is that enterprises want to make sure information safety.
One other motivation, says Tariq, is that corporations more and more wish to personal their information, pointing to how Salesforce earlier this yr blocked its AI rivals from utilizing Slack information. “Companies are becoming more and more aware of how important the data is, how it’s fueling the LLMs, how it’s fueling the AI,” Tariq stated. “They want to be more in control. They don’t want someone else to be in control of it.”
With the recent funds, Tigris intends to proceed constructing its information storage facilities to help rising demand — Tariq says the startup has grown 8x yearly since its founding in November 2021. Tigris already has three information facilities in Virginia, Chicago, and San Jose, and desires to proceed increasing within the U.S. in addition to in Europe and Asia, particularly in London, Frankfurt, and Singapore.