AI has transformative potential. But if you are the co-founders of modular, a startup emerging from stealth today, the software used to develop it is “monolithic,” broken up into silos piled with layers of complexity. Big Tech companies have made helpful contributions such as TensorFlow and PyTorch – AI development frameworks maintained by Google and Facebook, respectively. But these companies, Modular’s co-founders say, are showing a preference for their tooling and infrastructure at the expense of AI advancements.
Modular wants to change that. Founded by former Apple and Google engineers and execs, the company today closed a grand round-up ($30 million) led by GV (formerly Google Ventures), with the participation of Greylock, The Factory and SV Angel to deliver its vision of realize a streamlined platform -agnostic AI system development platform.
“The industry struggles to maintain and scale fragmented, custom toolchains that differ in research and production, training and deployment, server and edge,” Modular CEO Chris Lattner told TechCrunch in an email interview. “Many of the world’s largest non-major tech companies naively believe that the open source community and open source infrastructure owned by Google, Meta and Nvidia will eventually provide this, when their priorities and limitations demonstrate otherwise.”
Lattner has an impressive resume and spearheaded the creation of Swift, the programming language that powers much of the Apple ecosystem. He was previously the VP of Tesla’s self-driving division and president of engineering and product at SiFive, which provides intellectual property to chip design companies. During his tenure at Google, Lattner managed and built a range of AI-related products, including: TPUs at Google Brain, one of Google’s AI-focused research divisions, and TensorFlow.
Modular’s other co-founder, Tim Davis, is self-contained and has helped define the vision, strategy and roadmaps for Google’s machine learning products, from small research groups to production systems. From 2020 to early 2022, Davis was the product lead for Google machine learning APIs, compilers, and runtime infrastructure for server and edge devices.
“The most pressing issue for non-Big Tech companies is how to produce AI within the confines of performance, cost, time and talent. The opportunity costs of this challenge are enormous. For individual companies, this means out-of-market innovations, inferior product experiences and ultimately a negative impact on their bottom line,” said Lattner. “AI can change the world, but not until fragmentation can be resolved and the global developer community can focus on solving real-world problems, not the infrastructure itself.”
Modular’s solution is a platform that unites popular AI framework frontends through modular, “composite” common components. Details are a bit murky — it’s early days, Lattner warned — but Modular’s goal is to let developers plug in custom hardware to train AI systems, deploy those systems to edge devices or servers, and otherwise “scale seamlessly.” [the systems] over hardware so that deploying the latest AI research into manufacturing “just works,” Lattner said.
Contrasting with the emerging MLOps category of vendors, Modular delivers tools to collect, label, and transform the data needed to train AI systems, as well as workflows for writing, deploying, and monitoring AI. MLOps, short for “machine learning operations,” seeks to streamline the AI lifecycle by automating and standardizing development workflows, just as DevOps was intended for software.
Driven by the accelerated adoption of AI, analytics company Cognilytica predicts that the global market for MLOps solutions will be worth $4 billion by 2025 – up from $350 million in 2019. In a recent questionnaireForrester found that 73% of companies believe MLOps adoption would keep them competitive, while 24% say it would make them a market leader.
“Modular’s main competition is the mindset that dominates AI software development within Big Tech and Big Tech itself,” said Lattner. “The reason those companies are successful in deploying AI is that they are gathering armies of developers, incredibly talented AI tinkerers, and using their massive computing power and financial resources to advance their own efforts and products — including their own clouds and AI hardware. Despite their incredible contributions to the field, their self-preference highlights a deep divide in AI and puts an industry-limiting ceiling on the rest of the world’s ability to use this technology to combat some of our most important socio-economic and environmental problems. “
Lattner – without naming names – claims that Modular is already working with “some of the biggest” [firms] in technology.” The short-term focus is on expanding Modular’s 25-strong team and getting the platform ready for launch in the coming months.
“Changing economic conditions mean the world’s largest AI companies have spent billions on AI to focus on manufacturing — and making money — with AI, rather than tinkering,” Lattner said. “Many of the best and brightest computer scientists — basically the 100x engineers within organizations where 10x engineers are the norm — are just fighting to maintain these systems and make them work for basic use cases — most of which are focused on revenue optimization projects, not the To this end, technical decision makers are looking for infrastructure that is more usable, flexible and performant, streamlining the development and implementation of e2e AI and enabling AI research to move faster to production.They are really just looking for a much larger value of AI at lower implementation costs.”