Productization is an affirmation. When any good or service evolves to a point where it can be productized and boxed, it denotes a certain solidity and branded consumability. Whether we’re talking about a packet of instant noodles, a pair of shoes or a standard haircut, when a commercial proposition is productized, it becomes easier for consumers to know its price, function and scope.
Back in pre-millennial times, all consumer-level (and a good deal of enterprise-level) software was sold this way. Microsoft Office used to come in a smart-looking box, sometimes with a tough clear plastic outer shell. Rapid application development tools would come in a nice package, sometimes with a manual… and the expansive Adobe Creative Suite master collection would come in the biggest box of all, housing around eight disks inside.
As we know, times have changed and software that works at this level now comes as a web-based download or as a cloud-hosted always-on service. But the one thing that hasn’t changed is the productization of software as a means of clarifying its ratified work and validated functionality.
That same process is now happening with artificial intelligence and we call it Model-as-a-Service, or MaaS. This productization technique enables cloud-centric software engineers to get hold of prebuilt, preconfigured, pre-trained machine learning models for a whole range of AI functions.
MaaS Goes En Masse
“MaaS has emerged as a groundbreaking paradigm that revolutionizes the deployment and utilization of generative AI models. MaaS represents a paradigm shift in how we use AI technologies and provides a scalable and accessible solution for developers and users to leverage pre-trained AI models without the need for extensive infrastructure or expertise in model training,” notes a 2023 white paper on this subject written in collaboration between three Chinese universities and the University of Illinois Chicago.
Quickly becoming popularized especially in the realm of generative AI, MaaS is argued to be more efficient, cost-effective and easier to scale with. Where AI models have been established and agreed to be without bias or risk of hallucination, this approach may also be said to be more robust. In this burgeoning space, MaaS providers offer documentation, tutorials and support, so developers can start integrating AI capabilities into software more quickly and competently.
Among the organizations working at this level, NTT Data has now launched its Tsuzumi large language model (in both Japanese and English) through the Microsoft Azure AI MaaS service. Named after a traditional Japanese drum, Tsuzumi is capable of adjusting model size without compromising performance. This operational adaptability is achieved by the model using efficient tuning processes and industry adapters for customized knowledge learning. The company insists that this allows the technology to be highly relevant and versatile, quickly adjusting to specific use-case requirements with less service provisioning costs – and, it’s now available on MaaS (en masse, if you will), through Microsoft Azure.
“Supporting the launch of Tsuzumi on Microsoft Azure AI exemplifies our dedication to empowering organizations globally to harness the power of generative AI through models optimized for performance and price and backed by the world’s most trusted cloud,” said Eric Boyd, corporate vice president for Azure AI platform at Microsoft. He notes in line with NTT Data that this development marks a fresh milestone in a 25-year collaboration committed to technological solutions that drive sustainability and innovation.
Production-ized Product Efficiency
The MaaS productization trend is being played out across the technology industry. Earlier this year we heard from data and AI solutions company SAS as the organization unveiled lightweight, industry-specific AI models for individual licenses. The company says it is equipping organizations with readily deployable AI technology to ‘productionize’ real-world use cases. SAS has specific experience in a range of industries including financial, healthcare, manufacturing and government.
“An area that is ripe for SAS is productizing models built on SAS’ core assets, talent and IP from its wealth of experience working with customers to solve industry problems,” offered Chandana Gopal, research director for future of intelligence, IDC. She reflects on the suggestion made by SAS itself that the consumption of AI models is primarily focused on large language models for generative AI, but in reality, LLMs are a very small part of the modeling needs of real-world production deployments of AI and decision-making for businesses.
With the new offering, SAS says it is moving beyond LLMs and offering deterministic AI models for industries that span use cases such as fraud detection, supply chain optimization, entity management, document conversation and healthcare payment integrity etc. These industry-specific AI models are engineered for quick integration to provide operationalized (i.e. usable, workable) trustworthy AI technology.
Flourishing Frameworks
“Models are the perfect complement to our existing solutions and SAS Viya platform offerings and cater to diverse business needs across various audiences, ensuring that innovation reaches every corner of our ecosystem,” said Udo Sglavo, vice president for AI and Analytics, SAS. “By tailoring our approach to understanding specific industry needs, our frameworks empower businesses to flourish in their distinctive environments.”
SAS says it is democratizing AI by offering out-of-the-box, lightweight AI models starting with an AI assistant for warehouse space optimization. Using technology large language model technologies, these assistants cater to nontechnical users, translating interactions into optimized workflows seamlessly and aiding in faster planning decisions.
Will the productization of model-based AI start to enable the technology industry to embed new strains of smart intelligence deeper into our applications so that we don’t have to hear about artificial intelligence innovation X, Y & Z every single week as we do now? Don’t get your hopes up, there’s plenty of hype left in this cycle.
The wider trend may be reflective of AI becoming a more embedded utility in all applications, which some of the braver spokespersons in the IT space think could happen by the end of this decade. Until then, AI continues to go en masse to on MaaS.