Parsing the A16Z report on how enterprises are building & buying generative AI & its implications for partners
Image source: Amazon Titan Image Generator G1 on Amazon Bedrock

Parsing the A16Z report on how enterprises are building & buying generative AI & its implications for partners

Andreessen Horowitz (A16Z), the famed silicon valley VC firm, recently published this report '16 Changes to the Way Enterprises Are Building and Buying Generative AI' on how enterprises are adopting generative AI based on a survey of enterprise leaders. The report highlights 16 top-of-mind considerations about resourcing, models, and use cases on how enterprise leaders are using, buying, and budgeting for generative AI.

In this article I will go over my observations from this report and what it means for consulting and technology partners.

First of all, at the outset, let me say a couple of things 1/ the survey sample size of 70 is small 2/ the report is more focused on software startups and not as much on services & hardware providers (and rightfully so as this is authored by a VC firm). The report does provide many meaningful insights for customers looking to invest in generative AI, technology providers and the enabling ecosystem of partners.

I am sharing my observations for each of the 16 considerations in the report:

  1. Report claims that budgets for generative AI are skyrocketing and there is an estimated 2X to 5X (average 2.5x) increase in spend per enterprise. Directionally, this makes sense as some of the other independent publications claim spend on generative AI to increase exponentially. According to IDC FutureScape report Oct'23 it is estimated that enterprise spend on generative AI is forecasted to increase from $16B in 2023 to $143B in 2027 – that’s almost a 10 fold increase in just 4 years. Generative AI is a disruptive technology of our time and there is sizable addressable market and potential impact. According to McKinsey the incremental economic impact of generative AI on global economy can be $2.6-4.4T and according to Goldman Sachs Research generative AI could drive a 7% (or almost $7T) increase in global GDP.

  2. This is a very software centric view. "On a much smaller scale, we’ve also started to see some leaders deploying their genAI budget against headcount savings, particularly in customer service. We see this as a harbinger of significantly higher future spend on genAI if the trend continues. One company cited saving ~$6 for each call served by their LLM-powered customer service—for a total of ~90% cost savings—as a reason to increase their investment in genAI eightfold." This may be an outlier in terms of investment increase than the norm. It does, however, point to one of the top early use cases (in terms of adoption and maturity) of generative AI and large scale savings customers can realize in customer service.

  3. Measuring RoI for generative AI investments is rapidly coming to fore as a key consideration for customers in addition to security, privacy, control, accuracy and explainability. Customers are increasingly demanding detailed insights on the returns and how much generative AI is actually going to cost them.

  4. Customers are looking at technology providers and consulting partners to provide them the technical expertise to build generative AI powered applications as they do not have this expertise in-house and training internal technical teams may take time. This does point to massive opportunity for consulting partners to help customers and also for technology partners who can provide the tooling (I am thinking no-code/low-code tools or orchestration tools) to make it easier to build generative AI applications in house.

  5. Excellent exhibits and insights here. There is no one model that fits all the use cases. So multi-model is the way forward. A fundamental approach that AWS pioneered with release of Amazon Bedrock which other providers are now following suit. Customers are preparing for a scenario where they can easily change their applications pointing to a different foundation model if their use case or business conditions demand that. There is also opportunity for partners to help customers make these model choices and facilitate migrations from one model provider to the others. Migration will soon start to become a secular trend combined with some of the others insights in this report like increasing open source model adoption, customers emphasizing security and privacy and avoiding dependency on one model provider.

  6. Model landscape will remain competitive 1/ whether it is proprietary models vs open source or 2/ large models vs small models or choice across prompt engineering, fine-tuning or building a foundation model from scratch. Customers will be best served in making an informed decision on their long term approach and leading with data as a differentiator.

  7. Control and customizability will remain a key driver for adoption of open source models over proprietary models. While cost is not a key driver right now, once the adoption matures it will also start to become a key factor of consideration as well for enterprise customers. There are already startups receiving investments that are focused on cost reduction for generative AI.

  8. Customers in regulated industries and with sensitive use cases continue to demand top notch control on privacy and security to ensure that their private data doesn't get exposed or they do not unintentionally land into legal and compliance troubles.

  9. While creating custom models remains a niche, much of the early adoption has been RAG based and not fine-tuning driven. This may change with innovation by model providers (e.g. increasing context window, mixture of expert models, Mamba-based models), cost of pre-training coming down and cost efficient fine tuning techniques.

  10. Cloud providers play a key role in helping customers decide on model selection as seen by 'Azure customers preferring OpenAI while AWS customers preferring Anthropic or Cohere' as called out by the report. This is likely driven by customer's cloud vendor selection choices, that data has gravity and customers would prefer to bring models to their data rather than take their data to models (this may change however as data rich customers look at SLMs or pre-training their own models in which case the consideration is for infrastructure, which is again influenced heavily by cloud providers, and not models).

  11. Early to market features do not necessarily influence the key decision considerations for customers as the pace of innovation is unparalleled in this space. Model providers are constantly coming up with better features such as context window length, guardrails, model evaluation or agents functionality.

  12. Model performance convergence is real and any leader advantage may not be sustainable given the pace of innovation. Customers still have to make that informed decision on the model selection whether it is standard or custom evaluation, internal, external or project specific benchmarks. This is where partners (both consulting and tech partners) can play a major role.

  13. Due to choice and again rapid pace of innovation, customers prefer services that offer them choice, flexibility and ease of use/building.

  14. Building custom apps remains a key theme and gaps remain on generative AI native industry or use case specific applications/software in the market. However, expect more robust industry, domain & use case specific generative AI powered software offerings.

  15. Internal use cases is where customers are starting their generative AI adoption and then working on external customer/consumer facing use cases. Customer service, knowledge search and management, software development etc. remain top internal focused use cases.

  16. $5B run rate revenue for model API market by the end of 2024 does point to explosive growth in this space. This will lead to growth across the stack for other software providers who have complementary offerings such as security, privacy, LLMOps, vector databases and other categories and also for services partners who will play a major role in creating, customizing and deploying these applications for customers who may not have the expertise inhouse.

In summary, there is:

  • Massive opportunity for both consulting and technology partners in 2024 and coming years to help customer build & productionize custom generative AI applications

  • Opportunity for software providers to fill critical application gaps and develop maturity (function/industry or domain specific use cases)

  • For partners, investing in partnerships with cloud providers is going to be a critical factor in success for both consulting services and technology providers.

  • Opportunity for both technology partners (to simplify generative AI application development and fill use case gaps) and for consulting partners (to provide technical and industry expertise) to drive adoption of generative AI with customers.

PS. The views expressed in this article are my own and not of my employer's.

To view or add a comment, sign in

Explore topics