<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=1448210&amp;fmt=gif">
Free Research
Generative AI in the Enterprise Sector

Generative AI in the Enterprise Sector

January 01, 2024

 

How AI Technology Suppliers Can Meet the Hype

 

Generative Pretrained Transformer (GPT) applications like ChatGPT have ignited massive interest in Generative Artificial Intelligence (Gen AI) since late last year. So far, the opportunities for Gen AI have been constrained to the Business-to-Consumer (B2C) space, with the Business-to-Business (B2B) largely undefined.

In the long run, Gen AI will reshape the enterprise, with ABI Research forecasting it to contribute roughly $450 billion in value across various verticals by 2030. However,  several enterprise challenges are holding back adoption right now. The value of Gen AI is too great to pass up, from improved employee productivity and enhanced operational efficiency to service augmentation and even widespread automation.

 

Chart 1: Generative AI Vertical Value Creation

World Markets: 2023 to 2030

(Source: ABI Research)

a chart forecasting the enterprise generative AI value creation across industries

We’re still at a very early stage of enterprise generative AI, with the first deployments being reserved for low-hanging use cases where the stakes aren’t so high. But further adoption is currently unlikely as risk-reward alarms are ringing across C-suite level executives, such as data security, Intellectual Property (IP) protection, copyright infringement, and the possibility of fragmentation, all underpinned by a significant skills and knowledge gap. Case in point, a recent SAS report found that 76% of enterprises have data privacy concerns with Gen AI.

Moreover, today’s generative AI models are too big and general-purpose, lacking the performance, task efficiency, cost-effectiveness, and security necessary to meet enterprise investment criteria.

But the supply side of the market remains active as they look to build out a successful business-to-business (B2B) proposition. Many juggernaut tech firms, as well as startups, are developing generative AI models, applications, and services. However, as they explore this emerging opportunity, they must continue to grapple with a steep learning curve as they navigate a new commercial domain. Most significantly, a cost crisis is emerging. Building, training, and running generative AI models have massive overheads. Overcoming this crisis cannot rely on the cash-light B2C market or existing “freemium” revenue models, so they must be proactive and start testing new monetization and product strategies.

 

 

 

 

Generative AI Use Cases for Enterprises  

While the limited number of generative AI implementations in the enterprise today is constrained to low-risk use cases, this won’t always be the case. Already, the list of enterprise use cases is considerable, and this continues to expand. Five of the top verticals contributing to Gen AI investment include retail & e-commerce, marketing/advertising/creative, financial services, energy/utilities/mining, and law. This section samples some of those use cases, along with some early examples. For a more detailed list, please see ABI Research report Generative AI Business Outcomes: Identifying Enterprise Commercial Opportunities.

 

Table 1: Enterprise Vertical Use Cases and Early Market Activity 

(Source: ABI Research) 

Vertical 

Potential Use Cases 

Early Market Activity 

 

Retail & E-Commerce 

 

  • Develop visual search tools to support customer accessibility. 
  • Build customer chatbots and intelligent assistants. 
  • Personalization of product recommendations. 
  • Product content generation (i.e., description, summaries, product images). 
  • Personalized marketing. 
  • Fraud identification (e.g., product listings, comments). 
  • Product design. 
  • Automated website development. 
  • Generation of copy for websites. 
  • Trend analysis and data synthesis for merchants and e-commerce stores. 
  • Amazon rolling out a review summary feature. 
  • eBay is implementing an automated listing description feature using image-to-text generation. 
  • Carrefour launches Hopla, an online chatbot, to support description generation and purchasing. 
  • Ask Instacart is an AI-powered search tool offering personalized recommendations.  
  • Shopify is a tool to support merchants by generating descriptions. 
  • 7 Eleven has implemented DeepBrain AI’s “AI human” and “AI kiosk” to aid customer engagement. 

Marketing, Advertising & Creative 

 

  • Create digital artwork. 
  • Develop new marketing content. 
  • Integrate design tools into creative software. 
  • Write scripts or storytelling for videos/adverts. 
  • Stock image generation.  
  • Transforming Two-Dimensional (2D) into Three-Dimensional (3D) images and digital content. 
  • Post-marketing surveillance and data analysis. 
  • Develop short-form content from long-form media. 
  • Localization and personalization of marketing content. 
  • Eliminate cookies through generative AI and first-party data. 
  • Augment copywriting and social media posting. 
  • React to end user and personalize advertisements to interests, emotions, and experiences.  
  • Sports Illustrated, Anthropologie, and Transcend have integrated with Jasper AI to support copywriting, trend identification, and topic development.  
  • Adobe has implemented a range of generative AI image and video editing tools through Firefly. 
  • Coca-Cola is building new marketing tools using OpenAI.  
  • WPP is partnering with NVIDIA to develop a generative content engine for digital advertising. They are looking to integrate additional 3D content to personalize advertising.  
  • BBDO uses Stable Diffusion to augment content generation. 
  • Meta Advantage+ Suite provides an AI sandbox for testing early versions of new generative AI tools, e.g., ad generator tools. 
  • Google is beginning to use Bard and generative AI tools to automate and develop. 
  • Code and Theory strikes up a partnership with Oracle to build ad tools. 
  • Snap is announcing AI-generated sponsored links that can personalize ads based on a user’s conversation with chatbots. 

 

 

 
       

 

Financial Services 

 

  • Stock market trend analysis. 
  • Internal database management. 
  • Bank account fraud protection through anomaly detection & threat intelligence. 
  • Drafting client contracts. 
  • Customer facing sales customization and query chatbot, e.g., insurance customer chatbot. 
  • Automated credit decisions. 
  • Risk management. 
  • IT automation. 
  • Financial literature analysis and summarization. 
  • AI-supported security. 
  • Bloomberg builds an LLM generative AI trained on financial data to perform Natural Language Processing (NLP) tasks. 
  • Deutsche Bank embeds NVIDIA into fraud detection, intelligent avatar assistants, and speech. 
  • EY modernized internal employee payroll through ChatGPT integration. 
  • Zurich uses ChatGPT for claims analysis and data mining.  

 

 

 

 

 

Energy, Utilities, and Mining 

  • Generate new routing and scheduling strategies to lower energy transportation costs. 
  • Automate pricing based on market trends and historical data. 
  • Identify potential hazards within the mining process. 
  • Demand forecasting, energy output forecasting. 
  • Grid management and optimization. 
  • Customization of customer offerings.  
  • Energy storage optimization. 
  • Gridmatic uses AI to predict weather to inform energy supply/demand expectations. 
  • Lightsource bp launches AI-generated home energy assistant. 
  • Octopus has implemented generative AI in its customer service operations with 44% of emails being answered by this service.  
  • BHP and Microsoft build generative AI tools to implement operational changes to impact ore recovery.  
  • Shell and SparkCognition partner to support Subsurface Imaging with Generative AI.  
       

 

Law 

  • Automate contract drafting, analysis, and summarization. 
  • Augment research processes. 
  • Organize and summarize documents. 
  • Provide intelligent legal chatbot assistant.  
  • Assessment of enterprise risk against pre-determined criteria, e.g., analyze enterprise internal documents and assess known risks. 
  • Allen & Overy have implemented Harvey AI for creation of and access to legal content. 
  • PWC implemented Harvey AI to support due diligence, regulatory compliance, and contract analysis. 
  • Clifford Chance (and many others) is working with Robin AI to speed up drafting and reviewing of contracts. 

 

Three Waves of Generative AI Adoption  

The road to more ubiquitous enterprise generative AI will be gradual, with ABI Research expecting adoption to come in three waves as the technology matures. These waves are provided in detail below. 

1.) Employee Augmentation: In this initial adoption phase, enterprises will use generative AI for low-hanging use cases to boost productivity, such as content generation, creative assets, data analysis tools, chatbots, and research support. Entertainment, media, marketing, and education verticals stand to gain the most value from the Gen AI wave of adoption.

2.) Service Enablement: In the second wave of adoption, mission-critical services will begin to be implemented with generative AI in the healthcare, legal, financial services, and telcos spaces. This is when enterprises will start using Gen AI to support employees in more complex ways, with risk management, product lifecycle management, and client contract drafting being prime examples. 

3.) Process Automation: The third and final wave of generative AI adoption will see tremendous value creation for critical sectors like energy, manufacturing, and transport & logistics. Generative AI can help facilitate progress for process automation, predictive maintenance, and system optimization. The potential cost savings across supply chains, logistics, manufacturing processes, etc., are robust and significant. 

Although some verticals, such as manufacturing, are not expected to realize the full value of generative AI until the “third wave” of generative AI, these enterprises, once they have defined a clear corporate strategy, should look to get their hands on the technology now for low-risk applications (e.g., content generation with human oversight). This will help them get accustomed to the ins and outs of generative AI, which transitions to organizational readiness for the future of AI as it relates to technological expertise, frameworks, governance, and regulation.  

 

What’s Holding Generative AI Back in the Enterprise Right Now?

Unlike the Business-to-Consumer (B2C) space, enterprise deployment of AI brings significant risks that need to be weighed against potential rewards prior to investment. This risk has led to notable “bans” on third-party, “black box” generative AI services like ChatGPT. While generative AI enterprise use cases continue to emerge, delivering business value cannot rely on large, generalized models as they are slow, insecure, expensive, not adapted for the tasks they service, and subject to dangerous hallucinations. Instead, smaller, contextualized models fine-tuned on specific datasets will offer a much greater Return on Investment (ROI). For example, an investment firm on Wall Street may want to develop an AI-based tool to analyze the stock market and inform users of key trends. Or a utility provider would want a generative AI model that can predict future energy demands.

But a lack of smaller, contextualized models is not the only challenge for enterprise generative AI adoption. Other significant barriers exist, as outlined in the diagram below:

 

Figure 1: Generative AI Challenges

(Source: ABI Research)

 A visual representation of the most prevailing challenges in implementing Gen AI in the enterprise

 

Generative AI adoption will always come with enterprise risk, but addressing these challenges will help mitigate it and accelerate the B2B market.

 

Why Smaller, “Fine-Tuned” Models Are the Future of Generative AI

While studies confirm that generative AI models trained on enormous datasets speed up training convergence and improve accuracy, they are not ideal for executing specific business functions. The true inflection point for the adoption of generative AI in the enterprise sector will come when smaller, more finely-tuned generative AI models are built for specific applications or use cases.   Smaller models are less resource intensive with lower training and inference costs, provide greater transparency and explainability through retrieval-based inferencing, and offer greater timeliness for enterprise deployment. “Fine-tuning” these “small” generative AI models can enhance performance and trustworthiness.  Moving from “giant, generalized” to “smaller, fine-tuned" generative AI models will help alleviate the data privacy, performance, and trustworthiness concerns that enterprises currently have. By deploying “fine-tuned" models, enterprises gain the following advantages:

  • Lower Cost: Training and inferencing costs are massively lowered, and fine-tuned iterations are much less resource intensive. 
  • Explainability and Trustworthiness: “Fine-tuned” models do not rely on known knowledge, but instead on “retrieval-based” inference models. These can reference sources to back up output. Direct access to original data can limit hallucinations and data approximations.
  • IP Ownership: Unlike public AI models, fine-tuned models only leverage internal data that the enterprise can be sure they have the rights to.
  • Performance Optimization and Contextualization: When generative AI solutions are tailored for specific business outcomes, this decreases the number of parameters required and improves performance. As an example, a general ChatGPT can involve up to 170 billion parameters, while a contextualized model may only need about 1 billion parameters or less. These finely-tuned models can also be catered to specific hardware, which decreases cost and improves utilization. Furthermore, because the models are more tailored and use well-curated data, the risk of hallucinations is far lower. 
  • Better Economies of Scale: ML-supported open-source models, which fine-tuned generative AI applications and use cases use, are cheaper than API-based services and better optimized for owned hardware. Scaling an API into GPT-3, for example, will not satisfy an enterprise’s economy at scale ambitions. On the other hand, deploying open-source generative AI on owned servers will translate into cost savings, as the enterprise does not need to pay for tokens. 

Building generative AI applications based on fine-tuned models has been a significant challenge for enterprises, with internal skillsets and the risk of creating siloed business units being at the forefront of worries. However, advancements in open sourcing generative AI and ML service tools will make fine-tuned models a more realistic opportunity for enterprises.

The year 2023 kicked off with vendors discussing a trillion+ parameter models, highlighting the industry trend to increase the size—and thus, accuracy—of generative AI. But the problem is that massive generative AI models are expensive, resource-intensive, and time-consuming. The future of generative AI will require a more effective business case, which can only be achieved with "tailored, fine-tuned" models.

 

Open-Source or Closed-Source Models? Or Both?

Open-source generative AI models are rapidly advancing and are destined to be the future of generative AI; however, closed-source models will still be practical for many enterprises, at least in the short term. Given the significant pros and cons of open- and closed-source models, enterprises and suppliers will refrain from committing solely to one or the other. Ultimately, ABI Research anticipates a meshed, “hybrid” model to induce greater industry value for generative AI. This will enable implementers to leverage federated learning in an economical way, while ensuring data are secure within an enterprise’s walled garden.

 

Table 1: Evaluation of Open- and Closed-Source Models for Enterprises

(Source: ABI Research)

Open

Closed

Opportunities

Challenges

Opportunities

Challenges

•    Enterprises can customize/fine-tune using their own data.

•    External innovation can support improved performance.

•    Low vendor lock-in.

•    More scalable for enterprise use cases.

•    Ecosystem is rapidly expanding.

•    Emerging open-source security frameworks that can be embedded alongside model/applications.

•    Requires in-house developmental expertise or third-party support, which can be prohibitive for early-stage or Small and Medium Enterprises (SMEs).

•    Not suitable for mission-critical or sensitive use cases, as security issues are known.

•    Often requires on-premises servers or infrastructure to run models.

•    Heavily reliant on fine-tuning for optimized performance.

•    Market-leading performance.

•    Security frameworks embedded.

•    Ease of access without any internal skills needed.

•    APIs can be consumed in a flexible model ensuring enterprises of all sizes can access.

•    Risk of vendor lock-in for users.

•    High API cost and egress fees.

•    Lacks explainability, transparency, and observability.

 

How Can Suppliers Seize the Generative AI Market Opportunities?

The generative AI supply chain is rapidly evolving. The market sees new foundational models being released, an extensive list of applications/plug-ins being deployed weekly, and new vendors entering this potentially lucrative space through professional services or partnerships. In the following sections, we summarize the activity unfolding in various generative AI market opportunities waiting for each supplier and highlight some “out-of-the-box” monetization strategies.

 

Table 2: Market Opportunities for Gen AI Suppliers 

 (Source: ABI Research) 

Research & Development

Hardware Providers

Foundation Model Providers

Data Services

ML Service Tools

Application Developers

Enterprise Services

Guardrail development can create a competitive advantage in a market concerned about data privacy and environmental impact.

Growing demand for accelerators, developing strong value proposition at the edge, and building full-stack services leveraging hardware innovation.

“Responsible AI,” cost offsetting to application developers, and tapping into huge customer bases can establish a market-leading position.

Increased demand for data privacy (data synthesis, curation, and monitoring) and integrating data services into emerging Gen AI platforms.

Embracing and monetizing open-sourced models, profiting from renewed enterprise engagement with AI, and development of no code tools.

Leveraging open-source models to build full-stack applications and seizing the largely untapped space for fine-tuned, contextualized applications. 

 

MSPs can assist smaller enterprises with their Gen AI efforts on day 0, 1, and 2 operations. Build highly secure and transparent implementations.

 

Research and Development

Hardware vendors, cloud service providers, system integrators, and consultants spend billions on Research and Development (R&D) each year, making it the catalyst for generative AI evolution. Whether it’s building/training a new AI model or improving hardware utilization, R&D has an enormous impact across the generative AI realm. Generative AI R&D companies are used to increasing parameters, performance, etc., but ABI Research believes that safety/security will be the main focus going forward, considering tighter regulation and standardization efforts.

 

Hardware Providers

The hardware market is dominated by a few vendors with huge budgets, making it the toughest generative AI layer to enter for newcomers. NVIDIA has established itself as the market leader for Graphics Processing Units (GPUs), which are used for AI training and inferencing. NVIDIA has also developed generative AI and ML service tools to augment enterprise adoption of generative AI. Intel, AMD, and Qualcomm are also heavy hitters in the hardware space. Qualcomm has proven its ability to run Stable Diffusion inference (a 1+ billion parameter text-to-image generative AI application from Stability AI) on-device. Meanwhile, Intel and service provider BCG have collaborated to help offer services directly to the enterprise.

 

Foundation Model Providers

The foundation model market is another tough one to penetrate, requiring high hardware costs, strong AI training expertise, and data access. Building and training a foundation model is a significantly expensive endeavor, so as a result, hyperscalers like Microsoft, Meta, Google, and Baidu have overshadowed other firms. There are also a few startups, such as OpenAI, receiving significant financial backing from Microsoft, Anthropic, Cohere, and AI21. Amazon Bedrock, which will monetize Titan and support enterprise adoption of generative AI-based applications, exemplifies how foundation model providers are now beginning to capitalize on the commercial opportunities of generative AI.

Until now, closed-source models have dominated this space, but that won’t always last. The evolution of open-source models, copyright challenges, data constraints, ethical issues, and enormous costs will inevitably cause friction. Foundation model providers must demonstrate their commitment to protecting customer data.

 

Data Services

While initial generative AI models have relied on public information (e.g., GPT-3 uses the Internet), data privacy/copyright challenges will make this approach to training more difficult. This will benefit companies offering data services, as enterprises will need synthetic data for training once regulation/cost barriers reduce the quantity of available data. Moreover, enterprises implementing generative AI will find it incredibly valuable to have internal data curated and labeled to fine-tune their models. Showcasing the perceived value in this area, several startups have received investment recently, with Accenture’s recent stake in curation and labeling provider Stardog being a good example.

 

ML Service Tools

While ML platforms are more supportive within other AI domains like computer vision, they are increasingly being used in Natural Language Processing (NLP) services. Over time, these ML-based tools and services will be integral to enterprise generative AI deployment because they enable fine-tuning and application development. Some of the tools that ease the development, deployment, operations, and management of generative AI models for enterprises are outlined below:

  • Optimization tools bolster the overall performance and efficiency of generative AI models/applications by allowing generative AI to function quicker and more accurately.
  • Integration services ensure that generative AI applications interoperate with existing enterprise processes and workflows.
  • Cloud platforms act as a central hub where developers and enterprises can deploy, monitor, and manage generative AI deployments across cloud infrastructure.
  • AI security services are a big deal considering the scrutiny surrounding generative AI legal and ethical concerns. Enterprise adoption will not proliferate until stakeholders are confident that generative AI solutions won’t impact data privacy, security, or IP.
  • Low/no-code platforms, which offer graphical interfaces, drag-and-drop functionality, pre-built application frameworks, generative AI model APIs, etc., enable developers to build generative AI software applications and workflows quickly and easily.

 

Application Developers

Within the generative AI world, three general generative AI-based application categories have emerged: 1) User Interfaces (UIs) for foundation models (ChatGPT and Bard); 2) application plug-ins based on public generative AI models (WriteMage); and 3) full-stack applications built from fine-tuned generative AI models.

The first two categories don’t require extensive AI skills and are well-suited for the B2C domain, but not so much for the B2B market. ABI Research sees the third option, full-stack applications built with specific business use cases in mind, to be the enterprise opportunity. For the next 6 months, application developers will increasingly leverage open-source models to build the finely tuned, full-stack applications that enterprises want. We anticipate a high proportion of these generative AI deployments to be on-premises, particularly when data sensitivity is of concern.

 

Enterprise Services

Enterprises lack the skills, internal processes, and strategic understanding needed to successfully implement generative AI into business operations and processes. This precipitates a lucrative opportunity for third-party services, such as business consultants, systems integrators, vertical Independent Software Vendors (ISVs), cloud vendors, Managed Service Providers (MSPs), and resellers. This space is mainly fueled by partnerships among business consultants and system integrators, and many examples abound. On the business consultant side, notable collaborations include Bain and OpenAI, BCG and Intel, PWC and Harvey AI, and more. For system integrators, Google Cloud has partnered with Tata Consultancy Service, Wipro, Cognizant, and Capgemini, while Accenture and Scale AI have teamed up.

As depicted in Chart 3 below, the total revenue opportunity for software within the generative AI supply chain will increase from US$1.2 billion in 2023 to nearly US$57 billion by 2030.

 

Chart 2: Software Revenue Opportunity for the Gen AI Supply Chain

World Markets: 2023 vs 2030

(Source: ABI Research)

 

gen-ai-software-supply-chain-revenue-chart

Fueling revenue creation, going forward, requires suppliers to complement current “freemium,” subscription, and consumption-based models with “out-of-the-box” strategies that have proven successful in adjacent markets. Some of these revenue opportunities are as follows:

  • Open-Source Productization: ISVs, ML tool providers, and enterprise service providers should build products using increasingly competitive open-source models.
  • Advertising: Successfully used to support monetization of search tools, ISVs, hyperscalers, and ML tool providers could integrate ads into their products.
  • Center of Excellence (CoE) Support: Startups and ISVs lack the infrastructure and capital to build and scale generative AI applications. Hyperscalers, hardware vendors, integrators, and consultants should look to build accelerators using their tools/infrastructure to support application ecosystem development. This strategy can increase resource and platform usage, while also driving returns through equity.
  • Watermark/Citation Removal: More applicable in image generation, this strategy can create revenue in both the B2B and B2C markets. It also fits with the wider industry trend toward intellectual property protection.

 

 

 

How Should Generative AI Suppliers Support Their Commercial Proposition?

The supplier ecosystem still has a way to go before achieving commercial success. The biggest challenges to overcome include safety/security concerns, huge operational costs, and a lack of clearly defined monetization strategies. In this final section, ABI Research provides several ways for suppliers to carve out a leading commercial position in the market:

  • Be a Leader in “Responsible AI”: Establishing a regulatory framework for generative AI will require effort from both enterprises and vendors. A top-down approach must be completed through a bottom-up approach led by vendors. Vendors can lead in “responsible AI” by proactively developing and enforcing safeguards and guardrails to ease enterprise anxiety. This encompasses data used for training purposes, energy usage for generative AI, model accuracy requirements, and the use of watermarks/citing for AI-generated content.
  • Develop Easy-to-Use No-Code AI Platforms: For enterprise adoption of generative AI to really take off, vendors must offer solutions that enable them to deploy finely-tuned applications (e.g., Harvey AI and Jasper) in a straightforward manner. Current products like GPT are expensive, susceptible to hallucinations, hardware intensive, and lack explainability. To tap into the B2B space, vendors must develop generative AI that provides context and is tailored to specific business outcomes. Meanwhile, open-source models and easy-to-use ML service tools (low/no-code platforms) will enable enterprises to be bolder in their generative AI endeavors, as they will lower barriers to deployment.
  • Harness the Practicality of Commercial Partnerships: Striking strategic partnerships has been a common trend in the generative AI supply chain. A number of examples abound, such as NVIDIA and Snowflake, Bain and OpenAI, Cohere and LivePerson, and more. These vendors realize that adopting generative AI will not be commercially advantageous, and it’s more efficient to lean on the specialized skills and tools offered by partners.
  • Identify and Deploy a Diverse Range of Monetization Strategies Built around Open-Sourcing: Monetizing generative AI is a significant challenge for vendors. Venture Capital (VC) funding won’t pour in forever, so it’s imperative that vendors start creating robust revenue streams for open-source solutions to keep pace with generative AI usage demand. Long-term B2B success requires careful thought into various industry dynamics. From pay-as-you-go to turnkey platforms, vendors can choose from many new revenue models. However, choosing the right one will depend on four main considerations:
    • Customers: Customer demand for elasticity strongly influences what revenue models will work for stakeholders.
    • Core Competencies: Does the vendor have the necessary skills to deploy enterprise services? And have third parties or ISVs been identified to build an application marketplace?
    • Partner Ecosystem: Some revenue models, such as turnkey platforms, revenue share or commission, and transformative consulting, often require partnerships to ensure cross-chain competencies.
    • Competitor Models: Creating value for customers is all about differentiating your products from the competition. Therefore, vendors must always be surveying other generative AI solution providers to identify gaps in existing offers.

 

 

 

 

Frequently Asked Questions

 

How can generative AI models be used in business?

Generative AI models can be used in business to improve productivity, efficiency, and decision-making. Some key business uses include:

  • Retail & E-commerce – Automating product descriptions, powering chatbots, providing personalized recommendations, detecting fraud, and assisting with product design.
  • Marketing & Advertising – Creating digital artwork, generating ad copy, personalizing campaigns, transforming 2D images into 3D assets, and analyzing customer trends.
  • Financial Services – Detecting fraud, supporting risk management, drafting contracts, automating credit decisions, and analyzing large volumes of financial data.
  • Energy & Utilities – Forecasting demand, optimizing grid management, improving scheduling, and detecting hazards in mining or energy production.
  • Law – Drafting and summarizing contracts, streamlining research, organizing legal documents, and providing intelligent chatbot assistants.

 

How will generative AI reshape the enterprise?

Generative AI will reshape the enterprise by driving productivity, efficiency, and automation. ABI Research forecasts it will create around $450 billion in value by 2030 across industries such as retail, marketing, financial services, energy, and law.

Adoption will happen in three waves:

  • Employee Augmentation – Using Gen AI for content creation, chatbots, research, and data analysis to support workers.
  • Service Enablement – Integrating Gen AI into mission-critical areas like healthcare, legal services, and finance.
  • Process Automation – Large-scale use in manufacturing, logistics, and energy for predictive maintenance, optimization, and supply chain automation.

Even though enterprises are cautious today, the long-term impact will be widespread automation and entirely new ways of working.

 

What are the key challenges in deploying generative AI in business?

The key challenges with deploying generative AI in business include: 

  • Data privacy and security risks – Concerns about protecting sensitive information.
  • Intellectual Property (IP) and copyright – Risks of infringement and unclear ownership.
  • High costs – Building, training, and running models is resource-intensive.
  • Lack of fine-tuned models – Current large, general-purpose models are inefficient for business-specific tasks.
  • Skills gap – Many organizations lack the expertise to build and deploy Gen AI responsibly.
  • Hallucinations and reliability issues – Output may be inaccurate or unexplainable, reducing trust.

The path forward is in developing smaller, fine-tuned models trained on enterprise-specific data, which lowers costs, improves performance, and increases trustworthiness.

 

What role do AI technology suppliers play in enterprise generative AI?

The generative AI ecosystem relies on multiple types of suppliers, each with a critical role:

  • Hardware providers – Companies like NVIDIA, Intel, AMD, and Qualcomm supply GPUs and edge hardware that power training and inference.
  • Foundation model providers – Hyperscalers (Microsoft, Google, Meta, Amazon) and startups (OpenAI, Anthropic, Cohere) create large language and multimodal models that enterprises can build on.
  • Data services – Firms curate, synthesize, and label data to fine-tune models while ensuring compliance with privacy and IP rules.
  • ML service tools – Platforms and low/no-code tools that help enterprises fine-tune, deploy, secure, and manage AI models.
  • Application developers – Build full-stack, fine-tuned generative AI solutions tailored to specific business use cases.
  • Enterprise services – Consultants, system integrators, and managed service providers (e.g., PwC, Accenture, Bain) guide companies through deployment and strategy.

Together, these suppliers form the value chain that enables enterprises to safely and effectively adopt generative AI.

 

 

Tags: AI & Machine Learning

Lists by Topic

see all

Posts by Topic

See all

Recent Posts