The 4 trends that prevail on the Gartner Hype Cycle for AI, 2021
Smarter with Gartner I 12:18 pm, 23rd September
Through the use of natural language processing (NLP) and emerging technologies such as generative AI, knowledge graphs and composite AI, organizations are increasingly using AI solutions to create new products, improve existing products and grow their customer base.
However, the prime focus for organizations is to accelerate the speed at which the proofs of concept (POCs) move into production. Hence, the following four trends dominate this year’s AI landscape:
- Operationalizing AI initiatives
- Efficient use of data, models and compute
- Responsible AI
- Data for AI
Trend No. 1: Operationalizing AI initiatives
For the majority of organizations, continuously delivering and integrating AI solutions within enterprise applications and business workflows is a complex afterthought.
“On average, it takes about eight months to get an AI-based model integrated within a business workflow and for it to deliver tangible value,” says Shubhangi Vashisth, Senior Principal Analyst, Gartner. “However, to reduce AI project failures, organizations must efficiently operationalize their AI architectures.”
Gartner expects that by 2025, 70% of organizations will have operationalized AI architectures due to the rapid maturity of AI orchestration initiatives.
Organizations should consider model operationalization (ModelOps) for operationalizing AI solutions. ModelOps reduces the time it takes to move AI models from pilot to production with a principled approach that can help ensure a high degree of success. It also offers a system for governance and lifecycle management of all AI (graphs, linguistic, rule-based systems and others) and decision models.
Trend No. 2: Efficient use of data, models and compute
As organizations continue to innovate in AI, they also need to efficiently use all resources — data, models and compute.
For example, composite AI is currently about combining "connectionist" AI approaches like deep learning, with "symbolic" AI approaches like rule-based reasoning, graph analysis, agent-based modeling or optimization techniques. The result of combining those techniques (among others) is a composite AI system that solves a wider range of business problems in a more efficient manner.
Organizations can apply generative AI that creates original media content, synthetic data and models of physical objects. For example, generative AI was used to create a drug to treat obsessive compulsive disorder (OCD) in less than 12 months. Gartner estimates that by 2025, more than 30% of new drugs and materials will be systematically discovered using generative AI techniques.
Trend No. 3: Responsible AI
The more AI replaces human decisions at scale, the more it amplifies the positive and negative impacts of such decisions. Left unchecked, AI-based approaches can perpetuate bias leading to issues, loss of productivity and revenue.
While algorithms can deduce race and gender from proxy parameters, such as typical female names or postal codes with the dominant racial demographics, more implicit bias is difficult to spot. For example, a data scientist might overlook that a number of clicks on the website can be discriminatory against age. AI can perfectly classify a stereotypical Western wedding but be blind to the weddings in India and Africa.
Moving forward, organizations must develop and operate AI systems with fairness and transparency and take care of safety, privacy and society at large.
Trend No. 4: Data for AI
By 2025, more than 30% of new drugs and materials will be systematically discovered using generative AI techniques.
Disruptions such as the COVID-19 pandemic are causing historical data that reflects past conditions to quickly become obsolete, breaking many production AI and ML models.
D&A and IT leaders are now turning to new analytics techniques known as “small data” and “wide data.” Taken together, they are capable of using available data more effectively, either by working with low volumes of data or by extracting more value from unstructured, diverse data sources.
By 2025, Gartner expects that 70% of organizations will be compelled to shift their focus from big to small and wide data, providing more context for analytics and making AI less data-hungry.
Subscribe to our Newsletters
Stay up to date with our latest news
more news
IBM Safeguarded Copy : Assurez la résilience de vos données
by NSI I 11:24 am, 2nd December
Face à la recrudescence des cyberattaques, des erreurs humaines ou même des catastrophes naturelles, les systèmes d’information ainsi que les données d’entreprise sont plus menacés que jamais. Garantir la sécurité et la résilience des données sont devenues essentielles pour assurer la pérennité de la société. Grâce à la solution IBM Safeguarded Copy et l’expertise de NSI Luxembourg PSF, assurez l’intégrité et la disponibilité de vos données en cas d’incident, en créant des copies immuables et sécurisées.
Appache Iceberg vs Delta Lake : The Battle
by SWORD I 10:34 am, 29th November
Delta Lake and Apache Iceberg are two prominent open-source table formats designed to address the challenges of managing large-scale datasets in data lakes. While they share a common goal, their histories and development paths have diverged in significant ways.
load more