The 4 trends that prevail on the Gartner Hype Cycle for AI, 2021
Smarter with Gartner I 12:18 pm, 23rd September
Through the use of natural language processing (NLP) and emerging technologies such as generative AI, knowledge graphs and composite AI, organizations are increasingly using AI solutions to create new products, improve existing products and grow their customer base.
However, the prime focus for organizations is to accelerate the speed at which the proofs of concept (POCs) move into production. Hence, the following four trends dominate this year’s AI landscape:
- Operationalizing AI initiatives
- Efficient use of data, models and compute
- Responsible AI
- Data for AI
Trend No. 1: Operationalizing AI initiatives
For the majority of organizations, continuously delivering and integrating AI solutions within enterprise applications and business workflows is a complex afterthought.
“On average, it takes about eight months to get an AI-based model integrated within a business workflow and for it to deliver tangible value,” says Shubhangi Vashisth, Senior Principal Analyst, Gartner. “However, to reduce AI project failures, organizations must efficiently operationalize their AI architectures.”
Gartner expects that by 2025, 70% of organizations will have operationalized AI architectures due to the rapid maturity of AI orchestration initiatives.
Organizations should consider model operationalization (ModelOps) for operationalizing AI solutions. ModelOps reduces the time it takes to move AI models from pilot to production with a principled approach that can help ensure a high degree of success. It also offers a system for governance and lifecycle management of all AI (graphs, linguistic, rule-based systems and others) and decision models.
Trend No. 2: Efficient use of data, models and compute
As organizations continue to innovate in AI, they also need to efficiently use all resources — data, models and compute.
For example, composite AI is currently about combining "connectionist" AI approaches like deep learning, with "symbolic" AI approaches like rule-based reasoning, graph analysis, agent-based modeling or optimization techniques. The result of combining those techniques (among others) is a composite AI system that solves a wider range of business problems in a more efficient manner.
Organizations can apply generative AI that creates original media content, synthetic data and models of physical objects. For example, generative AI was used to create a drug to treat obsessive compulsive disorder (OCD) in less than 12 months. Gartner estimates that by 2025, more than 30% of new drugs and materials will be systematically discovered using generative AI techniques.
Trend No. 3: Responsible AI
The more AI replaces human decisions at scale, the more it amplifies the positive and negative impacts of such decisions. Left unchecked, AI-based approaches can perpetuate bias leading to issues, loss of productivity and revenue.
While algorithms can deduce race and gender from proxy parameters, such as typical female names or postal codes with the dominant racial demographics, more implicit bias is difficult to spot. For example, a data scientist might overlook that a number of clicks on the website can be discriminatory against age. AI can perfectly classify a stereotypical Western wedding but be blind to the weddings in India and Africa.
Moving forward, organizations must develop and operate AI systems with fairness and transparency and take care of safety, privacy and society at large.
Trend No. 4: Data for AI
By 2025, more than 30% of new drugs and materials will be systematically discovered using generative AI techniques.
Disruptions such as the COVID-19 pandemic are causing historical data that reflects past conditions to quickly become obsolete, breaking many production AI and ML models.
D&A and IT leaders are now turning to new analytics techniques known as “small data” and “wide data.” Taken together, they are capable of using available data more effectively, either by working with low volumes of data or by extracting more value from unstructured, diverse data sources.
By 2025, Gartner expects that 70% of organizations will be compelled to shift their focus from big to small and wide data, providing more context for analytics and making AI less data-hungry.
Subscribe to our Newsletters
Stay up to date with our latest news
more news
LuxProvide and DataChef harness MeluXina Supercomputer for the development of an ultra-fast, accurate, and efficient Large Language Model
by LuxProvide I 11:13 am, 27th November
LuxProvide, the national Luxembourgish leading provider of high-performance computing solutions, and DataChef, a Dutch leading consultancy firm specializing in data-driven solutions, have recently signed their new business partnership and are ready to share the results of their first project.
Quel est le point commun entre les centres de données et la plomberie de votre maison ?
by Dell Technologies I 1:57 pm, 6th October
Cela peut paraître prosaïque, mais une infrastructure informatique robuste et bien organisée ressemble un peu à une bonne plomberie. Quand la tuyauterie est problématique, il ne sert à rien d’avoir une salle de bains ultramoderne avec des carreaux de marbre, un jacuzzi et une douche à jet.Si vous avez installé votre infrastructure informatique au coup par coup, il se peut qu’elle ne soit plus adaptée à l’usage auquel elle est destinée. Il est ainsi peu probable qu’elle puisse supporter des projets de transformation tels que le multicloud, la cybersécurité ou la transformation de la main-d’œuvre.
load more