minutes of reading

Big Data Paris: what are the stakes of this year's tech event?

Over the next two days, on 26 and 27 September, the Big Data & AI trade show will be held at the Palais des Congrès in Paris.

Big Data & AI Paris, what makes this show stand out? 

For the third consecutive year, this event, which is the result of the merger of the Big Data & Ai Paris trade fairs, is a meeting place for tech players, bringing together 250 of the most influential European companies in the sector and some 15,000 participants. The interest in this event continues to grow as artificial intelligence becomes indispensable to all industries. Indeed, in a decade, the big data and AI ecosystem have become an economic sector, influencing, and transforming society in depth. 

This 2022 edition breaks away from strictly “business” aspects and takes the gamble of opening to social and environmental aspects. Therefore, it is natural that the opening conference questions the sector's externalities.  The conference, entitled “Big data and the digital revolution, allies, or enemies of the environment”, will focus on the sustainable applications of big data and AI and will be given by Gilles BABINET, entrepreneur, writer, and co chairman of the Conseil national du numérique. 

In the background, data security issues and European regulations will also be on the agenda of this 2022 vintage.  Florence RAYNAL, deputy director of the CNIL, will co-host a conference entitled “the IA Act, DMA, DSA, DGA…  How to prepare for the application of new European regulations”, with data protection specialists Cody OLSON  (Groupe Rocher) and Emmanuel PERNOT-LEPLAY (Deloitte). Adopting a forward-looking attitude on these issues is essential to ensure that the technical direction meets the requirements of compliance, but above all to avoid practices that could lead to rejection by public opinion and slow down the deployment of all solutions by a domino effect. 

From the era of data to the era of data culture? 

The era of big data has materialized in an unprecedented collection and accumulation of data. The volume generated by the economy is increasing by 40% annually and is expected to reach 163 trillion gigabytes by 2025. If we can join the Harvard Business Review in describing this data as “the black gold of the 21st century”, the companies that will master the refining of these colossal, almost unquantifiable quantities will be the majors of this century. 

The challenge for companies specializing in artificial intelligence is therefore to develop the tools that are essential for organizations to understand data and to instill a data culture, which means raising awareness of the essential nature of data structuring and exploitation. 

This is a challenge to which the players in the sector seem ready to respond. According to a study carried out jointly by Xerfi, Global, and Artificial Intelligence, companies' investments in AI have reached more than 67 billion dollars, for a market that should weigh 241 billion dollars in 2025, again according to this study cited by the show's organizers. 

Semantics, the Holy Grail of AI? 

Semantic analysis, which consists of analyzing texts, occupies a predominant place in the challenges of artificial intelligence. Many players have positioned themselves in this segment. The first applications were geared toward the creation of efficient conversational agents, capable of replacing an individual. This market is still growing and is expected to generate 10 billion in revenue by 2026. 

Subsequently, semantic analysis was positioned as a response to the problems of organizing and using increasingly rich and complex databases. The applications of NLP, Natural Language Processing, are the subject of a workshop, “Understanding texts thanks to AI and Deep Learning: Uses and Technologies”, led by François-

Régis CHAUMARTIN, VP Semantic Data Science at Dassault Systèmes and founder of Proxem, the group's semantic analysis subsidiary. 

Beyond commercial applications, crucial challenges, whether political or social, revolve around this discipline.  Disinformation is at the top of the list of scourges to which AI can provide solutions. Julien Mardas has decided to answer this call in 2019 by founding the start-up, Buster.Ai, dubbed “the information antivirus” by Challenges. In concrete terms, Buster.Ai develops and markets an AI capable of fact checking from millions of sources in record time with unprecedented accuracy. This solution has already been adopted by some of the biggest names in the media and will be present throughout the show. If this technology offers editorial offices the possibility of fact-checking sources, we can see it in the long term as a tool at the service of democracies that would contribute to a healthier and calmer public debate. 

What are the challenges for companies? 

The era of Big Data, in which companies are inundated with indicators, trackers, and insights, has made the processes of selecting, classifying, and exploiting data more complex. 

The mission of the CDO, Chief Data Officer, is now to have a positive impact on their organizations by having the ability to translate data into tools for understanding and helping to make decisions, capable of accelerating cycles, with the time factor becoming predominant. A mission that 96% of CDOs claim according to Expert.Ai,  offers a Natural Language platform, i.e. semantic analysis, capable of revealing the essence of a document to enable optimal, exhaustive, and rapid exploitation.  

If it is a cliché to write that the deployment of AI in organizations is a major growth lever, measuring its impact is essential. In a 2017 joint study by Accenture and Frontier Economics, creating new work processes that rethink the relationship between machines and humans could double annual economic growth rates by 2035 while increasing labor productivity by up to 40%. 

For example, AI offers far more efficient supply chain optimization solutions than traditional automation. In the supply chain of an average Fortune 100 company, these productivity gains could free up between $50 and $100 million in cash every day. 

To achieve these ambitions, the adoption of tools is only the first step in a process of driving change and shifting organizations to new models. These emerging challenges are conditioned by people's trust in freely accessible, reliable, and verifiable information.