|
International Journal of Computer Applications
Foundation of Computer Science (FCS), NY, USA
|
| Volume 187 - Issue 79 |
| Published: February 2026 |
| Authors: Anil Mandloi |
10.5120/ijca2026926341
|
Anil Mandloi . Event-Driven Architectures with Apache Kafka: Supporting Agentic AI and Big Data Analytics in Banking Transformations. International Journal of Computer Applications. 187, 79 (February 2026), 5-13. DOI=10.5120/ijca2026926341
@article{ 10.5120/ijca2026926341,
author = { Anil Mandloi },
title = { Event-Driven Architectures with Apache Kafka: Supporting Agentic AI and Big Data Analytics in Banking Transformations },
journal = { International Journal of Computer Applications },
year = { 2026 },
volume = { 187 },
number = { 79 },
pages = { 5-13 },
doi = { 10.5120/ijca2026926341 },
publisher = { Foundation of Computer Science (FCS), NY, USA }
}
%0 Journal Article
%D 2026
%A Anil Mandloi
%T Event-Driven Architectures with Apache Kafka: Supporting Agentic AI and Big Data Analytics in Banking Transformations%T
%J International Journal of Computer Applications
%V 187
%N 79
%P 5-13
%R 10.5120/ijca2026926341
%I Foundation of Computer Science (FCS), NY, USA
The banking sector is experiencing profound digital transformation, driven by demands for real-time processing, hyper-personalized customer experiences, advanced analytics, and stringent regulatory compliance [1][2]. Event-Driven Architectures (EDA) powered by Apache Kafka have become foundational for building scalable, resilient systems capable of managing massive data volumes with minimal latency [3][4]. This paper investigates Kafka's pivotal role in enabling EDA within banking, facilitating big data analytics for applications such as real-time fraud detection, risk management, and customer 360-degree views [5][6]. Furthermore, it explores Kafka's support for agentic AI—autonomous agents that perceive environments, reason over data, and execute actions in multi-agent ecosystems, leveraging real-time event streams for coordination [7][8]. Key advantages include loose coupling of services, fault-tolerant processing, and integration with stream processors like Apache Flink [9]. Real-world deployments at institutions including Rabobank, Nationwide Building Society, ING Bank, Capital One, and Alpian Bank demonstrate tangible benefits, such as reduced fraud losses and enhanced operational agility [10][11]. Challenges like exactly-once semantics, schema evolution, security in regulated environments, and governance for AI agents are examined, alongside best practices and emerging protocols such as Model Context Protocol (MCP) and Agent-to-Agent (A2A) [12][13]. The synergy of EDA, data streaming, and agentic AI underscores Kafka's position as an essential technology for future-proof banking platforms.