Life Is But A Stream

Ep 12 - Shift Left, Stream Fast: The Real-Time Data Strategy Fueling Swedbank’s Future

Episode Summary

When data moves faster, risk goes down, and impact goes up. Rami Al Lolah, Lead Architect at Swedbank, explains how real-time data streaming and the shift left approach are transforming how the bank delivers data-driven services with speed, safety, and scale.

Episode Notes

In a highly regulated industry where every millisecond matters, Swedbank is shifting its data strategy to move fast without breaking trust. In this episode, Rami Al Lolah, Lead Architect at Swedbank’s Integration Center of Excellence, shares how the bank built a future-proof foundation using Apache Kafka® with Confluent Cloud.

From early investments in building reusable data products to shift left, , Swedbank’s strategy goes beyond infrastructure. It’s about enabling interoperability, eliminating data silos, and empowering teams to think differently about how data should flow across a modern financial institution.

You’ll Learn:

Whether you're leading modernization for a financial institution or scaling real-time data for your organization, this episode offers a blueprint for transformation. 

About the Guest:
Rami Al Lolah is a seasoned professional, technologist and domain expert for 25+ years within the application and systems integration domain. His expertise is in delivering, designing, writing major integration solutions from world leading integration technology providers, leading technical/services teams, and directing major integration technical implementations. 

Guest Highlight: 
“We need to embrace the [shift left] mindset and methodology, and try to move critical considerations, for example — compliance, security, governance, and data quality earlier in the development process. This means that when teams build something like a Kafka topic, or even an API or a file based integration, they are not just focusing on making it work technically. They are already thinking about who owns it, who owns the data, what the contract, what the schemas look like, who will access this data, and how the access will be managed.”

Episode Timestamps: 
*(06:40) - Data Streaming Goodness
*(33:15) -   The Runbook: Winning Strategies
*(41:15) -    Data Streaming Street Cred
*(47:40) - Quick Bytes
*(54:30) - Joseph’s Top 3 Takeaways

Dive Deeper into Data Streaming:

Links & Resources:

Our Sponsor:  
Your data shouldn’t be a problem to manage. It should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy, real-time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster, and maximize efficiency with a true data streaming platform from the pioneers in data streaming. Learn more at confluent.io

 

Episode Transcription

0:00:00.2 Joseph Morais: Welcome to Life is But A Stream, the web show for tech leaders who need real time insights. I'm Joseph Morais, technical champion and data streaming evangelist here at Confluent. My goal, helping leaders like you harness data streaming to drive instant analytics, enhance customer experiences and lead innovation. Today I'm talking to Rami Al Lolah, lead architect for the Integration center of Excellence at Swedbank. In this episode we're looking at how one of the largest banks in the Nordics is modernizing data strategies with the help of real time streaming. You'll hear how Swedbank is ditching batch processing and building a real time foundation for integration, governance and scale. All powered by a centralized data sharing platform that's helping them deliver better customer experiences and real business impact. But first, a quick word from our sponsor.

0:00:54.8 Announcer: Your data shouldn't be a problem to manage, it should be your superpower. The Confluent data streaming platform transforms organizations with trustworthy real time data that seamlessly spans your entire environment and powers innovation across every use case. Create smarter, deploy faster and maximize efficiency with the true data streaming platform from the pioneers in data streaming.

0:01:23.5 Joseph Morais: Joining me now is Rami Al Lolah, lead architect for the Integration center of Excellence at Swedbank. How are you today, Rami?

0:01:31.5 Rami Al Lolah: Thanks, Joseph. I'm really good. We are still waiting for the never coming summer in Stockholm and it's never coming. I don't know why, but it's. Yeah, we're still hoping.

0:01:42.0 Joseph Morais: I was actually just in San Francisco. It is June and it was colder there in June than when I was there in February. I don't understand how that works.

0:01:52.6 Rami Al Lolah: We are having strange times here.

0:01:54.8 Joseph Morais: Well, let's jump right into it. What do you and your team do at Swedbank?

0:01:59.4 Rami Al Lolah: So I serve as the lead architect within the Integration Platforms Group, formerly actually known as the Integration center of Excellence. It is still considered like an integration center of excellence. Our group is made of highly skilled integration engineers and platform specialists who are responsible for running, operating and governing the bank's core integration platforms. These platforms actually are critical to our ability to move data securely and effectively across systems and between business domains that we have in the bank. The group actually consists of three main teams today. The API Management Team and the Data Streaming team, AKA the Enterprise Kafka team and the Managed File Transfer team. And there is another fourth team that it will be established very soon. It's going to handle an enterprise level data replication and change data capture services. So we both actually support wide range of integration demands across the bank and enabling real time event and data Streaming through Confluent, Kafka and other technology products for API, securing API management and operating our file based integration backbone. In short, my role as a lead architect involves in setting up the strategic direction for how we integrate systems and data for these particular teams and making sure that the group integration strategy is being followed up and maintained.

[overlapping conversation]

0:03:39.1 Rami Al Lolah: Yeah.

0:03:39.7 Joseph Morais: As I say, it's quite a bit of responsibility. I mean integrating data for a bank really doesn't get much higher stakes than that.

0:03:46.2 Rami Al Lolah: That's true, yeah, absolutely, yeah, yeah.

0:03:48.7 Joseph Morais: So who is Swedbank's customers?

0:03:51.2 Rami Al Lolah: So Swedbank customers mainly they are the internal customers. It's essentially the internal teams or the domains that needs to integrate systems, exchange data, expose or consume APIs or build real time capabilities into their products and services. This includes like product teams or working on mobile and Internet banking on the channel side, customers onboarding, payments, lending risk, anti-financial crimes, AML fraud and pretty much every business you can think of basically. So if a team needs to connect an application, for example, stream events, move files securely or build APIs, they come to us basically.

0:04:37.3 Joseph Morais: So when you say integrations, so are you integrating things like mainframes, various flavors of databases with your data streaming? Are those the type of integrations you're talking about?

0:04:46.4 Rami Al Lolah: Yes. So basically we actually involve in mainly everything that it is related to integration. So our core banking services in the Swedish side is actually hosted under the mainframe, under the z/OS mainframe. So most of the integration goes from the channels towards these systems in the back end in the mainframe. It's still within API integration basis and batch also, transfer between whatever they have in the mainframe and outside the mainframes. But we are actually not only this, we have lots of integrations outside of the mainframe as well, in the bank.

0:05:29.3 Joseph Morais: Got it. Well that's great. Thank you for all that detail. 

0:05:32.0 Rami Al Lolah: Absolutely. 

0:05:32.4 Joseph Morais: So you know, at a very high level, right, just like the elevator pitch, because we're going to go through the full details of it through the episode. But what is Swedbank's data streaming strategy?

0:05:40.9 Rami Al Lolah: Our data streaming strategy, you can think about it is all about enabling real time governed and reusable data movement across the bank using our Confluent platform today as our foundation. It's like the main backbone for data streaming in the bank today we have moved away from traditional point to point. We actually banned that in principle that we use point to point integration and we are trying to move away also from batch heavy integrations. And are building a central event streaming backbone that allows different domains to publish and consume data as events in real time, securely, reliably and at a scale. And we actually use Confluent platform as a primary enabler for this in this part.

0:06:40.5 Joseph Morais: So we set the stage. So let's dig deeper into your data streaming journey in our first segment. So tell me, before implementing data streaming, what were the biggest challenges you were facing day to day?

0:06:52.2 Rami Al Lolah: So we are as a bank so keen to provide very competitive financial services but with the highest compliance and security at operational levels as well. So we wanted to be able to secure data, analyze it and examine and see what's going on in our operational data. And that was the first thing that people that they would think about. We wanted also to democratize the reach to data to the different functions and units. Something to allow them to build analytics and reporting that can drive accurate, safe and value driven actions and business decisions basically. So the legacy integrations from where we want to source the most valuable information were tightly coupled. Often they are based on batch driven integrations and lacks real time sense. Believe me, you don't want to, for example, know that there is a suspicious activity going on in some financial activity the next day or the week after.

0:07:55.5 Joseph Morais: Right. So it sounds like the slower your data is, the more risk you take on as a bank.

0:08:01.4 Rami Al Lolah: Exactly, yeah. And that is one of the key challenges that drove in the beginning to start the Confluent Kafka platform. We also had challenges with the dataciders. So we wanted data to be reusable. We didn't want the data to be hijacked by a specific area and they don't share it or we wanted to achieve interoperability in the organization where data can be shared across organization in a safe and responsible way. Apart from analytics, one of the biggest challenges that we had also that drove to this fortunate implementation of what we have today is that we had a lack of simple asynchronous application integration events and data streaming use cases. People they used to have, for example, MQ based integrations, file based integrations, even APIs. But it didn't work as exactly what you expect from basically in that sense. You want a capable platform that can actually cater for these real time, not near real time or not as some people they like to market it. So that was one of the biggest drivers as well to move forward even further steps implementing data streaming. And the...

0:09:21.8 Joseph Morais: Yes, the other options, it's either not fast enough or it doesn't scale well enough. And the beautiful thing about the way we've built our technology is it was built for both of those challenges. So you already mentioned one because I'm curious about the customer's experiences before data streaming. So the obvious example you gave, which is a fantastic one, is you don't want to find out that someone stolen your credit card tomorrow, right? You want to find out about that charge right now. Are there any other challenges that came to mind before implementing data streaming?

0:09:54.1 Rami Al Lolah: Of course, I think the ability that you cannot actually get data in time and analyze them and take fast decisions on doing things actually made teams have slow development. It's not like a slope, but it's like, they wanted data, so they wanted to find ways to bring this data in into their systems. And so bringing this data in was not an easy job. For example, going from traditional non real time data streaming has caused bottlenecks in easing the migration also of some core services that we had towards an event-driven architecture. I think one major also challenge is that, it was harder to deliver truly personalized, responsive if you like, banking experience. So when your architecture for example, is built around yesterday data or whatever, day before yesterday, you cannot actually make proactive decisions. And it's a key reason we have invest heavily in real time streaming to create a foundation where data flows immediately, reliably and securely. It came to analytics and came the application integration that actually helped lots of teams basically adapt quickly and transform in a more effective way in doing things and doing integrations.

0:11:19.8 Joseph Morais: That's fantastic. So I'm curious, the audience now understands the challenges you were having before data streaming, but I'm curious if you could break down how you specifically ended up selecting data streaming as the new architecture to solve these. Was there a specific tipping point? Was there previous experience with Kafka? I'm just curious how you ultimately said yes, data streaming is what's going to fix this for us.

0:11:43.2 Rami Al Lolah: It was rather a gradual realization across the group that our traditional integration methods, especially batch based processing and point to point APIs, for example, just couldn't keep up with the growing demand for speed, scalability, responsiveness for our digital services. For example, as more teams started building cloud, native applications, microservices, working with advanced analytics and expecting near real time or real time recently insights, the limitations of our legacy patterns became more and more visible for everyone, basically. So a clear turning point came when we started aligning our technology direction with Swedbank broader tech and data strategy. It emphasized modularity, data driven decision making and cloud native thinking. And that is when actually we began laying a foundation for real time event driven architecture. So Confluent came and became a natural fit at that time for that vision. When we actually started looking for a technology. It offered a governed enterprise grade Kafka platform that we could use to decouple systems, stream data securely and enable reusability. While there wasn't really a single crisis moment, if you would like, the convergence of business demand, tech strategy and architectural limitations made it clear that data streaming wasn't just nice to have, it was a necessary to keep pace with what the bank wants, with our customers expected from us.

0:13:24.5 Joseph Morais: Right. And it's interesting because I think there's this expectation globally of everything working now, right? We all have phones and we can press a button on our phone and a car comes to pick us up, or food gets delivered to us, or I look and I see I have a notification of a fraud alert. And the expectations now, you know, broadly are that everything should work when I press a button. And I think it's interesting that as you were running into these challenges, there were multiple layers that said, batch is a blocker here, batch is a blocker there. We need to get the real time. And then of course that led you to data streaming. So now that the audience understands the challenges and how you select the data streaming, can you share what Swedbank has built with data streaming or is currently building?

0:14:13.2 Rami Al Lolah: We have built a core integration capability that now underpins many of our most critical internal data flows. One of the most important things I can say we have done is establish a central platform, a data streaming platform, a governed enterprise streaming backbone that connects different domains and platforms across the bank in real time. This platform is serving not only as a gateway for data ingestion towards analytics, but also as a base real time data and event streaming channel for application integration as well. That allows product teams to publish and consume business events securely and reliably. Also enabling the long demanded feature is real decoupling between producers and consumers. And that was really a great capability there. Also event driven communication, instead of the traditional point to point. So we are also building real time data pipelines that stream events into analytics environments such as our operational data lake, where teams can build real time dashboards, reports and models without relying on batch extracts or traditional integration. We started enriching this data and transform those streams in motion. Supporting smarter decisioning directly at the event level. All of this is still evolving, but we are well past the experimentation phase if you want. Data streaming is now like for us is a key part of how we move, governs and makes use of data. So we are not providing a platform, we are providing a complete offering frameworks Confluent Kafka as a service for data streaming for everyone in the bank today.

0:16:09.5 Joseph Morais: That's beautiful. As you're talking through that, I thought of something that we talk about a lot here at Confluent and this is the idea of shifting left, right? So if you think of, for the audience, if you think of an operational data streaming platform on the left side of architecture and you think of an analytics platform on the right and the operations platform is feeding the analytics platform and sometimes there's some bidirectionality there, depending on the integration. We talk about shift left is that the idea of bringing your processing closer to the source, in this case your operational state. And it sounds like, you know, through some stream processing, through Confluent platform. You're actually doing that at the bank now. You're trying to do the processing on that data and build those data products as early as you can, before getting them to your analytical estate. Is that correct?

0:17:01.1 Rami Al Lolah: That's absolutely right. I think this is one of the main reasons we have really had a data directive that actually encourages the shift lift paradigm that we think about to embrace the mindset and methodology and try to embrace, you know, to move critical considerations like for example compliance, security, governance and data quality earlier in the development process, essentially as you said to the left on the timeline of a typical delivery life cycle. So instead of handling those aspects at the end, at the right side, as you said, during production or go live or even if why it is landed in the data lake or whatever data system that people they want to consume the data from. Basically we want to tackle this before that. We want to make sure that this design is correct upfront during the development. And this means that when teams build something like for example a Kafka topic or even an API or a file based integration, they are not just focusing on making it work technically, they are already thinking about who owns it, who owns the data, what the contract, what the schemas look like, for example, and who will access this data, how the access will be managed?

0:18:23.0 Rami Al Lolah: So even what policies can be applied and how we Will ensure traceability and compliance? All of these things are guided today in the bank by frameworks and you know, guardrails and really strong guidelines that has been created centrally through many functions from architecture, from group data and AI and on all these things needs to be followed, needs to be assessed, needs to be verified and approved. So that is a complete, if you want a process that you want to go through it to ensure that shift lift is there and we really assess the right data and make sure that we have a quality data at the end.

0:19:05.2 Joseph Morais: Well, in my opinion you did it the right way. I mean you started with the integration challenges and that's really the hard part. Decoupling everything, using connectors if you have to pull things from mainframes, from databases, et cetera. But then the next logical step is now you have all that data in your data streaming platform and you have stream processing. Let's start doing those manipulations as early as we can. And I'm glad you were able to find success in doing that. So I'm curious with this integration platform, is there a specific KPI? Is it how integrated things are, is it maybe latency? I'm curious what is your key performance indicator for this platform?

0:19:42.6 Rami Al Lolah: Yeah, I think this is a really good question and I think it's a concern for many stakeholders in the bank that they want to see how did our investment actually going on. So one of the primary KPIs today we track is the time to integrate. So how quickly a product team can connect to another system. For example, publish or consume events and go live with a new integration. With the automation and quick onboarding schema we founded close collaboration and support in place. We have significantly reduced that time. We used what used to take weeks or even months can now often be done in days. We are also improving this event further actually in the future, enabling full self onboarding and steering of how our customers can use our platform. Another important metric is the platform adoption for example and specifically the number of domains actively publishing and consuming from Kafka topics. And the number of events flowing through the platform daily. That gives us a really good sense of whether streaming is becoming the default integration model across the bank rather than an exception. For example, today we really have long waiting lists people they actually applying to onboard to the enterprise Confluent Kafka platform in the bank trying to move to this event streaming and data streaming.

0:21:14.1 Rami Al Lolah: So we look also at the governance and data quality indicators, when we prove that we have a good governance and you have a good data quality when using data streaming versus other integrations, then we are actually employing the right guidelines, employing the right patterns in that integration. So people, they will encourage to use it and to go with it. For example, one of these, how many schemas are being reused versus duplicated. How many topics are properly classified and across controlled, whether the data being produced is consistent and reliable for downstream use. It ties a lot with our regulatory and compliance goals and internal audit requirements. But I think that that is a quite important KPI and I think this is why we are actually monitoring lots of these very carefully and to make lots of progress on that.

0:22:12.3 Joseph Morais: Yeah, that's fantastic. So you mentioned data governance and my next question is actually about that. My initial question was going to be how do you approach it? And I think you did broach that. But what I'm curious about is how important is tracking and enforcing the flow of data quality as it enters your system for Swedbank?

0:22:31.3 Rami Al Lolah: Yeah, I think tracking and enforcing the quality of data as it enters the system is one of the foundational principles of our data governance model. It's not an optional step, basically, and for good reason, actually. If poor quality data, for example, gets in early, it spreads quickly downstream basically, and causing inconsistencies and in reporting, for example, broken integrations like failed, probably automations and in the worst case, you know, compliance risks and poor customer experience. So we treat data quality not just as a check at the reporting or analytics layer, for example, but for something that must be validated upstream as early as possible in the dataflow. And that's why our data directive that has been published requires every process that creates or captures data to have clearly defined business terms, documented data elements and formal data quality requirements. That includes everything from formatting and completeness to timeliness, lineage and logical consistency. So we enforce these quality gates at the integration layer, especially in our Kafka platform, in our APIs. Every new schema, for example, in our platform must be registered and versioned through Schema Registry, for example, ensuring structural consistency and make sure that producer and consumers, they have properly established contracts that complies with our data directives. And we make sure that we are following these governance steps for that.

0:24:15.0 Joseph Morais: Yeah, that's wonderful and I think you're taking absolutely the right approach. It's really surprising to me how many organizations out there have implemented data streaming without governance. Like they don't even use schemas and they just throw blobs of JSON back and forth at each other and they somehow make it work. But I can't imagine that is tenable, especially at the scale of a bank. So something you touched on earlier was data retention. So I'm curious, how long do you need to hold on to operational and is that driven by compliance?

0:24:45.6 Rami Al Lolah: We have complete policies for data retentions in different areas basically. So we don't actually have fixed retention here and there basically even within our data systems like our operational data lake or so, there are layers of data there and layers of policies there. So for example, for this layer we keep this retention, this data retention for example, this many years. This for example place contains GDPR data which we cannot keep more than specific time, and that is very critical. In Confluent platform, we are very strict at that side. We don't use a platform as a sort of a tiered storage in that sense. We don't store data in the platform today rather than we use the platform as a sort of a hub of exchanging data between things. Still there are services that they retain the data, keep it there for further processing and further things. So the retention goal varies from few days into months and years. And we actually do not enforce a specific retention. We leave that to the business and to the product teams to get in touch with compliance and risk and all these people and get their blessings on that, this data can remain in the system for that period of time. And this is where we actually want to start purging all data and how to do that, where and when and why should we do that? So we actually provide the capabilities in the platform to do this, but we are not like enforcing certain rules in that side.

0:26:27.6 Joseph Morais: So it's interesting. You've talked about connectors, we've talked about variable retention, we've talked about stream processing, we talked about schema registries and governance and really and security. And I think that's an important one. And what you've really just described is what we hear at Confluent call the data streaming platform, right? So you can use something like open source Kafka and open source Flink to achieve a lot of the things you talked about here. But the data streaming platform is about simplifying all of that, about giving you a package that has all that integrated and includes things like high levels of security, whether it's ACLs or roles based access or open authentication, OAuth. All of that is kind of, you don't really get out of the box if you try to build it yourself. And I think all of those advantages with the Confluent data streaming platform kind of really helped and correct me if I'm wrong, but really helped Swedbank adopt data streaming without any of the rough edges and some of the missing pieces that you'd have to build yourself. Do you agree?

0:27:31.9 Rami Al Lolah: Absolutely. I think there are lots of considerations when we talk about this. Previously when I joined the bank their Confluent platform was already there, basically. It was already established for analytics, for example. When we actually started to expand the scope of Confluent platform into not only doing analytics and things, but we also want to integrate microservices, we want to integrate systems together in pure application integration, publish, subscribe patterns. And so lots of people, they started thinking about open source Apache Kafka, for example. And or Rabbit MQ that some of the product teams individually started implementing. But they actually miss. The key point here is that you don't have the full package. Basically, if you want to integrate for example with MQ, you need to create your own connector by yourself. You don't need to reinvent the wheel. If there is already a certified supported product that comes into this whole ecosystem around the Apache Kafka or around the Kafka itself, that really gives you the power to harness data from other data systems.

0:28:52.1 Rami Al Lolah: I myself see that Confluent connectors and the ksqlDB, the Kafka stream are the most powerful components that comes in this package that I don't think you can reinvent in a way it scales in the right way and then you can use them for the real and the right use cases that you want to do. You don't need to recreate things again. So that's why it was a direction that we want to move with the platform and we want to eliminate individual implementations of smaller platforms here and there that does not actually comply with our data strategy and all of that. So we want to have a centralized platform. What we didn't want to actually compromise agility and compromise faster market. Yes.

0:29:43.6 Joseph Morais: That's wonderful and thank you for sharing that. And there's an entire Connect engineering team here at Confluent that would be very delighted to hear what you just said. So thank you. Now I am fresh off of out of DataBricks data and AI summit so I have, you know, I'm kind of curious about this. You mentioned all types of integrations, right? Whether it's relational databases, mainframes, et cetera. I'm curious, are you looking to provide or integrate the platform with any cloud service provider, ISV services?

0:30:11.8 Rami Al Lolah: We have today actually integrations with Confluent platform towards Azure for example.

0:30:18.5 Joseph Morais: Okay, great.

0:30:19.5 Rami Al Lolah: And we do have some sort of archival data and things like this. We use Kafka solely actually in real time to move this data to cloud providers that we use in the bank. We actually have Azure as our standard cloud provider that we use basically. There has been also discussions that we can actually integrate SaaS services in different other cloud providers with the bank core services and the bank core services using Kafka platform basically. But currently, it's Azure is the main region and site where we actually take data and use it in that sense and move it between our on site, on prem data center and Azure using Kafka.

0:31:12.1 Joseph Morais: It's great you were able to use Confluent to bridge your on premise to the cloud. And I think that's a challenge that many organizations, financial services or otherwise have. So it's wonderful that you've been successful doing that. So Rami, it's 2025, this is a tech conversation so I have to ask, what is the future of data streaming and artificial intelligence at Swedbank?

0:31:33.9 Rami Al Lolah: So in Kafka Platform team we actually think on what are the best ways actually to use AI, for example to do analysis on the things that we use today. For example, predicting errors that we have while people, for example, let's say using connectors or using for example transformations basically feed this data, feed these things into large models and analyze them basically and predict what could be the solution or what can improve this pattern of integration for these customers. Also we think maybe AI could help in the future to ease the onboarding on the platform basically by self creating clients for customers without even the need to program them basically or write a code basically. It's very early, I think we are not at that stage yet but I think there are really a really good steps that we are trying to take in that sense in the bank. And I think it's going to be really fun.

0:32:38.0 Joseph Morais: Yeah, it definitely sounds exciting and frankly you've built this integration platform and it essentially future proofs your initiatives at the bank. Whether it's AI now or quantum computing in five years from now or whatever it is, you have a platform that you can build a connector, use a commercial connector, build an API and really get your data to any type of data system in the future, regardless of what it is, whether it exists today or someone builds it tomorrow.

0:33:06.4 Rami Al Lolah: Exactly, exactly.

0:33:14.6 Joseph Morais: So next up is the runbook, where we break down strategies to overcome common challenges and set your data in motion. So tell me Rami, what led Swedbank to choose the Confluent data streaming platform over various open source and vendor products?

0:33:28.2 Rami Al Lolah: The main reason is that we actually wanted to make sure that we have a full package of services. We don't want to actually reinvent the wheel in developing what is need to be developed. For example, we know that we need integrations with data systems, for example beyond just the symbol, we need, for example to pull some data from MQ in the mainframe. We need to actually have connections toward Oracle database where we actually publish events or read events from these databases. The problem with this is that you need to... With open source products you cannot do this out of the box. You need to actually create these things, you need to test it. And most importantly we work mostly in a federated way basically. So we have domains, we have product teams that belongs to certain domains, technology domains and business domains that they actually run the development. So people, they expect that they want to focus on business, focus on creating business value and developing the products in this way. And they don't want to invest in creating a technology platform in their domain, for example, maintain it, pay for the infrastructure, bring developers to create the required development, to create dashboards, monitoring and security and these things.

0:34:51.2 Rami Al Lolah: So they don't want this. That was the main reason why we wanted to use Confluent platform for the ecosystem that was really built around it. And what I like about it is that they keep their development of this platform in a consistent way. So every time we hear that there is something good is coming basically. And for example the RBAC versus ALCS, the access lists, it was a major milestone for us in adapting the platform and deciding that yes, we want to go with this. It enabled for us doors for multi tenancy, for example, in the bank to use this platform and to make everyone use it without everyone compromising data or segregate or accessing someone's else data or something like this. And still there are lots of things coming which is really good. And this is why we like that, this product development is consistent and going in the right path.

0:35:54.7 Joseph Morais: Yeah, thank you. That wonderful. I think that's ultimately what every ISV is ultimately trying to achieve. We're trying to put something... We're trying to put out products that enable you to do something faster so you can focus on the things that only your engineers can do and that's build specific value for Swedbank customers, and like only your engineers can do that. We figured out data streaming just like other companies have figured out their technologies. So why not utilize something that's already battle tested, proven and get you running, get you off to a start quicker. So other than Kafka, what is the top tool you rely on for data streaming today?

0:36:36.1 Rami Al Lolah: We don't run actually other data streaming tools today except Confluent platform. So today our Confluent platform is the main backbone for data streaming in the bank. So it's not only for analytics and source ingesting data to operational data lake and other platform basically for the downstream applications and reportings and all these things, but we also heavily using the platform for integrations and migration of legacy applications into our microservices in development platforms that they enable people to create microservices and applications. Also forces that, if we are actually using a specific platform and we are satisfied with the capabilities that it is built there and we are actually using it and we are supporting it in the right way and we are providing the right support and right governance around it, if you would like. We don't allow this the creation of another shadow, if you would like data streaming systems, basically. That will create inconsistency that will force people to create their own silos data products. That's what the bank does not want to do basically. So they want you to use one platform that governs this, controls this capability and everyone use it, everyone share information with it, everyone speaks the same language and use the same data. There are no sideways. I mean basically, it's the highway that you need to drive on.

0:38:11.4 Joseph Morais: Yeah, Swedbank's all in on Confluent and that warms my heart. So thank you, that's a fantastic answer. So I'm curious through the approach and converging on data streaming. Are there any tools or approaches that you actively avoid through those learnings?

0:38:29.7 Rami Al Lolah: To be frank with you, as a Lead architect for integration in general, we don't say no to everything basically. So I mean we need to be very realistic here, like data streaming is for data that you need to really move on, events that you need to know about immediately, critical data that you want to make decisions on. And basically and that is the future, that is what we do today. And that is what most of the products and most of the product teams and and services in the bank they aim for and they want to actually be in basically. But there are still traditional integrations aside of that. Basically there are still MQ, we still have file based integration. We want to move away from these basically but we want to also make sure that the business case to move away from this is sufficient basically. So I don't accept people to be onboarded to the data streaming platform because they exchange three messages every day, basically, for just an integration. That is not data streaming basically. And we don't actually accept that people go to MQ and ship 300 or 1 million message per second to between systems there on MQ.

0:39:51.8 Rami Al Lolah: We don't allow that as well. So we want to make sure that the right use case goes to the right pattern and the right platform. And I think Kafka and Confluent today it's basically serving this purpose in this sense for everything that it is related to data streaming and real time event streaming, publish, subscribe for exchanging information between two applications that is also adaptable and we are using for that. By the way, I think there is part of the tick and data strategy is the guidelines and directive that we have is that, we also want to try to move away from traditional, for example. message queuing and the file based integration. We want to use Kafka for example, for this. And we are trying today to build up the patterns and to document them to make sure that we can use this pattern in a secure way and a compliance way to replace, for example, an MQ integration with a date with Kafka properly and securely for everyone.

0:40:57.0 Joseph Morais: Yeah, I really like that approach. You don't say no, you instead say let's use the right tool for the use case. It's like more about building guardrails as opposed to stop signs. I really do like that. All right, so we've talked about the tech tools and tactics. But none of that moves the needle without the right people behind it. Let's dive into how you got Swedbank to fully commit to data streaming. So how did you convince leadership to get on board with your solution? Was it smooth sailing or was it more of a bit of a roller coaster?

0:41:33.2 Rami Al Lolah: That's a very interesting story because I think we didn't needed much effort to convince people to use it. I think when I joined the bank I had. One of my responsibilities was actually to build some sort of a target view for what do we lack in integration area of capabilities that we need. And I had made lots of interviews with lots of architects and product owners and chief product owners. And I would tell you that major, I would say 90% of the feedback that I got is that they are missing an event streaming and data streaming capability today. And they have lots of use cases that they want to do. The problem with that is that we used Confluent Kafka at that time only for analytics and stream analytics and all these use cases. We did not allow something to be built on the platform beyond that. So if you have a microservice and you want to integrate it, you will do that with MQ and you will live difficult times there. It's not designed for this basically. So people, they wanted a real data streaming platform that can give them, you know, that it is fully automated, people can onboard to the platform and do the things just without the need to wait behind an IT team to do that basically.

0:43:04.4 Rami Al Lolah: So it was not very hard. We took the case, we built the case and we took it to our stakeholders and we said, okay, we want to take this Confluent platform that was built for analytics and we want to expand the scope for it. And you want to create a bigger team around it and also supply the people with support on how to create and how to do these things. It didn't take actually too much to think about basically. So it was a logistics thinking. I mean, who is going to move, who is going to join here and there basically. But the concept itself, it was highly demanded and people, they wanted really to do it and to make it. So it was not hard, it was smooth. It took a little bit of a time, but it eventually happened.

0:43:54.6 Joseph Morais: No, but that's really valuable answer because we've had different guests that were at different parts of that journey. There was no data streaming in the company at all. In your case, data streaming was already adopted for a very specific use case. But for you it was about how do we broaden that adoption because we could do so much more with this and I'm glad that you didn't have to run into too many challenges. 

0:44:18.9 Rami Al Lolah: No, no. 

0:44:19.6 Joseph Morais: The other thing when we talk about adoption that gets mentioned is you know, having our documentation. Sometimes for companies that don't have necessarily staff, we have professional services and we have all this training. So all of this knowledge around Confluent I think is also a real boon to anyone who's trying to adopt it. Now I'm curious, can you share a specific tactic that has significantly improved the adoption of data streaming at Swedbank? So you talked about that, right? Like that was the challenge, what was the tactic? What did you use to get that adoption improved?

0:44:51.7 Rami Al Lolah: When I joined the bank and I started talking with people about Kafka or Confluent Kafka, one of the main painful things that they were talking about is that okay, if I want to set up a security for Kafka, for example, using Kerberos, okay, then I need to order AD groups and I need to order service accounts and they need to start creating key tabs. And they started, you know, said okay, that's a long thing. It takes weeks to do that, and it's a headache. Do you have any other ways to do that? We can use certificates for example to do the authentication and things. But when RBAC was introduced, I mean that simplified a lot the process, that gave us the opportunity as an enterprise Kafka team to build automation around this. So we created pipelines that feeds the platform with orders. Like for example, I want to create a topic. For example, today you create the topic via pipeline. You don't actually use Control center. So you access the pipeline with your own users with all new monitoring and all these things and you onboard to the platform via creating your own service account.

0:46:06.0 Rami Al Lolah: We have APIs now that can create AD groups and things. I think the main tactic was, is how can we make the platform as easy as possible to self onboard. And that for product teams was a game changer for them because, they no longer want to wait for AD groups to be created or something. Or they don't want to contact the Kafka team, for example, to map our back to a certain topics or something. They can do that themselves via the pipelines and the automation without even asking someone else. So people, they were actually really wanted to do that and they started onboarding one by one and then when they realized that this is something that it is working actually, it's why do we need to invest in creating an open source system in our domain and invest on bringing developers and things? No, we will give this team what they want and we will use this as a service. This is the main marketing point that we had. We are providing Kafka as a service for our teams in the bank. It's much more like the cloud concept if you would...

0:47:20.1 Joseph Morais: Yeah, that's a really strong set of tactics. I mean basically you know, fully utilizing all the features of Confluent. That was one of them. Building automation and abstraction away from the actual platform itself to make it self service is another, really making it even easier to adopt. That's a fantastic tactic. Now let's shift gears and dive into the real hard hitting content. The data streaming meme of the week.

0:48:00.2 Rami Al Lolah: Yes, I think it's a spot on meme. There is no business in this world that wants to have old data or yesterday's data, today receiving it and then it's actually have to actually decide on it or build business decisions or things on that based on this data. So I think most of the today's integration they actually demand using for example real time and data streaming to do proper business decisions in timely fashion. Compare that to the formula one, for example, race basically where these cars are equipped with thousands of sensors and send you the data in real time to actually take immediate decisions and things on what should we do on the next round? For example, should we take some fuel? Should we change tires? Should we actually do some fine tuning to the engine or something? It actually creates a difference of milliseconds between the winner and the loser. And it's exactly the same thing with financial services today. If you are not up to date with the data. For example, if you cannot take decisions, if you cannot personalize experience and predict what your customer wants, you will lose that customer in the next hit. So he will click actually on that button and try to understand, for example, what's the best credit card that he can use if he does not read the right information on time and get the right business edition for the, he will click on the next link to move to your competitor. And that is something that most people today, they started to realize it's no longer an option, it's a necessary step that needs to be taken. So I very much agree with this.

0:50:00.3 Joseph Morais: I'm glad that one really resonated with you. So before we let you go, we're going to do a lightning round, bite-sized questions, bite-sized answers, that's B-Y-T-E, like hot takes. But schema back and serialized. Are you ready, Rami?

0:50:18.5 Rami Al Lolah: Yes.

0:50:19.6 Joseph Morais: What is something you hate about IT?

0:50:22.5 Rami Al Lolah: I think excessive evolvement of technology that became so advanced and so frightening in some sense. And I think we really need to be very careful with adapting new technologies. Make sure that we have on the right moral and ethical path on how to do things and how to adapt these things. So it is nice, it is very nice that we see all these excessive developments, but we need to be very careful and we need to be really sure that we are employing these technologies in the right path.

0:51:01.0 Joseph Morais: Yeah, that makes sense to me. What's the last piece of media you streamed?

0:51:04.8 Rami Al Lolah: It was actually a media that I streamed in the bank for our fellow architects and fellow product teams. It was a huge crowd, around 700, 800 people. And we were discussing data streaming and event streaming for our enterprise Kafka platform in the bank. It was a deep dive, technical discussion and presentation around this.

0:51:31.1 Joseph Morais: Very appropriate for this setting. What's the hobby that you enjoy that helps you think differently about working with data across a large enterprise?

0:51:40.1 Rami Al Lolah: As I said in the previous question, I like watching a lot the Formula One championship. It is always fascinating me how they actually build this technology, how they use this data, this massive amount of data that can actually be fed into systems and giving them the right way to decide on things. It's really becoming like, you know, it's the separate between who wins and who loses, basically. Who has the last real data that can drive the right decision.

0:52:12.0 Joseph Morais: Right. Who has the freshest data is often the winner.

0:52:14.6 Rami Al Lolah: Absolutely, yeah.

0:52:15.7 Joseph Morais: Can you name a book or resource that has influenced your approach to building event driven architecture or implementing data streaming?

0:52:22.6 Rami Al Lolah: When I started actually looking at this around three, four years ago, there was a book, it's called Streaming Integration. It was by I think Steve Wilkes and it was really a good book about data streaming and streaming things. But there was an interesting section in that book that before people they started talking about AI and machine learning, and there was a complete section that talks about how can, for example, Kafka be used to feed in real time data to these large models basically. So instead of feeding static data as if we do today, for example, uploading static data via BDFS or something and ask AI to summarize or ask AI to understand things. It is more powerful that we actually feed this data in real time to this machine learning models. So that was like a very, very unusual section that talks about if you want machine learning to analyze unpredicted patterns, for example, that people, they, that no human can actually understand or can predict basically. So you can feed this real time data into the... It was a really interesting read. I recommend it. I recommend that section. I know a lot of people, they know the whole stuff. But that section was really interesting in that book.

0:53:47.2 Joseph Morais: Great. So what is your advice for a first time Chief Data Officer or someone with a similar impressive title?

0:53:52.7 Rami Al Lolah: Yeah, so I think I... Maybe I am not the right person here. That....

0:54:00.1 Joseph Morais: Your learnings are just as valid as anyone else's.

0:54:02.2 Rami Al Lolah: Exactly. Yeah. But I think one of the main reasons is that we have a really great team that today has ambitious in building more and more data governance and make sure that data quality is employed in the bank and things. But if I were speaking directly to a first time Chief Data officer, I would say first, maybe there is no need to rush into tools and technology. I think the biggest value early will come from creating clarity around roles, ownership and accountability before investing on technology, before investing any new tools or capabilities. I think the second thing is that I want to ask him to treat data governance as a business enabler, not as a control function, for example. And that's why, for example, if governance is framed as a blocker, people will run away from it, will try to avoid it. And that I have seen in lots of organizations before I joined previously. So I think if you position it in a way that it is going to be helping teams and moving them faster build the trust, it will definitely come up with a different output.

0:55:19.0 Rami Al Lolah: Another tip is that maybe if you start small but to make it visible so we can for example, pick on one of these critical business processes, try to, for example, apply these governance rules and principles on it and make it like a flagship of success or a success story, then everyone else will actually be encouraged to follow this. I think the last tip is that from a chief Data officer is that, maybe it's good that he builds bridges with IT. And not only IT, may be security, legal and architecture as well teams from day one because he will need this partner to ensure policies are practical and implementations is technically feasible, for example. So he needs all these stakeholders to make sure that this data governance and this data strategy will be fulfilled and there will be people that they will actually behind this and then supporting it.

0:56:23.7 Joseph Morais: That's really great advice. So any final thoughts or anything to plug?

0:56:31.1 Rami Al Lolah: Not really. It was really amazing to join your podcast and that was a really amazing opportunity.

0:56:39.7 Joseph Morais: All right. Well, thank you so much for joining me today, Rami. And for the audience, please stick around after this because I'm giving you my three takeaways. My three top takeaways in two minutes. Man, that was a cool, great conversation with Rami. Here are my top three takeaways. First is democratize the reach to data. No data silos. I thought this was such an incredible task. I mean, of course Rami is an integration engineer, so that is literally his job. But the idea of using data streaming to democratize that data, right, to make it available to those who didn't have access to it before to maybe remove some of the sensitive parts so it can be available through stream processing, a very powerful paradigm. This is something that Rami mentioned and I really love that. The slower the data, the more the risk. I might have summarized it that way, but it was his point. But I absolutely love that because it's true. I mean, think about it. It's financial services. The slower you react to a deposit, a withdrawal, a fraudulent charge, all of that creates risk. Risk that you'll duplicate a deposit or duplicate a withdrawal, or that a person will not be able to respond in time to some type of cyber criminal or, or some type of identity theft in enough time, quickly enough to actually mitigate the potential damage. 

0:58:07.9 Joseph Morais: So the slower the data, the more the risk. That's a really good one. And then without real time, you cannot, or without real time data, you cannot make proactive decisions. I thought this was really powerful because it's true. I mean, the only thing better than real time is having data from the future. And we haven't got there yet. Maybe when we get quantum computers. But being able to make proactive decisions, being able to change an offer or respond to a new customer need. Without real time data, you can't do any of that. That's it for this episode of Life is But A Stream. Thanks again to Rami for joining us and thanks to you for tuning in. As always, we're brought to you by Confluent. The Confluent Data Streaming Platform is the data advantage every organization needs to innovate today and we win tomorrow. Your unified solution to stream, connect, process and govern your data starts at confluent.io. If you like to connect, find me on LinkedIn. Tell a friend or coworker about us and subscribe to the show so you never miss an episode. We will see you next time.