Time for an Enterprise data paradigm shift

Looking back at the significant data trends over the last 20 years, we have moved from relational databases to data warehouses, data lakes, and now data mesh. We can insert a few more concepts in between, like non-relational databases (NoSQL), data virtualization, the move from on-premises to the cloud, and more. But have we succeeded and made significant progress in managing and mastering the Enterprise data? The results are somewhat mixed.

On paper, data mesh is an attractive idea and makes sense, with the concepts of domain owners, data-as-a-product, self-service, and federated data governance. But implementing it will be challenging and take a long time. Not to say that getting there hundred percent is probably an illusion.

Many data simplification and rationalization projects have delivered too little, not to say they have failed. There are multiple reasons for this. First, let’s recognize that it’s a complicated problem to solve. Second, there is always something more important to do in terms of new critical data requirements. Thirds, data requirements keep evolving, and there is no perfect “master” data model that can handle everything. I’ll stop here as my objective is not to provide an exhaustive list.

Maybe it is time to consider a different data paradigm and approach the problem from a different angle. While we should continue to simplify the data landscape and put better data governance in place – I would definitely push the concept of data mesh – we should also recognize that this will be a very long journey and that some data “mess” will remain for a very long time, if not forever. So why not acknowledge and accept it and link all this data “mess” together via an abstraction layer? And I am NOT talking about data virtualization like Denodo and others would think about it.

This is where the latest advancements in artificial intelligence will play a crucial role. What I am proposing here would not have been possible 5-10 years ago. We need two things: 1) an intelligent engine to connect disparate databases and 2) some generative AI to help the user, whether a human or a machine through APIs, to get the data she needs from these disparate databases. These tools exist today and could be deployed across the Enterprise. I am currently discussing the idea with a Swiss startup, and some proof of concept could work within a few months. A full deployment could also be very fast.

This could revolutionize our thinking about Enterprise data. Stay tuned. I will continue to discuss this topic over the coming weeks and months.

#digitaltransformation #datamesh

Back to building monolith applications?

Most experienced engineers would likely recommend implementing microservices, APIs, and serverless architectures in the current technology landscape. And not dare to talk about building a monolith solution. But this is what one Amazon Prime Video team did because of too high infrastructure costs and scaling bottlenecks. By ditching serverless, microservices, and AWS Lambda, they saved 90% of their infrastructure costs and solved their performance issues. Read the full story here.

Implementing the latest infrastructure and development concepts for the sake of it does not necessarily bring the best solutions. It’s like pushing database normalization to the extreme, to the detriment of performance. Denormalization is often necessary.

Every case is different when building an [new] IT solution and comes with other requirements. Different IT architectures will have various pros and cons, and no solutions will be perfect. You must challenge your team to think out of the box, assessing “modern” ways of doing things while not excluding traditional ones.You must also recognize the existing IT landscape and technical debt because it’s not like you can erase everything and start from scratch. If necessary, build a minimum viable product to prove your proposed architecture is scalable and delivers the required performance.

Paradigm shift to replace the legacy technology stack of banks [and wealth and asset managers]

The McKinsey & Company article “Banks’ core technology conundrum reaches an inflection point” presents an insightful perspective on the core technology challenge that banks are currently facing, which is now reaching a critical point. While this issue has been discussed for a long time, two factors are making the situation more pressing than ever. Firstly, banks will soon face a talent shortage in their legacy technologies. At the same time, they will have to fight for talent in new technologies. Both talent shortages will put significant pressure on their ability to maintain and evolve their systems. Secondly, legacy technologies are consuming a growing share of banks’ budgets, leaving them with limited resources to pursue strategic initiatives that can drive innovation and transformation.

Another interesting element discussed in this article is how Thought Machine is thinking about solving part of the problem by running products as code and making them independent of the platform, which can be composed of tens if not hundreds of different systems for incumbent banks: “We have a system of smart contracts that run on the platform, but they’re separate from it” says Paul Taylor, founder and CEO of Thought Machine. Brian Ledbetter, a senior partner at McKinsey & Company, also brings up the concept of putting risk controls in code and not in processes for risk management. After infrastructure as code, which we have been discussing for quite some time, we are adding controls as code and products as code, significant paradigm shifts that are complicated for incumbent banks still dealing with mainframes and systems that are 20+ years old.

The challenge of legacy IT stacks and technical debt for incumbent banks has been discussed for decades. Incumbent banks must not look at this as a systems replacement, but as an enabler and a necessity for their future. To be successful, incumbent banks must educate their business on technology, have a technology talent strategy, and bring people to the center of their digital transformation.

Interesting video from the CEO of Thought Machine.

Risk Management

Once in a while, I will discuss some topics that are not totally related to the digital transformation of wealth and asset managers. This is one, even if I could argue that a digital transformation cannot be run without taking and managing some risks.

If we want to talk about risk, not many activities are more dangerous and relevant than mountain climbing. Jimmy Chin, an American professional mountain athlete, photographer, film director, and author, discusses risk management in the context of climbing at a Goldman Sachs talk. 

The first element he brings in is embracing failure, especially since there are a lot of failures in climbing. Their first attempt to climb Meru Peak (a mountain in the Garhwal Himalayas) failed. On their way down, they were already making decisions about things they had to change for their next expedition, like being lighter and taking warmer sleeping bags.

A second element is embracing the process, managing the variables you can control, and identifying those you cannot control. By embracing the process, you will focus on everything you need to get together to succeed, not only on the ultimate goal (in Jimmy’s case, reaching the summit.) That’s how you get there.

The third element is fear, which can be healthy, as it helps sharpen senses and motivates. Fear is not helpful when it becomes paralyzing or turns into panic.

Not surprisingly, a key component in risk management is anticipating all the potential problems that can emerge and having pre-defined solutions. When a risk materializes, taking stock of the situation and identifying the perceived and actual risks are essential. And really focusing on the actual risks.

Jimmy also brings the notion of trust and understanding how people function in different situations.

Reference:
Listen to the talk. There is much more there. Goldman Sach, Jimmy Chin talk: https://www.goldmansachs.com/insights/talks-at-gs/jimmy-chin.html

2023 Gartner Emerging Technologies and Trends Impact Radar

Gartner has released its 2023 Gartner Emerging Technologies and Trends Impact Radar. Let me try to give it a read with the eyes of a wealth
or asset manager.

Artificial Intelligence is all over the radar, with Foundation Models, Self-Supervised Learning, Generative AI, and more. Wealth and asset managers must seriously look into Artificial Intelligence and understand how it can be leveraged across their value chain, from investments to operations, risk management, and compliance. Part of the solution will come from their solution providers, like Bloomberg, BlackRock Aladdin, or State Street Alpha, to name a few providers. But wealth and asset managers cannot only rely on their providers. Instead, they must acquire the necessary skills and talent and play with artificial intelligence technologies. War for talent will make it complicated.

Blockchain is also quite present with Web3 and Tokenization, both being in the 3-6-year horizon. Most wealth and asset managers are already testing tokenization in one way or another, and they should continue. Tokenization will have many benefits for the industry, from speeding up transactions, eliminating some intermediaries and therefore reducing costs, making some asset classes available to smaller investors, improving the liquidity of some assets, and more. Tokenization will require some industry alignment and standards.

No surprise, Digital Twins are here too. Gartner is probably thinking more about Digital Twins in the context of manufacturing and industrial activities. But as discussed in this blog, the potential for digital twins in the financial industry is real and massive.

Then, there are hardware and infrastructure with Neuromorphic Computing, 6G, and Hyperscale Edge Computing. If wealth and asset managers continue to move to the cloud, they’ll be able to leverage these latest hardware technologies as they become available in the cloud.

Using digital twins in wealth and asset management

I have always been fascinated by digital twins and the potential they offer. There are many examples of companies using them. BMW has partnered with NVIDIA and uses real-time digital twin factories to optimize its production and conduct predictive maintenance. Emirates Team New Zealand uses digital twins to design and test its boats. SpaceX uses a digital twin of the Dragon capsule to monitor and adjust trajectories, loads, and propulsion systems. McKinsey says that companies can achieve ~50% faster time-to-market, ~25% improvement in product quality, and ~10% revenue uplift with digital twins.

Let’s start with some definitions. What is a digital twin, and how does it differ from simulations and standard CAD (Computer-Aided Design)? Simulations are usually limited to one process (i.e., narrow scope) and do not leverage real-time data. In contrast, digital twins are a virtual representation of a real, complete system fed with real-time data, lasting the system’s entire lifecycle. They allow rapid iterations and optimization of the system. The next big thing is linking digital twins to augmented and virtual reality, interconnecting digital twins, to finally creating the enterprise metaverse.

How about we use digital twins in wealth and asset management? I am hesitant to say that we have already been using them for a long time to model portfolios, test investment strategies, and assess the impact of certain events, to name a few examples. A “purist” might say these are more simulations than digital twins. And this is correct in many cases. The backtest of a portfolio is a simulation. But when an asset manager builds models to optimize portfolios daily, using near-real-time data, it’s getting very close to being a digital twin.

Traditional wealth and asset managers have yet to fully leverage the potential of digital twins because their use of data is limited (which is not/less the case for quantitative asset managers), they could leverage more near/real-time data and alternative data, and they could leverage more data across their entire value chain, from market research to portfolio construction, product development, marketing, and sales and distribution.

Many solutions are available to wealth and asset managers to use and leverage more data. But it requires more than tools. It requires technical skills and talent. Investment teams must have developers within their teams to use advanced market research solutions. Product Development teams must learn data science (e.g., Python) to get the best out of markets’, customers’, and competitors’ data. And the IT team must support these platforms.

Another challenge is where to start and how to build digital twins. A mistake might be to try to build a full fledge digital twin at once. It’s better to start small and evolve the first version. A good suggestion is to run hackathons to develop prototypes quickly and test the initial concepts. And the beauty of the hackathon is that you will have a multi-disciplinary team working together with portfolio managers, product development guys, and engineers.

To be successful, wealth and asset managers must make this a Firm objective, driven from the top. They must invest in talent and team upskilling, and ensure the right innovation culture is in place.

Let’s look at the digital transformation of other industries – The Washington Post

Looking at other industries to think about innovation and how to leverage technology in wealth management is always interesting. In that context,
the digital transformation journey of the Washington Post led by Shailesh Prakash is very insightful.

They quickly recognized that The Post needed to achieve excellence in both journalism and technology. It was a radical transformation, moving the IT department from an IT system babysitting mindset, IT systems in which the newsroom staff had very little confidence, to a product development mindset, building and inventing digital products. Part of the journey was adopting agile and colocating the engineers with their partners from the newsroom. They decided to build versus buy, set a fast experiment and innovation culture, and developed an obsession with products. Attracting and retaining the best engineers was key in that journey.

Through their excellence in technology, they built a set of tools they could sell to other publishers: Arc Publishing (Arc XP) was born, generating tens of millions of dollars of revenues for The Post.

Deep dive into the digital transformation of the Washington Post with the University of Virginia case study. There are also plenty of resources on the web. Just search for Shailesh Prakash (who is now at Google). If you like this digital journey, I also recommend reading the Goldman Sach’s Digital Journey case study from Harvard Business School. A great read.

Some thoughts on innovation

Trying to put together some thoughts about innovation, I have reread some chapters of “Non-bullshit innovation” from David Rowan, a book I recommend reading. I personally like a lot the chapter about Autodesk, “Find your blind spot” and ARUP, “Empower your team.”

Not trying to be exhaustive and pretending this is “the answer”, here are a few simple principles to be innovative, adapt, and potentially survive:

  1. Fund long-term experiments;
  2. Be obsessed with the future;
  3. Have a lab, change the culture, and show that taking some risks is fine (and necessary);
  4. Get involved by curiosity, not to make some public relationships;
  5. Follow the 3-horizon framework: 1. Maintain today’s core business; 2. Nurture emerging businesses that could become significant; 3. Conceive new future businesses in a more speculative way;
  6. Create organizational tensions that challenge the status quo thinking (link that back to the 3-horizon framework);
  7. Make sure you keep up with the speed of change in the industry. Otherwise, you will fall behind;
  8. Innovation must make it to the real world. Otherwise, it is not innovation;
  9. Tell stories, great stories.

Now, take these nine principles and turn them into questions, e.g., do we have a lab? Do we fund long-term experiments? Are we obsessed with the future? And so on. Where do you stand in terms of innovation?

Recommended reading: Non-Bullshit Innovation: Radical Ideas from the World’s Smartest Minds

Who will pick up the bill when your smart contract is buggy and everything is lost?

I always thought people underestimated the challenge of blockchain and smart contracts. While smart contracts have many benefits, like information security, no third-party required to verify authenticity, efficiency in execution, and much more, smart contracts can be complicated code, buggy, and lead to disasters. Chasing bugs on the blockchain (MIT Technology Review) provides examples of such disasters, where tens of millions of dollars have been lost and cannot be recovered.

That opens the door to a new business opportunity that will attract many players over the coming years: auditing blockchains and the code of smart contracts. Not making them bulletproof and guaranteeing that there will be no bugs, but ensuring the code is robust and that smart contracts will behave as expected. Challenge: finding the right talents to staff these “audit teams” with top-notch engineers who can make a difference.

I am not saying you should not push blockchain and smart contracts. But make sure you have an experienced team building them, and consider having a second pair of eyes looking at the code. Because once they are out there, it might be too late to make any changes and avoid a disaster. It will often be too late when you know about the tragedy.