top of page

A Modern Data Platform for Investment Decision-Making

  • Michelle Schulz
  • 3 days ago
  • 4 min read

“This project was more than just a technical implementation for me – it was one of my first major projects where we didn’t just deliver a solution, but also built real trust. We started with a clearly defined use case, but it quickly became clear that it was really about much more – data, processes, governance, and ultimately how decisions are made.”

Georgi Pargov

Project lead and Analytics Engineer


With every step, it became evident that the platform not only addressed an existing problem, but also unlocked new opportunities. That was the moment when a project turned into a long-term collaboration.


Before the project started, core investment processes at the client were still heavily Excel-based. This led to manual effort, versioning issues, inconsistent figures, and a high risk of errors.

Our solution

We built a scalable data platform on Databricks – including automated pipelines, centralized data storage, governance with Unity Catalog, and interactive decision support via Streamlit.

At the same time, we designed the data structure in a way that allows it to be used directly for advanced analytics and AI use cases, without requiring additional data preparation.

Enterprise Investment Analytics Platform – Governed Lakehouse Architecture
Enterprise Investment Analytics Platform – Governed Lakehouse Architecture

The result

  • Single source of truth for investment data

  • Faster, consistent analyses and reporting

  • Reduced operational risk

  • Greater autonomy for investment teams in scenario analysis and portfolio simulations

  • Structured platform for future AI analytics and experimentation

  • A long-term, scalable platform that can easily adapt to future requirements

To achieve this, we structured the implementation into three steps.

Step 1: Moving beyond Excel – building a scalable foundation

The first phase focused on migrating core investment workflows from Excel into Databricks. At that time, Unity Catalog was not yet available in its current level of maturity, so the focus was on fast value delivery: automated pipelines, centralized data storage, and reproducible analytics.

This step alone created immediate value:

  • A central single source of truth for investment data

  • Less manual effort for investment teams

  • Faster access to consistent, up-to-date information

  • Reduced operational risk in daily reporting

Most importantly, investment managers were able to spend less time reconciling numbers and more time interpreting them.

From the outset, we treated the platform as an evolving capability – not a one-off IT project. Technological developments and new platform features were continuously monitored to ensure long-term future readiness.

Step 2: From functional to enterprise-grade – governance with Unity Catalog

With the introduction of Unity Catalog as the standard for governance, security, and data lineage in Databricks, we deliberately evolved the architecture.

The initial setup without Unity Catalog was functional, but came with limitations:

  • No centralized governance across teams and workspaces

  • Limited access control for sensitive investment data

  • Lack of transparent traceability of data usage

We therefore migrated the entire platform to Unity Catalog and introduced a clear environment strategy, fully integrated with Azure DevOps for version control and controlled deployments:

DEV → TEST → PRP (Pre-Production) → PRD (Production)

This step was not just a technical improvement, but a key enabler for risk and compliance requirements.

Business impact:

  • Clear separation between development and production environments

  • Improved auditability and compliance

  • Controlled access to sensitive investment and risk data

  • Lower risk when introducing new analytics features

For a regulated, global organization, governance is not overhead – it is the foundation for trust, scalability, and sustainable data-driven decision-making.

This governance layer created the stable foundation required to rethink how insights are delivered.

Step 3: From dashboards to decision support – introducing Streamlit

Once data pipelines and governance were in place, the next challenge was user experience.

The existing solution relied on Databricks dashboards. While suitable for static reporting, they reach their limits when business users need to analyze scenarios, adjust assumptions, or interact with data.

We therefore introduced Streamlit as a new interface layer for investment analytics.

Streamlit is a modern framework for data applications – not just visualization.

This fundamentally changed how investment managers work with data:

  • Interactive applications instead of static dashboards

  • Custom workflows for scenario analysis, stress testing, and portfolio simulations

  • Rapid iteration: new requirements can be implemented within days

  • Tight integration with Databricks, keeping data logic and user interaction closely connected

Business impact:

  • Faster adaptation to market and regulatory requirements

  • Tools that reflect real investment workflows

  • Greater autonomy for business users in analyzing and evaluating data

It became clear to us: traditional BI tools like Power BI remain essential for standardized reporting. Streamlit, however, excels in decision support and shifts the focus from “What happened?” to “What should we do next?”.

Enabling the business: analytics tools for business users

Technology only creates value if it is usable.

To reduce dependency on technical teams, we developed custom Python packages to simplify how users interact with data:

  • Standardized functions to read and write Databricks tables

  • Simple helpers for analyzing datasets

  • Reusable building blocks for common investment analytics

This abstraction layer reduces technical complexity while ensuring consistency and data quality.

At the same time, these standardized access patterns create a foundation that allows AI models and advanced analytics to be implemented directly on top of the data platform, without additional data preparation.

Business impact:

  • Faster time-to-insight

  • Less friction between IT and investment teams

  • Lower operational risk through standardized access

  • Greater independence for business users

Overall impact: beyond technology

This transformation delivered measurable value across the organization:

For investment teams:

  • Faster access to trusted data

  • Interactive tools for real decision-making

  • Reduced manual effort

For the organization:

  • Stronger governance and regulatory readiness

  • Scalable foundation for future use cases

  • Lower risk when evolving the platform

  • Higher development speed without compromising control

Our role went beyond implementation. We actively advised on architecture, governance, and operating models, and enabled internal teams to further develop the platform independently.

Conclusion

The result is not just a modern data platform, but a sustainable capability within the organization: a foundation that enables faster, more informed, and more reliable investment decisions.

At the same time, the platform provides a structured and governed data base that allows future AI use cases to be developed, tested, and scaled efficiently.

 
 
 

Comments


bottom of page