Skip to content
Blog

What’s New in SAP Datasphere (May 2025)

Marilena Sass

May 21, 2025

What’s new in SAP Datasphere (Mai 2025)

There have been some updates made to SAP Datasphere since our last blog post in April. If you missed our latest What’s New, you can find that post here: Blog post. We are pleased to present the latest innovations from May 2025. This blog post focuses on the analysis model in the field of data modeling. It also looks at the data catalog and data integration.

Data integration

In our description of the last updates, we presented the object store. It provides the option of receiving data in a push procedure directly from an SAP BW or SAP BW/4HANA. If the required data is in the source system, it is not necessary to integrate it via SAP Datasphere; it can be transferred directly to the object store. There are some limitations, however. Among other things, the properties of the local table generated by the SAP BW cannot be changed. In addition, Delta capture is pre-set as enabled or disabled and this cannot be changed. Snapshot versions are retained when Delta capture is disabled.

SAP BW Push Method
SAP BW Push Method

In addition to the object store, the possibilities for data federation have been extended. Now you can use the ABAP SQL service for remote access on the SQL level for data from ABAP-managing CDS view entities in SAP S/4HANA with connection type SAP ABAP. SQL access enables more specific pre-selection of data. In this context, more involved post-processing can be dispensed with, especially for extensive CDS views. For SQL access, the “Remote Tables” section has also been added. It includes the Cloud Connector option for data flows and replication flows. For classic remote tables, there is the “direct” option.

Data modeling

Additional SAP Datasphere innovations provide fundamental and flexible customizations to newly created analytics models. A new analysis model can be created from an existing analysis model. The new model integrates useful characteristics of the analysis model, such as associations, variables, data access controls, and more. For a detailed overview, refer to the SAP documentation. This means that it is possible to create extended or additional reporting without any additional effort. The new model also supports comparing planned changes to the status quo for comparison.

Another innovation makes it possible to replace the fact source of an analysis model. Note that the new fact source must have the same attributes, associations, metrics, and input parameters, among other things. Validation can be used to verify that the adjustments are correct. This reduces the number of subsequent adjustments in the layers above, such as the reporting layer.

Since the last update, the validation options for semantic views have been extended with an exciting feature: The use of the “fact.” Previously, keys could already be checked for uniqueness and to ensure that there are no “NULL” values in the corresponding fields. The referential integrity check is added with the latest extension.

This function ensures that all values in the foreign key column actually exist in the corresponding dimension tables. This means that when replacing a fact or adding new dimensions, potential errors can be detected right away and adjustments made immediately. A valuable improvement, one that improves the consistency and quality of data in semantic views.

The update also introduces new features for working with metrics in the analysis model. You can now adjust the format of a code, such as the number of decimal places or the scaling, in the Formatting section of the editor.

In addition, another feature has been introduced that provides deeper insight into the structure of the analysis model: It is now possible to evaluate both the origin and effects of key figures directly in the model. This extension not only offers more flexibility in the presentation of key figures but also facilitates the traceability and optimization of the analysis models.

With the new capabilities in the analysis model, SAP now offers even more flexibility in dealing with changes and extensions to the existing structure – without adapting to data sources or rebuilding the model. These customizations can be leveraged directly in the SAP Analytics Cloud, greatly simplifying integration and development. One of the most exciting new features is the ability to analyze the performance of a view using runtime metrics. In the Data Builder, it is now possible to execute two database queries directly to determine key metrics such as execution time, number of sources used, data accesses, or maximum memory utilization. In addition, it is possible to create an Explain Plan, which shows the resource requirements of a view during creation.

These performance analyses enable targeted optimization of resource consumption and customization of the model. In addition, updates can be timed in line with other tasks to avoid system congestion – a clear advantage for more efficient and stable data modeling.

Data catalog

As already announced, the focus will be on the data catalog in the coming quarters. The data catalog can now be configured more specifically for an SAP Datasphere space. If a data product is no longer needed, it can be removed from a space. This allows resources to be redistributed explicitly among the spaces and removed from unnecessary resources.

Another adjustment also applies to the data catalog: Two collections have received new names. The “SAP Business Data Cloud Data Products” collection has been renamed “Data Products.” At the same time, the “Marketplace Data Products” collection was given the new name “Data Products (Marketplace).”

These renames provide greater clarity in the data catalog and facilitate the orientation and mapping of the various data products within the platform. A small step to further enhance the user experience.

Find an active data product in SAP Datasphere
Find an active data product in SAP Datasphere

The integration of databases with SAP BDC opens up new opportunities for companies by providing them with valuable insights from large amounts of data and combining them with AI-based models. The metadata from the Unity Catalog from Databricks should be shared with the SAP Datasphere Data Catalog in the future.

Conclusion

The flexible adaptation of the data basis and optimization of resource consumption are always the focus of interest in the field of data processing. That’s why SAP is committed to providing continuous updates that not only enable comprehensive and powerful data modeling but also provide a better view of the system resources needed. These advances are critical to meeting the ever-increasing demands on data processing and to ensuring efficient use of existing technologies.

If you are curious about future innovations, you can take a look at the upcoming developments in the SAP Road Map Explorer. Plans and ideas that further shape and shape the quality of SAP Datasphere will be presented there regularly.

valantic will be pleased to assist you and provide support or answer questions about SAP Datasphere.

Visit our website for more information & a free consultation!

Our SAP Datasphere website Our SAP Datasphere website

Don't miss a thing.
Subscribe to our latest blog articles.

Register