blog phone

adesso BLOG

Tags:

  • Data management

Show all posts
Methodology

26.09.2024 By Annette Kauppinen

Data Mesh – A Revolution In Data Management

Picture Annette Kauppinen

Today's data-driven business environment is challenging for many organizations. Data silos, poor quality and lack of cooperation between data producers and users hinder decision-making, slow down innovation and limit the added value of data. Fortunately, the solution already exists – it's called Data Mesh.

Read more
Industries

Efficient power plant processes and well-founded decisions are hardly conceivable today without data-based analyses. In this blog post, you will find out how PowerBI can make the work of power plant employees easier through automated reporting and comprehensive data integration. You will learn how operating data from different sources can be efficiently evaluated, plant performance optimised and decision-making processes accelerated.

Read more
Methodology

In modern data processing, companies are faced with the challenge of choosing the right database technology for their specific requirements. PostgreSQL and Databricks are two widely used solutions, each with their own strengths. In this blog post, I will highlight the differences between PostgreSQL and Databricks, analyse their respective advantages and disadvantages and give specific use cases that justify a switch to Databricks.

Read more
Methodology

06.06.2024 By Christian Del Monte

Change Data Capture for Data Lakehouse

Picture Christian Del Monte

Change Data Capture (CDC) is a technique that captures all data changes in a data archive, collects them and prepares them for transfer and replication to other systems, either as a batch process or as a stream. This blog post focuses on the application of CDC in data lakehouses using the example of Change Data Feed, a variant of CDC developed by Databricks in the context of delta lake-based data lakehouses.

Read more
Methodology

In a distributed software system, data changes always pose a challenge. How is it possible to track the change history of data located in one part of the system in order to synchronise connected data stores in other subsystems? Change Data Capture (CDC) offers an answer to this question. I explain what this is all about in this blog post.

Read more
Industries

Every day, employees struggle with manual reporting processes that cause high personnel costs, limited opportunities for process optimisation and quality deficiencies. Despite the crucial importance of KPIs for management, manual reporting processes are widespread in production. In this blog post, I explain why companies in the IIoT sector are starting with production reporting.

Read more
Methodology

Metadata-driven data pipelines are a game changer for data processing in companies. These pipelines use metadata to dynamically update processes instead of manually revising each step every time a data source changes. As with data pipelines, metadata maintenance can be a bottleneck in the maintenance and further development of a pipeline framework. In this blog post, I use practical examples to show how the Jsonnet template language makes it easier to maintain metadata.

Read more
AI

Workflow orchestration and workflow engines are crucial components in modern data processing and software development, especially in the field of artificial intelligence (AI). These technologies make it possible to efficiently manage and coordinate various tasks and processes within complex data pipelines. In this blog post, we present Prefect, an intuitive tool for orchestrating workflows in AI development.

Read more
Methodology

Snowflake plays a prominent role in shaping the face of the industry in the ever-evolving world of data analytics and data management. This blog post looks at the development of Snowflake and why it is considered a ground-breaking solution for businesses.

Read more