Remove Data Warehouse Remove Definition Remove High Quality Data Remove Metadata
article thumbnail

Implementing Data Contracts in the Data Warehouse

Monte Carlo

In this article, Chad Sanderson , Head of Product, Data Platform , at Convoy and creator of Data Quality Camp , introduces a new application of data contracts: in your data warehouse. In the last couple of posts , I’ve focused on implementing data contracts in production services.

article thumbnail

From Big Data to Better Data: Ensuring Data Quality with Verity

Lyft Engineering

High-quality data is necessary for the success of every data-driven company. It is now the norm for tech companies to have a well-developed data platform. This makes it easy for engineers to generate, transform, store, and analyze data at the petabyte scale. What and Where is Data Quality?

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

Data Quality Score: The next chapter of data quality at Airbnb

Airbnb Tech

However, for all of our uncertified data, which remained the majority of our offline data, we lacked visibility into its quality and didn’t have clear mechanisms for up-leveling it. How could we scale the hard-fought wins and best practices of Midas across our entire data warehouse?

article thumbnail

What is dbt Testing? Definition, Best Practices, and More

Monte Carlo

Your test passes when there are no rows returned, which indicates your data meets your defined conditions. Often, teams run custom data tests as part of a deployment pipeline, or scheduled on production systems via job schedulers like Apache Airflow, dbt Cloud, or via in-built schedulers in your data warehouse solution.

SQL 52
article thumbnail

The Symbiotic Relationship Between AI and Data Engineering

Ascend.io

While data engineering and Artificial Intelligence (AI) may seem like distinct fields at first glance, their symbiosis is undeniable. The foundation of any AI system is high-quality data. Here lies the critical role of data engineering: preparing and managing data to feed AI models.

article thumbnail

The Rise of the Data Engineer

Maxime Beauchemin

Data modeling is changing Typical data modeling techniques — like the star schema  — which defined our approach to data modeling for the analytics workloads typically associated with data warehouses, are less relevant than they once were.

article thumbnail

What is Data Fabric: Architecture, Principles, Advantages, and Ways to Implement

AltexSoft

A data fabric is an architecture design presented as an integration and orchestration layer built on top of multiple disjointed data sources like relational databases , data warehouses , data lakes, data marts , IoT , legacy systems, etc., to provide a unified view of all enterprise data.