article thumbnail

Implementing Data Contracts in the Data Warehouse

Monte Carlo

In this article, Chad Sanderson , Head of Product, Data Platform , at Convoy and creator of Data Quality Camp , introduces a new application of data contracts: in your data warehouse. In the last couple of posts , I’ve focused on implementing data contracts in production services.

article thumbnail

Data Engineering Weekly #167

Data Engineering Weekly

link] Intel: Four Data Cleaning Techniques to Improve Large Language Model (LLM) Performance If someone asks me to define LLM, this is my one-line definition. Large Language Models: Turning messy data into surprisingly coherent nonsense since 2023. High-quality data is the cornerstone of LLM.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

5 Layers of Data Lakehouse Architecture Explained

Monte Carlo

You know what they always say: data lakehouse architecture is like an onion. …ok, Data lakehouse architecture combines the benefits of data warehouses and data lakes, bringing together the structure and performance of a data warehouse with the flexibility of a data lake. But they should!

article thumbnail

Data Lakehouse Architecture Explained: 5 Layers

Monte Carlo

You know what they always say: data lakehouse architecture is like an onion. …ok, Data lakehouse architecture combines the benefits of data warehouses and data lakes, bringing together the structure and performance of a data warehouse with the flexibility of a data lake. But they should!

article thumbnail

From Big Data to Better Data: Ensuring Data Quality with Verity

Lyft Engineering

High-quality data is necessary for the success of every data-driven company. It is now the norm for tech companies to have a well-developed data platform. This makes it easy for engineers to generate, transform, store, and analyze data at the petabyte scale. What and Where is Data Quality?

article thumbnail

How to Use DBT to Get Actionable Insights from Data?

Workfall

With DBT, they weave powerful SQL spells to create data models that capture the essence of their organization’s information. DBT’s superpowers include seamlessly connecting with databases and data warehouses, performing amazing transformations, and effortlessly managing dependencies to ensure high-quality data.

article thumbnail

A New Horizon for Data Reliability With Monte Carlo and Snowflake

Monte Carlo

Monte Carlo will analyze your data in Snowflake to understand how long each query is running, how much compute power you’re using, and how it integrates with the other queries in your warehouse to help you optimize not just your pipelines but your Snowflake performance and spend as well. Our promise: we will show you the product.