Sun.Jan 05, 2025

article thumbnail

2024 retrospective on waitingforcode.com

Waitingforcode

Even though I was blogging less in the second half of the previous year, the retrospective is still the blog post I'm waiting for each year. Every year I summarize what happened in the past 12 months and share with you my future plans. It's time for the 2024 Edition!

IT 130
article thumbnail

Private Listing and Masking Policies for Cross-Region Data Sharing

Cloudyard

Read Time: 4 Minute, 8 Second In the evolving landscape of data sharing, Snowflake offers two primary methods: Direct Share and Private Listing. While both facilitate data sharing, they cater to different needs and scenarios. This blog focuses on Private Listing and Masking Policies for Cross-Region Data Sharing , comparing it with Direct Share and examining how masking policies ensure data privacy in shared environments.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

HR Data Integration: Challenges, Benefits, and Best Practices

Hevo

The trend of today’s information-driven world is to make decisions based on information. The human resources departments are not left behind in this trend. Integration of HR data has become an important step in smoothing the flow of HR processes, improving the employee experience, and ensuring compliance in a technology-enabled environment.

article thumbnail

Data Engineering Weekly #202

Data Engineering Weekly

Try Fully Managed Apache Airflow for FREE Run Airflow without the hassle and management complexity. Take Astro (the fully managed Airflow solution) for a test drive today and unlock a suite of features designed to simplify, optimize, and scale your data pipelines. For a limited time, new signups will receive a complimentary Airflow Fundamentals Certification exam (normally $150).

article thumbnail

Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production. This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.