January, 2025

article thumbnail

The Data Engineering Toolkit: Essential Tools for Your Machine

Simon Späti

To be proficient as a data engineer, you need to know various toolkitsfrom fundamental Linux commands to different virtual environments and optimizing efficiency as a data engineer. This article focuses on the building blocks of data engineering work, such as operating systems, development environments, and essential tools. We’ll start from the ground upexploring crucial Linux commands, containerization with Docker, and the development environments that make modern data engineering possibl

article thumbnail

The Emerging Role of AI Data Engineers - The New Strategic Role for AI-Driven Success

Data Engineering Weekly

The Critical Role of AI Data Engineers in a Data-Driven World How does a chatbot seamlessly interpret your questions? How does a self-driving car understand a chaotic street scene? The answer lies in unstructured data processing—a field that powers modern artificial intelligence (AI) systems. Unlike neatly organized rows and columns in spreadsheets, unstructured data—such as text, images, videos, and audio—requires advanced processing techniques to derive meaningful insights.

Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Trending Sources

article thumbnail

How to ensure consistent metrics in your warehouse

Start Data Engineering

1. Introduction 2. Centralize Metric Definitions in Code Option A: Semantic Layer for On-the-Fly Queries Option B: Pre-Aggregated Tables for Consumers 3. Conclusion & Recap 4. Required Reading 1. Introduction If youve worked on a data team, youve likely encountered situations where multiple teams define metrics in slightly different ways, leaving you to untangle why discrepancies exist.

Utilities 147
article thumbnail

How Meta discovers data flows via lineage at scale

Engineering at Meta

Data lineage is an instrumental part of Metas Privacy Aware Infrastructure (PAI) initiative, a suite of technologies that efficiently protect user privacy. It is a critical and powerful tool for scalable discovery of relevant data and data flows, which supports privacy controls across Metas systems. This allows us to verify that our users everyday interactions are protected across our family of apps, such as their religious views in the Facebook Dating app, the example well walk through in this

article thumbnail

Apache Airflow® 101 Essential Tips for Beginners

Apache Airflow® is the open-source standard to manage workflows as code. It is a versatile tool used in companies across the world from agile startups to tech giants to flagship enterprises across all industries. Due to its widespread adoption, Airflow knowledge is paramount to success in the field of data engineering.

article thumbnail

Data Integrity for AI: What’s Old is New Again

Precisely

Artificial Intelligence (AI) is all the rage, and rightly so. By now most of us have experienced how Gen AI and the LLMs (large language models) that fuel it are primed to transform the way we create, research, collaborate, engage, and much more. Yet along with the AI hype and excitement comes very appropriate sanity-checks asking whether AI is ready for prime-time.

More Trending

article thumbnail

10 GitHub Repositories to Master Math

KDnuggets

Learn math through roadmaps, courses, tutorials, Python frameworks for solving equations, guides, exercises, textbooks, and more.

Python 155
article thumbnail

Data News — Week 25.02

Christophe Blefari

HNY 2025 ( credits ) Happy new year ✨ I wish you the best for 2025. There are multiple ways to start a new year, either with new projects, new ideas, new resolutions or by just keeping doing the same music. I hope you will enjoy 2025. The Data News are here to stay, the format might vary during the year, but here we are for another year. Thank you so much for your support through the years.

Data 130
article thumbnail

DeepSeek R1 on Databricks

databricks

Deepseek-R1 is a state-of-the-art open model that, for the first time, introduces the reasoning capability to the open source community. In particular, the.

129
129
article thumbnail

The insertInto trap in Apache Spark SQL

Waitingforcode

Even though Apache Spark SQL provides an API for structured data, the framework sometimes behaves unexpectedly. It's the case of an insertInto operation that can even lead to some data quality issues. Why? Let's try to understand in this short article.

SQL 130
article thumbnail

Apache Airflow® Best Practices: DAG Writing

Speaker: Tamara Fingerlin, Developer Advocate

In this new webinar, Tamara Fingerlin, Developer Advocate, will walk you through many Airflow best practices and advanced features that can help you make your pipelines more manageable, adaptive, and robust. She'll focus on how to write best-in-class Airflow DAGs using the latest Airflow features like dynamic task mapping and data-driven scheduling!

article thumbnail

Data Engineering Interview Series #2: System Design

Start Data Engineering

1. Introduction 2. Guide the interviewer through the process 2.1. [Requirements gathering] Make sure you clearly understand the requirements & business use case 2.2. [Understand source data] Know what you have to work with 2.3. [Model your data] Define data models for historical analytics 2.4. [Pipeline design] Design data pipelines to populate your data models 2.5.

Designing 130
article thumbnail

Strobelight: A profiling service built on open source technology

Engineering at Meta

Were sharing details about Strobelight, Metas profiling orchestrator. Strobelight combines several technologies, many open source, into a single service that helps engineers at Meta improve efficiency and utilization across our fleet. Using Strobelight, weve seen significant efficiency wins, including one that has resulted in an estimated 15,000 servers worth of annual capacity savings.

article thumbnail

What Are Large Language Models? A Beginner’s Guide for 2025

KDnuggets

Curious about what LLMs are and want to know about them? Explore the Full Guide Right Here, Right Now!

148
148
article thumbnail

Global Fishing Watch – Illuminating Vessel Activity On the Open Ocean

ArcGIS

Having fish for dinner tonight? Ever wondered if anyone is monitoring where it's coming from?

IT 118
article thumbnail

Optimizing The Modern Developer Experience with Coder

Many software teams have migrated their testing and production workloads to the cloud, yet development environments often remain tied to outdated local setups, limiting efficiency and growth. This is where Coder comes in. In our 101 Coder webinar, you’ll explore how cloud-based development environments can unlock new levels of productivity. Discover how to transition from local setups to a secure, cloud-powered ecosystem with ease.

article thumbnail

What Is PDFMiner And Should You Use It – How To Extract Data From PDFs

Seattle Data Guy

PDF files are one of the most popular file formats today. Because they can preserve the visual layout of documents and are compatible with a wide range of devices and operating systems, PDFs are used for everything from business forms and educational material to creative designs. However, PDF files also present multiple challenges when it… Read more The post What Is PDFMiner And Should You Use It – How To Extract Data From PDFs appeared first on Seattle Data Guy.

IT 130
article thumbnail

Delta Lake and restore - traveling in time differently

Waitingforcode

Time travel is a quite popular Delta Lake feature. But do you know it's not the single one you can use to interact with the past versions? An alternative is the RESTORE command, and it'll be the topic of this blog post.

IT 130
article thumbnail

Testing and Development for Databricks Environment and Code.

Confessions of a Data Guy

Every once in a great while, the question comes up: “How do I test my Databricks codebase?” It’s a fair question, and if you’re new to testing your code, it can seem a little overwhelming on the surface. However, I assure you the opposite is the case. Testing your Databricks codebase is no different than […] The post Testing and Development for Databricks Environment and Code. appeared first on Confessions of a Data Guy.

Coding 114
article thumbnail

Predictions 2025: AI As Cybersecurity Tool and Target

Snowflake

Though AI is (still) the hottest technology topic, its not the overriding issue for enterprise security in 2025. Advanced AI will open up new attack vectors and also deliver new tools for protecting an organizations data. But the underlying challenge is the sheer quantity of data that overworked cybersecurity teams face as they try to answer basic questions such as, Are we under attack?

Data Lake 101
article thumbnail

15 Modern Use Cases for Enterprise Business Intelligence

Large enterprises face unique challenges in optimizing their Business Intelligence (BI) output due to the sheer scale and complexity of their operations. Unlike smaller organizations, where basic BI features and simple dashboards might suffice, enterprises must manage vast amounts of data from diverse sources. What are the top modern BI use cases for enterprise businesses to help you get a leg up on the competition?

article thumbnail

Don’t Manage Your Python Environments, Just Use Docker Containers

KDnuggets

Python environment management can sometimes give you that awful feeling in the pit of your stomach. So don't do it: just use Docker containers.

Python 138
article thumbnail

Color Schemes for the Global Wind Atlas

ArcGIS

Mix colors to build a theme for the new multidimensional Global Wind Atlas that's now available in ArcGIS Living Atlas.

Building 116
article thumbnail

ILA Evo: Meta’s journey to reimagine fiber optic in-line amplifier sites

Engineering at Meta

Today’s rapidly evolving landscape of use cases that demand highly performant and efficient network infrastructure is placing new emphasis on how in-line amplifiers (ILAs) are designed and deployed. Metas ILA Evo effort seeks to reimagine how an ILA site could be deployed to improve speed and cost while making a step function improvement in power efficiency.

article thumbnail

2024 retrospective on waitingforcode.com

Waitingforcode

Even though I was blogging less in the second half of the previous year, the retrospective is still the blog post I'm waiting for each year. Every year I summarize what happened in the past 12 months and share with you my future plans. It's time for the 2024 Edition!

IT 130
article thumbnail

Prepare Now: 2025s Must-Know Trends For Product And Data Leaders

Speaker: Jay Allardyce, Deepak Vittal, Terrence Sheflin, and Mahyar Ghasemali

As we look ahead to 2025, business intelligence and data analytics are set to play pivotal roles in shaping success. Organizations are already starting to face a host of transformative trends as the year comes to a close, including the integration of AI in data analytics, an increased emphasis on real-time data insights, and the growing importance of user experience in BI solutions.

article thumbnail

How HP is optimizing the 3D Printing supply chain using Delta Sharing

databricks

Javier Lagares is a Principal Data Engineer at HP, where he leads the development of data-driven solutions for the 3D printing business. With.

article thumbnail

Data Pruning MNIST: How I Hit 99% Accuracy Using Half the Data

Towards Data Science

Building more efficient AI TLDR : Data-centric AI can create more efficient and accurate models. I experimented with data pruning on MNIST to classify handwritten digits. Best runs for furthest-from-centroid selection compared to full dataset. Image byauthor. What if I told you that using just 50% of your training data could achieve better results than using the fulldataset?

article thumbnail

How I Would Learn Python in 2025 (If I Could Start Over)

KDnuggets

Ive been programming with Python for over 6 years now. But if I could start over, heres how Id go about learning Python in 2025.

Python 139
article thumbnail

Using Spatial Components in Spatial Statistics

ArcGIS

Learn how to use spatial components to help with various spatial statistics workflows.

113
113
article thumbnail

Apache Airflow® Crash Course: From 0 to Running your Pipeline in the Cloud

With over 30 million monthly downloads, Apache Airflow is the tool of choice for programmatically authoring, scheduling, and monitoring data pipelines. Airflow enables you to define workflows as Python code, allowing for dynamic and scalable pipelines suitable to any use case from ETL/ELT to running ML/AI operations in production. This introductory tutorial provides a crash course for writing and deploying your first Airflow pipeline.

article thumbnail

Building a Fast, Light, and CHEAP Lake House with DuckDB, Delta Lake, and AWS Lambda

Confessions of a Data Guy

Building fun things is a real part of Data Engineering. Using your creative side when building a Lake House is possible, and using tools that are outside the normal box can sometimes be preferable. Checkout this video where I dive into how I build just such a Lake House using Modern Data Stack tools like […] The post Building a Fast, Light, and CHEAP Lake House with DuckDB, Delta Lake, and AWS Lambda appeared first on Confessions of a Data Guy.

AWS 130
article thumbnail

Event time skew and global watermark in Apache Spark Structured Streaming

Waitingforcode

A few months ago I wrote a blog post about event skew and how dangerous it is for a stateful streaming job. Since it was a high-level explanation, I didn't cover Apache Spark Structured Streaming deeply at that moment. Now the watermark topic is back to my learning backlog and it's a good opportunity to return to the event skew topic and see the dangers it brings for Structured Streaming stateful jobs.

IT 130
article thumbnail

Enhancing Neural Network Training at Yelp: Achieving 1,400x Speedup with WideAndDeep

Yelp Engineering

At Yelp, we encountered challenges that prompted us to enhance the training time of our ad-revenue generating models, which use a Wide and Deep Neural Network architecture for predicting ad click-through rates (pCTR). These models handle large tabular datasets with small parameter spaces, requiring innovative data solutions. This blog post delves into our journey of optimizing training time using TensorFlow and Horovod, along with the development of ArrowStreamServer, our in-house library for lo

Datasets 104