Remove making-dbt-cloud-api-calls-using-dbt-cloud-cli
article thumbnail

Making dbt Cloud API calls using dbt-cloud-cli

dbt Developer Hub

dbt Cloud is a hosted service that many organizations use for their dbt deployments. cron schedule, API trigger), the jobs generate various artifacts that contain valuable metadata related to the dbt project and the run results. What is dbt-cloud-cli and why should you use it?

Cloud 52
article thumbnail

Data News — Recommendations

Christophe Blefari

Which will makes me sad, but I understand. I mean, there is a big difference between podcast for instance and news blogging like I'm doing. It was my first step in this journey to make my handpicked links browsable and usable to everyone. See your recommendation Christophe, why did you make this? No one asked for it.

Data 130
Insiders

Sign Up for our Newsletter

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

article thumbnail

The Spiritual Alignment of dbt + Airflow

dbt Developer Hub

Airflow and dbt are often framed as either / or: You either build SQL transformations using Airflow’s SQL database operators (like SnowflakeOperator ), or develop them in a dbt project. You either orchestrate dbt models in Airflow, or you deploy them using dbt Cloud. In a Google Cloud Storage bucket.

article thumbnail

Snowpark Offers Expanded Capabilities Including Fully Managed Containers, Native ML APIs, New Python Versions, External Access, Enhanced DevOps and More

Snowflake

At this year’s Summit, we are excited to announce a series of advancements to Snowpark runtimes and libraries, making the deployment and processing of non-SQL code in Snowflake even simpler, faster, and more secure. Snowpark — Set of libraries and runtimes for secure deployment and processing of non-SQL code on the Snowflake Data Cloud.

Python 52
article thumbnail

DataOps: What Is It, Core Principles, and Tools For Implementation

phData: Data Engineering

In order to make an informed decision on any of these questions, you need data! If you have hundreds of thousands of users using a variety of products, you need more tooling and processes at your disposal that enables a thoughtful and scalable approach. This makes you work for your data instead of your data working for you.

IT 52