Loading Events


Build An Automated Data Pipeline With Snowpark

Snowpark is a powerful language for data manipulation, ML model training, and so much more. But it's only as fast as the data pipeline automation behind it. Learn how to get more of these workflows into production at off of your maintenance backlog in this hands-on lab.

About the Lab

Snowpark is the ideal data engineering language. It’s flexible, easy to use, and has a rich ecosystem of libraries and tools. That’s why today’s data teams use it for anything data-related: pipelines, data transformation, machine learning, app development, and more. Now with native support in Snowflake’s Data Cloud, teams are using Python libraries to get more control over advanced workflows. 

The only challenge is managing the data pipelines that feed these models. Somehow they still seem to break on a regular basis, causing constant updates to the code and manual reruns of the pipeline. 

Ascend.io takes away the hassle of manual pipeline orchestration by automating everything about these data flows, including change propagation and error recoveries. This lets you focus on the fun parts of a Snowpark pipeline without any of the drag.

Join this instructor-led, virtual, hands-on lab to learn how to build one of these automated Snowpark pipelines in just 15 minutes.

In 60 minutes, you’ll:

– Set up your local Snowpark dev environment using the Ascend CLI

– Build your first ‘hello world’ pipeline using open source data

– Use the Ascend automation engine to put execution and maintenance of this pipeline on autopilot

This lab is perfect for data professions familiar with using SQL and Python for data engineering and looking to learn more about the specifics of Snowpark libraries.

Prerequisites to participate in this lab:

We welcome everyone to attend and watch this virtual hands-on lab. However, to actively participate, you will need an Ascend and Snowflake account. 


– Account Admin role access for Snowflake or a Snowflake free trial

– Ascend cloud free account configured with Snowflake credentials (process here)


Familiarity with Python is optional. We will provide you with code for the lab.


More Resources

Data Pipeline Automation: A Definitive Guide

Explore how to automate your data pipelines & discover how automation eliminates tedious data engineering tasks so you can focus on delivering value.

Data Pipeline Optimization: How to Reduce Costs

Data pipeline optimization helps cut costs, enhance efficiency, pinpoint hotspots, and drive data-driven success.

Python for Data Engineering

Python is the ideal data engineering language. Learn why and how to get started with Python for your data engineering workflows.