5 Takeaways from the Data Pipeline Automation Summit 2023
Discover the top takeaways from the Data Pipeline Automation Summit 2023, including the future of automation, live data sharing, and cloud cost savings.
Get insights and advice on automating repetitive data engineering, optimizing data platform costs, and accelerating data initiatives at your company.
Subscribe and get all the articles delivered straight to your inbox.Or customize your subscription to receive only the topics you are most interested in.
Discover the top takeaways from the Data Pipeline Automation Summit 2023, including the future of automation, live data sharing, and cloud cost savings.
Discover five reasons to join our 2023 Data Pipeline Automation Summit — completely virtual and free — and register now!
We’re excited to introduce Robin Rostratter as Head of EMEA sales to better engage and support our existing and new business in Europe, Middle East, and Africa.
Jenny Hurn, our People Experience champion, dives into what makes Ascend one of the best places to work and shares what it means to create an award-winning workplace and culture.
In this episode, Sean and I talk with Jon Saltzman about all things data lineage and its importance at every step of the data pipeline.
In this episode, we go back to the foundations of data engineering and data pipelines with a deep dive into data ingest.
In this episode, we discuss the concept of accidental ransomware—or when a team is slowed down by the burden of managing and maintaining commoditized software.
In this episode, we do things a bit differently and chat with one of Ascend’s lead engineers, Nandan Tammineedi, about his work helping develop the Ascend for Snowflake platform.
Data orchestration is the step in the data management process that leverages software to optimally and efficiently move and process data.
In this episode, we look at a foundational aspect of data engineering workloads—automation, including the confusion that frequently surrounds data automation and how some data teams can avoid the major pitfall of going too far too fast.
Data ingestion is the process of transporting data from one or more sources to a storage location where it can be analyzed and used by an organization
In this episode, we chat about one of the foundational principles of data engineering—data orchestration. What is it, who needs it, and just how data engineering organizations need it to evolve over the next few years.
Subscribe and get all the articles delivered straight to your inbox.Or customize your subscription to receive only the topics you are most interested in.