Build an end-to-end data pipeline that solves a real problem. Leverage AI to build, deploy, and maintain it. Complete to win up to $500.


.webp)
.webp)

Leverage Otto, the data engineering agent, to build an end-to-end pipeline on Ascend. Your pipeline must include at least 4 components, including at least 1 ingestion component and 1 transformation component. Leverage at least 1 agentic feature to build or manage your data pipeline.
Join the Ascend Community Slack and find the #agentic-data-hackathon channel to ask questions. You can also attend optional Office Hours on February 3 & 5th to connect with the Ascend team and get any help you made need on your project. Click here to register to attend office hours.
Share a brief writeup or demo video about your project. Please include a description of your data pipeline, how you used AI, and what your learned along the way. All entries must be submitted by 11:59 PM PST on Friday, February 6th to be considered for prizes.
Join the Ascend Community Slack and find the #agentic-data-hackathon channel to ask questions, get help troubleshooting, and connect with other participants.
Get help via Zoom from the Ascend team during scheduled sessions on Feb 3 and 5.
Click here to register for office hours
Find additional resources including How-to Guides, Tutorials, and labs on our docs site at docs.ascend.io
%202.avif)
%202.avif)
The Hackathon is an event following our 2 day Agentic Data & Analytics Bootcamp aimed to help participants put their new skills into practice. However, it is open to anyone, regardless of their participation in the Bootcamp, including data engineers, analysts, and anyone interested in building with Ascend. You must be 18+ to participate and receive prizes.
Your submission must meet these criteria:
✅ At least 1 agentic feature - Use one or more of Ascend's agentic capabilities to build or manage your pipeline:
- Otto chat to build components conversationally
- Custom Rules or Agents to customize agentic outputs
- Automated agentic responses to system events or schedules
✅ Minimum 4 components - Your pipeline must include at least 4 components, including:
- At least 1 ingestion component (Read Connector)
- At least 1 transformation component
✅ Orchestration/scheduling - Your pipeline should be scheduled or orchestrated (not just ad-hoc runs) and deployed into your deployment.
Pipelines that don't meet these minimum requirements won't be eligible for judging.
No. You can use public datasets, mock data, or sample data . What matters is that your pipeline demonstrates how you'd solve a real problem - the actual data source can be representative rather than proprietary.
Your pipeline must be built and run in Ascend, but you can absolutely use additional agentic capabilities from your data warehouse. For example, pulling from an API and loading into Snowflake and leveraging Snowflake's native AI capabilities with Cortex.
Your writeup or video should cover:
- The problem you're solving
- Your data sources & pipeline design
- How you used Ascend's agentic features specifically
- What you learned during the project.
You should also include links or screenshots so judges can verify it meets the requirements.
Writeup: 3-5 minute read (roughly 500-800 words)
Video: 5 minutes max
Office Hours will be held February 3 & 5 at 9am PT.
Can't make it live? You can ask questions async in Slack. We'd be happy to help support you!
Judges will score each submission on three main areas:
1. Use of Agentic Features (35%)
Did you effectively leverage Otto to build your pipeline? Are you using agentic features to automate or simplify your workflow? Did you go beyond basic usage to show what makes agents and AI powerful?
2. Real Impact (35%)
Does this solve an actual problem data teams face? Is the use case realistic and valuable?Would this pipeline be useful in production? Does it address a meaningful pain point or inefficiency?
3. Creativity and Curiosity (30%)
Is there creative or innovative use of agents and AI features? Is there a clear, compelling explanation of your solution and what you learned? How did you use AI to move from zero to working pipeline quickly?
Judges will review your writeup/video and assess based on these criteria. All three areas matter - a pipeline that works perfectly but doesn't leverage AI creatively won't score as well as one that solves a trivial problem but demonstrates interesting use cases for AI in data engineering.
Free Up Analytics and Data Engineering Time


