The Build Plane

Program your data pipelines end-to-end on a single pane of glass.

Diagram of the Ascend platform with the build plane highlighted in red to build end-to-end data pipelines on a single pane of glass.

What Is The Build Plane?

Eliminate tools-specific silos and consolidate your pipelines.

Unlike other data tech stacks, intelligent data pipelines are entirely built end-to-end on a single pane of glass. You can use the rich UI to program ingestion and transformation logic, visualize lineage, and monitor operations during dev, test and production. You can also develop pipelines programmatically with rich data engineering functions in the CLI/SDK, and incorporate them into your CI/CD workflows.

Collection of data sources Ascend has connectors for to ingest data and build end-to-end data pipelines on the platform.

Ingest

Loading data is the first step for any data pipeline. The Ascend ingestion framework provides over 300 connectors to get any data from anywhere, and load it into your data clouds.

All ingestion is by default incremental, including database CDC, with full resynch functions also available. The ingestion process includes the generation of metadata for the control plane to automate the downstream pipelines.

Use a Dedicated deployment to securely access private data sources in your network.

Collection of data sources Ascend has connectors for to ingest data and build end-to-end data pipelines on the platform.
Product screenshot to show the data transformation feature in the Ascend platform
Product screenshot to show the data transformation feature in the Ascend platform

Transform

Data flows seamlessly from ingestion into the DAG of intelligent data pipelines. Use the SDK and the intuitive UI to program the data logic in each of the transformation steps. The control plane autonomously runs the pipelines by sending the transformation logic to the data planes in the right order. The data never leaves your data clouds, where you can query it at any point in the pipelines, anytime.

Product screenshot to show the data orchestration feature in the Ascend platform.

Orchestrate

The UI automatically generates the lineage from the transform logic provided by the user in the form of SQL, Python, and Java. Following this lineage, the control plane self-orchestrates the sequences of workloads for each incremental change to propagate the data through the pipelines. The user controls key orchestration parameters such refresh rates, repartitioning rules of data sets, and actionable data quality rules.

Ascend pipelines can also be connected to external orchestrators, to control pipeline flow and receive signals of pipeline execution.

Product screenshot to show the data orchestration feature in the Ascend platform.
Product screenshot to show the share functionality on the Ascend platform.
Product screenshot to show the share functionality on the Ascend platform.

Share

Intelligent data pipelines can be tapped at any point to share data. They serve as the foundation for data mesh implementations.

First, they can be linked with other data pipelines. These links guarantee continuity of lineage, schema, and orchestration across huge webs of pipelines — even if these span data clouds.

Second, they can be written to any number of external systems, commonly known as reverse ETL. The data in all destinations is always kept in synch with the data pipeline. This way multiple teams are always working with one shared version of the truth.

Ways to Get Started

Learn

Stay up to date with the team building the future of data pipeline automation.

Use

Are you a developer? Get started for free. No credit card required.

Talk

Do you lead a data team? Talk to a Field CTO about redirecting your team to accelerating value.