scala and java

New Feature: Scala & Java Transforms

Apache Spark has grown tremendously over the years, attracting an increasingly broad group of developers, use cases, tools, and languages. When we first launched Ascend on top of Spark, we focused on SparkSQL as the primary language given its declarative nature and consequently, our ability to deeply understand the logic and intelligently orchestrate pipelines as code and data changed. Our users had a number of use cases that required capabilities not supported by SQL, however, which led us to add support for Python and PySpark

Still, something didn’t seem quite right. Spark started with Java and then Scala support, only adding SparkSQL and PySpark later. Many of our users had existing pipelines built with Scala or Java and didn’t want to port things over. While PySpark and SparkSQL may be the fancy new kids on the block, how could we forget our roots?

Today we’re excited to formally announce support for Scala & Java transforms. Not only does this expand our support to two of the most popular languages amongst data engineers, but marries this capability with the advanced orchestration and optimizations provided by Ascend.

Key benefits of Scala & Java on Ascend are:

  • More Logic, Less Configuration: simply implement an entrypoint function and your data logic, and the rest — scheduling, scaling, monitoring, etc — is automatically taken care of by Ascend.
  • Secure JAR Access: provide Ascend pointers to your JAR in a secure location.
  • JAR Caching: Ascend will keep a copy of your JAR within your Spark environment to ensure fast access.
  • Automatic JAR Change Detection: we’ll monitor your JAR for changes, and refetch your JAR when needed.
  • Code & Library Reuse: package common libraries and dependencies in your JAR, and even use the same JAR across multiple transforms.
  • Broad Spark Version Support: select from multiple Spark and Scala versions that best suit your needs.

Oh, and did we mention, you can even use these interchangeably with your PySpark and SparkSQL Transforms!

To see Scala & Java transforms in action, we’ve created a short tutorial for you:

 

Head into your Ascend environment’s Build view, and click on Scala Transform to get started.

With Scala & Java transformations, builders have a rich toolbox to model and wrangle any data complexities. As always, we can’t wait to see what you build next.

Ready to start with Scala & Java transforms? Sign up for a free trial and start building! Want to learn more about this feature and the other parts of the Ascend Unified Data Engineering platform? Schedule a live demo with one of our data engineers. Have a favorite function you’d like to see us add support for? Join us in Slack (#feature-requests) and let us know — we’d love to hear from you! 

Share on facebook
Share on twitter
Share on linkedin
Share on reddit
Share on email