Dataiku Honored with the Prestigious 2023 Databricks AI Partner of the Year Award

Dataiku, the platform for Everyday AI, today is proud to announce that it has been honored with the prestigious 2023 Databricks AI Partner of the Year Award. This recognition, presented at the annual Data + AI Summit during the Databricks Partner Summit in San Francisco, underscores Dataiku's unwavering commitment to innovation and collaboration in the dynamic AI and ML landscape.

Particularly in the emergent field of Generative AI, Dataiku has been instrumental in delivering multi-cloud AI and ML solutions that noticeably enhance enterprise AI maturity, drive impactful digital transformations, and guarantee the highest return on investments.

As technological advancement accelerates, Dataiku and Databricks are committed to meeting the needs of users across organizations, encouraging collaboration, and supporting data and AI projects at scale. The partnership allows everyone from data professionals to business experts to blend Apache SparkTM code and visual recipes in Dataiku, all running on the Databricks Lakehouse Platform, fueling the development of successful data and AI initiatives.

"Through the transformative integration of Dataiku and Databricks, we are not only reshaping how businesses use AI and ML, but also powering the advancement of Generative AI as an everyday technology,” said Abhijit Madhugiri, Vice President, Global Technology Alliances at Dataiku. “We are proud to empower users with advanced analytics, making the seemingly impossible, possible. This strategic alliance is indeed a leap forward in our vision to enable all organizations to effectively utilize data, optimizing operations and creating new opportunities through Generative AI.”

This partnership provides an end-to-end platform that supports flexible workloads and instant access to data, empowering users to develop, deploy, and operate AI & analytics initiatives at enterprise scale. Key integration points between Dataiku and Databricks include:

  • Named Databricks Connector: Connect directly into a Databricks Lakehouse to read and write Delta Tables in Dataiku, with data never leaving Databricks.
  • SQL Pushdown Computation: Pushdown Visual and SQL recipes to Databricks engine, leveraging the computational power of Databricks.
  • Databricks Connect “v2”: Write PySpark code in Python recipes or code notebooks to be executed on a Databricks cluster.
  • Exchange MLFlow Models: Import a previously trained MLFlow model from Databricks into Dataiku for native evaluation and operationalization, or export a model trained in Dataiku into Databricks.

"With Dataiku and Databricks, everyone from data professionals to business experts has what they need to collaborate and develop successful data and AI projects at scale," said Roger Murff, VP of Technology Partners at Databricks. "Customers have requested this integration between Dataiku and Databricks, making it simple for data analysts and domain experts to mix Spark code and visual recipes in Dataiku and running the compute on Databricks."

To learn more about the partnership and how Dataiku and Databricks are driving innovation in Generative AI, tune into their session at the Data + AI Summit. For further insights: 

  • Read how you can Have your Cake and Eat It Too With Dataiku and Databricks
  • Get started with Databricks and Dataiku
  • Visit the Dataiku page on Databricks.com 

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.