Start of Main Content

Databricks Data + AI Summit 2025 has come and gone, but it offered a lot to be excited about. With over 22,000 attendees, two days of keynotes, and 700+ sessions, it brought together the people shaping the future of data and AI. There were many announcements over the conference’s three days in San Francisco, but here are a few of the Databricks products that we’re eager to help our clients leverage.

Lakebase (Now in Public Preview)

A Postgres database built for the Lakehouse and optimized for AI.

Lakebase is Databricks’ fully managed Postgres database, integrated directly into the Lakehouse and purpose-built for AI applications. Operational databases (OLTP) have historically lived in separate systems from analytics platforms and often require complex and brittle extract, transfer, load (ETL) pipelines to connect to the modern data stack. Lakebase eliminates this divide by natively supporting both transactional and analytical workloads in its unified platform.

Because Lakebase separates compute from storage, your teams can scale infrastructure independently while maintaining low-latency transaction performance. Rather than spending time on provisioning or upkeep, you can rely on the platform to handle the heavy lifting in the background. Under the hood, Lakebase runs on Neon (from Databricks’ recent acquisition), making it a modern, cloud-native setup that’s perfect for AI-driven apps.

The most exciting part about Lakebase is its implications for AI agents and intelligent apps. Lakebase makes it easy to spin up transactional databases as part of an AI-driven workflow. The complexity of managing separate stacks is gone, and AI-driven systems can deploy, query, and operate OLTP databases with minimal friction. In the future, it's predicted that the vast majority of Lakebase deployments will be courtesy of AI agents.

What Lakebase Means for Our Clients

For many clients, integrating operational data into their analytics stack has long been a pain point. Lakebase offers a seamless way to bring both under the same roof. Clients with siloed operational and analytical systems can now unify their data with a single provider, simplifying architecture and reducing time-to-insight. That said, there are real considerations: operational databases are business-critical and often coincide with high I/O volume, demanding robust design and specific governance and monitoring strategies. With Databricks managing the infrastructure, our clients can focus on building data-driven applications instead of maintaining backend systems.

Mosaic AI Agent Bricks (Now in Beta)

Auto-optimized agents for production-ready AI.

This was the announcement that created buzz around the Moscone Center. Agent Bricks lets your teams build, test, and deploy domain-specific AI agents through an intuitive interface where you describe the task the agent should handle. Whether you use it for information extraction, question answering, or serving as a knowledge assistant, you can rely on the platform to automatically select models, assess quality and accuracy, optimize performance, and generate high-quality production-ready agents. 

Agent Bricks dramatically accelerates the time to bring agents to life. What once took months of infrastructure setup, trial and error, and tuning can now happen in days, all within a managed, scalable environment. It’s a leap forward for operational AI, not just as a research experiment, but as a deployable business asset. 

What Agent Bricks Means for Our Clients

Brooklyn Data clients now have a way to fast-track building and operationalizing AI agents without deep machine learning (ML) engineering teams. However, to take full advantage of Agent Bricks, they need to be AI-ready. This means having the right data foundation in place: clean, accessible, well-structured datasets that can feed intelligent agents. We’re entering a world where production AI is no longer reserved for tech giants. With tools like these, lean data teams can build impactful AI-driven services, provided their data is prepared to support it. 

If you’re curious about our approach to AI readiness, take a look at our VP of Data Engineering, David Gelman’s, post about it. 

AI/BI Genie, Together with Databricks One

Natural language business intelligence (BI) isn’t a new idea, but now it works for everyone.

Genie was first announced last June, but it’s already seeing real use from Databricks customers who want easier ways to work with their data. AI/BI Genie is now generally available to all Databricks users and brings conversational AI directly into the Databricks platform, making it easier than ever to ask data questions in plain English and get instant, useful insights. It understands context and previous queries, integrates with dashboards and notebooks, and can even generate charts and summaries on the fly.

However, Genie’s real power comes from its integration with Databricks One, the new unified workspace announced at the summit. You no longer need to jump between tools or reformat data to get answers — everything happens within one seamless, end-to-end environment. Non-technical users can easily navigate the workspace while interacting with trusted data sources, simply generating insights. Genie works right alongside your notebooks, dashboards, and SQL queries, extending Databricks into a conversational interface for analytics.

“Genie Deep Research” was also announced as an experimental feature coming this summer. Genie will be able to tackle more complex questions, break them apart, and even suggest what to explore next. Instead of just giving you a quick answer, Genie will be able to automatically generate multi-step queries, explore different angles of the data, and surface trends, anomalies, or correlations. 

What AI/BI Genie Means for Our Clients

The introduction of AI/BI Genie gives more stakeholders the ability to interact with data directly via natural language. This is a powerful unlock for a data-driven culture. But like Agent Bricks, its success depends heavily on the underlying data model and AI readiness. Teams are now under even more pressure to ensure their data and AI foundations are solid, so that AI-driven tools can generate useful and trustworthy results. 

Lakeflow Designer (Coming Soon to Private Preview)

Visual ETL pipelines, no code required.

Lakeflow Designer is Databricks’ new visual pipeline builder, enabling you to construct ETL workflows with drag-and-drop components and natural language commands. This democratizes the data engineering process for a broader audience, making it more accessible for teams that may not have dedicated data engineers.

Lakeflow’s key advantage over other visual ETL tools is its direct integration into the Databricks ecosystem. You can build, monitor, and iterate on pipelines without ever leaving your Databricks environment. 

What Lakeflow Means for Our Clients

Lean data organizations often rely on team members who can wear multiple hats. Lakeflow Designer offers analytics and BI professionals increased ownership over upstream pipeline development, shrinking the gap between raw data and insights. For those who already use Databricks for analytics and rely on third-party ETL tools for ingestion, this provides a path to consolidation, reducing tool sprawl and improving pipeline visibility across teams.


Want to know how these Databricks tools fit into your data strategy?

We can walk you through their benefits, show you how to leverage them, and explain how to get your team AI-ready so you can maximize their potential.

Curious About These Features, But Aren’t a Databricks Customer?

Try Databricks Free Version!

While not necessarily a feature, the announcement of Databricks Free Version is a breath of fresh air for data enthusiasts who are excited by the latest in data and AI, but don’t have access to a modern platform to try it for themselves. Sign up and learn more about it here.

The Future of Data and AI Is Here — Are You Ready to Embrace It?

The 2025 Data + AI Summit revealed the ways Databricks is simplifying and accelerating the path from raw data to intelligent action. From Lakebase’s operational and analytics unification to Agent Bricks’ streamlined AI deployment, Genie’s natural language BI, and Lakeflow’s accessible ETL, it’s clear that Databricks is building not just for data engineers and scientists but for every team involved in turning data into impact.

These tools are more than just technical upgrades — they represent a shift toward enabling leaner teams to do more with less, and empowering non-technical users to participate meaningfully in the data ecosystem. But as always, the key to unlocking this value is having the right foundation: clean, well-governed, and accessible data. 

Whether you're already a Databricks customer or just curious, this is a great time to reconsider your data strategy and AI readiness. The future of data and AI is no longer on the horizon — it’s here, and it's operational. Want to learn more about Databricks’ offerings or assess your AI-readiness? Reach out. Our data experts would be happy to chat with you.


Published:
  • AI and Machine Learning
  • Data Strategy and Governance
  • Data Stack Implementation
  • Data Team Enablement
  • Artificial Intelligence
  • Business Intelligence
  • Databricks

Take advantage of our expertise on your next project