Databricks held a Data and AI tour for Asia Pacific customers today, hosted by Ali Ghodsi, co-founder and CEO of Databricks. Ghodsi explained Databricks “data lakehouse,” saying “the data bakehouse offers a better path - an integrated and collaborative role-based experienced with APIs.”
Ghodsi explained, "A data lake alone presents challenges," and noted a company’s well-intentioned data lake can easily turn into a data swamp if security, reliability, quality, and performance are not maintained. He noted performance is often a problem with data lakes because the very nature of a data lake is geared towards loading data in structured and unstructured ways, with no specific format.
Instead, Ghodsi said, Databricks lake-first approach builds upon the freshest and most complete data in an organisation, with AI and ML built-in from the ground up, support for multi-cloud and inter-cloud, support for all use cases on a single platform, and built on open source and open standards.
Theory is one thing; Mercury NZ's Bryan Campbell took to the stage to explain his organisation's data journey and what it meant in practice.
Mercury NZ is a "gentailer," meaning the New Zealand-based company is both an electricity generator, and an energy retail business. It generates electricity from 100% renewable resources spanning hydro, geothermal, and wind. It operates nine hydro stations, five geothermal stations, and is currently building NZ’s largest wind farm. The business is listed on both the ASX and NZX and has approximately 800 team members across NZ.
"It's a broad and diverse business, which is great if you love data," Campbell said, explaining the business has complex assets, unpredictable fuels, and wholesale and financial markets to deal with along with a retail arm.
Campbell explained Mercury NZ's internal stakeholders were seeking data and insights around customer lifetime value, risk propensities, and personalised marketing options. Looking further ahead, the company recognised the value of demand and generation forecasting, market pricing risks, digital twins, and asset master data management.
Yet, it was constrained in achieving these good things through a high amount of data redundancy. Then, Campbell said, this was compounded by expense. “Data follows compute, where we were moving data where it was needed. This was expensive in terms of time and cost, and in terms of quality, often being changed along the way.”
Mercury NZ knew there must be a better way, and driven by a need to enable advanced analytics, create a self-service culture, implement technology to attract and retain talent, to scale economically, and be faster-to-market, the company investigated solutions which led it to Databricks and the data lakehouse design.
The solution that was implemented is shown below.
While Campbell noted Mercury NZ was still early in its journey the lakehouse is operational, and the business has already achieved wins. Campbell explained the drivers have been met by the solution Databricks enabled, and the initial needs - customer lifetime value, risk propensities, and personalised marketing - have all been met.
With this under its veritable belt, Mercury NZ is now looking to the future and the next set of items in its list, with a proven and stable data operation that has provided business value, increased efficiency, and reduced cost.