Trace Id is missing
3/10/2024

AT&T migration to Azure Databricks catalyzes technical staff, advances business goals

AT&T’s on-premises data architecture prevented decision-makers from connecting to a single source of truth and created data silos, costly compute workarounds, and barriers to innovation.

In migrating to Azure Databricks, AT&T has achieved significant savings by shutting down datacenters and moving members of its technical staff up the value chain to work on optimizing processes rather than patching operating systems.

The migration to Azure Databricks is helping AT&T achieve a five-year ROI of 300% while reducing more than 80 schemas, streamlining its overall data footprint, and boosting its data science cycles by approximately three times.

ATT

In the business-to-consumer telecom space, companies are seeing demand for their services driven primarily by evolving user preferences, especially as new devices emerge with ever-higher requirements for data. According to Statista, the number of US smartphone users is expected to increase from 296.8 million in 2020 to a projected 320.4 million in 2025. Statista also notes that in 2024, US consumers spend around eight hours of their day with digital media.

Keenly in tune with US consumer trends, and with a mission to bring connectivity to everyone no matter where they live or come from, AT&T knew it needed to scale its enterprise to prepare for this massive consumer growth. This included reducing the cost and complexity of maintaining its existing infrastructure and freeing staff from repetitive tasks so they can deliver more powerful experiences to customers.

Deploying a thoughtful, strategic approach to massive data migration

AT&T initiated its data migration in 2020 with a strong emphasis on a few key imperatives. The company wanted to empower its data teams by democratizing data and scaling up its AI efforts without overburdening its DevOps teams. It also wanted to make better use of insights to help improve customer experiences and increase efficiency across the enterprise. However, moving such large, complex architecture comes with risks. At that time, AT&T’s data platform ecosystem ingested more than 10 petabytes of data per day, managed 100 million petabytes across the network, and came with a high cost of ownership.

ā€œOur primary motivations for our cloud migration were scalability, elasticity, nimbleness, time to market, delivering additional value to our customers, and reducing cost,ā€ recalls Mark Holcomb, AVP, Solution Architecture at AT&T. ā€œOur datacenter was capital-intensive in terms of software costs and maintenance, and we had tens of thousands of CPUs on an enormous number of servers.ā€ Holcomb notes that AT&T’s IT resources were spending too much time on low-value activities, such as patching operating systems. So, it began looking for a more flexible, scalable, and cost-effective solution in the cloud to help it move more of its resources up the value chain so that staff could focus on optimizing the business and delivering more value to its customers.

Across its prior architecture, AT&T ran several data management platforms, which locked the efforts of data teams into silos, making it difficult to access data and leading to data duplication and latency issues. As a result, some data metrics were missing a single source of truth.

ā€œWe had situations where different business leaders would report on the same metric, but each one was presenting slightly different numbers,ā€ Holcomb explains. ā€œFor example, customer churn is a very important metric in our business, but it’s difficult to address the problem when each business unit has its own version of the truth. We had different teams analyzing different versions of data and making decisions based on that analysis. It was time to unify everyone on a single, reliable set of business data.ā€

Transforming company data culture with modern cloud architecture

Because AT&T serves more than 70 million postpaid wireless phone subscribers and over 7.5 million broadband households, the company wanted to ensure business continuity during its migration. AT&T prioritized data privacy, security, and governance as it took steps to democratize data. Shortly after the company began evaluating potential data engineering and AI solutions,Ā Azure Databricks emerged as a strong candidate.

ā€œWe were looking specifically for a solution that could support all the use cases we had been running on-premises up to that point,ā€ says Praveen Vemulapalli, Director – Data & Gen AI Architecture, Chief Data Office at AT&T.

We were looking specifically for a solution that could support all the use cases we had been running on-premises up to that point. Azure Databricks came in as a highly mature product that checked all our boxes.

Praveen Vemulapalli, Director – Data & Gen AI Architecture, Chief Data Office, AT&T

ā€œAzure Databricks came in as a highly mature product that checked all our boxes. We quickly determined that it could also support us in transforming our huge SAS analytics platform—including data storage, data science, and advanced analytics—from on-premises to the cloud. Most importantly, Azure Databricks supported our need to build everything in a private environment that would satisfy the stringent data security regulations under which we operate.ā€

Even though AT&T’s employees had spent many years on the company’s previous data architecture, they embraced Azure Databricks. The company collaborated with Microsoft and Databricks toĀ develop and implement extensive training materials. It also selected change leaders from more than 60 business units, providing them with detailed, step-by-step job aids that would help them complete daily tasks on Azure Databricks.

Retiring old architecture, ushering in revitalized collaboration

This thoughtful strategy paid off. AT&T spent the next nine months moving its workloads to Azure Databricks and established enterprise data assets that replaced siloed data warehouses.

ā€œMoving to Azure Databricks has transformed the data culture at AT&T,ā€ Vemulapalli reports. ā€œInstead of people analyzing data on their own laptops and saving the results locally, they’re all coming to the cloud to collaborate in one place. They’re sharing data assets with the rest of the company in notebooks, folders, and tables, and enhancing the overall quality of our analysis.ā€

Moving to Azure Databricks has transformed the data culture at AT&T. Instead of people analyzing data on their own laptops and saving the results locally, they’re all coming to the cloud to collaborate in one place.

Praveen Vemulapalli, Director – Data & Gen AI Architecture, Chief Data Office, AT&T

Across AT&T, data teams and business users alike useĀ Microsoft Power BIĀ to build business intelligence dashboards that access data through a SQL warehouse in Delta Lake storage. When collaborators want to drill down to a machine learning model underneath a report or dashboard, they can easily access everything they need in the data lake. Azure Databricks has become the platform of choice for AT&T’s data engineers.

ā€œWhen we migrated to Azure Databricks, we prioritized retiring our most compute-intensive workloads to help us contain the growth of our previous on-premises data lake,ā€ says Vemulapalli. ā€œBy the time we had gotten all our data out of our old architecture, we had already retired about 40% of the infrastructure we had been using. This greatly accelerated our ROI from the project.ā€

Celebrating datacenter yields of 300% ROI

By migrating to Azure Databricks, AT&T achieved one of its most important goals: data democratization. The company now supports nearly 90,000 internal customers on one data architecture and has reduced its Hadoop and Teradata footprint while eliminating countless data silos. Within this new architecture, AT&T can spin up new computing environments in hours, rather than the three to four months previously required.

Much of AT&T’s business case for the cloud migration was based on the economic value of shutting down various datacenters. When AT&T migrated its Hadoop data lake to Azure Databricks, the company achieved a five-year ROI of 300% while reducing more than 80 schemas and streamlining its overall data footprint.

Just as significantly, AT&T has reduced the amount of compute resources it must devote to maintaining its data products, leaving more capital available for strategic activities such as data science and data engineering. ā€œWe used to have to squeeze our data science activities into a relatively small compute bucket,ā€ recalls Holcomb. ā€œThat meant we had to serialize our activities, which extended the timeline for testing a hypothesis with different inputs. By using Azure Databricks clusters to do that processing, we’ve accelerated our data science cycles by about three times.ā€

We used to have to squeeze our data science activities into a relatively small compute bucket. That meant we had to serialize our activities, which extended the timeline for testing a hypothesis with different inputs. By using Azure Databricks clusters to do that processing, we’ve accelerated our data science cycles by about three times.

Mark Holcomb, AVP, Solution Architecture, AT&T

Adopting a forward focus with modern tools

As AT&T continues to look for ways to serve its millions of customers more efficiently, the company is converging on a single source of truth. Vemulapalli believes a lakehouse architecture—combining powerful aspects of data lakes and data warehouses—will help make this vision a reality.

ā€œAT&T is completely aligned with the lakehouse strategy for our data architecture,ā€ Vemulapalli explains. ā€œWe just need to keep working to implement it completely across our environment. For example, we’re in the process of migrating our data marts to the lakehouse. Also, Databricks continues to release new services such as Unity Catalog, which we know will be an excellent solution for enhancing our data governance. We look forward to going live on Unity Catalog in the near future.ā€

AT&T’s innovation won’t stop there. The company plans to continue working closely with Microsoft and Databricks as the company implements new services inĀ AzureĀ and completes its transition to the lakehouse.

ā€œOur collaboration with Microsoft and Databricks has been exceptional throughout this project,ā€ Vemulapalli concludes. ā€œBut as much as we’ve accomplished together, the journey continues. Every time Databricks releases a new feature, our teams get together to figure out how it fits into our environment.Ā We continue to work toward having all the applications in our architecture interact directly with a single source of data through the lakehouse. MicrosoftĀ and Databricks are helping us get to the future and deliver greater possibilities for our customers.ā€

We continue to work toward having all the applications in our architecture interact directly with a single source of data through the lakehouse. Microsoft and Databricks are helping us get to the future and deliver greater possibilities for our customers.

Praveen Vemulapalli, Director – Data & Gen AI Architecture, Chief Data Office, AT&T

Discover more details

CUSTOMER
SERVICES AND SUPPORT
Take the next step

Fuel innovation with Microsoft

Talk to an expert about custom solutions

Let us help you create customized solutions and achieve your unique business goals.

Drive results with proven solutions

Achieve more with the products and solutions that helped our customers reach their goals.

Follow Microsoft