The Vast Frontier of Data Is Driving Next Generation Experiences


What’s most interesting about this year’s AWS re:Invent is the sheer vastness of the platform and capabilities.

In-person events rule.

The fatigue factor had definitely set in doing remote conference events during the pandemic. It was delightful to see the 11th annual AWS re:Invent back in full force with over 50,000 registered attendees (that felt like over 100,000 people) buzzing around over 10 different venues to experience over 2,000 session across 50 content tracks. The conference closed out earlier this month.

AWS re:Invent has always been a tech-forward conference riddled with new product announcements, rich keynotes and a comprehensive catalog of sessions led by practitioners and customers. What’s most interesting about this year’s event is the sheer vastness of the platform and capabilities. Additionally, the technical ecosystem that has been created surrounding AWS platforms is very extensive from AI/ML, DevOps, Observability, Security, FinOps and many other areas that support AWS services. Fortunately, the floor was organized into neighborhoods to make it easier to find companies of similar capability.

Applications are rapidly becoming more and more loosely coupled with the ability to assemble and integrate new capabilities independently at speed like never before.

The AWS platform is a perfect example of this as it innovates at such a rapid rate and continues to expand into higher value managed offerings such as managed services (AWS Managed Services), integrated offerings (Amazon Redshift integration for Apache Spark ), SaaS offerings (AWS Supply Chain, AWS Connect) and many others.

The future of application development, specifically as it intersects data and customer experience, may never be the same again. Here are some key considerations for building a strong data and architectural foundation to solidify the future to deliver next generation experience for your customers.

Harnessing Data Proliferation Is the New Foundation of All Businesses

The vastness of a system is often underestimated and sometimes inconceivable. To fully grasp the magnitude of complicated, interconnected systems, continuous exploration with complimentary and evolving tools is needed.

Data exploration is becoming like ongoing space exploration. New discoveries are ongoing an evolving, for example, in star systems like the Eagle Nebula whose Pillars of Creation was first photographed in 1995 by the Hubble Telescope and now has been enriched by the James Webb Telescope, where new insights are now being unleashed through new data and images on star creation and many other new patterns of the universe.

Just like space exploration, data is becoming so vast, it’s nearly impossible for any single set of tools to deal with all the complexities. While new statistics on data production are appearing all the time, the reality is that in the next five years, more data will be produced since the dawn of digital age. To deal with this complexity, you need a variety of integrated data tools, real-time integration, proper governance and actionable insights.

AWS has assembled one of the most comprehensive set of cloud-based database platforms. There are eight purpose-built, non-relational databases and five relational databases including Aurora, which has become the fastest growing service in the history of AWS. It combines the performance and availability of traditional database with the simplicity and cost effectiveness of open source databases.

There is also a full set of serverless analytics services from large scale data processing with EMR, real time streaming data with AWS managed streaming for Apache Kafka or RedShift for a fully managed petabyte scale data warehouse. 

Machine learning and AI services such as SageMaker is being used to train models with billions of parameters to make more than a trillion predictions every month.

Many companies are already turning these complexities into opportunities. Expedia is processing 600 billion AI predictions per year powered by over 70 petabytes of data. Pinterest stores over 1 exabytes of data on Amazon s3. Samsung’s 1 billion users make 80,000 requests per second. Netflix processes billions of traffic flows and processes terabytes of log data each day. Dow Jones has modernized their ML models to determine the best time of day to reach customers which has improved subscriber engagement rates by a factor of 2.

Data is now at the center of every business, powering critical decisions and enabling rich customer experiences.

Related Article: A Decade of Dramatic Change in Digital Customer Experience

Integrating Data Rapidly Provides Faster Insights

Integration has long been the bane of any data solution with ETL (extract transform load) being referred to as a “thankless, unsustainable blackhole”. AWS has spent time to make integration easier with federated query capabilities in Redshift and Athena to enable running queries across databases and clouds without moving any data.

AWS data exchange allows seamless integration of 3rd party datasets with your own data in Redshift with no ETL required.   SageMaker has been integration with Aurora and Redshift to enable anyone to access machine learning models. The vision is to ultimately have a zero ETL future. Aurora and Redshift now have a fully managed, zero-ETL integration that allows consolidated data from multiple databases to be seamlessly accessed. This unifies the world of transactional data and analytics capabilities eliminating the need to write custom pipelines between Aurora and Redshift. 

Similarly, AWS added Redshift integration for Apache Spark to easily run Spark queries on Redshift data from EMR, Glue and SageMaker within seconds. All of this without the need to move any data to S3 or manage any connectors.

These are significant steps forward to a zero ETL future.

Enabling Real-Time, AI Powered Customer Experiences

The immense amount of data and the future of data integration is creating a perfect storm for AI/ML models to become smarter to make better decisioning and predictions to further personalize customer experiences.

Expedia is the business of creating memorable experiences for its customers. Their platform is providing a lot more than coordinating travel transactions and has become a way for its customers to experience and learn about the world.

Expedia’s reach is vast with over 168 million members, 50,000 B2B partners. 3 million properties, 500 airlines powering travel in over 70 countries. They have amassed over two decades of travel behaviors, booking patterns, partner activity and much more. It’s fair to day data is their number one competitive weapon.

Rathi Murthy, Expedia Group CTO, provided several insights to their approach in using AWS to power these customer-first experiences. To leverage the data they collect, Expedia has made significant investments in AI/ML and now process over 600 billion predictions each year powered by over 70 Petabytes of data. They also are enabling extreme personalization by creating over 360,000 permutations of a brand’s page so travelers can see what’s most relevant for them.

According to Murthy, to accomplish this they had to modernize their data and application infrastructure by first transforming to a containerized architecture using Amazon EKS and Karpenter. Next, they had to figure out a way to process millions of images and reviews with sub millisecond latency by leveraging Amazon DB and SageMaker. Finally, they needed to create a new self-service platform that hosts over 29 million virtual conversations saving over 8 million agent hours while improving customer response times.

Customers that are backed by tools and a strong data foundational can innovate on their customer experiences more readily than their competition.

Related Article: 5 Digital Customer Experience Trends for 2023



Source link

We will be happy to hear your thoughts

Leave a reply

DELA DISCOUNT
Logo
Enable registration in settings - general
Shopping cart