Join us for AWS Analytics Modernization Week!

About the Event

Your organization is expecting more value from your data in delivering strategic insights to drive better real-time business decisions, while balancing the technical demands around security, governance, volume, and a variety of new data sources and formats. AWS Analytics Modernization Week is an online event series designed to help you accelerate your learning around data, analytics, and machine learning best practices and help you execute a modern agile approach to analytics in the cloud. In this event, sessions will focus on connecting disparate data assets via a Lake House approach, highlighting real-time use cases leveraging both analytics and machine learning to drive better insights, and showcasing how to build, manage, govern, and secure your data assets.

Join us July 19 - 22 for AWS Analytics Modernization Week, which provides deep dive sessions and demos to learn how to leverage machine learning to access deeper insights, simplify your infrastructure management, and accelerate your operational analytics. Sessions will highlight Lake House Architecture and the AWS Analytics services and programs available to assist in your migration journey. There will be live AWS representatives available to answer your questions anytime throughout the event, and we invite you to join any or all of the week’s sessions. Register for the day(s) that work best for you.


Who Should Attend

This event is ideal for technical decision makers who are looking for guidance on how to become more agile, efficient, and innovative with their data projects. These sessions primarily feature 200-300-level content (Intermediate to Advanced), including technical presentations and demos designed for those experienced with the AWS console.

The intended audience for this event includes Internal IT Leaders, DevOps Leaders, Data Engineers, Data Scientists, Developers, Architects, Database Administrators, and Product Solutions Managers.

Event Agenda

Event Agenda

  • Mon, Jul 19
  • Tues, Jul 20
  • Wed, Jul 21
  • Thurs, Jul 22
  • Day 5
  • Track6 Session 1
  • Track7 Session 1
  • Track8 Session 1
  • Track9 Session 1
  • Track10 Session 1

Mon, Jul 19

Get Insights from All Your Data to All Your Users

Monday, July 19, 2021, 9:00 AM - 12:00 PM PT | 12:00 PM - 3:00 PM ET

Analyze Data Across Your Lake House Architecture with Amazon Redshift (Level 300)
9:00 AM - 10:15 AM PT | 12:00 PM - 1:15 PM ET
Organizations can gain deeper and richer insights by bringing together all your relevant data of all structures and types and from all sources to analyze, using Lake House Architecture. You can use Amazon Redshift to query data across Redshift clusters, your S3 data lake, Amazon Aurora, and other operational databases with Redshift Data Sharing, Redshift Spectrum, and Redshift Federated Query. Learn how to enable analytics across a broad range of sources without having to move data physically across different systems.

Enabling Seamless 360-degree View and Sales Planning for a Global Pharmaceutical Company

AWS Partner Agilisium shares how a global pharmaceutical company maximized the impact of their sales and marketing efforts through granular information on product movement and sales progression. Through an integrated solution with S3, Redshift, Spectrum/Athena, and RDS, the company was able to view activities and process logs to boost their marketing and strategic planning, while running the platform in a cost-effective way.

Democratize Machine Learning with Amazon Redshift ML (Level 200)
10:15 AM - 11:00 AM PT | 1:15 PM - 2:00 PM ET
Learn how your data analysts can create, train, and apply machine learning models using familiar SQL commands in Amazon Redshift data warehouses. With Redshift ML, you can take advantage of Amazon SageMaker, a fully managed machine learning service, without learning new tools or languages. Simply use SQL statements to create and train Amazon SageMaker machine learning models using your Redshift data and then use these models to make predictions.

Access Deeper Insights with Machine Learning (Level 100)
11:00 AM - 12:00 PM PT | 2:00 PM - 3:00 PM ET
Learn how your business users can leverage AI/ML capabilities to enhance dashboards and reporting with no coding necessary. Using QuickSight’s built-in anomaly detection, forecasting, natural language processing, and the newly announced QuickSight Q, any business user, regardless of technical skill, can ask deeply analytical questions on all of your data and receive answers in seconds. Additionally, integration with Amazon SageMaker opens doors to bring your own predictive models to enrich your data visualizations.

Tues, Jul 20

Load Streaming Data to Your Lake House Architecture for Real-Time Analytics

Tuesday, July 20, 2021, 9:00 AM - 12:00 PM PT | 12:00 PM - 3:00 PM ET

Deliver Better Customer Experiences With Machine Learning in Real-Time (Level 200)
9:00 AM - 10:15 AM PT | 12:00 PM - 1:15 PM ET
Organizations are increasingly using machine learning to make near-real-time decisions, such as placing an ad, assigning a driver, recommending a product, or even dynamically pricing products and services. Real-time machine learning can substantially enhance your customers' experience, resulting in better engagement and retention. In this session, you will learn how you can use AWS data streaming platforms to support real-time machine learning.

Reveal Key Consumer Insights Using Real-Time Sentiment Analysis (Level 300)
10:15 AM - 11:30 AM PT | 1:15 PM - 2:30 PM ET
In this session, we’ll demonstrate how to perform real-time sentiment analysis on top of incoming customer reviews with serverless AWS technologies and natural language processing. You'll learn how to use Amazon Kinesis Data Streams, Kinesis Data Analytics, and Amazon Comprehend to power this approach and how it can be applied to other use cases such as real-time translation, PII detection, and redaction.

Patterns of Streaming Capabilities (Level 200)
11:30 AM - 12:00 AM PT | 2:30 PM - 3:00 PM ET
The use cases that lead us to use streaming capabilities largely fall into three broad patterns. From real time hydration of data lakes to running machine learning models on the streaming data, AWS Partner Infosys is on the front lines working with customers and will share the perspective on these patterns with representative case studies.

Wed, Jul 21

Accelerate Your Operational Analytics

Wednesday, July 21, 2021, 9:00 AM - 12:00 PM PT | 12:00 PM - 3:00 PM ET

Put Out Fires Before They Become Wildfires: Find and Alert on Anomalies With ML Features in Amazon Elasticsearch Service (Level 200)
9:00 AM - 10:15 AM PT | 12:00 PM - 1:15 PM ET
Many organizations are using Amazon Elasticsearch Service as their centralized infrastructure for Operational Analytics, leveraging its near real-time analytics capabilities and visualizations to monitor and find issues in their critical infrastructure and applications. Coupling the power of search and ML, Amazon ES can alert users of anomalies and issues even faster, enabling them to head off problems before they become acute. In this session, you will learn:

1. Why Amazon ES is ideally suited for operational analytics.

2. What anomaly detection can do for you and why it is well suited to glean insight from vast datasets.

3. How to get started with anomaly detection in Amazon ES to identify and find problems faster in your log data.

Trace Analytics with Amazon Elasticsearch Service: Combine the Power of Log Analytics and Distributed Tracing in a Single Platform (Level 200)
10:15 AM - 11:00 AM PT | 1:15 PM - 2:00 PM ET
Traditional methods of collecting logs and metrics from individual components and services in a distributed application do not allow for end-to-end insights. With trace analytics in Amazon Elasticsearch Service, developers and IT Ops can easily troubleshoot performance and availability issues in distributed applications. In this session, we'll discuss how this new feature works and allows for faster resolutions.

Analyzing Logs With Kinesis Data Firehose and Amazon Elasticsearch Service (Level 300)
11:00 AM - 12:00 PM PT | 2:00 PM - 3:00 PM ET
Using Amazon Elasticsearch Service for operational analytics has become an enterprise standard for many AWS customers. However, customers have asked for a fully managed approach to ingesting their logs into Amazon ES. In this session, you will learn how to use Kinesis Data Firehose to load your data into an Amazon ES endpoint in a VPC without having to create, operate, and scale your own ingestion and delivery infrastructure.

Thurs, Jul 22

Securely Enable Your Data Processing Use Cases

Thursday, July 22, 2021, 9:00 AM - 12:00 PM PT | 12:00 PM - 3:00 PM ET

Deploy Lake House Architecture to Enable Self-Service Analytics with AWS Lake Formation (Level 300)
9:00 AM - 10:00 AM PT | 12:00 PM - 1:00 PM ET
Being data driven requires ubiquitous access to data in a secure and governed way. In this session you will learn how AWS Lake Formation makes it easy to build, manage, and secure your data lake. We will cover how to set up fine grained access permissions enabling secure access to data from a wide range of services. We will also cover how to update and delete data using Governed Tables and automatically optimize data for better query performance.

Accelerate Apache Spark and Other Big Data Application Development With EMR Studio (Level 300)
10:00 AM - 11:00 PM PT | 1:00 PM - 2:00 PM ET
Amazon EMR is the industry-leading cloud big data platform for processing vast amounts of data using open source tools such as Apache Spark and Presto. In April, we released EMR Studio, a new integrated development environment for data scientists to develop, visualize, and debug applications written in R, Python, Scala, and PySpark. Join this session to learn how to use EMR and EMR Studio to accelerate your big data work.

Simplify Infrastructure Management With Amazon EMR on Amazon EKS (Level 300)
11:00 AM - 12:00 PM PT | 2:00 PM - 3:00 PM ET
As Big Data and Analytics continue to grow, organizations are leveraging capabilities provided by the Containers and Kubernetes ecosystem to build their cutting-edge data platforms. AWS launched Amazon EMR on Amazon EKS to help customers focus more on developing the applications without worrying about operating the infrastructure. In this session, you'll learn about the architecture design of Amazon EMR on Amazon EKS and see a live demo showing how to get started in minutes. AWS experts will also share best practices for setting up monitoring, logging, security, and how to optimize for cost.

Day 5

Tab 5 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track6 Session 1

Tab 6 content

Integer ultrices lacus sit amet lorem viverra consequat. Vivamus lacinia interdum sapien non faucibus. Maecenas bibendum, lectus at ultrices viverra, elit magna egestas magna, a adipiscing mauris justo nec eros.

Track7 Session 1

Tab 7 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track8 Session 1

Tab 8 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track9 Session 1

Tab 9 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Track10 Session 1

Tab 10 content

Organizations today are looking to free themselves from the constraints of on-premises databases and leverage the power of fully managed databases in the cloud. Amazon RDS is a fully managed relational database service that you can use to run your choice of database engines including open source engines, Oracle, and SQL Server in the cloud. Amazon RDS automates time-consuming database administration tasks and adds capabilities such as replication and Multi-AZ failover to make your database deployments more scalable, available, reliable, manageable, and cost-effective. This session covers why you should consider moving your on-premises Oracle & SQL Server deployments to Amazon RDS and the tools to get started.

Multiple City Registration

  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa
  • JAN 21 2020
    Sunt in Culpa

Venue

Session Proficiency Levels Explained

  • Level 100

    Introductory

    Sessions will focus on providing an overview of AWS services and features, with the assumption that attendees are new to the topic

  • Level 200

    Intermediate

    Sessions will focus on providing best practices, details of service features and demos with the assumption that attendees have introductory knowledge of the topics

  • Level 300

    Advanced

    Sessions will dive deeper into the selected topic. Presenters assume that the audience has some familiarity with the topic, but may or may not have direct experience implementing a similar solution

  • Level 400

    Expert

    Sessions are for attendees who are deeply familiar with the topic, have implemented a solution on their own already, and are comfortable with how the technology works across multiple services, architectures, and implementations

About the Event

Your organization is expecting more value from your data in delivering strategic insights to drive better real-time business decisions, while balancing the technical demands around security, governance, volume, and a variety of new data sources and formats. AWS Analytics Modernization Week is an online event series designed to help you accelerate your learning around data, analytics, and machine learning best practices and help you execute a modern agile approach to analytics in the cloud. In this event, sessions will focus on connecting disparate data assets via a Lake House approach, highlighting real-time use cases leveraging both analytics and machine learning to drive better insights, and showcasing how to build, manage, govern, and secure your data assets.

Join us July 19 - 22 for AWS Analytics Modernization Week, which provides deep dive sessions and demos to learn how to leverage machine learning to access deeper insights, simplify your infrastructure management, and accelerate your operational analytics. Sessions will highlight Lake House Architecture and the AWS Analytics services and programs available to assist in your migration journey. There will be live AWS representatives available to answer your questions anytime throughout the event, and we invite you to join any or all of the week’s sessions. Register for the day(s) that work best for you.

Who Should Attend

This event is ideal for technical decision makers who are looking for guidance on how to become more agile, efficient, and innovative with their data projects. These sessions primarily feature 200-300-level content (Intermediate to Advanced), including technical presentations and demos designed for those experienced with the AWS console.

The intended audience for this event includes Internal IT Leaders, DevOps Leaders, Data Engineers, Data Scientists, Developers, Architects, Database Administrators, and Product Solutions Managers.

Speakers

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

  • Raju Gulabani, VP of Databases, Analytics & AI, AWS

    Raju Gulabani is VP of Databases, Analytics & AI within AWS at Amazon.com. He is responsible for P&<, product management, engineering and operations for Database services such as Amazon Aurora and Amazon DynamoDB, and Analytics services such as Amazon Redshift and Amazon EMR, as well as AI services like Amazon Lex, Amazon Polly, and Amazon Rekognition. Prior to joining Amazon in his current position in 2010, Raju spent four years at Google and built the Google Apps business (now known as G Suite).Earlier in his career, Raju founded an Intel backed Wi-Fi Voice over IP company as well as held engineering management positions at Microsoft.

  • Ryan Kelly, Data Architect, Equinox

    Ryan Kelly is a data architect at Equinox, where he helps outline and implement frameworks for data initiatives. He also leads clickstream tracking which helps aid teams with insights on their digital initiatives. Ryan loves making it easier for people to reach and ingest their data for the purposes of business intelligence, analytics, and product/service enrichment. He also loves exploring and vetting new technologies to see how they can enhance what they do at Equinox

  • Richard Boyd, Cloud Data Engineer, iRobot

    Richard Boyd is a cloud data engineer with the iRobot Corporation’s Cloud Data Platform where he builds tools and services to support the world’s most beloved vacuum cleaner. Before joining iRobot, Richard built discrete event simulators for Amazon’s automated fulfillment centers in Amazon Robotics. His previous roles include cyber warfare systems analyst at MIT and research for the Center for Army Analysis. He holds advanced degrees in Applied Mathematics & Statistics.

Customer Highlights

Epic Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Epics Games

Yelp Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Yelp

Airbnb Logo Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Airbnb

Lyft Image

Epics Games’ entire analytics platform runs on AWS. Billions of game events, like player interactions on the map, their accuracy, damage taken and dealt, and what resources they are using are all sent to AWS.

Lyft

FAQs

  1. Where is this event?
  2. Do I have to register for the entire week?
  3. How much does this event cost?
  4. What are the prerequisites before attending the event?

Q: Where is this event?

A: This event is an online event, hosted by AWS on the GoToWebinar platform.

Q: Do I have to register for the entire week?

A: No, you can register for the day(s) that work best for you.

Q: How much does this event cost?

A: There is no cost to attend this event.

Q: What are the prerequisites before attending the event?

A: There are no prerequisites for attending the event. We encourage attendees to browse the Analytics page on the AWS website to get a brief overview of the services available to them.